<![CDATA[Gerard Cornuejols, Carnegie Mellon University]]> 27215 Speaker

Gerard Cornuejols
IBM University Professor of Operations Research
Tepper School of Business
Carnegie Mellon University

Abstract

This talk will be based on joint work with Borozan, Basu, Conforti and Zambelli. We extend a theorem of Lovasz characterizing maximal lattice-free convex sets. This result has implications in integer programming. In particular we show a ono-to-one correspondance between these sets and minimal inequalities.

Bio

Gerard Cornuejols was Editor-in-Chief of MOR during the period of 1999-2003 and has been serving in the Advisory Board of MOR since then. He is very famous for his works on Integer Programming, for which he has been awarded several important prizes, namely:

]]> Mike Alberghini 1 1356018974 2012-12-20 15:56:14 1475892100 2016-10-08 02:01:40 0 0 event This talk will be based on joint work with Borozan, Basu, Conforti and Zambelli. We extend a theorem of Lovasz characterizing maximal lattice-free convex sets. This result has implications in integer programming. In particular we show a ono-to-one correspondance between these sets and minimal inequalities.

]]>
2010-03-02T11:00:00-05:00 2010-03-02T12:00:00-05:00 2010-03-02T12:00:00-05:00 2010-03-02 16:00:00 2010-03-02 17:00:00 2010-03-02 17:00:00 2010-03-02T11:00:00-05:00 2010-03-02T12:00:00-05:00 America/New_York America/New_York datetime 2010-03-02 11:00:00 2010-03-02 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro, ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[Paul Glasserman, Columbia University]]> 27215 Speaker

Paul Glasserman
Jack R. Anderson Professor
Graduate School of Business
Columbia University

Abstract

Among the solutions proposed to the problem of banks "too big to fail" is contingent capital in the form of debt that converts to equity when a bank's regulatory capital ratio falls below a threshold. We analyze the dynamics of such a security with continuous conversion and derive closed form expressions for its value when the firm's assets are modeled as geometric Brownian motion and the conversion trigger is an asset-based capital ratio. A key step in the analysis is an explicit formula for the fraction of equity held by the original holders of the contingent capital debt as a function of the maximum drop in asset value. We contrast this analysis with the case of a market-based (rather than accounting-based) conversion trigger. 
This is joint work with Bezhad Nouri.

Bio

Paul Glasserman is the Jack R. Anderson Professor of Business at Columbia Business School, where he served as senior vice dean in 2004-2008. Since 2008, he has also been an academic visitor at the Federal Reserve Bank of New York. His research focuses on derivative securities, risk management, stochastic models, and simulation. His publications include the book Monte Carlo Methods in Financial Engineering, which received the 2005 I-Sim Outstanding Simulation Publication Award and the 2006 Lanchester Prize. He also received Risk magazine's 2007 Quant of the Year Award.

]]> Mike Alberghini 1 1356016936 2012-12-20 15:22:16 1475892100 2016-10-08 02:01:40 0 0 event Among the solutions proposed to the problem of banks "too big to fail" is contingent capital in the form of debt that converts to equity when a bank's regulatory capital ratio falls below a threshold. We analyze the dynamics of such a security with continuous conversion and derive closed form expressions for its value when the firm's assets are modeled as geometric Brownian motion and the conversion trigger is an asset-based capital ratio. A key step in the analysis is an explicit formula for the fraction of equity held by the original holders of the contingent capital debt as a function of the maximum drop in asset value. We contrast this analysis with the case of a market-based (rather than accounting-based) conversion trigger. This is joint work with Bezhad Nouri.

]]>
2010-11-23T11:00:00-05:00 2010-11-23T12:00:00-05:00 2010-11-23T12:00:00-05:00 2010-11-23 16:00:00 2010-11-23 17:00:00 2010-11-23 17:00:00 2010-11-23T11:00:00-05:00 2010-11-23T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-23 11:00:00 2010-11-23 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker, ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Jorge Nocedal, Northwestern University]]> 27215 Speaker

Professor
Director of the Computational Science Institute,
Northwestern University

Abstract

We present a "semi-stochastic" Newton method motivated by machine learning problems with very large training sets as well as by the availability of powerful distributed computing environments. The method employs sampled Hessian information to accelerate convergence and enjoys convergence guarantees. We illustrate its performance on multiclass logistic models for the speech recognition system developed at Google. An extension of the method to the sparse L1 setting as well as a complexity analysis will also be presented. This is joint work with Will Neveitt (Google), Richard Byrd (Colorado) and Gillian Chin (Northwestern).

Bio

Jorge Nocedal is a professor in the IEMS and EECS departments at Northwestern University. He obtained a BS from the National University of Mexico and a PhD from Rice University. His research interests are in optimization and scientific computing, and in their application to machine learning, computer-aided design and financial engineering. He is the author (with Steve Wright) of the book Numerical Optimization.

He is a SIAM Fellow, an ISI Highly Cited Researcher (Mathematics Category), and was an invited speaker at the 1998 International Congress of Mathematicians. He serves in the editorial board of Mathematical Programming, and in 2011 he will become editor-in-chief of SIAM Journal on Optimization. In 1998 he was appointed Bette and Neison Harris Professor of Teaching Excellence at Northwestern.

]]> Mike Alberghini 1 1356016289 2012-12-20 15:11:29 1475892100 2016-10-08 02:01:40 0 0 event We present a "semi-stochastic" Newton method motivated by machine learning problems with very large training sets as well as by the availability of powerful distributed computing environments. The method employs sampled Hessian information to accelerate convergence and enjoys convergence guarantees. We illustrate its performance on multiclass logistic models for the speech recognition system developed at Google. An extension of the method to the sparse L1 setting as well as a complexity analysis will also be presented. This is joint work with Will Neveitt (Google), Richard Byrd (Colorado) and Gillian Chin (Northwestern).

]]>
2010-11-30T11:00:00-05:00 2010-11-30T12:00:00-05:00 2010-11-30T12:00:00-05:00 2010-11-30 16:00:00 2010-11-30 17:00:00 2010-11-30 17:00:00 2010-11-30T11:00:00-05:00 2010-11-30T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-30 11:00:00 2010-11-30 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro, ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[Ed Kaplan, Yale University]]> 27215 Speaker

Edward H. Kaplan

William N and Marie A Professor of Management Sciences,
Yale School of Management; 

Professor of Public Health, 
Yale School of Public Health; 

Professor of Engineering, 
Yale School of Engineering and Applied Science

Abstract

This article presents the first model developed specifically for understanding the infiltration and interdiction of ongoing terror plots by undercover intelligence agents, and does so via novel application of ideas from queueing theory and Markov population processes. The resulting "terror queue" models predict the number of undetected terror threats in an area from agent activity/utilization data, and also estimate the rate with which such threats can be interdicted. The models treat terror plots as customers and intelligence agents as servers. Agents spend all of their time either detecting and infiltrating new terror plots (in which case they are "available"), or interdicting already detected terror plots (in which case they are "busy"). Initially we examine a Markov model assuming that intelligence agents, while unable to detect all plots, never err by falsely detecting fake plots. While this model can be solved numerically, a simpler Ornstein-Uhlenbeck diffusion approximation yields some results in closed form while providing nearly identical numerical performance. The transient behavior of the terror queue model is discussed briefly along with a sample sensitivity analysis to study how model predictions compare to simulated results when using estimated versus known terror plot arrival rates. The diffusion model is then extended to allow for the false detection of fake plots. Such false detection is a real feature of counterterror intelligence given that intelligence agents or informants can make mistakes, as well as the proclivity of terrorists to deliberately broadcast false information. The false detection model is illustrated using suicide bombing data from Israel.

Bio

Professor Kaplan's research has been reported on the front pages of the New York Times and the Jerusalem Post, editorialized in the Wall Street Journal (article), recognized by the New York Times Magazine's Year in Ideas, and discussed between the covers of Time (article), Newsweek (see article), US News and World Report, Consumer Reports and the New Yorker, and in person on NBC's Today Show, the Cronkite Report, and National Public Radio 
(transcript)

The author of more than 100 research articles, Professor Kaplan received both the Lanchester Prize and theEdelman Award, the two top honors in the operations research field. An elected member of both the National Academy of Engineering and the Institute of Medicine of the US National Academies, he has also twice received the prestigious Lady Davis Visiting Professorship at the Hebrew University of Jerusalem, where he has investigated AIDS policy issues facing the State of Israel. 

Kaplan's current research focuses on the application of operations research to problems in counterterrorism and 
homeland security.

]]> Mike Alberghini 1 1356018762 2012-12-20 15:52:42 1475892100 2016-10-08 02:01:40 0 0 event This article presents the first model developed specifically for understanding the infiltration and interdiction of ongoing terror plots by undercover intelligence agents, and does so via novel application of ideas from queueing theory and Markov population processes. The resulting "terror queue" models predict the number of undetected terror threats in an area from agent activity/utilization data, and also estimate the rate with which such threats can be interdicted. The models treat terror plots as customers and intelligence agents as servers. Agents spend all of their time either detecting and infiltrating new terror plots (in which case they are "available"), or interdicting already detected terror plots (in which case they are "busy"). Initially we examine a Markov model assuming that intelligence agents, while unable to detect all plots, never err by falsely detecting fake plots. While this model can be solved numerically, a simpler Ornstein-Uhlenbeck diffusion approximation yields some results in closed form while providing nearly identical numerical performance. The transient behavior of the terror queue model is discussed briefly along with a sample sensitivity analysis to study how model predictions 
compare to simulated results when using estimated versus known terror plot arrival rates. The diffusion model is then extended to allow for the false detection of fake plots. Such false detection is a real feature of counterterror intelligence given that intelligence agents or informants can make mistakes, as well as the proclivity of terrorists to deliberately broadcast false information. The false detection model is illustrated using suicide bombing data from Israel.

]]>
2010-03-03T11:00:00-05:00 2010-03-03T12:00:00-05:00 2010-03-03T12:00:00-05:00 2010-03-03 16:00:00 2010-03-03 17:00:00 2010-03-03 17:00:00 2010-03-03T11:00:00-05:00 2010-03-03T12:00:00-05:00 America/New_York America/New_York datetime 2010-03-03 11:00:00 2010-03-03 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker, ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Jose Blanchet, Columbia University]]> 27215 Speaker
Jose Blanchet
Columbia University

Abstract
Our focus is on the development of provably efficient simulation algorithms for estimating large deviations probabilities (such as overflow probabilities) in the context of many server queues. These types of systems, which have been the subject of much investigation in recent years, pose interesting challenges from a rare event simulation standpoint, given their measure valued state descriptor. We shall explain a technique that has the following elements. First, it introduces a pivotal set that is suitable chosen to deal with boundary-type behavior, which is common in the analysis of queueing systems. Second, it takes advantage of Central Limit Theorem approximations that have been developed recently for these types of systems and third it use a novel bridge-sampling approach in order to describe an symptotically optimal (in certain sense) importance sampling scheme. This work provides the first systematic approach to develop provably efficient rare-event simulation methodology for these types of systems.

This is a joint work with P. Glynn and H. Lam.

Bio
Jose Blanchet is a faculty member of the IEOR at Columbia University. Jose holds a Ph.D. in Management Science and Engineering from Stanford University. Prior to joining Columbia he was a faculty member in the Statistics Department at Harvard University. Jose is a recipient of the 2009 Best Publication Award given by the INFORMS Applied Probability Society and a CAREER award in Operations Research given by NSF in 2008. He worked as an analyst in Protego Financial Advisors, a leading investment bank in Mexico. He has research interests in applied probability and Monte Carlo methods. He serves in the editorial board of Advances in Applied Probability, Journal of Applied Probability, QUESTA and TOMACS.

]]> Mike Alberghini 1 1356019728 2012-12-20 16:08:48 1475892100 2016-10-08 02:01:40 0 0 event Our focus is on the development of provably efficient simulation algorithms for estimating large deviations probabilities (such as overflow probabilities) in the context of many server queues. These types of systems, which have been the subject of much investigation in recent years, pose interesting challenges from a rare event simulation standpoint, given their measure valued state descriptor. We shall explain a technique that has the following elements. First, it introduces a pivotal set that is suitable chosen to deal with boundary-type behavior, which is common in the analysis of queueing systems. Second, it takes advantage of Central Limit Theorem approximations that have been developed recently for these types of systems and third it use a novel bridge-sampling approach in order to describe an symptotically optimal (in certain sense) importance sampling scheme. This work provides the first systematic approach to develop provably efficient rare-event simulation methodology for these types of systems.

]]>
2010-01-26T11:00:00-05:00 2010-01-26T12:00:00-05:00 2010-01-26T12:00:00-05:00 2010-01-26 16:00:00 2010-01-26 17:00:00 2010-01-26 17:00:00 2010-01-26T11:00:00-05:00 2010-01-26T12:00:00-05:00 America/New_York America/New_York datetime 2010-01-26 11:00:00 2010-01-26 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker, ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Dimitris Bertsimas, MIT]]> 27215 Speaker

Boeing Leaders for Global Operations Professor
Operations Research/Statistics
Sloan School of Management

Abstract

In this presentation, we show a significant role that symmetry, a fundamental concept in convex geometry, plays in determining the power of robust and finitely adaptable solutions in multi-stage stochastic and adaptive optimization problems. We consider a fairly general class of multi-stage mixed integer stochastic and adaptive optimization problems and propose a good approximate solution policy with performance guarantees that depend on the geometric properties such as symmetry of the uncertainty sets. In particular, we show that a class of finitely adaptable solutions is a good approximation for both the multi-stage stochastic as well as the adaptive optimization problem. A finitely adaptable solution specifies a small set of solutions for each stage and the solution policy implements the best solution from the given set depending on the realization of the uncertain parameters in the past stages. To the best of our knowledge, these are the first approximation results for the multi-stage problem in such generality.

(joint work with Vineet Goyal, Columbia University and Andy Sun, MIT)

Bio

Dimitris Bertsimas is currently the Boeing Professor of Operations Research and the codirector of the Operations Research Center at the Massachusetts Institute of Technology. He has received a BS in Electrical Engineering and Computer Science at the National Technical University of Athens, Greece in 1985, a MS in Operations Research at MIT in 1987, and a Ph.D in Applied Mathematics and Operations research at MIT in 1988. Since 1988, he has been in the MIT faculty.

His research interests include optimization, stochastic systems, data mining, and their application. He has co-authored more than 120 scientific papers and he has co-authored the following books: ``Introduction to Linear Optimization'' (with J. Tsitsiklis, Athena Scientific and Dynamic Ideas, 2008), ``Data, models and decisions'' (with R. Freund, Dynamic Ideas, 2004) and ``Optimization over Integers'' (with R. Weismantel, Dynamic Ideas, 2005). He is currently department editor in Optimization for Management Science and former area editor in Operations Research in Financial Engineering. He has supervised 42 doctoral students and he is currently supervising 10 others.

He is a member of the National Academy of Engineering, and he has received several research awards including: the Farkas prize (2008), the Erlang prize (1996), the SIAM prize in optimization (1996), the Bodossaki prize (1998) and the Presidential Young Investigator award (1991-1996).

]]> Mike Alberghini 1 1356018052 2012-12-20 15:40:52 1475892100 2016-10-08 02:01:40 0 0 event In this presentation, we show a significant role that symmetry, a fundamental concept in convex geometry, plays in determining the power of robust and finitely adaptable solutions in multi-stage stochastic and adaptive optimization problems. We consider a fairly general class of multi-stage mixed integer stochastic and adaptive optimization problems and propose a good approximate solution policy with performance guarantees that depend on the geometric properties such as symmetry of the uncertainty sets. In particular, we show that a class of finitely adaptable solutions is a good approximation for both the multi-stage stochastic as well as the adaptive optimization problem. A finitely adaptable solution specifies a small set of solutions for each stage and the solution policy implements the best solution from the given set depending on the realization of the uncertain parameters in the past stages. To the best of our knowledge, these are the first approximation results for the multi-stage problem in such generality.

]]>
2010-09-14T12:00:00-04:00 2010-09-14T13:00:00-04:00 2010-09-14T13:00:00-04:00 2010-09-14 16:00:00 2010-09-14 17:00:00 2010-09-14 17:00:00 2010-09-14T12:00:00-04:00 2010-09-14T13:00:00-04:00 America/New_York America/New_York datetime 2010-09-14 12:00:00 2010-09-14 01:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro, ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[Sheldon Ross, University of Southern California]]> 27215 Speaker

Sheldon Ross
Epstein Chair Professor
Industrial and Systems Engineering
University of Southern California

Abstract

Suppose there are r gamblers, with gambler i initially having a fortune of ni. In our first model we suppose that at each stage two of the gamblers are chosen to play a game, equally likely to be won by either player, with the winner of the game receiving 1 from the loser. Any gambler whose fortune becomes 0 leaves, and this continues until there is only a single gambler left. We are interested in the probability that player i is the one left, and in the the mean number of games played between specified players i and j. In our second model we suppose that all remaining players contribute 1 to a pot, which is equally likely to be won by each of them. The problem here is to determine the expected number of games played until one player has all the funds.

]]> Mike Alberghini 1 1356018471 2012-12-20 15:47:51 1475892100 2016-10-08 02:01:40 0 0 event Suppose there are r gamblers, with gambler i initially having a fortune of ni. In our first model we suppose that at each stage two of the gamblers are chosen to play a game, equally likely to be won by either player, with the winner of the game receiving 1 from the loser. Any gambler whose fortune becomes 0 leaves, and this continues until there is only a single gambler left. We are interested in the probability that player i is the one left, and in the the mean number of games played between specified players i and j. In our second model we suppose that all remaining players contribute 1 to a pot, which is equally likely to be won by each of them. The problem here is to determine the expected number of games played until one player has all the funds.

]]>
2010-03-16T12:00:00-04:00 2010-03-16T13:00:00-04:00 2010-03-16T13:00:00-04:00 2010-03-16 16:00:00 2010-03-16 17:00:00 2010-03-16 17:00:00 2010-03-16T12:00:00-04:00 2010-03-16T13:00:00-04:00 America/New_York America/New_York datetime 2010-03-16 12:00:00 2010-03-16 01:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker, ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Ben Van Roy, Stanford University]]> 27215 Speaker
Ben Van Roy
Stanford University

Abstract
When used to guide decisions, linear regression analysis typically involves estimation of regression coefficients via ordinary least squares and their subsequent use in an optimization problem. When features are not chosen perfectly, it can be beneficial to account for the decision objective when computing regression coefficients. Empirical optimization does so but sacrifices performance when features are well-chosen or training data are insufficient. We propose directed regression, an efficient algorithm that combines merits of ordinary least squares and empirical optimization. We demonstrate through computational studies that directed regression generates performance gains over either alternative. We also develop a theory that motivates the algorithm.

]]> Mike Alberghini 1 1356019623 2012-12-20 16:07:03 1475892100 2016-10-08 02:01:40 0 0 event When used to guide decisions, linear regression analysis typically involves estimation of regression coefficients via ordinary least squares and their subsequent use in an optimization problem. When features are not chosen perfectly, it can be beneficial to account for the decision objective when computing regression coefficients. Empirical optimization does so but sacrifices performance when features are well-chosen or training data are insufficient. We propose directed regression, an efficient algorithm that combines merits of ordinary least squares and empirical optimization. We demonstrate through computational studies that directed regression generates performance gains over either alternative. We also develop a theory that motivates the algorithm.

]]>
2010-02-09T11:00:00-05:00 2010-02-09T12:00:00-05:00 2010-02-09T12:00:00-05:00 2010-02-09 16:00:00 2010-02-09 17:00:00 2010-02-09 17:00:00 2010-02-09T11:00:00-05:00 2010-02-09T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-09 11:00:00 2010-02-09 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker, ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Egon Balas, Carnegie Mellon University]]> 27215 Speaker

Egon Balas
University Professor of Industrial Administration and Applied Mathematics
The Thomas Lord Professor of Operations Research
Carnegie Mellon University

Abstract

Intersection cuts are generated from a polyhedral cone and a convex set S whose interior contains no feasible integer point. We generalize these cuts by replacing the cone with a more general polyhedron C. The resulting generalized intersection cuts dominate the original ones. This leads to a new cutting plane paradigm under which one generates and stores the intersection points of the extreme rays of C with the boundary of S rather than the cuts themselves. These intersection points can then be used to generate deeper cuts in a non-recursive fashion.

(This talk is based on joint work with Francois Margot)

Bio

Egon Balas is University Professor of Industrial Administration and Applied Mathematics, as well as the Thomas Lord Professor of Operations Research, at Carnegie Mellon University. He has a doctorate in Economic Science from the University of Brussels and a doctorate in Mathematics from the University of Paris.

Professor Balas's research interests are in mathematical programming, primarily integer and combinatorial optimization. He has played a leading role in the developmant of enumerative and cutting plane techniques for 0-1 programming, and is mainly known as the developer of the approach called disjunctive programming or lift-and-project. He has also developed scheduling algorithms and software. Dr. Balas has served or is serving on the editorial boards of Operations Research, Discrete Applied Mathematics, the Journal of Combinatorial Optimization, Computational Optimization and Applications, the European Journal of Operational Research, Annals of Operations Research etc. In 1980 Dr. Balas received the US Senior Scientist Award of the Alexander von Humboldt Foundation; in 1995 he received the John von Neumann Theory Prize of INFORMS; and in 2001 he was the first American to be awarded the EURO Gold Medal of the European Association of Operational Research Societies.

]]> Mike Alberghini 1 1356017883 2012-12-20 15:38:03 1475892100 2016-10-08 02:01:40 0 0 event Intersection cuts are generated from a polyhedral cone and a convex set S whose interior contains no feasible integer point. We generalize these cuts by replacing the cone with a more general polyhedron C. The resulting generalized intersection cuts dominate the original ones. This leads to a new cutting plane paradigm under which one generates and stores the intersection points of the extreme rays of C with the boundary of S rather than the cuts themselves. These intersection points can then be used to generate deeper cuts in a non-recursive fashion.

]]>
2010-10-12T12:00:00-04:00 2010-10-12T13:00:00-04:00 2010-10-12T13:00:00-04:00 2010-10-12 16:00:00 2010-10-12 17:00:00 2010-10-12 17:00:00 2010-10-12T12:00:00-04:00 2010-10-12T13:00:00-04:00 America/New_York America/New_York datetime 2010-10-12 12:00:00 2010-10-12 01:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro, ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[Yurii Nesterov]]> 27215 Speaker

Yurii Nesterov
CORE/INMA, 
Catholic University og Louvain (UCL) Belgium

Abstract

In this talk we describe the new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we suggest to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly enough, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method, and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.

The paper can be downloaded as CORE Discussion Paper 2010/2:

http://www.uclouvain.be/en-310431.html

]]> Mike Alberghini 1 1356018297 2012-12-20 15:44:57 1475892100 2016-10-08 02:01:40 0 0 event In this talk we describe the new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we suggest to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly enough, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method, and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.

]]>
2010-04-01T12:00:00-04:00 2010-04-01T13:00:00-04:00 2010-04-01T13:00:00-04:00 2010-04-01 16:00:00 2010-04-01 17:00:00 2010-04-01 17:00:00 2010-04-01T12:00:00-04:00 2010-04-01T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-01 12:00:00 2010-04-01 01:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro, ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[Mike Harrison, Stanford University]]> 27215 Speaker
Michael Harrison
Adams Distinguished Professor of Management 
Stanford University

Abstract
Motivated by applications in financial services, we consider the following customized pricing problem. A seller of some good or service (like auto loans or small business loans) confronts a sequence of potential customers numbered 1, 2, … , T. These customers are drawn at random from a population characterized by a price-response function ρ(p). That is, if the seller offers price p, then the probability of a successful sale is ρ(p). The profit realized from a successful sale is Ï€(p) = p âˆ' c, where c > 0 is known. 

If the price-response function ρ(-) were also known, then the problem of finding a price p* to maximize ρ(p)π(p) would be simple, and the seller would offer price p* to each of the T customers. We consider the more complicated case where ρ(-) is fixed but initially unknown: roughly speaking, the seller wants to choose prices sequentially so as to maximize the total profit earned from the T potential customers; each successive choice involves a trade-off between refined estimation of the unknown price-response function (learning) and immediate profit (earning).

* Joint work with Bora Keskin and Assaf Zeevi

]]> Mike Alberghini 1 1356019487 2012-12-20 16:04:47 1475892100 2016-10-08 02:01:40 0 0 event Motivated by applications in financial services, we consider the following customized pricing problem. A seller of some good or service (like auto loans or small business loans) confronts a sequence of potential customers numbered 1, 2, ”¦ , T. These customers are drawn at random from a population characterized by a price-response function ρ(p). That is, if the seller offers price p, then the probability of a successful sale is ρ(p). The profit realized from a successful sale is Ï€(p) = p âˆ' c, where c > 0 is known.

]]>
2010-02-18T11:00:00-05:00 2010-02-18T12:00:00-05:00 2010-02-18T12:00:00-05:00 2010-02-18 16:00:00 2010-02-18 17:00:00 2010-02-18 17:00:00 2010-02-18T11:00:00-05:00 2010-02-18T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-18 11:00:00 2010-02-18 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker, ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Costis Maglaras, Columbia University]]> 27215 Speaker

Costis Maglaras
David and Lyn Silfen Professor of Business
Decision, Risk and Operations Division,
Graduate School of Business,
Columbia University

Abstract

The first part of the talk will offer a brief overview of algorithmic trading in the US equities market, touching, in passing, upon on the topic of high-frequency trading and the nature of current market structure. The emphasis will be on highlighting different facets of this area and discussing corresponding quantitative research problems. The second half of the talk describes a queueing model of limit order book dynamics, and explores questions of optimal limit order placement, market impact, and optimal trade execution.

Bio

Costis Maglaras is the David and Lyn Silfen Professor of Business at Columbia University. His research focuses on quantitative pricing and revenue management, the economics, design and operations of service systems, and financial engineering. He is the author of many research articles spanning the theory and application of stochastic modeling in a variety of fields, more recently in pricing, risk management and valuation of multi-unit real estate portfolios, and in the design of portfolio trading systems and algorithms. He holds editorial positions in many of the flagship journals of his fields of study, he is the recipient of several research and teaching awards, and he teaches and serves as faculty director for the executive education course on Risk Management offered by Columbia Business School.

]]> Mike Alberghini 1 1356017716 2012-12-20 15:35:16 1475892100 2016-10-08 02:01:40 0 0 event The first part of the talk will offer a brief overview of algorithmic trading in the US equities market, touching, in passing, upon on the topic of high-frequency trading and the nature of current market structure. The emphasis will be on highlighting different facets of this area and discussing corresponding quantitative research problems. The second half of the talk describes a queueing model of limit order book dynamics, and explores questions of optimal limit order placement, market impact, and optimal trade execution.

]]>
2010-11-02T12:00:00-04:00 2010-11-02T13:00:00-04:00 2010-11-02T13:00:00-04:00 2010-11-02 16:00:00 2010-11-02 17:00:00 2010-11-02 17:00:00 2010-11-02T12:00:00-04:00 2010-11-02T13:00:00-04:00 America/New_York America/New_York datetime 2010-11-02 12:00:00 2010-11-02 01:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker, ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Adrian Lewis, Cornell University]]> 27215 Speaker

Adrian Lewis
Cornell University

Abstract

Concrete optimization problems, while often nonsmooth, are not pathologically so. The class of "semi-algebraic" sets and functions - those arising from polynomial inequalities - nicely exemplifies nonsmoothness in practice. Semi-algebraic sets (and their generalizations) are common, easy to recognize, and richly structured, supporting powerful variational properties. In particular I will discuss a generic property of such sets - partial smoothness - and its relationship with a proximal algorithm for nonsmooth composite minimization, a versatile model for practical optimization.

Bio

Adrian S. Lewis was born in England in 1962. He is a Professor at Cornell University in the School of Operations Research and Industrial Engineering. Following his B.A., M.A., and Ph.D. degrees from Cambridge, and Research Fellowships at Queens' College, Cambridge and Dalhousie University, Canada, he worked in Canada at the University of Waterloo (1989-2001) and Simon Fraser University (2001-2004). He is an Associate Editor of the SIAM Journal on Optimization, Mathematics of Operations Research, and the SIAM/MPS Book Series on Optimization, and is a Co-Editor for Mathematical Programming. He received the 1995 Aisenstadt Prize, from the Canadian Centre de Recherches Mathematiques, the 2003 Lagrange Prize for Continuous Optimization from SIAM and the Mathematical Programming Society, and an Outstanding Paper Award from SIAM in 2005. He co-authored "Convex Analysis and Nonlinear Optimization" with J.M. Borwein.

Lewis' research concerns variational analysis and nonsmooth optimization, with a particular interest in optimization problems involving eigenvalues.

]]> Mike Alberghini 1 1356018174 2012-12-20 15:42:54 1475892100 2016-10-08 02:01:40 0 0 event Concrete optimization problems, while often nonsmooth, are not pathologically so. The class of "semi-algebraic" sets and functions - those arising from polynomial inequalities - nicely exemplifies nonsmoothness in practice. Semi-algebraic sets (and their generalizations) are common, easy to recognize, and richly structured, supporting powerful variational properties. In particular I will discuss a generic property of such sets - partial smoothness - and its relationship with a proximal algorithm for nonsmooth composite minimization, a versatile model for practical optimization.

]]>
2010-04-06T12:00:00-04:00 2010-04-06T13:00:00-04:00 2010-04-06T13:00:00-04:00 2010-04-06 16:00:00 2010-04-06 17:00:00 2010-04-06 17:00:00 2010-04-06T12:00:00-04:00 2010-04-06T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-06 12:00:00 2010-04-06 01:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro, ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[Yinyu Ye, Stanford University]]> 27215 Speaker

Yinyu Ye

Professor of Management Science and Engineering
and, by courtesy, Electrical Engineering

Affiliation: Department of Management Science and Engineering
Stanford University

Abstract

A natural optimization model that formulates many online resource allocation and revenue management problems is the online linear program (LP) where the constraint matrix is revealed column by column along with the objective function. We provide a near-optimal algorithm for this surprisingly general class of online problems under the assumption of random order of arrival and some mild conditions on the size of the LP right-hand-side input. Our learning-based algorithm works by dynamically updating a threshold price vector at geometric time intervals, where the dual prices learned from revealed columns in the previous period are used to determine the sequential decisions in the current period. Our algorithm has a feature of learning by doing", and the prices are updated at a carefully chosen pace that is neither too fast nor too slow. In particular, our algorithm doesn't assume any distribution information on the input itself, thus is robust to data uncertainty and variations due to its dynamic learning capability. Applications of our algorithm include many online multi-resource allocation and multi-product revenue management problems such as online routing and packing, online combinatorial auctions, adwords matching, inventory control and yield management.

This is a joint work with Shipra Agrawal and Zizhuo Wang.

Bio

Yinyu Ye received the B.S. degree in System Engineering from the Huazhong University of Science and Technology, Wuhan, China, and the M.S. and Ph.D. degrees in Management Science & Engineering from Stanford University, Stanford. Currently, he is a full Professor of Management Science and Engineering and Institute of Computational and Mathematical Engineering and the Director of the MS&E Industrial Affiliates Program, Stanford University. His current research interests include Continuous and Discrete Optimization, Mathematical Programming, Algorithm Design and Analysis, Computational Game/Market Equilibrium, Metric Distance Geometry, Graph Realization, Dynamic Resource Allocation, and Stochastic and Robust Decision Making, etc.

The following is a list of some of his main achievements:

]]> Mike Alberghini 1 1356019384 2012-12-20 16:03:04 1475892100 2016-10-08 02:01:40 0 0 event A natural optimization model that formulates many online resource allocation and revenue management problems is the online linear program (LP) where the constraint matrix is revealed column by column along with the objective function. We provide a near-optimal algorithm for this surprisingly general class of online problems under the assumption of random order of arrival and some mild conditions on the size of the LP right-hand-side input. Our learning-based algorithm works by dynamically updating a threshold price vector at geometric time intervals, where the dual prices learned from revealed columns in the previous period are used to determine the sequential decisions in the current period. Our algorithm has a feature of learning by doing", and the prices are updated at a carefully chosen pace that is neither too fast nor too slow. In particular, our algorithm doesn't assume any distribution information on the input itself, thus is robust to data uncertainty and variations due to its dynamic learning capability. Applications of our algorithm include many online multi-resource allocation and multi-product revenue management problems such as online routing and packing, online combinatorial auctions, adwords matching, inventory control and yield management.

]]>
2010-02-23T11:00:00-05:00 2010-02-23T12:00:00-05:00 2010-02-23T12:00:00-05:00 2010-02-23 16:00:00 2010-02-23 17:00:00 2010-02-23 17:00:00 2010-02-23T11:00:00-05:00 2010-02-23T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-23 11:00:00 2010-02-23 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro, ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[Mark Squillante, IBM Research]]> 27215 Speaker

Mark S. Squillante,
IBM Research

Abstract

In this talk we present an integrated suite of operations research models and methods that supports the effective and efficient management and planning of human capital supply chains by addressing distinct features and characteristics of human talent and skills. This consists of solutions for: (1) the statistical forecasting of future demand and resource requirements; (2) a new form of risk-based stochastic resource capacity planning; (3) the stochastic modeling and optimization/control of supply evolutionary dynamics over time; (4) a new form of optimal multi-skill supply-demand matching; and (5) the stochastic optimization of business decisions to manage resource shortages and overages. These solutions include contributions in the areas of stochastic models and stochastic optimization/control. The suite of models and methods constitutes an end-to-end solution that is deployed as an important part of the human capital management and planning process within IBM. 

Bio

Mark S. Squillante is a Research Staff Member in the Mathematical Sciences Department at the IBM Thomas J. Watson Research Center, where he leads the Applied Probability and Stochastic Optimization team. His research interests concern mathematical foundations of the analysis, modeling and optimization of the design and control of stochastic systems, including stochastic processes, applied probability, stochastic optimization and control, and their applications. He is the author of many research articles across these areas, and has received several internal (IBM) and external research awards. He is a Fellow of ACM and IEEE, and currently serves on the editorial boards of Operations Research, Stochastic Models and Performance Evaluation.

]]> Mike Alberghini 1 1356017167 2012-12-20 15:26:07 1475892100 2016-10-08 02:01:40 0 0 event In this talk we present an integrated suite of operations research models and methods that supports the effective and efficient management and planning of human capital supply chains by addressing distinct features and characteristics of human talent and skills. This consists of solutions for: (1) the statistical forecasting of future demand and resource requirements; (2) a new form of risk-based stochastic resource capacity planning; (3) the stochastic modeling and optimization/control of supply evolutionary dynamics over time; (4) a new form of optimal multi-skill supply-demand matching; and (5) the stochastic optimization of business decisions to manage resource shortages and overages. These solutions include contributions in the areas of stochastic models and stochastic optimization/control. The suite of models and methods constitutes an end-to-end solution that is deployed as an important part of the human capital management and planning process within IBM.

]]>
2010-11-16T11:00:00-05:00 2010-11-16T12:00:00-05:00 2010-11-16T12:00:00-05:00 2010-11-16 16:00:00 2010-11-16 17:00:00 2010-11-16 17:00:00 2010-11-16T11:00:00-05:00 2010-11-16T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-16 11:00:00 2010-11-16 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker, ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[On Recurrence and Transience in Heavy-Tailed Generalized Semi-Markov Processes]]> 27187 TITLE:  On Recurrence and Transience in Heavy-Tailed Generalized Semi-Markov Processes

SPEAKER:  Peter J. Haas, IBM Research

ABSTRACT:

The generalized semi-Markov process (GSMP) is the usual model for the underlying stochastic process of a complex discrete-event system. It is important to understand fundamental behavioral properties of the GSMP model, such as the conditions under which the states of a GSMP are recurrent. For example, recurrence is necessary for the validity of steady-state simulation output analysis methods such as the regenerative method, spectral method, and the method of batch means. We review some sufficient conditions for recurrence in irreducible finite-state GSMPs. These conditions include requirements on the "clocks" that govern the occurrence times of state transitions. For example, each clock-setting distribution must have finite mean. We then show that, in contrast to ordinary semi-Markov processes, an
irreducible finite-state GSMP can have transient states in the presence of multiple clock-setting distributions with heavy tails. (Joint work with Peter Glynn.)


]]> Anita Race 1 1292408891 2010-12-15 10:28:11 1475891616 2016-10-08 01:53:36 0 0 event 2010-12-16T11:00:00-05:00 2010-12-16T12:00:00-05:00 2010-12-16T12:00:00-05:00 2010-12-16 16:00:00 2010-12-16 17:00:00 2010-12-16 17:00:00 2010-12-16T11:00:00-05:00 2010-12-16T12:00:00-05:00 America/New_York America/New_York datetime 2010-12-16 11:00:00 2010-12-16 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Winter Break Begins]]> 27215 Mike Alberghini 1 1292261576 2010-12-13 17:32:56 1475891616 2016-10-08 01:53:36 0 0 event 2010-12-24T00:00:00-05:00 2010-12-24T00:00:00-05:00 2010-12-24T00:00:00-05:00 2010-12-24 05:00:00 2010-12-24 05:00:00 2010-12-24 05:00:00 2010-12-24T00:00:00-05:00 2010-12-24T00:00:00-05:00 America/New_York America/New_York datetime 2010-12-24 12:00:00 2010-12-24 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> <![CDATA[Operations Research and Public Health]]> 27187 TITLE:  Operations Research and Public Health

SPEAKER: Margaret L. Brandeau

ABSTRACT:

What is the most cost-effective way to use limited HIV prevention and treatment resources?  How should the Centers for Disease Control and Prevention revise national immunization recommendations so that gaps in vaccination coverage will be filled in a cost-effective manner?  To what extent should local communities stockpile antibiotics for response to a potential bioterror attack?  How can humanitarian relief organizations manage their inventories most effectively?  This talk will describe examples from past and ongoing model-based analyses of public health policy questions.  We also provide perspectives on key elements of a successful policy analysis and discuss ways in which such analysis can influence policy.

 About the Speaker

 Margaret Brandeau is Professor of Management Science and Engineering and Professor of Medicine (by Courtesy) at Stanford University.  Her research focuses on the development of applied mathematical and economic models to support health policy decisions.  Her recent work has focused on HIV prevention and treatment programs, programs to control the spread of Hepatitis B virus, and evaluating preparedness plans for bioterror response.  She is a Fellow of the Institute for Operations Research and Management Science (INFORMS), and has received the President’s Award from INFORMS (recognizing important contributions to the welfare of society), the Pierskalla Prize from INFORMS (for research excellence in health care management science), and a Presidential Young Investigator Award from the National Science Foundation, among other awards.  Professor Brandeau earned a BS in Mathematics and an MS in Operations Research from MIT, and a PhD in Engineering-Economic Systems from Stanford University.

]]> Anita Race 1 1291025960 2010-11-29 10:19:20 1475891612 2016-10-08 01:53:32 0 0 event 2010-11-30T12:00:00-05:00 2010-11-30T13:30:00-05:00 2010-11-30T13:30:00-05:00 2010-11-30 17:00:00 2010-11-30 18:30:00 2010-11-30 18:30:00 2010-11-30T12:00:00-05:00 2010-11-30T13:30:00-05:00 America/New_York America/New_York datetime 2010-11-30 12:00:00 2010-11-30 01:30:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Serologic surveillance to monitor infection attack rates and severity of an influenza pandemic in real-time]]> 27187 TITLE: Serologic surveillance to monitor infection attack rates and severity
of an influenza pandemic in real-time

SPEAKER:  Professor Joseph Wu

ABSTRACT:

Early estimates of transmissibility and severity of an emerging influenza pandemic is an urgent public health priority. This is challenging because many influenza infections are subclinical.
Population-based serologic surveillance allows accurate estimates of infection attack rates (IAR) and severity estimates. During 2009, we tested more than 17,800 serological specimens in Hong Kong throughout the first wave of the H1N1 pandemic. Using these data we estimated
that the basic reproductive number was 1.38 and that 0.6% of infections led to hospitalization. We developed a novel statistical method for real-time serologic surveillance and estimated that about 1,000 specimens per week would allow accurate estimates of IAR and severity as soon as the true IAR has reached 2%. Serologic monitoring should be considered in updated pandemic plans.

]]> Anita Race 1 1291279669 2010-12-02 08:47:49 1475891612 2016-10-08 01:53:32 0 0 event 2010-12-08T11:00:00-05:00 2010-12-08T12:00:00-05:00 2010-12-08T12:00:00-05:00 2010-12-08 16:00:00 2010-12-08 17:00:00 2010-12-08 17:00:00 2010-12-08T11:00:00-05:00 2010-12-08T12:00:00-05:00 America/New_York America/New_York datetime 2010-12-08 11:00:00 2010-12-08 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Pricing Contingent Capital]]> 27187 TITLE:  Pricing Contingent Capital

SPEAKER:  Paul Glasserman

ABSTRACT:

Among the solutions proposed to the problem of banks "too big to fail" is contingent capital in the form of debt that converts to equity when a bank's regulatory capital ratio falls below a threshold. We analyze the dynamics of such a security with continuous conversion and derive closed form expressions for its value when the firm's assets are modeled as geometric Brownian motion and the conversion trigger is an asset-based capital ratio. A key step in the analysis is an explicit formula for the fraction of equity held by the original holders of the contingent capital debt as a function of the maximum drop in asset value. We contrast this analysis with the case of a market-based (rather than accounting-based) conversion trigger. This is joint work with Bezhad Nouri.

Paul Glasserman is the Jack R. Anderson Professor of Business at Columbia Business School, where he served as senior vice dean in 2004-2008. Since 2008, he has also been an academic visitor at the Federal Reserve Bank of New York. His research focuses on derivative securities, risk management, stochastic models, and simulation. His publications include the book Monte Carlo Methods in Financial Engineering, which received the 2005 I-Sim Outstanding Simulation Publication Award and the 2006 Lanchester Prize. He also received Risk magazine's 2007 Quant of the Year Award.


]]> Anita Race 1 1289559355 2010-11-12 10:55:55 1475891608 2016-10-08 01:53:28 0 0 event 2010-11-23T11:00:00-05:00 2010-11-23T12:00:00-05:00 2010-11-23T12:00:00-05:00 2010-11-23 16:00:00 2010-11-23 17:00:00 2010-11-23 17:00:00 2010-11-23T11:00:00-05:00 2010-11-23T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-23 11:00:00 2010-11-23 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Bringing Awareness to Haiti Aid Relief]]> 27187 TITLE: Bringing Awareness to Haiti Aid Relief

At the invitation of the Georgia Tech Center for Health and Humanitarian Logistics, Dr. John Hardman, president and CEO of the Carter Center, and François Grünewald, director of URD (Urgency, Recovery and Development), will discuss the situation in Haiti and what still needs to be done to ensure recovery.

This lecture will be moderated by Reginald DesRoches, professor and associate chair of Civil & Environmental Engineering and key technical leader in the response to the Haiti Earthquake.

It will be followed by the screening of a series of documentaries on Haiti.

To register: http://www.france-atlanta.org/spip.php?article50

For a complete listing of events taking place on the Georgia Tech campus, visit http://www.global.gatech.edu/france-atlanta/. For other lectures, workshops, conferences, and symposiums associated with the full France-Atlanta collaboration, visit http://www.france-atlanta.org/

]]> Anita Race 1 1290095772 2010-11-18 15:56:12 1475891608 2016-10-08 01:53:28 0 0 event 2010-12-06T16:00:00-05:00 2010-12-06T18:00:00-05:00 2010-12-06T18:00:00-05:00 2010-12-06 21:00:00 2010-12-06 23:00:00 2010-12-06 23:00:00 2010-12-06T16:00:00-05:00 2010-12-06T18:00:00-05:00 America/New_York America/New_York datetime 2010-12-06 04:00:00 2010-12-06 06:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Regulating Local Monopolies in Electricity Transmission: A Real-world Application of the StoNED Method]]> 27187 TITLE: Regulating Local Monopolies in Electricity Transmission: A Real-world Application of the StoNED Method

SPEAKER:  Andrew Johnson

ABSTRACT:

The Finnish electricity market has a competitive energy generation market and a monopolistic transmission system. To regulate the local monopoly power of network operators, the government regulator uses frontier estimation methods (e.g., Stochastic Frontier Analysis (SFA) and nonparametric Data Envelopment Analysis (DEA)) to identify excessive transmission costs, taking into account outputs and the operating environment. We describe the new regulatory system developed for the Finnish regulator, which is based on the method Stochastic Non-smooth Envelopment of Data (StoNED) and utilizes panel data to detect the excessive costs from random noise.

The literature of productive efficiency analysis is divided into two main branches: the parametric SFA and nonparametric DEA. StoNED is a new frontier estimation framework that combines the virtues of both DEA and SFA in a unified approach to frontier analysis. StoNED follows the SFA approach by including a stochastic component. In contrast to SFA, however, the proposed method does not make any prior assumptions about the functional form of the production function. In that respect, StoNED is similar to DEA, and only imposes free disposability, convexity, and some returns to scale specification.

The main advantage of the StoNED approach to the parametric SFA approach is the independence of the ad hoc parametric assumptions about the functional form of the production function (or cost/distance functions). In contrast to the flexible functional forms, one can impose monotonicity, concavity and homogeneity constraints without sacrificing the flexibility of the regression function. Additionally, the main advantage of StoNED to the nonparametric DEA approach is robustness to outliers, data errors, and other stochastic noise in the data. In DEA the frontier is spanned by a relatively small number of efficient firms, however, in our method all observations influence the shape of the frontier. Also many standard tools from parametric regression such as goodness of fit statistics and statistical tests are directly applicable in our approach. This is collaborate work with Timo Kuosmanen of Aalto University in Finland.

 

 Andrew L Johnson is an Assistant Professor in the Department of Industrial and Systems Engineering at Texas A&M University. He obtained his B.S. in Industrial and Systems Engineering from Virginia Tech and his M.S. and Ph.D. from the H. Milton Stewart School of Industrial and Systems Engineering from Georgia Tech. His research interests include productivity and efficiency measurement, warehouse design and operations, material handling and mechanism design. He is a member of the INFORMS, National Eagle Scout Association, and German Club of Virginia Tech.

]]> Anita Race 1 1289398215 2010-11-10 14:10:15 1475891604 2016-10-08 01:53:24 0 0 event 2010-11-19T11:00:00-05:00 2010-11-19T12:00:00-05:00 2010-11-19T12:00:00-05:00 2010-11-19 16:00:00 2010-11-19 17:00:00 2010-11-19 17:00:00 2010-11-19T11:00:00-05:00 2010-11-19T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-19 11:00:00 2010-11-19 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[T-statistic based correlation and heterogeneity robust inference, with applications to risk, inequality and concentration measurement]]> 27187 TITLE: T-statistic based correlation and heterogeneity robust inference, with applications to risk, inequality and concentration measurement

SPEAKER: Rustam Ibragimov

ABSTRACT:

Many risk, inequality, poverty and concentration measures are extremely sensitive to outliers, dependence, heterogeneity and heavy tails. In this paper we focus on robust measurement of risk, inequality, poverty and concentration under heterogeneity, dependence and heavy-tailedness of largely unknown form using the recent results on t-statistic based heterogeneity and correlation robust inference in Ibragimov and Muller (2007). The robust large sample inference on risk, inequality, poverty and concentration measures is conducted as follows: partition the observations into q>=2 groups, calculate the empirical measures for each group and conduct a standard test with the resulting q estimators of the population measures.

Numerical results confirm the appealing properties of tstatistic based robust inference method in this context, and its applicability to many widely used risk, inequality, poverty and concentration measures, including Sharpe ratio; value at risk and expected shortfall; Gini coecient; Theil index, mean logarithmic deviation and generalized entropy measures; Atkinson measures; coecient of variation and Herfindahl-Hirschman index; head count, poverty gap and squared poverty gap indices and other Foster-Greer-Thorbecke measures of poverty, among others. The results discussed in the paper further indicate a strong link between the tstatistic based robust inference methods and stochastic analogues of the majorization conditions that are usually imposed on risk, inequality, poverty and concentration measures related to
self-normalized sums or their transforms, as in the case of Sharpe ratio, coefficient of variation and Herndahl-Hirschman index.

]]> Anita Race 1 1289313883 2010-11-09 14:44:43 1475891604 2016-10-08 01:53:24 0 0 event 2010-11-11T11:00:00-05:00 2010-11-11T12:00:00-05:00 2010-11-11T12:00:00-05:00 2010-11-11 16:00:00 2010-11-11 17:00:00 2010-11-11 17:00:00 2010-11-11T11:00:00-05:00 2010-11-11T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-11 11:00:00 2010-11-11 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Accurate Emulators for Large-Scale Computer Experiments]]> 27187 TITLE: Accurate Emulators for Large-Scale Computer Experiments

SPEAKER:  Ben Haaland

ABSTRACT:

A multistep procedure is introduced to statisticians for modeling large-scale computer experiments. In practice, the procedure shows substantial improvements in overall accuracy. We introduce the terms nominal and numeric error and decompose the overall error of an emulator into nominal and numeric portions. For the multistep procedure, we develop bounds on the numeric and nominal error. These bounds show that substantial gains in overall accuracy can be attained with the multistep approach.

]]> Anita Race 1 1289314010 2010-11-09 14:46:50 1475891604 2016-10-08 01:53:24 0 0 event 2010-11-12T11:00:00-05:00 2010-11-12T12:00:00-05:00 2010-11-12T12:00:00-05:00 2010-11-12 16:00:00 2010-11-12 17:00:00 2010-11-12 17:00:00 2010-11-12T11:00:00-05:00 2010-11-12T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-12 11:00:00 2010-11-12 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Exact Simulation of the Equilibrium Distribution of Reflected Stochastic Networks with Levy Input]]> 27187 TITLE: Exact Simulation of the Equilibrium Distribution of Reflected Stochastic
Networks with Levy Input

SPEAKER:  Jose Blanchet

ABSTRACT:

Reflected stochastic networks arise in the analysis of a large class of queueing systems. The most popular model of this type is perhaps reflected Brownian motion, which arises in the heavy-traffic analysis of generalized Jackson networks. In this talk we discuss Monte Carlo
simulation strategies for the steady-state analysis of reflected stochastic networks. In particular, we show how to exactly (i.e. without bias) simulate the equilibrium distribution of a reflected stochastic network with compound Poisson input and how to provide samples that are
close (with explicit and controlled error bounds) to both the transient and the steady-state distribution of reflected Brownian motion in the positive orthant. (Joint work with Xinyun Chen.)

]]> Anita Race 1 1289398350 2010-11-10 14:12:30 1475891604 2016-10-08 01:53:24 0 0 event 2010-11-15T11:00:00-05:00 2010-11-15T12:00:00-05:00 2010-11-15T12:00:00-05:00 2010-11-15 16:00:00 2010-11-15 17:00:00 2010-11-15 17:00:00 2010-11-15T11:00:00-05:00 2010-11-15T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-15 11:00:00 2010-11-15 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Integrated Stochastic Resource Planning of Human Capital Supply Chains]]> 27187 TITLE: Integrated Stochastic Resource Planning of Human Capital Supply Chains

SPEAKER:   Mark Squillante

ABSTRACT:

In this talk we present an integrated suite of operations research models and methods that supports the effective and efficient management and planning of human capital supply chains by addressing distinct features and characteristics of human talent and skills. This consists of solutions for: (1) the statistical forecasting of future demand and resource requirements; (2) a new form of risk-based stochastic resource capacity planning; (3) the stochastic modeling and optimization/control of supply evolutionary dynamics over time; (4) a new form of optimal multi-skill supply-demand matching; and (5) the stochastic optimization of business decisions to manage resource shortages and overages. These solutions include contributions in the areas of stochastic models and stochastic optimization/control.  The suite of models and methods constitutes an end-to-end solution that is deployed as an important part of the human capital management and planning process within IBM.

Mark S. Squillante is a Research Staff Member in the Mathematical Sciences Department at the IBM Thomas J. Watson Research Center, where he leads the Applied Probability and Stochastic Optimization team.  His research interests concern mathematical foundations of the analysis, modeling and optimization of the design and control of stochastic systems, including stochastic processes, applied probability, stochastic optimization and control, and their applications.  He is the author of many research articles across these areas, and has received several internal (IBM) and external research awards.  He is a Fellow of ACM and IEEE, and currently serves on the editorial boards of Operations Research, Stochastic Models and Performance Evaluation.

]]> Anita Race 1 1288785907 2010-11-03 12:05:07 1475891604 2016-10-08 01:53:24 0 0 event 2010-11-16T11:00:00-05:00 2010-11-16T12:00:00-05:00 2010-11-16T12:00:00-05:00 2010-11-16 16:00:00 2010-11-16 17:00:00 2010-11-16 17:00:00 2010-11-16T11:00:00-05:00 2010-11-16T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-16 11:00:00 2010-11-16 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Functional Regression Models]]> 27187 TITLE: Functional Regression Models

SPEAKER:  Hans-Georg Mueller

ABSTRACT:

Functional regression has emerged as a useful approach for the analysis of complex data that combine functional or longitudinal predictors with scalar or functional responses. A major emphasis has been the functional linear regression model, which allows to implement dimension reduction in a simple and straightforward way but may be too restrictive. We will discuss flexible extensions of this model. These include functional quadratic, polynomial and
additive models. Of special interest is differentiation with respect to a functional argument, for which additive models are particularly well suited. Another extension are local models, where the focus is on the dependency of a Gaussian process or its derivatives at a given time on the value of a predictor process at the same or a different time. The methods will be illustrated with densely as well as sparsely sampled functional data. This talk is based on joint work with Wenjing Yang and Fang Yao.

]]> Anita Race 1 1288084453 2010-10-26 09:14:13 1475891600 2016-10-08 01:53:20 0 0 event 2010-10-28T12:00:00-04:00 2010-10-28T13:00:00-04:00 2010-10-28T13:00:00-04:00 2010-10-28 16:00:00 2010-10-28 17:00:00 2010-10-28 17:00:00 2010-10-28T12:00:00-04:00 2010-10-28T13:00:00-04:00 America/New_York America/New_York datetime 2010-10-28 12:00:00 2010-10-28 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[A multiclass queueing model of limit order book dynamics]]> 27187 TITLE: A multiclass queueing model of limit order book dynamics

SPEAKER: Costis Maglaras

ABSTRACT:

The first part of the talk will offer a brief overview of algorithmic trading in the US equities market, touching, in passing, upon on the topic of high-frequency trading and the nature of current market structure. The emphasis will be on highlighting different facets of this area and discussing corresponding quantitative research problems. The second half of the talk describes a queueing model of limit order book dynamics, and explores questions of optimal limit order placement, market impact, and optimal trade execution.

Bio:
Costis Maglaras is the David and Lyn Silfen Professor of Business at Columbia University. His research focuses on quantitative pricing and revenue management, the economics, design and operations of service systems, and financial engineering. He is the author of many research articles spanning the theory and application of stochastic modeling in a variety of fields, more recently in pricing, risk management and valuation of multi-unit real estate portfolios, and in the design of portfolio trading systems and algorithms. He holds editorial positions in many of the flagship journals of his fields of study, he is the recipient of several research and teaching awards, and he teaches and serves as faculty director for the executive education course on Risk Management offered by Columbia Business School.

]]> Anita Race 1 1288605411 2010-11-01 09:56:51 1475891600 2016-10-08 01:53:20 0 0 event 2010-11-02T12:00:00-04:00 2010-11-02T13:00:00-04:00 2010-11-02T13:00:00-04:00 2010-11-02 16:00:00 2010-11-02 17:00:00 2010-11-02 17:00:00 2010-11-02T12:00:00-04:00 2010-11-02T13:00:00-04:00 America/New_York America/New_York datetime 2010-11-02 12:00:00 2010-11-02 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[A New Optimization Method for Machine Learning and Stochastic Optimization]]> 27187 TITLE: A New Optimization Method for Machine Learning and Stochastic Optimization

SPEAKER:  Jorge Nocedal

ABSTRACT:

We present a "semi-stochastic" Newton method motivated by machine learning problems with very large training sets as well as by the availability of powerful distributed computing environments. The method employs sampled Hessian information to accelerate convergence and enjoys convergence guarantees. We illustrate its performance on multiclass logistic models for the speech recognition system developed at Google. An extension of the method to the sparse L1 setting as well as a complexity analysis will also be presented.  This is joint work with Will Neveitt (Google), Richard Byrd (Colorado) and Gillian Chin (Northwestern).

Short Bio:
Jorge Nocedal is a professor in the IEMS and EECS departments at Northwestern University. He obtained a BS from the National University of Mexico and a PhD from Rice University. His research interests are in optimization and scientific computing, and in their application to machine learning, computer-aided design and financial engineering. He is the author (with Steve Wright) of the book "Numerical Optimization."

He is a SIAM Fellow, an ISI Highly Cited Researcher (Mathematics Category), and was an invited speaker at the 1998 International Congress of Mathematicians. He serves in the editorial board of Mathematical Programming, and in 2011 he will become editor-in-chief of SIAM Journal on Optimization. In 1998 he was appointed Bette and Neison Harris Professor of Teaching Excellence at Northwestern.


]]> Anita Race 1 1287391296 2010-10-18 08:41:36 1475891596 2016-10-08 01:53:16 0 0 event 2010-11-30T11:00:00-05:00 2010-11-30T12:00:00-05:00 2010-11-30T12:00:00-05:00 2010-11-30 16:00:00 2010-11-30 17:00:00 2010-11-30 17:00:00 2010-11-30T11:00:00-05:00 2010-11-30T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-30 11:00:00 2010-11-30 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Monitoring a Large Number of Data Streams via Thresholding]]> 27187 TITLE: Monitoring a Large Number of Data Streams via Thresholding

SPEAKER: Yajun Mei

ABSTRACT:

In the modern information age one often monitors a large number of data streams with the aim of offering the potential for early detection of a "trigger" event. In this talk, we are interested in detecting the event as soon as possible, but we do not know when the event will occur, nor do we know which subset of data streams will be affected by the event. Motivated by the applications in censoring sensor networks and by the case when one has a prior knowledge that at most r data streams will be affected, we propose scalable global monitoring schemes based on the sum of the local detection statistics that are "large" under either hard thresholding or top-r thresholding rules or both. The proposed schemes are shown to possess certain asymptotic optimality properties.

]]> Anita Race 1 1287482017 2010-10-19 09:53:37 1475891596 2016-10-08 01:53:16 0 0 event 2010-10-21T12:00:00-04:00 2010-10-21T13:00:00-04:00 2010-10-21T13:00:00-04:00 2010-10-21 16:00:00 2010-10-21 17:00:00 2010-10-21 17:00:00 2010-10-21T12:00:00-04:00 2010-10-21T13:00:00-04:00 America/New_York America/New_York datetime 2010-10-21 12:00:00 2010-10-21 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Statistical Methods for Analysis of Diffusion Weighted Magnetic Resonance Imaging]]> 27187 TITLE: Statistical Methods for Analysis of Diffusion Weighted Magnetic Resonance Imaging

SPEAKER: Sofia Olhede

ABSTRACT:

High angular resolution diffusion imaging data is the observed characteristic function for the local diffusion of water molecules in tissue. This data is used to infer structural information in brain imaging.  Non-parametric scalar measures are proposed to summarize such data, and to locally characterize spatial features of the diffusion probability density function (PDF), relying on the geometry of the characteristic function.  Summary statistics are defined so that their distributions are, to first order, both independent of nuisance parameters and analytically tractable.  The dominant direction of the diffusion at a spatial location (voxel) is determined, and a new set of axes are introduced in Fourier space. Variation quantified in these axes determines the local spatial properties of the diffusion density.  Non-parametric hypothesis tests for determining whether the diffusion is unimodal, isotropic or multi-modal are proposed.  More subtle characteristics of white-matter microstructure, such as the degree of anisotropy of the PDF and symmetry compared with a variety of asymmetric PDF alternatives, may
be ascertained directly in the Fourier domain without parametric assumptions on the form of the diffusion~PDF.  We simulate a set of diffusion processes and characterize their local properties using the newly introduced summaries.  We show how complex white-matter
structures across multiple voxels exhibit clear ellipsoidal and asymmetric structure in simulation, and assess the performance of the statistics in clinically-acquired magnetic resonance imaging data.  Joint work with Brandon Whitcher, GSK.

BIO: Sofia C. Olhede was awarded the M.Sci. and Ph.D. degrees in mathematics from Imperial College London, London, U.K., in 2000 and 2003, respectively. She was a Lecturer (2002�2006) and Senior Lecturer (2006�2007) with the Mathematics Department, Imperial College London. In 2007, she joined the Department of Statistical Science, University College
London, where she is Pearson Professor of Statistics and Honorary Professor of Computer Science. Her research interests include the analysis of nonstationary time series, inhomogeneous random fields and applications in geoscience, medical imaging and oceanography. Prof. Olhede is an Associate Editor of the Journal of the Royal Statistical Society, Series B (Statistical Methodology) and of the IEEE Transactions on Signal
Processing. She is a member of the Programme Committee of the International Centre for Mathematical Sciences, and is an Isaac Newton Institute Correspondent.

]]> Anita Race 1 1286965856 2010-10-13 10:30:56 1475891592 2016-10-08 01:53:12 0 0 event 2010-10-14T12:00:00-04:00 2010-10-14T13:00:00-04:00 2010-10-14T13:00:00-04:00 2010-10-14 16:00:00 2010-10-14 17:00:00 2010-10-14 17:00:00 2010-10-14T12:00:00-04:00 2010-10-14T13:00:00-04:00 America/New_York America/New_York datetime 2010-10-14 12:00:00 2010-10-14 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[High dimensional inverse covariance matrix estimation]]> 27187 TITLE: High dimensional inverse covariance matrix estimation

SPEAKER:  Ming Yuan

ABSTRACT:

More and more often in practice, one needs to estimate a high dimensional covariance matrix. In this talk, we discuss how this task is often related to the sparsity of the inverse covariance matrix. In particular, we consider estimating a (inverse) covariance matrix that can be well approximated by ``sparse'' matrices. Taking advantage of the connection between multivariate linear regression and entries of the inverse covariance matrix, we introduce an estimating procedure that can effectively exploit such ``sparsity''.  The proposed method can be computed using linear programming and therefore has the potential to be used in very high dimensional problems. Oracle inequalities are established for the estimation error in terms of several operator norms, showing that the method is adaptive to different types of sparsity of the problem.

]]> Anita Race 1 1286356541 2010-10-06 09:15:41 1475891551 2016-10-08 01:52:31 0 0 event 2010-10-07T12:00:00-04:00 2010-10-07T13:00:00-04:00 2010-10-07T13:00:00-04:00 2010-10-07 16:00:00 2010-10-07 17:00:00 2010-10-07 17:00:00 2010-10-07T12:00:00-04:00 2010-10-07T13:00:00-04:00 America/New_York America/New_York datetime 2010-10-07 12:00:00 2010-10-07 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Robust Risk Management]]> 27187 TITLE: Robust Risk Management

SPEAKER: Apostolos Fertis

ABSTRACT:

Coherent risks can be expressed as the worst-case expectation when the probability distribution varies in some uncertainty set, according to the representation theorem. Very often, randomness can be divided in two stages, and there is additional information about the possible first stage scenarios. Traditional coherent risks, such as the CVaR, fail to make use of this information. In this talk, we introduce a new class of risk measures, called robust risk measures, which combine the uncertainty set of a traditional risk measure with the additional information about the first stage scenarios. We state and prove a representation theorem for the robust risk measures, which facilitates their computation. We define and show how to compute the Robust CVaR, the robust risk constructed based on CVaR. We compare the optimal-Robust CVaR and optimal-CVaR portfolios under diverse scenarios constructed using real New York Stock Exchange (NYSE) and NASDAQ data from 2005 to 2010.

Bio:
Apostolos Fertis completed his PhD at the Electrical Engineering and Computer Science Department of the Massachusetts Institute of Technology in 2009.  Currently he is a researcher at the Institute for Operations Research (IFOR) in Zurich.
In  PhD thesis,  under the supervision of Professor Dimitris Bertsimas, he investigated the application of the robust optimization concept in confronting the uncertainty in the samples used to produce statistical estimates. In January 2010, he initiated the "Robust Risk Management" research project at theIFOR. The project aspires to introduce a new idea in uncertainty management by combining traditional risk management techniques with robust optimization.

]]> Anita Race 1 1286363264 2010-10-06 11:07:44 1475891551 2016-10-08 01:52:31 0 0 event 2010-11-09T11:00:00-05:00 2010-11-09T12:00:00-05:00 2010-11-09T12:00:00-05:00 2010-11-09 16:00:00 2010-11-09 17:00:00 2010-11-09 17:00:00 2010-11-09T11:00:00-05:00 2010-11-09T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-09 11:00:00 2010-11-09 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Perfect Sampling of Stochastic Perpetuities]]> 27187 TITLE: Perfect Sampling of Stochastic Perpetuities

SPEAKER:  Jose Blanchet

ABSTRACT:

A stochastic perpetuity is the net present value, with i.i.d. random discount factors, of an infinite stream of i.i.d. rewards in time. Under reasonable assumptions on the rewards and discounts we describe how to generate exact (unbiased) samples of stochastic perpetuities. The algorithm is based on a variation of dominated coupling from the past. The dominating process involves exact sampling of the delay sequence of a single server queue starting from the distant past. (This is joint work with K. Sigman.)

]]> Anita Race 1 1284970756 2010-09-20 08:19:16 1475891542 2016-10-08 01:52:22 0 0 event 2010-09-23T12:00:00-04:00 2010-09-23T13:00:00-04:00 2010-09-23T13:00:00-04:00 2010-09-23 16:00:00 2010-09-23 17:00:00 2010-09-23 17:00:00 2010-09-23T12:00:00-04:00 2010-09-23T13:00:00-04:00 America/New_York America/New_York datetime 2010-09-23 12:00:00 2010-09-23 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Generalized intersection cuts and a new cut generating paradigm]]> 27187 TITLE:   Generalized intersection cuts and a new cut generating paradigm

SPEAKER:  Egon Balas

ABSTRACT:

Intersection cuts are generated from a polyhedral cone and a convex set S whose interior contains no feasible integer point. We generalize these cuts by replacing the cone with a more general polyhedron  C.  The resulting generalized intersection cuts dominate the original ones. This leads to a new cutting plane paradigm under which one generates and stores the intersection points of the extreme rays of C  with the boundary of S rather than the cuts themselves. These intersection points can then be used to generate deeper cuts in a non-recursive fashion.
(This talk is based on joint work with Francois Margot)


Bio:
Egon Balas is University Professor of Industrial Administration and Applied Mathematics, as well as the Thomas Lord Professor of Operations Research, at Carnegie Mellon University. He has a doctorate in Economic Science from the University of Brussels and a doctorate in Mathematics from the University of Paris.

Professor Balas's research interests are in mathematical programming, primarily integer and combinatorial optimization. He has played a leading role in the developmant of enumerative and cutting plane techniques for 0-1 programming, and is mainly known as the developer of the approach called disjunctive programming or lift-and-project. He has also developed scheduling algorithms and software. Dr. Balas has served or is serving on the editorial boards of Operations Research, Discrete Applied Mathematics, the Journal of Combinatorial Optimization, Computational Optimization and Applications, the European Journal of Operational Research, Annals of Operations Research etc. In 1980 Dr. Balas received the US Senior Scientist Award of the Alexander von Humboldt Foundation; in 1995 he received the John von Neumann Theory Prize of INFORMS; and in 2001 he was the first American to be awarded the EURO Gold Medal of the European Association of Operational Research Societies.

 

]]> Anita Race 1 1285317947 2010-09-24 08:45:47 1475891542 2016-10-08 01:52:22 0 0 event 2010-10-12T12:00:00-04:00 2010-10-12T13:00:00-04:00 2010-10-12T13:00:00-04:00 2010-10-12 16:00:00 2010-10-12 17:00:00 2010-10-12 17:00:00 2010-10-12T12:00:00-04:00 2010-10-12T13:00:00-04:00 America/New_York America/New_York datetime 2010-10-12 12:00:00 2010-10-12 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Recent Advances in Centralized and Decentralized Multi-Location Transshipment Problems]]> 27187 TITLE:  Recent Advances in Centralized and Decentralized Multi-Location Transshipment Problems

SPEAKER:  Dr. Michal Tzur

ABSTRACT:

The multi-location replenishment and transshipment problem is concerned with several retailers facing random demand for the same item at distinct markets, that may use transshipments to eliminate excess inventory/shortages after demand realization. For a centralized system we summarize recent advances related to operational and design aspects of the problem. Our main focus in this talk is on the decentralized system, in which the retailers operate to maximize their own profit. This causes incentive problems that prevent coordination, even with two retailers who may pay each other for transshipped units. We propose a new mechanism based on a transshipment fund which is the first to coordinate the system, in a fully non-cooperative setting, for all instances of two retailers as well as all instances of any number of retailers. The computation and information requirements of this mechanism are realistic and relatively modest. We also present necessary and sufficient conditions for coordination and prove they are always satisfied with our mechanism.

Bio:  

Michal Tzur is an Associate Professor in the Industrial Engineering department at the Faculty of Engineering at Tel Aviv University in Israel, where she served as the head of the undergraduate program and as the department chair. Michal joined Tel Aviv University after spending three years at the Operations and Information Management department at Wharton School at the University of Pennsylvania. She received her B.A. from Tel Aviv University and her M. Phil and Ph.D. from Columbia University. During the years 2002-2004 she visited the IEMS department at Northwestern University. Her areas of expertise are supply chain management, inventory routing and transportation.

]]> Anita Race 1 1284544033 2010-09-15 09:47:13 1475891538 2016-10-08 01:52:18 0 0 event 2010-09-22T14:00:00-04:00 2010-09-22T15:00:00-04:00 2010-09-22T15:00:00-04:00 2010-09-22 18:00:00 2010-09-22 19:00:00 2010-09-22 19:00:00 2010-09-22T14:00:00-04:00 2010-09-22T15:00:00-04:00 America/New_York America/New_York datetime 2010-09-22 02:00:00 2010-09-22 03:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Inference for unlabelled graphs]]> 27187 TITLE: Inference for unlabelled graphs

SPEAKER: Professor Peter Bickel

ABSTRACT:

A great deal of attention has recently been paid to determining sub-communities on the basis of relations, corresponding to edges, between individuals, corresponding to vertices   of an unlabelled graph (Newman, SIAM Review 2003; Airoldi et al JMLR 2008; Leskovec &  Kleinberg et al SIGKDD 2005). We have developed a nonparametric framework for probabilistic ergodic models of infinite unlabelled graphs (PNAS2009)and made some connections with modularities arising in the physics literature   and community models in the social sciences.A fundamental difficulty in implementing these procedures is  computational complexity.We  develop   approaches  which  bypass these difficulties.

 (This is joint work with Aiyou Chen and Liza Levina)

]]> Anita Race 1 1283853814 2010-09-07 10:03:34 1475891535 2016-10-08 01:52:15 0 0 event 2010-09-14T13:00:00-04:00 2010-09-14T14:00:00-04:00 2010-09-14T14:00:00-04:00 2010-09-14 17:00:00 2010-09-14 18:00:00 2010-09-14 18:00:00 2010-09-14T13:00:00-04:00 2010-09-14T14:00:00-04:00 America/New_York America/New_York datetime 2010-09-14 01:00:00 2010-09-14 02:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Health Systems Seminar Series with Dr. Julie Simmons Ivy]]> 27187 TITLE:  When to Respond: A Multi-Agent Stochastic Alert Threshold Model for Declaring a Disease Outbreak

SPEAKER:  Julie Simmons Ivy, Associate Professor, Edward P. Fitts Department of Industrial and Systems Engineering, North Carolina State University 

ABSTRACT:

Influenza pandemics are considered one of the most significant and widely spread threats to public health. In this research, we explore the relationship between local and state health departments with respect to issuing alerts and responding to a potential disease outbreak such as influenza. We modeled the public health system as a multi-agent (or decentralized) partially observable Markov decision process where local and state health departments are decision makers. The model is used to determine when local and state decision makers should issue an alert or initiate mitigation actions such as vaccination in response to the existence of a disease threat. The model incorporates the fact that health departments have imperfect information about the exact number of infected people. The objective of the model is to minimize both false alerts and late alerts while identifying the optimal timing for alerting decisions. Providing such a balance between false and late alerts has the potential to increase the credibility and efficiency of the public health system while improving immediate response and care in the event of a public health emergency. Using data from the 2009-2010 H1N1 influenza outbreak to estimate model parameters including observations and transition probabilities, computational results for near optimal solutions are obtained.  In order to gain insight regarding the structure of optimal policies at the local and state levels, various model parameters including false and late alerting costs are explored. 

This research is a part of the North Carolina Preparedness and Emergency Response Research Center (NCPERRC) and was supported by the Centers for Disease Control and Prevention (CDC) Grant 1PO1 TP 000296-02.

]]> Anita Race 1 1283853597 2010-09-07 09:59:57 1475891535 2016-10-08 01:52:15 0 0 event Georgia Tech's Health Systems Institute welcomes Dr. Julie Simmons Ivy, on "When to Respond: A Multi-Agent Stochastic Alert Threshold Model for Declaring a Disease Outbreak."

]]>
2010-10-06T12:00:00-04:00 2010-10-06T13:00:00-04:00 2010-10-06T13:00:00-04:00 2010-10-06 16:00:00 2010-10-06 17:00:00 2010-10-06 17:00:00 2010-10-06T12:00:00-04:00 2010-10-06T13:00:00-04:00 America/New_York America/New_York datetime 2010-10-06 12:00:00 2010-10-06 01:00:00 America/New_York America/New_York datetime <![CDATA[]]> <![CDATA[Dr. Julie Simmons Ivy]]> <![CDATA[Health Systems Institute at Georgia Tech and Emory University]]>
<![CDATA[Advances in multistage optimization]]> 27187 TITLE:  Advances in multistage optimization

SPEAKER:  Dimitris Bertsimas (Boeing Prof. of OR)

ABSTRACT:

In this presentation, we show a significant role that symmetry, a fundamental concept in convex geometry, plays in determining the power of robust and finitely adaptable solutions in multi-stage stochastic and adaptive optimization problems. We consider a fairly general class of multi-stage mixed integer stochastic and adaptive optimization problems and propose a good approximate solution policy with performance guarantees that depend on the
geometric properties such as symmetry of the uncertainty sets. In particular, we show that a class of finitely adaptable solutions is a good approximation for both the multi-stage stochastic as well as the adaptive optimization problem. A finitely adaptable solution specifies a small set of solutions for each stage and the solution policy implements the best solution from the given
set depending on the realization of the uncertain parameters in the past stages. To the best of our knowledge, these are the first approximation results for the multi-stage problem in such generality.    (joint work with Vineet Goyal, Columbia University and Andy Sun, MIT)

Bio:

Dimitris Bertsimas is currently the Boeing Professor of Operations Research  and the
codirector of the Operations Research Center  at the Massachusetts Institute  of Technology.
He has  received a BS   in  Electrical Engineering and Computer Science at the National
Technical  University of Athens, Greece in 1985, a MS  in Operations Research  at MIT  in
1987, and a Ph.D in Applied  Mathematics and Operations Research at MIT in 1988.
Since 1988, he has been in the MIT faculty.

]]> Anita Race 1 1282561227 2010-08-23 11:00:27 1475891531 2016-10-08 01:52:11 0 0 event 2010-09-14T12:00:00-04:00 2010-09-14T13:00:00-04:00 2010-09-14T13:00:00-04:00 2010-09-14 16:00:00 2010-09-14 17:00:00 2010-09-14 17:00:00 2010-09-14T12:00:00-04:00 2010-09-14T13:00:00-04:00 America/New_York America/New_York datetime 2010-09-14 12:00:00 2010-09-14 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Reducing Operating Room Labor Costs: Capturing Workload Information & Dynamic Adjustments of Staffing Level]]> 27187 TITLE:   Reducing Operating Room Labor Costs: Capturing Workload Information & Dynamic Adjustments of Staffing Level

SPEAKER:    Professor Polly Biyu He

ABSTRACT:

We study the problem of setting nurse staffing levels in hospital operating rooms when there is uncertainty about the daily workload. We demonstrate in this healthcare service setting how information availability and choices of decision models affect a newsvendor's performance. We develop empirical models to predict the daily workload distribution and study how its mean and variance change with the information available. In particular, we consider different information sets available at the time of decision: no information, information on number of cases, and information on number and types of elective cases. We use these models to derive optimal staffing rules based on historical data from a US teaching hospital and prospectively test the performance of these rules. Our empirical results suggest that hospitals could potentially reduce their staffing costs by an average of 39-49% (depending on the absence or presence of emergency cases) by deferring the staffing decision until procedure-type information is available. However, in reality, contractual and scheduling constraints often require operating room managers to reserve staffed hours several months in advance, when little information about the cases is known. This motivates us to consider the problem of adjusting the staffing level given information updates. Specifically, we develop decision models that allow the OR manager to adjust the staffing level with some adjustment costs when he or she has better information. We study how adjustment costs affect the optimal staffing policy and the value of having the flexibility to adjust staffing. We also demonstrate how to implement our adjustment policies by applying the optimal decision rules derived from our models to the hospital data.

Joint work with Stefano Zenios, Franklin Dexter and Alex Macario.

 

 

]]> Anita Race 1 1283934867 2010-09-08 08:34:27 1475891531 2016-10-08 01:52:11 0 0 event 2010-09-16T12:00:00-04:00 2010-09-16T13:00:00-04:00 2010-09-16T13:00:00-04:00 2010-09-16 16:00:00 2010-09-16 17:00:00 2010-09-16 17:00:00 2010-09-16T12:00:00-04:00 2010-09-16T13:00:00-04:00 America/New_York America/New_York datetime 2010-09-16 12:00:00 2010-09-16 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[PRODUCTION PLANNING MODELS WITH RESOURCES SUBJECT TO CONGESTION]]> 27187 TITLE:   PRODUCTION PLANNING MODELS WITH RESOURCES SUBJECT TO CONGESTION

SPEAKER:   Reha Uzsoy

ABSTRACT:

A fundamental difficulty in developing effective production planning models has been accurately reflecting the nonlinear dependency between workload and lead times. We develop a mathematical programming model for production planning in multiproduct, single stage systems that captures the nonlinear dependency between workload and lead times. We then use outer linearization of this nonlinear model to obtain a linear programming formulation and extend it to multistage systems. Extensive computational experiments validate the approach and compare its results to conventional models that assume workload-independent planning lead times. We conclude with some future directions including applications to integrated planning of production starts and safety stocks.

Bio: Reha Uzsoy is Clifton A. Anderson Distinguished Professor in the Edward P. Fitts Department of Industrial and Systems Engineering at North Carolina State University. He holds BS degrees in Industrial Engineering and Mathematics and an MS in Industrial Engineering from Bogazici University, Istanbul, Turkey. He received his Ph.D in Industrial and Systems Engineering in 1990 from the University of Florida. His teaching and research interests are in production planning, scheduling, and supply chain management. He is the author of one book, two edited books, and over eighty refereed journal publications. Before coming to the US he worked as a production engineer with Arcelik AS, a major appliance manufacturer in Istanbul, Turkey. He has also worked as a visiting researcher at Intel Corporation and IC Delco. His research has been supported by the National Science Foundation, Intel Corporation, Hitachi Semiconductor, Harris Corporation, Kimberly Clark, Union Pacific, Ascension Health and General Motors. He was named a Fellow of the Institute of Industrial Engineers in 2005, Outstanding Young Industrial Engineer in Education in 1997 and a University Faculty Fellow by Purdue University in 2001, and has received awards for both undergraduate and graduate teaching.

]]> Anita Race 1 1282732243 2010-08-25 10:30:43 1475891531 2016-10-08 01:52:11 0 0 event 2010-09-21T12:00:00-04:00 2010-09-21T13:00:00-04:00 2010-09-21T13:00:00-04:00 2010-09-21 16:00:00 2010-09-21 17:00:00 2010-09-21 17:00:00 2010-09-21T12:00:00-04:00 2010-09-21T13:00:00-04:00 America/New_York America/New_York datetime 2010-09-21 12:00:00 2010-09-21 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Graph-based methods for efficiently constructing factorial designs]]> 27187 SIAC Seminar

TITLE: Graph-based methods for efficiently constructing factorial designs

SPEAKER: Dr. Abhishek K. Shrivastava

ABSTRACT:

Fractional factorial designs are among the most popular class of experimental designs among practitioners. Usually, a factorial design is selected after comparing all the designs in a catalog of designs for a given size. Designs in these catalogs should be distinct under relabeling of factors, level labels and run order, i.e., they should be mutually non-isomorphic. Testing two designs for isomorphism is computationally hard, and construction of non-isomorphic catalogs is even tougher with the large number of designs that need to be compared for isomorphism. In this talk, I will present a new approach for solving design isomorphism by representing designs as graphs. The resulting graph isomorphism problem can be efficiently solved using methods available in literature. Further, in the case of regular designs, I will show how these graph representations can be exploited for speeding up the efficiency of catalog construction by reducing the number of isomorphism tests. I will demonstrate the gains from this approach by presenting results for 2-level regular fractional factorial and 2-level split-plot designs.

 

Bio:

Abhishek K. Shrivastava is an Assistant Professor in the Department of Manufacturing Engineering and Engineering Management in City University of Hong Kong, Hong Kong. He received his B. Tech. (Honors) in Industrial Engineering from I.I.T. Kharagpur, India, in 2003, and his Ph.D. in Industrial Engineering from Texas A&M University, College Station, USA, in 2009. His research interests are in statistical modeling and analysis of complex systems, design of experiments and rare event detection. He is a member of INFORMS, IIE, ASA and IMS.

]]> Anita Race 1 1281528233 2010-08-11 12:03:53 1475891527 2016-10-08 01:52:07 0 0 event 2010-08-26T13:00:00-04:00 2010-08-26T14:30:00-04:00 2010-08-26T14:30:00-04:00 2010-08-26 17:00:00 2010-08-26 18:30:00 2010-08-26 18:30:00 2010-08-26T13:00:00-04:00 2010-08-26T14:30:00-04:00 America/New_York America/New_York datetime 2010-08-26 01:00:00 2010-08-26 02:30:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[ISyE New MS Student Information Session 1]]> 27215 The information session for new ISyE Masters students will be held in the Executive Classroom (Room 228) of the main ISyE Building.

In addition to this session, there is one on Thursday, August 19th from 1-2pm at the same location.  Students only need to attend one of the sessions.

]]> Mike Alberghini 1 1279710237 2010-07-21 11:03:57 1475891517 2016-10-08 01:51:57 0 0 event Info session for new ISyE Masters students

]]>
2010-08-17T02:30:00-04:00 2010-08-17T03:30:00-04:00 2010-08-17T03:30:00-04:00 2010-08-17 06:30:00 2010-08-17 07:30:00 2010-08-17 07:30:00 2010-08-17T02:30:00-04:00 2010-08-17T03:30:00-04:00 America/New_York America/New_York datetime 2010-08-17 02:30:00 2010-08-17 03:30:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[ISyE New MS Student Information Session 2]]> 27215 The information session for new ISyE Masters students will be held in the Executive Classroom (Room 228) of the main ISyE Building.

In addition to this session, there is one on Tuesday, August 17th from 1:30-2:30pm at the same location.  Students only need to attend one of the sessions.

]]> Mike Alberghini 1 1279710371 2010-07-21 11:06:11 1475891517 2016-10-08 01:51:57 0 0 event information session for new ISyE Masters students

 

]]>
2010-08-19T02:00:00-04:00 2010-08-19T03:00:00-04:00 2010-08-19T03:00:00-04:00 2010-08-19 06:00:00 2010-08-19 07:00:00 2010-08-19 07:00:00 2010-08-19T02:00:00-04:00 2010-08-19T03:00:00-04:00 America/New_York America/New_York datetime 2010-08-19 02:00:00 2010-08-19 03:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[ISyE New PhD Student Information Session]]> 27215 The orientation session for new ISyE PhD students will be held in the Executive Classroom (Room 228) of the main ISyE Building.  There will be a pizza lunch in the atrium immediately after the session.

]]> Mike Alberghini 1 1279710510 2010-07-21 11:08:30 1475891517 2016-10-08 01:51:57 0 0 event information session for new ISyE PhD students 

]]>
2010-08-20T11:15:00-04:00 2010-08-20T13:00:00-04:00 2010-08-20T13:00:00-04:00 2010-08-20 15:15:00 2010-08-20 17:00:00 2010-08-20 17:00:00 2010-08-20T11:15:00-04:00 2010-08-20T13:00:00-04:00 America/New_York America/New_York datetime 2010-08-20 11:15:00 2010-08-20 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Alpha Alignment Factor: A Solution to the Underestimation of Risk for Optimized Active Portfolios]]> 27187 TITLE: Alpha Alignment Factor: A Solution to the Underestimation of Risk for Optimized Active Portfolios

SPEAKER: Dr. Anureet Saxena

ABSTRACT:

The underestimation of risk of optimized portfolios is a consistent criticism about risk models. Quantitative portfolio managers have historically used a variety of ad hoc techniques to overcome this issue in their investment processes. In this paper, we construct a theory explaining why risk models underestimate the risk of optimized portfolios. We show that the problem is not necessarily with a risk model, but is rather the interaction of expected returns, constraints, and a risk model in an optimizer. We develop an optimization technique that incorporates a dynamic Alpha Alignment Factor (AAF) into the factor risk model during the optimization process. Using actual portfolio manager backtests, we illustrate both how pervasive the underestimation problem can be and the effectiveness of the proposed AAF in correcting the bias of the risk estimates of optimized portfolios.

Speaker bio:
Dr. Anureet Saxena is a research associate at Axioma Inc. Prior to joining Axioma in 2008, he held research positions at Carnegie Mellon University (Egon Balas, mentor), Tata Institute of fundamental research (Narendra Karmarkar, mentor) and T.J. Watson Research Center (IBM).
His research interests include mixed integer linear and non-linear programming, stochastic programming and quantitative finance. He has published four scholarly articles in Mathematical Programming and has delivered more than thirty presentations at various professional meetings. Dr. Saxena is the recipient of the 2008 Gerald L. Thompson doctoral dissertation award in management science for his thesis titled Integer Programming, a Technology.  He holds MS and PhD in industrial administration from Tepper School of Business (CMU), BTech in computer science and engineering from IIT Bombay and is currently a level 2 candidate in the Chartered Financial Analyst (CFA) program.

]]> Anita Race 1 1271243899 2010-04-14 11:18:19 1475891478 2016-10-08 01:51:18 0 0 event Alpha Alignment Factor: A Solution to the Underestimation of Risk for Optimized Active Portfolios

]]>
2010-04-20T12:00:00-04:00 2010-04-20T13:00:00-04:00 2010-04-20T13:00:00-04:00 2010-04-20 16:00:00 2010-04-20 17:00:00 2010-04-20 17:00:00 2010-04-20T12:00:00-04:00 2010-04-20T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-20 12:00:00 2010-04-20 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Residual Updating Algorithms for Kernel Interpolation]]> 27187 TITLE: Residual Updating Algorithms for Kernel Interpolation

SPEAKER: Greg Fasshauer

ABSTRACT:

I will first present two scattered data approximation methods from a numerical analysis point of view: radial basis function or kernel interpolation and moving least squares approximation. Then I will introduce the idea of approximate moving least squares approximation and connect all three methods via a residual updating algorithm. In parallel I will attempt to point out connections to an analogous set of methods (kriging, local polynomial regression and higher-order kernels for density estimation) in statistics. The idea of residual updating will be illuminated both from a more analytical perspective and at the numerical linear algebra level where we have rediscovered an old algorithm due to Riley [1].
[1] J.D. Riley. Solving systems of linear equations with a positive definite, symmetric, but possibly ill-conditioned matrix. Mathematical Tables and Other Aids to Computation 9/51 (1955), 96–101.

Brief bio:
Greg Fasshauer
Professor, Associate Chair and Director of Undergraduate Studies
Illinois Institute of Technology
Department of Applied Mathematics
Chicago, IL 60616

Education and Positions
* Since 1997: Assistant, associate and full professor, Department of Applied Mathematics, IIT
* Ralph P. Boas Visiting Assistant Professor: Department of Mathematics, Northwestern University (1995-1997)
* Ph.D. (Mathematics): Vanderbilt University with Larry L. Schumaker (1995)
* M.A. (Mathematics): Vanderbilt University with Larry L. Schumaker (1993)
* Diplom (Mathematics) & Staatsexamen (Mathematics and English): University of Stuttgart with Klaus Höllig (1991)

Research Interests (currently supported by NSF)
* Meshfree approximation methods for multivariate approximation and their application
* Radial basis functions and positive definite kernels
* Approximation theory
* Computer-aided geometric design
* Spline theory
* Numerical solution of PDEs

Books and 40+ papers
* Meshfree Approximation Methods with MATLAB
Interdisciplinary Mathematical Sciences - Vol. 6 World Scientific Publishers, Singapore, 2007
* Progress on Meshless Methods (edited with A.J.M. Ferreira, E.J. Kansa, and V.M.A. Leitão)
Computational Methods in Applied Sciences, Vol. 11 Springer, Berlin, 2009

]]> Anita Race 1 1271663225 2010-04-19 07:47:05 1475891478 2016-10-08 01:51:18 0 0 event Residual Updating Algorithms for Kernel Interpolation

]]>
2010-04-21T11:00:00-04:00 2010-04-21T12:00:00-04:00 2010-04-21T12:00:00-04:00 2010-04-21 15:00:00 2010-04-21 16:00:00 2010-04-21 16:00:00 2010-04-21T11:00:00-04:00 2010-04-21T12:00:00-04:00 America/New_York America/New_York datetime 2010-04-21 11:00:00 2010-04-21 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Two pedagogical challenges: teaching mathematical modeling and choosing performance measures.]]> 27187 TITLE: Two pedagogical challenges: teaching mathematical modeling and choosing performance measures

SPEAKER:  Professor Steve Pollock

ABSTRACt:

Developing mathematical models to describe (in some useful way) operational situation  is a critical component of doing effective Operations Research.  Yet, for most students, learning and internalizing the craft (or art) of doing mathematical modeling is a difficult and often mysterious journey.  And this struggle can actually be made worse by using a normal classroom lecture-reading-studying mode of teaching. Moreover, the pedagogical challenge to those who try to develop and hone students' abilities to model is complicated when measures of performance normally used to judge "success" in this educational process are either not clear or possibly pernicious.  Indeed, choosing measures of performance for just about any operational problems is often problematic.

The talk will address these issues via personal experiences, anecdotes and opinions.  Any formal mathematics presented or discussed will be incidental, elementary or purposely unnecessarily complicated.

Biosketch:

Stephen M. Pollock is Herrick Emeritus Professor of Manufacturing and Professor Emeritus of Industrial and Operations Engineering at the University of Michigan.  He has been involved in applying operations research and decision analysis methods to understand and influence of a variety of operational phenomena involving: military search and detection, criminal recidivism, manufacturing process monitoring, sequential allocation of resources, predictive and proactive maintenance, sports, networks of queues, the stochastic behavior of infectious disease epidemics and the optimization of radiation oncology plans.
 
After receiving his Ph.D. at M.I.T. in 1964 he was a member of the technical staff at Arthur D. Little, Inc. before joining the faculty at the U.S. Naval Postgraduate School in 1965 and the University of Michigan in 1969.  He was chair of the IOE Department from 1981 through 1990.  In 1992 he was the recipient of the Stephen S. Attwood Award, the highest honor awarded to a faculty member by the College of Engineering. He has authored over 65 technical papers, co-edited two books, and has served as a consultant to over 30 industrial, governmental and service organizations.
 
Professor Pollock was Associate Editor and Area Editor of Operations Research, Senior Editor of IIE Transactions, Associate Editor of Management Science and on the editorial boards of other journals.  He has served on various advisory boards for the National Science Foundation, on the Army Science Board, and chaired and been member of many National Research Council committees, boards and panels.  He was President of the Operations Research Society of America in 1986 and awarded the 2001 INFORMS Kimball Medal for contributions to operations research and the management sciences.  He a fellow of INFORMS and the AAAS and is a member of the National Academy of Engineering.

]]> Anita Race 1 1271062245 2010-04-12 08:50:45 1475891478 2016-10-08 01:51:18 0 0 event Two pedagogical challenges: teaching mathematical modeling and choosing performance measures.

]]>
2010-04-13T11:00:00-04:00 2010-04-13T12:00:00-04:00 2010-04-13T12:00:00-04:00 2010-04-13 15:00:00 2010-04-13 16:00:00 2010-04-13 16:00:00 2010-04-13T11:00:00-04:00 2010-04-13T12:00:00-04:00 America/New_York America/New_York datetime 2010-04-13 11:00:00 2010-04-13 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Some recent developments in multi-state and degradation models for reliability inference]]> 27187 TITLE: Some recent developments in multi-state and degradation models for reliability inference

SPEAKER: Prof. Vijay Nair

ABSTRACT:

Traditional reliability analysis is based on time-to-failure data. In this talk, we discuss two different directions that lead to more informative reliability inference. The first is multi-state models. Challenges in statistical inference based on interval-censored data will be described, and methods for doing likelihood-based inference in semi-Markov models will be discussed. The second part of the talk will provide an overview of degradation models, describe a class of non-homogeneous Gaussian processes and some inference issues. This talk is based on joint work with current and former students Yang Yang and Xiao Wang.


Bio: Vijay Nair is the Donald A. Darling Professor of Statistics and Professor of Industrial and Operations Engineering at the University of Michigan, Ann Arbor. He has been Chair of the Department of Statistics sine 1998. Previously, he was a Research Scientist in the Mathematical Sciences and Operations Research Centers at Bell Laboratories for fifteen years.
His research interests include engineering statistics, information technology, quality and process improvement, industrial experiments, reliability and risk analysis, neuro-informatics, and statistical methods in behavioral intervention research. He also has extensive practical experience in the automotive, semiconductor, and telecommunications industries.
Vijay has a Bachelor’s degree in Economics from the University of Malaya (Malaysia) and a Ph. D. in Statistics from the University of California, Berkeley. He has been editor of Technometrtics, International Statistical Review, coordinating editor of the Journal of Statistical Planning and Inference, and has served on the editorial boards of many other leading statistics and quality journals. He is currently Vice President of the International Statistics Institute, President-elect of the International Society for Business and Industrial Statistics, and Chair of the Statistics Division of the American Society for Quality. He has chaired or co-chaired several panels of the National Academies and is a former chair of the Board of Trustees of the National Institute of Statistical Sciences. He is a Fellow of the American Association for the Advancement of Science, the American Statistical Association, the American Society for Quality, and the Institute of Mathematical Statistics.

]]> Anita Race 1 1271153687 2010-04-13 10:14:47 1475891478 2016-10-08 01:51:18 0 0 event Some recent developments in multi-state and degradation models for reliability inference

]]>
2010-04-15T12:00:00-04:00 2010-04-15T13:00:00-04:00 2010-04-15T13:00:00-04:00 2010-04-15 16:00:00 2010-04-15 17:00:00 2010-04-15 17:00:00 2010-04-15T12:00:00-04:00 2010-04-15T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-15 12:00:00 2010-04-15 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Warranty Prediction Based on Auxiliary Use-rate Information]]> 27187 TITLE:  Warranty Prediction Based on Auxiliary Use-rate Information

SPEAKER: Professor William Q. Meeker

ABSTRACT:

Usually the warranty data response used to make predictions of future failures is the number of weeks (or another unit of real time) in service. Use-rate information usually is not available (automobile warranty data are an exception, where both weeks in service and number of miles driven are available for units returned for warranty repair). With new technology, however, sensors and smart chips are being installed in many modern products ranging from computers and printers to automobiles and aircraft engines. Thus the coming generations of field data for many products will provide information on how the product has been used and the environment in which it was used. This paper was motivated by the need to predict warranty returns for a product with multiple failure modes. For this product, cycles-to-failure/use-rate information was available for those units that were connected to the network. We show how to use a cycles-to-failure model to compute predictions and prediction intervals for the number of warranty returns. We also present prediction methods for units not connected to the network. In order to provide insight into the reasons that use-rate models provide better predictions, we also present a comparison of asymptotic variances comparing the cycles-to-failure and time-to-failure models. 

Bio:   William Q. Meeker is a Professor of Statistics and Distinguished Professor of Liberal Arts and Sciences at Iowa State University. He is a Fellow of the American Statistical Association (ASA) and the American Society for Quality (ASQ) and a past Editor of Technometrics. He is co-author of the books Statistical Methods for Reliability Data with Luis Escobar (1998), and Statistical Intervals: A Guide for Practitioners with Gerald Hahn (1991), six book chapters, and of numerous publications in the engineering and statistical literature.  He has won the ASQ Youden prize four times and the ASQ Wilcoxon Prize three times. He was recognized by the ASA with their Best Practical Application Award in 2001 and by the ASQ Statistics Division’s with their W.G. Hunter Award in 2003. In 2007 he was awarded the ASQ Shewhart medal. He has done research and consulted extensively on problems in reliability data analysis, warranty analysis, reliability test planning, accelerated testing, nondestructive evaluation, and statistical computing.

]]> Anita Race 1 1270458473 2010-04-05 09:07:53 1475891473 2016-10-08 01:51:13 0 0 event Warranty Prediction Based on Auxiliary Use-rate Information

]]>
2010-04-08T12:00:00-04:00 2010-04-08T13:00:00-04:00 2010-04-08T13:00:00-04:00 2010-04-08 16:00:00 2010-04-08 17:00:00 2010-04-08 17:00:00 2010-04-08T12:00:00-04:00 2010-04-08T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-08 12:00:00 2010-04-08 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Performance Bounds for Large Scale Queueing Systems]]> 27187 TITLE:  Performance Bounds for Large Scale Queueing Systems

SPEAKER:  David Goldberg

ABSTRACT:

Parallel server queues arise in many applications, ranging from call centers to national security and health care. Understanding how these systems behave when the number of servers is large and the service distribution is non-Markovian is a difficult problem. In this talk, we resolve several open questions related to a certain heavy traffic scaling regime (Halfin-Whitt) for parallel server queues, which has recently been used in the modeling of call centers. In particular, we derive the asymptotics of the steady-state queue length for a very general class of service distributions. We also bound the large deviations behavior of the limiting steady-state queue length, and prove that the associated critical exponent takes a particularly simple form in certain cases. Our main proof technique involves bounding the multiserver queue between two simpler systems. These systems exhibit an interesting duality, and yield bounds of a very general nature, which may be useful in answering a range of questions related to the modeling and optimization of queues.

]]> Anita Race 1 1270458766 2010-04-05 09:12:46 1475891473 2016-10-08 01:51:13 0 0 event ]]> 2010-04-20T12:00:00-04:00 2010-04-20T13:00:00-04:00 2010-04-20T13:00:00-04:00 2010-04-20 16:00:00 2010-04-20 17:00:00 2010-04-20 17:00:00 2010-04-20T12:00:00-04:00 2010-04-20T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-20 12:00:00 2010-04-20 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Hereditary Portfolio Optimization with Memory]]> 27187 TITLE: Hereditary Portfolio Optimization with Memory

SPEAKER: Dr. Mou-Hsiung (Harry) Chang

ABSTRACT:

In this talk, we consider an infinite time horizon portfolio optimization problem in a market that consists of one savings account and one stock account whose unit price satisfies a nonlinear stochastic functional differential equation. Within the solvency region the investor is allowed to consume from the savings account and can make transactions between the two assets subject to paying capital-gain taxes as well as a fixed plus proportional transaction cost. The main objective is to seek an optimal consumption-investment strategy in order to maximize the expected utility from the total discounted consumption over the infinite time horizon. The portfolio optimization problem is formulated as a stochastic control problem that involves both the classical and impulse controls. A quasi-variational HJB inequality for the value function is derived and the verification theorem for the optimal investment consumption strategy is obtained. The value function is also shown to be the unique viscosity solution of the HJB inequality.

Bio: Dr. M. H. Chang is currently the manager of the Probability & Statistics Program and acting division chief of the Mathematical Sciences Division at the U. S. Army Research Office (ARO). Prior to joining ARO, Dr. Chang had been a tenured professor in the Department of Mathematical Sciences at the University of Alabama in Huntsville (UAH) for twenty-eight years and had served as Chair of the Mathematical Sciences Department for eight years. He received his B.S. in Applied Mathematics from National Chung-Hsing University (in Taiwan) and his M.S. and Ph.D. in Mathematics from the University of Rhode Island.

Dr. Chang publishes extensively in stochastic analysis, stochastic control, mathematical finance, and quantum Markov processes. He has published one research monograph “Stochastic Control of Hereditary Systems and Applications”, Volume 59 of the Stochastic Modeling and Applied Probability Series, Springer, New York, January 2008, ISBN 978-0-387-75805-3, more than 60 referred journal articles, and has made more than 80 invited presentations at professional conferences and universities. He is currently an associate editor for “Journal of Applied Mathematics and Stochastic Analysis” and “Stochastic Analysis and Applications”. He has also served as a referee for numerous mathematics and applied mathematics journals.

]]> Anita Race 1 1269429209 2010-03-24 11:13:29 1475891469 2016-10-08 01:51:09 0 0 event Hereditary Portfolio Optimization with Memory

]]>
2010-03-30T12:00:00-04:00 2010-03-30T13:00:00-04:00 2010-03-30T13:00:00-04:00 2010-03-30 16:00:00 2010-03-30 17:00:00 2010-03-30 17:00:00 2010-03-30T12:00:00-04:00 2010-03-30T13:00:00-04:00 America/New_York America/New_York datetime 2010-03-30 12:00:00 2010-03-30 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Efficiency of random search methods on huge-scale optimization problems]]> 27187 TITLE: Efficiency of random search methods on huge-scale optimization problems

SPEAKER:  Yurii Nesterov

ABSTRACT:

In this talk we describe the new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we suggest to apply an optimization technique based on random partial update of
decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly enough, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method, and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.

]]> Anita Race 1 1269429038 2010-03-24 11:10:38 1475891469 2016-10-08 01:51:09 0 0 event Efficiency of random search methods on huge-scale optimization problems

]]>
2010-04-01T12:00:00-04:00 2010-04-01T13:00:00-04:00 2010-04-01T13:00:00-04:00 2010-04-01 16:00:00 2010-04-01 17:00:00 2010-04-01 17:00:00 2010-04-01T12:00:00-04:00 2010-04-01T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-01 12:00:00 2010-04-01 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[QCF Day Symposium]]> 27187  

The 8th Annual QCF Day Symposium will be held April 2, 2010. The symposium will take place in Atlanta, Georgia. It is hosted by the Quantitative and Computational Finance program at the Georgia Institute of Technology.

The purpose of this symposium is to showcase the newest and most innovative approaches to quantitative finance used today. In previous years, over several hundreds of people have attended this symposium to listen to both academic and non-academic invited experts. By and large the talks concentrate on specific topics related to how Quantitative and Computational Finance is being used today. The speakers in previous years have included Senior Executives, Managing Directors, Vice Presidents, Group Directors, Quantitative Associates, Senior Economists and others from a wide range of organizations - Bank of America-Merrill Lynch, Deutsche Bank, Goldman Sachs, INVESCO, ING Investment Management, Morgan Stanley, NYFRB, Risk Metrics, S&P, SunGard, SunTrust Robinson Humphrey, UBS and Vanguard - as well as professors from Columbia, Toronto, Chicago, Princeton, and New York University.

There will be many opportunities to speak with students and professionals about the work environment and activities in the field of quantitative finance. The symposium is attended by students from the Georgia Tech MS QCF Program and students from educational programs strongly overlapping with the QCF area; by researchers and practitioners working within the QCF area who want to share their experiences and learn more from others in the QCF area; and by other individuals seeking information about the content of the very diverse QCF area.

 

To register for QCF Day, click HERE

For more information on the QCF program at Georgia Tech, please download our program brochure.

 

]]> Anita Race 1 1269593893 2010-03-26 08:58:13 1475891469 2016-10-08 01:51:09 0 0 event QCF Day Symposium

]]>
2010-04-02T09:30:00-04:00 2010-04-02T18:00:00-04:00 2010-04-02T18:00:00-04:00 2010-04-02 13:30:00 2010-04-02 22:00:00 2010-04-02 22:00:00 2010-04-02T09:30:00-04:00 2010-04-02T18:00:00-04:00 America/New_York America/New_York datetime 2010-04-02 09:30:00 2010-04-02 06:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[PRODUCTION SYSTEMS ENGINEERING: Problems, Solutions, and Applications]]> 27187 TITLE: PRODUCTION SYSTEMS ENGINEERING: Problems, Solutions, and Applications

SPEAKER: Professor S.M. Meerkov

ABSTRACT:

Production Systems Engineering (PSE) is an emerging branch of Engineering intended to uncover fundamental principles that govern production systems and utilize them for the purposes of analysis, continuous improvement, and design. In PSE, the machines are assumed to be unreliable and the buffers are finite. Under these assumptions, production lines are nonlinear stochastic systems. The study of their statics and dynamics is the goal of PSE. In this talk, the main problems of PSE and their solutions will be described along with several applications. In addition, the so-called PSE Toolbox, which implements the methods and algorithms developed, will be discussed. A demo of the toolbox can be found at http://www.ProductionSystemsEngineering.com

BIO: Semyon M. Meerkov received his MSEE degree from the Polytechnic of Kharkov, Ukraine, in 1962 and Ph.D. in Systems Science from the Institute of Control Sciences, Moscow, Russia, in 1966. He was with the Institute of Control Sciences until 1977. From 1979 to 1984 he was with the Department of Electrical and Computer Engineering, Illinois Institute of Technology, Chicago, IL. Since 1984 he has been a Professor at the Department of Electrical Engineering and Computer Science of the University of Michigan, Ann Arbor, MI. He has held visiting positions at UCLA (1978-1979), Stanford (1991), Technion, Israel (1997-1998 and 2008) and Tsinghua, China (2008). He was the Editor-in-Chief of Mathematical Problems in Engineering, Department Editor for Manufacturing Systems of IIE Transactions and Associate Editor of several other journals. His research interests are in Systems and Control with applications to Production Systems, Communication Networks, and Theory of Rational Behavior.

]]> Anita Race 1 1269953980 2010-03-30 12:59:40 1475891469 2016-10-08 01:51:09 0 0 event 2010-04-02T13:00:00-04:00 2010-04-02T14:00:00-04:00 2010-04-02T14:00:00-04:00 2010-04-02 17:00:00 2010-04-02 18:00:00 2010-04-02 18:00:00 2010-04-02T13:00:00-04:00 2010-04-02T14:00:00-04:00 America/New_York America/New_York datetime 2010-04-02 01:00:00 2010-04-02 02:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Semi-algebraic optimization theory]]> 27187 TITLE: Semi-algebraic optimization theory

SPEAKER: Adrian lewis

ABSTRACT:

Concrete optimization problems, while often nonsmooth, are not pathologically so. The class of "semi-algebraic" sets and functions - those arising from polynomial inequalities - nicely exemplifies nonsmoothness in practice. Semi-algebraic sets (and their generalizations) are common, easy to recognize, and richly structured, supporting powerful variational properties. In particular I will discuss a generic property of such sets - partial smoothness - and its relationship with a proximal algorithm for nonsmooth composite minimization, a versatile model for practical optimization.

Bio:
Adrian S. Lewis was born in England in 1962. He is a Professor at Cornell University in the School of Operations Research and Industrial Engineering. Following his B.A., M.A., and Ph.D. degrees from Cambridge, and Research Fellowships at Queens' College, Cambridge and Dalhousie University, Canada, he worked in Canada at the University of Waterloo (1989-2001) and Simon Fraser University (2001-2004). He is an Associate Editor of the SIAM Journal on Optimization, Mathematics of Operations Research, and the SIAM/MPS Book Series on Optimization, and is a Co-Editor for Mathematical Programming. He received the 1995 Aisenstadt Prize, from the Canadian Centre de Recherches Mathematiques, the 2003 Lagrange Prize for Continuous Optimization from SIAM and the Mathematical Programming Society, and an Outstanding Paper Award from SIAM in 2005. He co-authored "Convex Analysis and Nonlinear Optimization" with J.M. Borwein.

Lewis' research concerns variational analysis and nonsmooth optimization, with a particular interest in optimization problems involving eigenvalues.

]]> Anita Race 1 1269415422 2010-03-24 07:23:42 1475891469 2016-10-08 01:51:09 0 0 event Semi-algebraic optimization theory

]]>
2010-04-06T12:00:00-04:00 2010-04-06T13:00:00-04:00 2010-04-06T13:00:00-04:00 2010-04-06 16:00:00 2010-04-06 17:00:00 2010-04-06 17:00:00 2010-04-06T12:00:00-04:00 2010-04-06T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-06 12:00:00 2010-04-06 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Supply Disruptions and the Reverse Bullwhip Effect]]> 27187 TITLE:  Supply Disruptions and the Reverse Bullwhip Effect

SPEAKER: Zuo-Jun Max Shen (Faculty Candidate)

ABSTRACT:

We postulate the existence of a “reverse bullwhip effect” (RBWE) that occurs during and immediately after supply disruptions. Whereas the classical bullwhip effect (BWE) describes an increase in demand/order volatility as one moves upstream in the supply chain, the RBWE describesthe opposite. We motivate our analysis by an example involving gasoline-buying patterns following Hurricane Katrina in 2005. We then present theoretical and empirical evidence for a RBWE that occurs in a variety of situations involving supply uncertainty. RBWE has great implication in service system design and management, so we examine causes of RBWE, discuss its impact, and suggest strategies for mitigating it.

Bio:
Zuo-Jun (Max) Shen received his Ph.D. from Northwestern University. He has been active in the following research areas: integrated supply chain design and management, market mechanism design, applied optimization, and decision making with limited information. He is currently on the editorial/advisory board for several leading journals. He received the CAREER award from National Science Foundation in 2003.

]]> Anita Race 1 1269853168 2010-03-29 08:59:28 1475891469 2016-10-08 01:51:09 0 0 event 2010-04-07T12:00:00-04:00 2010-04-07T13:00:00-04:00 2010-04-07T13:00:00-04:00 2010-04-07 16:00:00 2010-04-07 17:00:00 2010-04-07 17:00:00 2010-04-07T12:00:00-04:00 2010-04-07T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-07 12:00:00 2010-04-07 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Statistical Shape Analysis of Manufacturing Data]]> 27187 TITLE: Statistical Shape Analysis of Manufacturing Data

SPEAKER: Professor Enrique del Castillo

ABSTRACT:

We show how Statistical Shape Analysis, a set of techniques used to model the shapes of biological and other kind of objects in the natural sciences, can be used also to model the geometric shape of a manufactured part. We first review Procrustes-based methods, and emphasize possible solutions to the basic problem of having corresponding, or matching, labels in the measured ``landmarks", the locations of the measured points on each part acquired with a CMM or similar instrument. The analysis of experiments with shape responses is discussed next.  The usual approach in practice is to estimate the form error of the part and conduct an ANOVA on the form errors. Instead, an F ANOVA test due to Goodall and a new permutation ANOVA test for shapes are presented. Real data sets as well as simulated shape data of interest in manufacturing were used to perform power comparisons for 2 and 3 dimensional shapes. The ANOVA on the form errors was found to have poor performance in detecting mean shape differences in circular and cylindrical parts. The ANOVA F test and the Permutation ANOVA  test provide highest power to detect differences in the mean shape. It is shown how these tests can also be applied to general "free form" shapes of parts where no standard definition of form error exists in manufacturing practice. New visualization tools, including main effect and interaction plots for shapes and deviation from nominal plots are presented to help interpreting the results of experiments where the response is a shape.

Bio: 

Dr. Castillo is a Distinguished Professor of Industrial Engineering.  Dr. Castillo also holds a joint appointment with the Department of Statistics.  He is an author of over 85 journal papers and 2 textbooks and a former NSF CAREER awardee, Fulbright Scholar, and the Editor in Chief of the Journal of Quality Technology (2006-2009).

]]> Anita Race 1 1268807146 2010-03-17 06:25:46 1475891465 2016-10-08 01:51:05 0 0 event Statistical Shape Analysis of Manufacturing Data

]]>
2010-03-19T13:00:00-04:00 2010-03-19T14:00:00-04:00 2010-03-19T14:00:00-04:00 2010-03-19 17:00:00 2010-03-19 18:00:00 2010-03-19 18:00:00 2010-03-19T13:00:00-04:00 2010-03-19T14:00:00-04:00 America/New_York America/New_York datetime 2010-03-19 01:00:00 2010-03-19 02:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Workshop on Computer Experiments]]> 27187 1. Opening: Overview on Computer Experiments

   Speaker: Professor Jeff Wu (ISyE, Gatech)

 

2. Title: Multi-Layer Designs for Computer Experiments

   Speaker: Professor Roshan J. Vengazhiyil (ISyE, Gatech)

Abstract: Computer experiments play a major role in the modern era of scientific and technological development. In designing computer experiments, Latin hypercube designs (LHDs) are widely used. However, finding an optimal LHD is computationally cumbersome. On the other hand, although many optimal designs are well known for physical experiments, the redundancy of design points make them undesirable for computer experiments. In this work, we present a new class of space-filling designs developed by splitting two-level full or fractional factorial designs into multiple layers. The method takes advantages of many available results in designing physical experiments and therefore, the proposed Multi-layer designs (MLDs) are easy to generate. Moreover, our numerical study shows that MLDs can have better space-filling properties than optimal LHDs.

 

3. Title: Some New Advances in Design and Modeling of Computer Experiments

Speaker: Professor Peter Z. G. Qian (Statistics, Wisconsin)

Abstract: Computer models are now becoming ubiquitous in nearly all fields of sciences and engineering. Design and modeling are two key aspects of computer experiments. In this talk, I will report some recent advances in both aspects. Specific topics include a new approach for emulation of computer models with qualitative and quantitative factors; sequential space-filling designs; Sudoku based space-filling designs; and sliced Latin hypercube designs for ensembles of computer models. 

 

4. Title: Analysis of Computer Experiments with Functional Response

   Speaker: Professor Ying Hung  (Statistics, Rutgers)

Abstract: Most existing methods for analyzing computer experiments with single outputs such as kriging cannot be easily applied to functional outputs due to the computational problems caused by high-dimensionality of the response. In this paper, we develop an efficient implementation of kriging for analyzing functional responses. Our methodology uses a two-stage model building procedure with Kronecker products and an improved EM algorithm for estimation. The methodology is illustrated using a computer experiment conducted for optimizing residual stresses in machined parts. This is a joint work with V. Roshan Joseph and Shreyes N. Melkote.

]]> Anita Race 1 1269248881 2010-03-22 09:08:01 1475891465 2016-10-08 01:51:05 0 0 event Workshop on Computer Experiments

]]>
2010-03-29T11:00:00-04:00 2010-03-29T13:00:00-04:00 2010-03-29T13:00:00-04:00 2010-03-29 15:00:00 2010-03-29 17:00:00 2010-03-29 17:00:00 2010-03-29T11:00:00-04:00 2010-03-29T13:00:00-04:00 America/New_York America/New_York datetime 2010-03-29 11:00:00 2010-03-29 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Statistics in Service to the Nation]]> 27187 TITLE:   Statistics in Service to the Nation

SPEAKER: Professor Stephen Fienberg

ABSTRACT:

All too often academic statisticians think of their role as the production of new theory and methods.  But at least as important is the role we can fulfill in support national  activities and projects requiring statistical insights and rigor. Many (but far from all) of these are based at the National Research Council and its committees and panels.  I will illustrate the value and creativity of statistical thinking in a diverse set of projects that span the past half century---from the "National Halothane Study" through "The Polygraph and Lie Detection."  A crucial feature of several of these activities has been the impetus for development of new research.  I will also emphasize the role I believe statisticians  should be playing in such activities in the future.

]]> Anita Race 1 1268986223 2010-03-19 08:10:23 1475891465 2016-10-08 01:51:05 0 0 event
Statistics in Service to the Nation

]]>
2010-04-02T12:00:00-04:00 2010-04-02T13:00:00-04:00 2010-04-02T13:00:00-04:00 2010-04-02 16:00:00 2010-04-02 17:00:00 2010-04-02 17:00:00 2010-04-02T12:00:00-04:00 2010-04-02T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-02 12:00:00 2010-04-02 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[BIC Applied to Model Selection of a Large Number of Change-points]]> 27187 TITLE: BIC Applied to Model Selection of a Large Number of Change-points

SPEAKER: Professor David Siegmund

ABSTRACT:

In a previous paper (Biometrics, 2006, pp. 22-32) we derived a Bayes Information Criterion (BIC) for determining the number of change-points in a sequence of independent observations when the number $m$ of change-points is assumed to remain bounded as the number of observations increases. Here we generalize that result to include multiple aligned sequences with intervals of simultaneous change that occur in a fraction of the sequences and a total number of of change-points $m$ that can increase with the sample size; and we include in the criterion terms that increase at rate $m$. Stochastic terms that enter into the new criterion involve integrals and maxima of two-sided Brownian motion with negative drift. Examples involve segmenting aligned DNA sequences according to copy number variations that occur at the same position in a fraction of the sequences.
This is joint research with N. Zhang.

]]> Anita Race 1 1268909814 2010-03-18 10:56:54 1475891465 2016-10-08 01:51:05 0 0 event BIC Applied to Model Selection of a Large Number of Change-points

]]>
2010-04-07T13:00:00-04:00 2010-04-07T14:00:00-04:00 2010-04-07T14:00:00-04:00 2010-04-07 17:00:00 2010-04-07 18:00:00 2010-04-07 18:00:00 2010-04-07T13:00:00-04:00 2010-04-07T14:00:00-04:00 America/New_York America/New_York datetime 2010-04-07 01:00:00 2010-04-07 02:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Designing compact and maximally permissive liveness-enforcing supervisors for complex resource allocation systems through classi]]> 27187 TITLE: Designing compact and maximally permissive liveness-enforcing supervisors for complex resource allocation systems through classification theory

SPEAKER: Professor Spiridon Reveliotis

ABSTRACT:

The problem of liveness-enforcing supervision -- or, deadlock avoidance -- for complex resource allocation systems (RAS) is a well-documented problem in supervisory control theory. Most of the past research on it has acknowledged the fact that the maximally permissive liveness-enforcing supervisor (LES) possesses super-polynomial complexity for most RAS classes, and therefore, it has resorted to solutions that trade off maximal permissiveness for computational tractability. In the presented work, we distinguish between the "off-line" and the "on-line" computation that is required for the effective implementation of the maximally permissive LES, and we seek to develop representations of the maximally permissive LES that require "minimal" on-line computation.

The particular representation that we adopt is that of a compact classifier that will effect the underlying dichotomy of the reachable state space into safe and unsafe subspaces. Through a series of reductions of the derived classification problem, we are able to attain extensive reductions in, both, (i) the computational complexity of the off-line task of the construction of the sought classifier, and (ii) the complexity involved in the on-line classification process itself. We formally establish completeness and optimality properties for the proposed design procedures. We also offer heuristics that, if necessary, can alleviate the computational effort that is necessary for the construction of the sought classifier. Finally, we demonstrate the efficacy of the developed approaches through a series of computational experiments. To the best of our knowledge, these experiments also establish the ability of the proposed methodology to effectively compute tractable implementations of the maximally permissive LES for problem instances significantly beyond the capacity of any other approach currently available in the literature.

]]> Anita Race 1 1267445043 2010-03-01 12:04:03 1475891457 2016-10-08 01:50:57 0 0 event Designing compact and maximally permissive liveness-enforcing supervisors for complex resource allocation systems through classification theory

]]>
2010-03-05T12:00:00-05:00 2010-03-05T13:00:00-05:00 2010-03-05T13:00:00-05:00 2010-03-05 17:00:00 2010-03-05 18:00:00 2010-03-05 18:00:00 2010-03-05T12:00:00-05:00 2010-03-05T13:00:00-05:00 America/New_York America/New_York datetime 2010-03-05 12:00:00 2010-03-05 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Mean-variance portfolio optimization when means and covariances are unknown]]> 27187 TITLE: Mean-variance portfolio optimization when means and covariances are unknown

SPEAKER: Dr. Haipeng Xin

ABSTRACT:

Markowitz's celebrated mean-variance portfolio optimization theory assumes that the means and covariances of the underlying asset returns are known. In practice, they are unknown and have to be estimated from historical data. Plugging the estimates into the efficient frontier
that assumes known parameters has led to portfolios that may perform poorly and have counter-intuitive asset allocation weights; this has been referred to as the ``Markowitz optimization enigma.'' After reviewing different approaches in the literature to address these difficulties, we explain the root cause of the enigma and propose a new approach to resolve
it. Not only is the new approach shown to provide substantial improvements over previous methods, but it also allows flexible modeling to incorporate dynamic features and fundamental analysis of the training sample of historical data, as illustrated in simulation and empirical studies. This is a joint work with Tze Leung Lai (Stanford University) and Zehao Chen (Bosera Funds).

Short bio: Haipeng Xing graduated from the Department of Statistics at Stanford University at 2005, and then jointed the Department of Statistics at Columbia University. In 2008, he moved the Department of Applied Maths and Statistics at SUNY, Stony Brook. His research interests include financial econometrics and engineering, time series modeling and adaptive control, and change-point problems.

]]> Anita Race 1 1267704948 2010-03-04 12:15:48 1475891457 2016-10-08 01:50:57 0 0 event Mean-variance portfolio optimization when means and covariances are unknown

]]>
2010-03-05T13:00:00-05:00 2010-03-05T14:00:00-05:00 2010-03-05T14:00:00-05:00 2010-03-05 18:00:00 2010-03-05 19:00:00 2010-03-05 19:00:00 2010-03-05T13:00:00-05:00 2010-03-05T14:00:00-05:00 America/New_York America/New_York datetime 2010-03-05 01:00:00 2010-03-05 02:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Likelihood Ratio Methods for Outbreak Detection in Spatial and Spatiotemporal Surveillance]]> 27187 TITLE: Likelihood Ratio Methods for Outbreak Detection in Spatial and Spatiotemporal Surveillance

SPEAKER: Prof. Kwok Tsui

ABSTRACT:

For public health surveillance, timely detection of a rate increase in disease incidence is very important. This talks reviews some popular methods for temporal surveillance and proposes a general framework for spatial and spatiotemporal surveillance based onlikelihood ratio statistics over windows of tests. We show that the CUSUM and other popular likelihood ratio statistics are special cases under such a general framework. We compare the efficiency of these surveillance methods in spatial and spatiotemporal cases for detecting clusters of incidence using both Monte Carlo simulations and a real example.  We will also discuss the generalization of weighted likelihood ratio tests for detecting different shift magnitudes under homogeneous and non-homogeneous populations.

]]> Anita Race 1 1268044150 2010-03-08 10:29:10 1475891457 2016-10-08 01:50:57 0 0 event Likelihood Ratio Methods for Outbreak Detection in Spatial and Spatiotemporal Surveillance

]]>
2010-03-09T11:00:00-05:00 2010-03-09T12:00:00-05:00 2010-03-09T12:00:00-05:00 2010-03-09 16:00:00 2010-03-09 17:00:00 2010-03-09 17:00:00 2010-03-09T11:00:00-05:00 2010-03-09T12:00:00-05:00 America/New_York America/New_York datetime 2010-03-09 11:00:00 2010-03-09 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Manifold Learning: Discovering Nonlinear Variation Patterns in Complex Data Sets]]> 27187 TITLE: Manifold Learning: Discovering Nonlinear Variation Patterns in Complex Data Sets

SPEAKER: Professor Daniel Apley

ABSTRACT:

In statistical analysis and data mining of multivariate data sets, many problems can be viewed as discovering variation patterns in a set of N observations of n variables. The term "variation pattern" refers to the structured, interdependent manner in which the n variables may vary over the N observations. In a very general mathematical representation we view each multivariate observation as a vector in n-dimensional space. Then over the set of N observations, we assume the data consist of a structured component plus noise, where the structured component lies on a p-dimensional manifold with p << n. The objective is to learn, or discover, the manifold based only on the set of data, with no prior knowledge of what to expect. Discovery of the manifold is useful in many different contexts:  Denoising noisy images and other multivariate data; dimensionality reduction of large data sets; extraction of important features for enhancing subsequent analyses; exploratory analyses for identifying and understanding relationships between variables; etc. In this talk, I will discuss the manifold learning problem, applications, and algorithms. Linear structured manifolds can be easily discovered with standard principal components and factor analyses. Consequently, this talk will focus on discovering nonlinear manifolds, which is a much more challenging and nuanced problem.

Bio:  Daniel W. Apley is an Associate Professor of Industrial Engineering & Management Sciences at Northwestern University. His research interests lie at the interface of engineering modeling, statistical analysis, and data mining, with particular emphasis on manufacturing variation reduction applications in which very large amounts of data are available. His research has been supported by numerous industries and government agencies. He received the NSF CAREER award in 2001, the IIE Transactions Best Paper Award in 2003, and the Wilcoxon Prize for best practical application paper appearing in Technometrics in 2008. He currently serves as Editor-in-Chief for the Journal of Quality Technology and has served as Chair of the Quality, Statistics & Reliability Section of INFORMS, Director of the Manufacturing and Design Engineering Program at Northwestern, and Associate Editor for Technometrics.

 

]]> Anita Race 1 1268047149 2010-03-08 11:19:09 1475891457 2016-10-08 01:50:57 0 0 event Manifold Learning: Discovering Nonlinear Variation Patterns in Complex Data Sets

]]>
2010-03-12T12:00:00-05:00 2010-03-12T13:00:00-05:00 2010-03-12T13:00:00-05:00 2010-03-12 17:00:00 2010-03-12 18:00:00 2010-03-12 18:00:00 2010-03-12T12:00:00-05:00 2010-03-12T13:00:00-05:00 America/New_York America/New_York datetime 2010-03-12 12:00:00 2010-03-12 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Modeling of Travelocity's On-line Travel Distribution Business]]> 27187 TITE: Modeling of Travelocity's On-line Travel Distribution Business

SPEAKER: Barry Smith

ABSTRACT:

Beginning in 2002, Travelocity and the Sabre Research Group collaborated  to develop the Enterprise Network Model (ENM).  The ENM combines discrete choice customer modeling with simulation and large-scale optimization to improve Travelocity's management of its on-line travel distribution business.  The ENM helped Travelocity become a more effective retailer and contributed $43 million per year to Travelocity's bottom line.  This project was recognized in 2005 as an Edelman competition finalist.

]]> Anita Race 1 1266498279 2010-02-18 13:04:39 1475891442 2016-10-08 01:50:42 0 0 event 2010-02-24T16:00:00-05:00 2010-02-24T17:00:00-05:00 2010-02-24T17:00:00-05:00 2010-02-24 21:00:00 2010-02-24 22:00:00 2010-02-24 22:00:00 2010-02-24T16:00:00-05:00 2010-02-24T17:00:00-05:00 America/New_York America/New_York datetime 2010-02-24 04:00:00 2010-02-24 05:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Mean-variance portfolio optimization when means and covariances are unknown]]> 27187 TITLE: Mean-variance portfolio optimization when means and covariances are unknown

SPEAKER: Haipeng Xing

ABSTRACT:

Markowitz's celebrated mean-variance portfolio optimization theory assumes that the means and covariances of the underlying asset returns are known. In practice, they are unknown and have to be estimated from historical data. Plugging the estimates into the efficient frontier that assumes known parameters has led to portfolios that may perform poorly and have counter-intuitive asset allocation weights; this has been referred to as the "Markowitz optimization enigma.''  After reviewing different approaches in the literature to address these difficulties, we explain the root cause of the enigma and propose a new approach to resolve it. Not only is the new approach shown to provide substantial improvements over previous methods, but it also allows flexible modeling to incorporate dynamic features and fundamental analysis of the training sample of historical data, as illustrated in simulation and empirical studies. This is a joint work with Tze Leung Lai (Stanford University) and Zehao Chen (Bosera Funds).

Short bio: Haipeng Xing graduated from the Department of Statistics at Stanford University at 2005, and then jointed the Department of Statistics at Columbia University. In 2008, he moved the Department of Applied Maths and Statistics at SUNY, Stony Brook. His research interests include financial econometrics and engineering, time series modeling and adaptive control, and change-point problems.

]]> Anita Race 1 1266399987 2010-02-17 09:46:27 1475891442 2016-10-08 01:50:42 0 0 event 2010-02-26T11:00:00-05:00 2010-02-26T12:00:00-05:00 2010-02-26T12:00:00-05:00 2010-02-26 16:00:00 2010-02-26 17:00:00 2010-02-26 17:00:00 2010-02-26T11:00:00-05:00 2010-02-26T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-26 11:00:00 2010-02-26 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Health Systems Reunion]]> 27187 Health Systems Reunion
Saturday, February 27, 2010, 6:30 PM to 9:30 PM

Hosted by Health Systems Institute
828 W Peachtree Street NE, Atlanta, GA

]]> Anita Race 1 1266498042 2010-02-18 13:00:42 1475891442 2016-10-08 01:50:42 0 0 event 2010-02-27T18:30:00-05:00 2010-02-27T21:30:00-05:00 2010-02-27T21:30:00-05:00 2010-02-27 23:30:00 2010-02-28 02:30:00 2010-02-28 02:30:00 2010-02-27T18:30:00-05:00 2010-02-27T21:30:00-05:00 America/New_York America/New_York datetime 2010-02-27 06:30:00 2010-02-27 09:30:00 America/New_York America/New_York datetime <![CDATA[]]> Lauren Calvert  hsialumni@gmail.com

 

]]>
<![CDATA[Monitoring and Diagnosis of Complex Systems with Multi-stream High Dimensional Sensing Data]]> 27187 TITLE: Monitoring and Diagnosis of Complex Systems with Multi-stream High Dimensional Sensing Data

SPEAKER: Dr. Qingyu Yang, Research Fellow

ABSTRACT:

The wide deployment and application of distributed sensing and computer systems have resulted in multi-stream sensing data leading to both temporally and spatially dense data-rich environments, which provides unprecedented opportunities for improving operations of complex systems in both manufacturing and healthcare applications. However, it also brings out new research challenges on data analysis due to high-dimensional and complex temporal-spatial correlated data structure. In this talk, as an example of my research work, I will discuss a critical research issue on how to separate immeasurable embedded individual source signals from indirect mixed sensor measurements. In this research, a hybrid analysis method is proposed by integrating Independent Component Analysis and Sparse Component Analysis. The proposed method can efficiently estimate individual source signals that include both independent signals and dependent signals which have dominant components in the time or linear transform domains. With source signals identified, it is feasible to monitor each source signal directly and provide explicit diagnostic information.

Bio:

Dr. Qingyu Yang is currently a postdoctoral research fellow with the Department of Industrial & Operations Engineering at the University of Michigan-Ann Arbor. He received a M.S. degree in Statistics and a Ph.D. degree in Industrial Engineering from the University of Iowa in 2007 and 2008, respectively. He also held a B.S. degree in Automatic Control (2000) and a M.S. degree in Intelligent System (2003) from the University of Science and Technology University of China (USTC, China). His research interests include distributed sensor system, information system, and applied statistics. He was the recipient of the Best Paper Award from Industrial Engineering Research Conference (IERC) 2009.

]]> Anita Race 1 1266231682 2010-02-15 11:01:22 1475891417 2016-10-08 01:50:17 0 0 event 2010-02-19T12:00:00-05:00 2010-02-19T13:00:00-05:00 2010-02-19T13:00:00-05:00 2010-02-19 17:00:00 2010-02-19 18:00:00 2010-02-19 18:00:00 2010-02-19T12:00:00-05:00 2010-02-19T13:00:00-05:00 America/New_York America/New_York datetime 2010-02-19 12:00:00 2010-02-19 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Op Ed by Ozlem Ergun, Pinar Keskinocak, and Julie Swann In Atlanta Journal & Constitution]]> 27328 The earthquake in Haiti, tsunami in Indonesia, and Hurricane Katrina are all events that inspire us to help. In a rush of support, we make donations and expect quick results. With frustration, however, we see that there is a gap between the event and when supplies actually reach people in need. Why? In the case of Haiti, the infrastructure was completely destroyed. The main port and airport were not operational for the first 48 hours, making it impossible for aid to enter the country. Once the airport’s runway was operational, there was chaos prioritizing which planes should land. Even after aid was on land, debris blocked roads with no available equipment to clear them. This is the conundrum that’s plaguing relief workers and frustrating and confusing those donating money for recovery  To read full text, click on link below:
http://www.ajc.com/opinion/logistics-ignored-in-disaster-289549.html

]]> Edie Cohen 1 1265206942 2010-02-03 14:22:22 1475891375 2016-10-08 01:49:35 0 0 event 2010-02-03T00:00:00-05:00 2010-02-03T00:00:00-05:00 2010-02-03T00:00:00-05:00 2010-02-03 05:00:00 2010-02-03 05:00:00 2010-02-03 05:00:00 2010-02-03T00:00:00-05:00 2010-02-03T00:00:00-05:00 America/New_York America/New_York datetime 2010-02-03 12:00:00 2010-02-03 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Barbara Christopher

Stewart School of Industrial and Systems Engineering

404.385.3102

]]>
<![CDATA[A probabilistic comparison of split, triangle and quadrilateral cuts for two row mix-integer programs]]> 27187 TITLE: A probabilistic comparison of split, triangle and quadrilateral cuts for two row mix-integer programs

SPEAKER: Qie He

ABSTRACT:

Despite the elegant theory of cuts for mixed-integer programs with two rows and two integer variables derived from triangle and quadrilateral integer-free bodies, there has been only limited computational evidence that these cuts are capable of giving better results than those derived from simple splits, i.e. Gomory cuts which are derived from a single row. In this paper, we show that under a certain probabilistic model, a cut coefficient derived from a simple split is likely to be stronger than that derived from a triangle and as strong as that derived from a quadrilateral. Furthermore, we use our probabilistic model to compare the volume cut off from the linear relaxation by a split cut and a type 1 triangle cut. We empirically demonstrate that the probability that the split cut cuts off more volume than the triangle cut is quite high when the number of continuous variables is large.

This is a joint work with Shabbir Ahmed and George Nemhauser.

]]> Anita Race 1 1264692589 2010-01-28 15:29:49 1475891375 2016-10-08 01:49:35 0 0 event 2010-02-03T13:30:00-05:00 2010-02-03T14:30:00-05:00 2010-02-03T14:30:00-05:00 2010-02-03 18:30:00 2010-02-03 19:30:00 2010-02-03 19:30:00 2010-02-03T13:30:00-05:00 2010-02-03T14:30:00-05:00 America/New_York America/New_York datetime 2010-02-03 01:30:00 2010-02-03 02:30:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Partial Correlation Estimation by Joint Sparse Regression Models]]> 27187 TITLE: Partial Correlation Estimation by Joint Sparse Regression Models

SPEAKER: Ji Zhu

ABSTRACT:

In this talk, we propose a computationally efficient approach for selecting non-zero partial correlations under the high-dimension-low-sample-size setting. This method assumes the overall sparsity of the partial correlation matrix and employs sparse regression techniques for model fitting. We illustrate the performance of our method by extensive simulation studies. It is shown that our method performs well in both non-zero partial correlation selection and the identification of hub variables, and also outperforms two existing methods. We then apply our method to a microarray breast cancer data set and identify a set of "hub genes" which may provide important insights on genetic regulatory networks. Finally, we prove that, under a set of suitable assumptions, the proposed procedure is asymptotically consistent in terms of model selection and parameter estimation. This is joint work with Jie Peng, Pei Wang and Nengfeng Zhou.

]]> Anita Race 1 1264693529 2010-01-28 15:45:29 1475891375 2016-10-08 01:49:35 0 0 event 2010-02-04T11:00:00-05:00 2010-02-04T12:00:00-05:00 2010-02-04T12:00:00-05:00 2010-02-04 16:00:00 2010-02-04 17:00:00 2010-02-04 17:00:00 2010-02-04T11:00:00-05:00 2010-02-04T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-04 11:00:00 2010-02-04 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Equivalency of Accelerated Life Testing Plans under Different Stress Loadings]]> 27187 TITLE: Equivalency of Accelerated Life Testing Plans under Different Stress Loadings

SPEAKER: Professor E. A. Elsayed

ABSTRACT:

Accelerated Life Testing (ALT) is designed and conducted to obtain failure observations in a short time by subjecting test units to severer than normal operating conditions and use the data for reliability prediction. Many types of stress loadings such as constant-stress, step-stress and cyclic-stress can be utilized when conducting ALT. Extensive research has been conducted on the analysis of ALT data obtained under constant stress loading. However, the equivalency of ALT experiments involving different stress loadings has not been investigated. In this paper, we provide definitions for the equivalency of test plans, general equivalent ALT plans and some special types of equivalent ALT plans are explored. For demonstration, a constant-stress ALT and a ramp-stress ALT for miniature lamps are presented and their equivalency is investigated.

 

 

]]> Anita Race 1 1264693212 2010-01-28 15:40:12 1475891375 2016-10-08 01:49:35 0 0 event 2010-02-05T12:00:00-05:00 2010-02-05T13:00:00-05:00 2010-02-05T13:00:00-05:00 2010-02-05 17:00:00 2010-02-05 18:00:00 2010-02-05 18:00:00 2010-02-05T12:00:00-05:00 2010-02-05T13:00:00-05:00 America/New_York America/New_York datetime 2010-02-05 12:00:00 2010-02-05 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Adversarial Risk Analysis: Games and Auctions]]> 27187 TITLE: Adversarial Risk Analysis:  Games and Auctions

SPEAKER: David Banks

ABSTRACT:

Classical game theory has been an unreasonable description for human behavior, and traditional analyses make strong assumptions about common knowledge and fixed payoffs.  Classical risk analysis has assumed that the opponent is non-adversarial (i.e., "Nature") and thus is inapplicable to many situations.  This work explores Bayesian approaches to adversarial risk analysis, in which each opponent must model the decision process of the other, but there is the opportunity to use human judgment and subjective distributions.  The approach is illustrated in the analysis of two important applications:  sealed bid auctions and simple poker; some related work on counterbioterrorism is also covered.  The results in these three applications are interestingly different from those found in previous work.

]]> Anita Race 1 1265199611 2010-02-03 12:20:11 1475891375 2016-10-08 01:49:35 0 0 event 2010-02-11T12:00:00-05:00 2010-02-11T13:00:00-05:00 2010-02-11T13:00:00-05:00 2010-02-11 17:00:00 2010-02-11 18:00:00 2010-02-11 18:00:00 2010-02-11T12:00:00-05:00 2010-02-11T13:00:00-05:00 America/New_York America/New_York datetime 2010-02-11 12:00:00 2010-02-11 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[A Sparse Signomial Model for Classification and Regression]]> 27187 TITLE: A Sparse Signomial Model for Classification and Regression

SPEAKER: Professor Myong K. (MK) Jeong

ABSTRACT:

Support Vector Machine (SVM) is one of the most popular data mining tools for solving classification and regression problems. Due to its high prediction accuracy, SVM has been successfully used in various fields. However, SVM has the following drawbacks.  First, it is not easy to get an explicit description of the discrimination (or regression) function in the original input space and to make a variable selection decision in the input space. Second, depending on the magnitude and numeric range of the given data points, the resulting kernel matrices may be ill-conditioned, so learning algorithms may be suffered from numerical instability even though data scaling generally helps to handle this kind of issues but may not be always effective. Third, the selection of an appropriate kernel type and its parameters can be complex while the performance of the resulting functions is heavily influenced.

To overcome these drawbacks, this talk presents the sparse signomial classification and regression (SSCR) model. SSCR seek a sparse signomial function by solving a linear program to minimize the weighted sum of the 1-norm of the coefficient vector of the function and the 1-norm of violation (or loss) caused by the function. SSCR can explorevery high demensional feature spaces with less sensitivity to numerical values or numeric ranges of the given data. Moreover, this method give an explicit description of the resulting function in the original input space, which can be used for prediction purposes as well as interpretation purposes. We present a practical implementation of SSCR based on the column generation and explore some theoretical properties of the proposed formulation. Computational study shows that SSCR is competitive or even better performance compared to other widely used learning methods for classification and regression.

]]> Anita Race 1 1265200218 2010-02-03 12:30:18 1475891375 2016-10-08 01:49:35 0 0 event 2010-02-12T12:00:00-05:00 2010-02-12T13:00:00-05:00 2010-02-12T13:00:00-05:00 2010-02-12 17:00:00 2010-02-12 18:00:00 2010-02-12 18:00:00 2010-02-12T12:00:00-05:00 2010-02-12T13:00:00-05:00 America/New_York America/New_York datetime 2010-02-12 12:00:00 2010-02-12 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Dynamic policies to learn and earn in a customized pricing context]]> 27187 TITLE: Dynamic policies to learn and earn in a customized pricing context

SPEAKER: J. Michael Harrison

ABSTRACT:

Motivated by applications in financial services, we consider the following customized pricing problem.  A seller of some good or service (like auto loans or small business loans) confronts a sequence of potential customers numbered 1, 2, … , T.  These customers are drawn at random from a population characterized by a price-response function r(p).  That is, if the seller offers price p, then the probability of a successful sale is r(p).  The profit realized from a successful sale is p(p) = p c, where c > 0 is known. 

 If the price-response function r(×) were also known, then the problem of finding a price p* to maximize r(p)p(p) would be simple, and the seller would offer price p* to each of the T customers.  We consider the more complicated case where r(×) is fixed but initially unknown: roughly speaking, the seller wants to choose prices sequentially so as to maximize the total profit earned from the T potential customers; each successive choice involves a trade-off between refined estimation of the unknown price-response function (learning) and immediate profit (earning).

 * Joint work with Bora Keskin and Assaf Zeevi

]]> Anita Race 1 1265025538 2010-02-01 11:58:58 1475891375 2016-10-08 01:49:35 0 0 event 2010-02-18T11:00:00-05:00 2010-02-18T12:00:00-05:00 2010-02-18T12:00:00-05:00 2010-02-18 16:00:00 2010-02-18 17:00:00 2010-02-18 17:00:00 2010-02-18T11:00:00-05:00 2010-02-18T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-18 11:00:00 2010-02-18 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Cutting planes: A convex analysis perspective]]> 27187 TITLE: Cutting planes: A convex analysis perspective

SPEAKER: Dr. Gerard Cornuejols

ABSTRACT:

his talk will be based on joint work with Borozan, Basu, Conforti and Zambelli. We extend a theorem of Lovasz characterizing maximal lattice-free convex sets. This result has implications in integer programming. In particular we show a ono-to-one correspondance between these sets and minimal inequalities.

]]> Anita Race 1 1263999032 2010-01-20 14:50:32 1475891372 2016-10-08 01:49:32 0 0 event 2010-03-02T11:00:00-05:00 2010-03-02T12:00:00-05:00 2010-03-02T12:00:00-05:00 2010-03-02 16:00:00 2010-03-02 17:00:00 2010-03-02 17:00:00 2010-03-02T11:00:00-05:00 2010-03-02T12:00:00-05:00 America/New_York America/New_York datetime 2010-03-02 11:00:00 2010-03-02 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Terror Queues]]> 27187 TITLE: Terror Queues

SPEAKER: Professor Edward Kaplan

ABSTRACT:

This article presents the first model developed specifically for understanding the infiltration and interdiction of ongoing terror plots by undercover intelligence agents, and does so via novel application of ideas from queueing theory and Markov population processes. The resulting "terror queue" models predict the number of undetected terror threats in an area from agent activity/utilization data, and also estimate the rate with which such threats can be interdicted.  The models treat terror plots as customers and intelligence agents as servers. Agents spend all of their time either detecting and infiltrating new terror plots (in which case they are "available"), or interdicting already detected terror plots (in which case they are "busy"). Initially we examine a Markov model assuming that intelligence agents, while unable to detect all plots, never err by falsely detecting fake plots.  While this model can be solved numerically, a simpler Ornstein-Uhlenbeck diffusion approximation yields some results in closed form while providing nearly identical numerical performance.  The transient behavior of the terror queue model is discussed briefly along with a sample sensitivity analysis to study how model predictions compare to simulated results when using estimated versus known terror plot arrival rates. The diffusion model is then extended to allow for the false detection of fake plots. Such false detection is a real feature of counterterror intelligence given that intelligence agents or informants can make mistakes, as well as the proclivity of terrorists to deliberately broadcast false information. The false detection model is illustrated using suicide bombing data from Israel.

 

]]> Anita Race 1 1263896121 2010-01-19 10:15:21 1475891372 2016-10-08 01:49:32 0 0 event 2010-03-03T11:00:00-05:00 2010-03-03T12:00:00-05:00 2010-03-03T12:00:00-05:00 2010-03-03 16:00:00 2010-03-03 17:00:00 2010-03-03 17:00:00 2010-03-03T11:00:00-05:00 2010-03-03T12:00:00-05:00 America/New_York America/New_York datetime 2010-03-03 11:00:00 2010-03-03 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Communication Based Train Control (CBTC) System and Future Cyber-Physical Rail Systems]]> 27187 TITLE: Communication Based Train Control (CBTC) System and Future Cyber-Physical Rail Systems

SPEAKER: Dr. Tao Tang

ABSTRACT:

The global demand for rail services continues to outpace the available capacity and the infrastructure.  At the meantime, aging systems and traditional business practices provide significant limitations to effectively address these challenges. In order to accommodate the significant increase of demands in rail services, it is important to increase the capacity of the railway network.  Another paramount issue in rail service is the safety – It is essential to design and operate a system that can provide reliable and efficient communication and control of railway signals and eliminate any potential dangers in the system. 

 The Communication Based Train Control (CBTC) system uses radio signal to transmit real time control information between the train and the wayside. With the help of the CBTC system, the train will continually monitor its position and operation status, which significantly improves the dependability and increases the capacity of a railway network. Currently, the CBTC system has been adopted by more and more railway networks in the world.  A Cyber-Physical Rail Systems based on CBTC, with more enhanced information collection, Internet-based communications, and safety and efficiency improvement, will be the future trend in the railway system communication and safety. 

 This presentation will introduce some key technologies in the CBTC system. The  framework, research challenges, and opportunities of cyber-physical rail system will be discussed also.

]]> Anita Race 1 1263465351 2010-01-14 10:35:51 1475891372 2016-10-08 01:49:32 0 0 event 2010-01-15T12:00:00-05:00 2010-01-15T13:00:00-05:00 2010-01-15T13:00:00-05:00 2010-01-15 17:00:00 2010-01-15 18:00:00 2010-01-15 18:00:00 2010-01-15T12:00:00-05:00 2010-01-15T13:00:00-05:00 America/New_York America/New_York datetime 2010-01-15 12:00:00 2010-01-15 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Stochastic dynamic predictions using Gaussian process models for nanoparticle synthesis]]> 27187 TITLE: Stochastic dynamic predictions using Gaussian process models for nanoparticle synthesis

SPEAKER: Andres Felipe Hernandez Moreno and Professor Martha Grover

ABSTRACT:

Gaussian process model is an empirical modeling approach that has been widely applied in engineering for the approximation of deterministic functions, due its flexibility and ability to interpolate observed data. Despite its statistical properties, Gaussian process models (GPM) have not been employed to describe the dynamics of stochastic complex system like nanoscale phenomena. This presentation describes the methodology to construct approximate models for multivariate stochastic dynamic simulations using GPM, combining ideas from design of experiments, spatial statistics and dynamic systems modeling. In particular, the effect of sampling strategies in the identification and prediction of the GPM is analyzed in detailed. The methodology is applied in the prediction of a dynamic size distribution during the synthesis of platinum nanoparticles under supercritical CO_2 conditions.

]]> Anita Race 1 1263895707 2010-01-19 10:08:27 1475891372 2016-10-08 01:49:32 0 0 event 2010-01-22T12:00:00-05:00 2010-01-22T13:00:00-05:00 2010-01-22T13:00:00-05:00 2010-01-22 17:00:00 2010-01-22 18:00:00 2010-01-22 18:00:00 2010-01-22T12:00:00-05:00 2010-01-22T13:00:00-05:00 America/New_York America/New_York datetime 2010-01-22 12:00:00 2010-01-22 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Analysis of Large-Scale Computer Experiments]]> 27187 TITLE: Analysis of Large-Scale Computer Experiments

SPEAKER: Lulu Kang
PhD Candidate  (in the Statistics Program)
School of Industrial and Systems Engineering
Georgia Tech

ABSTRACT:

Computer experiments simulate the engineering systems by implementing the mathematical models governing the systems in computers. Recently, experiments having large number of input variables and experimental runs started to emerge. In the existing literature, kriging has been commonly used for approximating the complex computer models, but it has limitations for dealing with the large-scale experiments due to its computational complexity and numerical stability. In this work, I propose a new modeling approach known as regression-based inverse distance weighting (RIDW). The new predictor is shown to be computationally more efficient than kriging while producing comparable prediction performance. We also develop a heuristic method for constructing confidence intervals for prediction. I will also discuss extensions of RIDW and my future research directions on this exciting topic
Bio:
Lulu Kang is a Ph.D. candidate in the Statistics Program of the School of Industrial and Systems Engineering at Georgia Institute of Technology. She is working with Professor Roshan J. Vengazhiyil. Her research interests are in developing statistical theories and methodologies, as well as their applications in physical science and engineering.

]]> Anita Race 1 1264493424 2010-01-26 08:10:24 1475891372 2016-10-08 01:49:32 0 0 event 2010-01-28T11:00:00-05:00 2010-01-28T12:00:00-05:00 2010-01-28T12:00:00-05:00 2010-01-28 16:00:00 2010-01-28 17:00:00 2010-01-28 17:00:00 2010-01-28T11:00:00-05:00 2010-01-28T12:00:00-05:00 America/New_York America/New_York datetime 2010-01-28 11:00:00 2010-01-28 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Bahar Biller, Carnegie Mellon University]]> 27318 Speaker
Bahar Biller
Tepper School of Business
Carnegie Mellon University

Abstract
(To be announced)

]]> Shuangchi He 1 1268145165 2010-03-09 14:32:45 1475891372 2016-10-08 01:49:32 0 0 event 2010-03-09T11:00:00-05:00 2010-03-09T12:00:00-05:00 2010-03-09T12:00:00-05:00 2010-03-09 16:00:00 2010-03-09 17:00:00 2010-03-09 17:00:00 2010-03-09T11:00:00-05:00 2010-03-09T12:00:00-05:00 America/New_York America/New_York datetime 2010-03-09 11:00:00 2010-03-09 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Shuangchi He
H. Milton Stewart School of Industrial and Systems Engineering
Contact Shuangchi He

]]>
<![CDATA[Small Systems Biology]]> 27187 TITLE: Small Systems Biology

SPEAKER: Professor Eberhard O. Voit

ABSTRACT:

The combination of high-throughput methods of molecular biology with advanced mathematical and computational techniques has propelled the emergent field of systems biology into a position of prominence. Unthinkable only a decade ago, it has become possible to screen and analyze the expression of entire genomes, simultaneously assess large numbers of proteins and their prevalence, and characterize in detail the metabolic state of a cell population. While very important, the focus on comprehensive networks of biological components is only one side of systems biology. Complementing large-scale assessments, and sometimes at risk of being forgotten, are more subtle analyses that rationalize the design and functioning of biological modules in exquisite detail. This intricate side of systems biology aims at identifying the specific roles of processes and signals in smaller, fully regulated systems by computing what would happen if these signals were lacking or organized in a different fashion. I will exemplify this type of approach with two examples. The first is a detailed analysis of the regulation of glucose utilization in /Lactococcus lactis/. This organism is exposed to alternating periods of glucose availability and starvation. During starvation, it accumulates an intermediate of glycolysis, which allows it to take up glucose immediately upon availability. This notable accumulation poses a non-trivial control task that is solved with an unusual, yet ingeniously designed and timed feedforward activation system. The elucidation of this control system required high-precision /in vivo/ data on the dynamics of intracellular metabolite pools, combined with methods of nonlinear systems analysis, and may serve as a paradigm for multidisciplinary approaches to fine-scaled systems biology. The second example describes our attempts to understand signal transduction in the human brain, along with perturbations in diseases like Parkinson’s disease and Schizophrenia.


*/References:/*

Voit, E.O.: /Computational Analysis of Biochemical Systems. A Practical Guide for Biochemists and Molecular Biologists/, xii + 530 pp., Cambridge University Press, Cambridge, U.K., 2000.

Voit, E.O., A.R. Neves, and H. Santos. The Intricate Side of Systems Biology. /PNAS/, 103(25), 9452-9457, 2006.

Qi, Z., G. W. Miller, and E. O. Voit: Computational analysis of determinants of dopamine dysfunction. /Synapse/* 63*: 1133-1142, 2009.

Wu, Jialiang, Z. Qi, and E.O. Voit: Investigation of delays and noise in dopamine signaling with hybrid functional Petri nets. In Silico Biol. 10, 0005 (2010).

]]> Anita Race 1 1264493158 2010-01-26 08:05:58 1475891372 2016-10-08 01:49:32 0 0 event 2010-01-29T12:00:00-05:00 2010-01-29T13:00:00-05:00 2010-01-29T13:00:00-05:00 2010-01-29 17:00:00 2010-01-29 18:00:00 2010-01-29 18:00:00 2010-01-29T12:00:00-05:00 2010-01-29T13:00:00-05:00 America/New_York America/New_York datetime 2010-01-29 12:00:00 2010-01-29 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Business Analytics and Optimization]]> 27187 TITLE: Business Analytics and Optimization

SPEAKER: Dr. William Pulleyblank

ABSTRACT:

 In 2009, IBM launched a major activity in business analytics and optimization.  This includes operations research, data analytics, statistical analysis, business optimization and risk management.  I will review this and discuss how it fits with our Smarter Planets strategic initiative.  I will focus on the types of client problems that are being attacked as well as the types of activities that are taking place.  I will also discuss the disciplines that being combined in his activity as well as some of the major challenges we face. 

]]> Anita Race 1 1264063984 2010-01-21 08:53:04 1475891372 2016-10-08 01:49:32 0 0 event 2010-02-02T11:00:00-05:00 2010-02-02T12:00:00-05:00 2010-02-02T12:00:00-05:00 2010-02-02 16:00:00 2010-02-02 17:00:00 2010-02-02 17:00:00 2010-02-02T11:00:00-05:00 2010-02-02T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-02 11:00:00 2010-02-02 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Directed Regression]]> 27187 TITLE: Directed Regression

SPEAKER: Professor Ben Van Roy

ABSTRACT:

When used to guide decisions, linear regression analysis typically  involves estimation of regression coefficients via ordinary least  squares and their subsequent use in an optimization problem. When  features are not chosen perfectly, it can be beneficial to account for  the decision objective when computing regression coefficients.  Empirical optimization does so but sacrifices performance when  features are well-chosen or training data are insufficient. We propose  directed regression, an efficient algorithm that combines merits of  ordinary least squares and empirical optimization. We demonstrate  through computational studies that directed regression generates  performance gains over either alternative. We also develop a theory  that motivates the algorithm.

]]> Anita Race 1 1263996331 2010-01-20 14:05:31 1475891372 2016-10-08 01:49:32 0 0 event 2010-02-09T11:00:00-05:00 2010-02-09T12:00:00-05:00 2010-02-09T12:00:00-05:00 2010-02-09 16:00:00 2010-02-09 17:00:00 2010-02-09 17:00:00 2010-02-09T11:00:00-05:00 2010-02-09T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-09 11:00:00 2010-02-09 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[A Dynamic Near-Optimal Algorithm for Online Linear Programming]]> 27187 TITLE: A Dynamic Near-Optimal Algorithm for Online Linear Programming

SPEAKER: Professor Yinyu Ye

ABSTRACT:

A natural optimization model that formulates many online resource allocation and revenue
management problems is the online linear program (LP) where the constraint matrix is revealed
column by column along with the objective function. We provide a near-optimal algorithm for
this surprisingly general class of online problems under the assumption of random order of arrival and some mild conditions on the size of the LP right-hand-side input. Our learning-based algorithm works by dynamically updating a threshold price vector at geometric time intervals, where the dual prices learned from revealed columns in the previous period are used to determine the sequential decisions in the current period. Our algorithm has a feature of learning by doing, and the prices are updated at a carefully chosen pace that is neither too fast nor too slow. In particular, our algorithm doesn't assume any distribution information on the input itself, thus is robust to data uncertainty and variations due to its dynamic learning capability. Applications of our algorithm include many online multi-resource allocation and multi-product revenue management problems such as online routing and packing, online combinatorial auctions, adwords matching, inventory control and yield management.

Joint work with Shipra Agrawal and Zizhuo Wang.

]]> Anita Race 1 1263981607 2010-01-20 10:00:07 1475891372 2016-10-08 01:49:32 0 0 event 2010-02-23T11:00:00-05:00 2010-02-23T12:00:00-05:00 2010-02-23T12:00:00-05:00 2010-02-23 16:00:00 2010-02-23 17:00:00 2010-02-23 17:00:00 2010-02-23T11:00:00-05:00 2010-02-23T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-23 11:00:00 2010-02-23 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Gerard Cornuejols, Carnegie Mellon Univeristy]]> 27279 Speaker
Gerard Cornuejols
Carnegie Mellon University

Abstract
(To be announced)

]]> Barbara Christopher 1 1266503591 2010-02-18 14:33:11 1475891368 2016-10-08 01:49:28 0 0 event 2010-03-02T11:00:00-05:00 2010-03-02T12:00:00-05:00 2010-03-02T12:00:00-05:00 2010-03-02 16:00:00 2010-03-02 17:00:00 2010-03-02 17:00:00 2010-03-02T11:00:00-05:00 2010-03-02T12:00:00-05:00 America/New_York America/New_York datetime 2010-03-02 11:00:00 2010-03-02 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro
ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[2010 Health and Humanitarian Logistics Conference]]> 27328 The Second Annual Conference on Health and Humanitarian Logistisc will be held on March 4-5, 2010 on the campus of the Georgia Institute of Technology in Atlanta, Georgia.

For program and registration information, visit the conference website at www.scl.gatech.edu/humlog2010/.

The conference will bring together speakers and attendees from non-government organizations (NGOs), industry, government, military, and academia focusing on various topics, including planning, preparing, and responding to disasters; recovery and mitigation; addressing major societal problems related to health; and short- and long-term humanitarian response. The main objectives of the conference are to articulate the opportunities and challenges in humanitarian response and world health, both from a humanitarian and a corporate/economic perspective, to identify important research issues, to create academic awareness for the research opportunities, and to establish priorities for NGOs, corporations, and the government in terms of their strategies, policies, and investments.

The Conference is hosted by the Center for Humanitarian Logistics (http://www.scl.gatech.edu/research/humanitarian/), which is under the umbrella of Georgia Tech's Supply Chain and Logistics Institute (SCL).

To access presentation slides, speaker biographies, and other conference information for the 2009 Conference, visit: http://www2.isye.gatech.edu/humlog09/

]]> Edie Cohen 1 1265126192 2010-02-02 15:56:32 1475891368 2016-10-08 01:49:28 0 0 event 2010-03-04T09:00:00-05:00 2010-03-05T18:00:00-05:00 2010-03-05T18:00:00-05:00 2010-03-04 14:00:00 2010-03-05 23:00:00 2010-03-05 23:00:00 2010-03-04T09:00:00-05:00 2010-03-05T18:00:00-05:00 America/New_York America/New_York datetime 2010-03-04 09:00:00 2010-03-05 06:00:00 America/New_York America/New_York datetime <![CDATA[]]> Humanitarian Logistics Team
ISyE
Contact Humanitarian Logistics Team
404-894-2325

]]>
<![CDATA[Jose Blanchet, Columbia University]]> 27279 Speaker
Jose Blanchet
Columbia University

Abstract
Our focus is on the development of provably efficient simulation algorithms for estimating large deviations probabilities (such as overflow probabilities) in the context of many server queues. These types of systems, which have been the subject of much investigation in recent years, pose interesting challenges from a rare event simulation standpoint, given their measure valued state descriptor. We shall explain a technique that has the following elements. First, it introduces a pivotal set that is suitable chosen to deal with boundary-type behavior, which is common in the analysis of queueing systems. Second, it takes advantage of Central Limit Theorem approximations that have been developed recently for these types of systems and third it use a novel bridge-sampling approach in order to describe an symptotically optimal (in certain sense) importance sampling scheme. This work provides the first systematic approach to develop provably efficient rare-event simulation methodology for these types of systems.

Bio
Jose Blanchet is a faculty member of the IEOR at Columbia University. Jose holds a Ph.D. in Management Science and Engineering from Stanford University. Prior to joining Columbia he was a faculty member in the Statistics Department at Harvard University. Jose is a recipient of the 2009 Best Publication Award given by the INFORMS Applied Probability Society and a CAREER award in Operations Research given by NSF in 2008. He worked as an analyst in Protego Financial Advisors, a leading investment bank in Mexico. He has research interests in applied probability and Monte Carlo methods. He serves in the editorial board of Advances in Applied Probability, Journal of Applied Probability, QUESTA and TOMACS.

]]> Barbara Christopher 1 1266503396 2010-02-18 14:29:56 1475891368 2016-10-08 01:49:28 0 0 event 2010-01-26T11:00:00-05:00 2010-01-26T12:00:00-05:00 2010-01-26T12:00:00-05:00 2010-01-26 16:00:00 2010-01-26 17:00:00 2010-01-26 17:00:00 2010-01-26T11:00:00-05:00 2010-01-26T12:00:00-05:00 America/New_York America/New_York datetime 2010-01-26 11:00:00 2010-01-26 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker
ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Sheldon Ross, University of Southern California]]> 27279 Speaker
Sheldon Ross
Epstein Chair Professor
Industrial and Systems Engineering
University of Southern California

Abstract
(To be announced)

]]> Barbara Christopher 1 1266503817 2010-02-18 14:36:57 1475891368 2016-10-08 01:49:28 0 0 event Joint Statistics/OR Colloquium (To be announced)

]]>
2010-03-16T12:00:00-04:00 2010-03-16T13:00:00-04:00 2010-03-16T13:00:00-04:00 2010-03-16 16:00:00 2010-03-16 17:00:00 2010-03-16 17:00:00 2010-03-16T12:00:00-04:00 2010-03-16T13:00:00-04:00 America/New_York America/New_York datetime 2010-03-16 12:00:00 2010-03-16 01:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker
ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Ben Van Roy, Stanford University]]> 27279 Speaker
Ben Van Roy
Stanford University

Abstract
When used to guide decisions, linear regression analysis typically involves estimation of regression coefficients via ordinary least squares and their subsequent use in an optimization problem. When features are not chosen perfectly, it can be beneficial to account for the decision objective when computing regression coefficients. Empirical optimization does so but sacrifices performance when features are well-chosen or training data are insufficient. We propose directed regression, an efficient algorithm that combines merits of ordinary least squares and empirical optimization. We demonstrate through computational studies that directed regression generates performance gains over either alternative. We also develop a theory that motivates the algorithm.

]]> Barbara Christopher 1 1266503490 2010-02-18 14:31:30 1475891368 2016-10-08 01:49:28 0 0 event 2010-02-09T11:00:00-05:00 2010-02-09T12:00:00-05:00 2010-02-09T12:00:00-05:00 2010-02-09 16:00:00 2010-02-09 17:00:00 2010-02-09 17:00:00 2010-02-09T11:00:00-05:00 2010-02-09T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-09 11:00:00 2010-02-09 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker
ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Mike Harrison, Stanford University]]> 27279 Speaker
Michael Harrison
Adams Distinguished Professor of Management
Stanford University

Abstract
Motivated by applications in financial services, we consider the following customized pricing problem. A seller of some good or service (like auto loans or small business loans) confronts a sequence of potential customers numbered 1, 2, … , T. These customers are drawn at random from a population characterized by a price-response function ρ(p). That is, if the seller offers price p, then the probability of a successful sale is ρ(p). The profit realized from a successful sale is Ï€(p) = p − c, where c > 0 is known.

If the price-response function ρ(-) were also known, then the problem of finding a price p* to maximize ρ(p)π(p) would be simple, and the seller would offer price p* to each of the T customers. We consider the more complicated case where ρ(-) is fixed but initially unknown: roughly speaking, the seller wants to choose prices sequentially so as to maximize the total profit earned from the T potential customers; each successive choice involves a trade-off between refined estimation of the unknown price-response function (learning) and immediate profit (earning).

* Joint work with Bora Keskin and Assaf Zeevi

]]> Barbara Christopher 1 1266503446 2010-02-18 14:30:46 1475891368 2016-10-08 01:49:28 0 0 event 2010-02-18T11:00:00-05:00 2010-02-18T12:00:00-05:00 2010-02-18T12:00:00-05:00 2010-02-18 16:00:00 2010-02-18 17:00:00 2010-02-18 17:00:00 2010-02-18T11:00:00-05:00 2010-02-18T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-18 11:00:00 2010-02-18 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker
ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[Operations Research Seminar: Adrian Lewis, Cornell University]]> 27279 Speaker
Adrian Lewis
Cornell University

Abstract
(To be announced)

]]> Barbara Christopher 1 1266503859 2010-02-18 14:37:39 1475891368 2016-10-08 01:49:28 0 0 event (To be announced)

]]>
2010-04-06T12:00:00-04:00 2010-04-06T13:00:00-04:00 2010-04-06T13:00:00-04:00 2010-04-06 16:00:00 2010-04-06 17:00:00 2010-04-06 17:00:00 2010-04-06T12:00:00-04:00 2010-04-06T13:00:00-04:00 America/New_York America/New_York datetime 2010-04-06 12:00:00 2010-04-06 01:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro
ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[Yinyu Ye, Stanford University]]> 27279 Speaker
Yinyu Ye, Stanford University

Abstract
(To be announced)

]]> Barbara Christopher 1 1266503539 2010-02-18 14:32:19 1475891368 2016-10-08 01:49:28 0 0 event 2010-02-23T11:00:00-05:00 2010-02-23T12:00:00-05:00 2010-02-23T12:00:00-05:00 2010-02-23 16:00:00 2010-02-23 17:00:00 2010-02-23 17:00:00 2010-02-23T11:00:00-05:00 2010-02-23T12:00:00-05:00 America/New_York America/New_York datetime 2010-02-23 11:00:00 2010-02-23 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Renato Monteiro
ISyE
Contact Renato Monteiro
404-894-2300

]]>
<![CDATA[ISyE Advisory Board Meeting]]> 27187 TITLE: ISyE Advisory Board Meeting

The H. Milton Stewart Advisory Board is comprised of distinguished professionals and community leaders, and serve as a sounding board for the school chair and assist with the School's development goals.

The meeting will be held at the Georgia Tech Hotel and Conference Center from 11:00 am - 5:00 pm with a working lunch and dinner to follow

]]> Anita Race 1 1266934428 2010-02-23 14:13:48 1475891363 2016-10-08 01:49:23 0 0 event 2010-04-22T12:00:00-04:00 2010-04-22T18:00:00-04:00 2010-04-22T18:00:00-04:00 2010-04-22 16:00:00 2010-04-22 22:00:00 2010-04-22 22:00:00 2010-04-22T12:00:00-04:00 2010-04-22T18:00:00-04:00 America/New_York America/New_York datetime 2010-04-22 12:00:00 2010-04-22 06:00:00 America/New_York America/New_York datetime <![CDATA[]]> Lisa Johnson
ISyE
Contact Lisa Johnson
404-894-2303

]]>
<![CDATA[Rare Event Simulation for Many Server Queues]]> 27187 TITLE:  Rare Event Simulation for Many Server Queues

SPEAKER: Dr. Jose Blanchet

ABSTRACT:

Our focus is on the development of provably efficient simulation algorithms for estimating large deviations probabilities (such as overflow probabilities) in the context of many server queues. These types of systems, which have been the subject of much investigation in recent years, pose interesting challenges from a rare event simulation standpoint, given their measure valued state descriptor. We shall explain a technique that has the following elements. First, it introduces a pivotal set that is suitable chosen to deal with boundary-type behavior, which is common in the analysis of queueing systems. Second, it takes advantage of Central Limit Theorem approximations that have been developed recently for these types of systems and third it use a novel bridge-sampling approach in order to describe an asymptotically optimal (in certain sense) importance sampling scheme. This work provides the first systematic approach to develop provably efficient rare-event simulation methodology for these types of systems.

(Joint work with P. Glynn and H. Lam)

Bio:

Jose Blanchet is a faculty member of the IEOR at Columbia University. Jose holds a Ph.D. in Management Science and Engineering from Stanford University. Prior to joining Columbia he was a faculty member in the Statistics Department at Harvard University. Jose is a recipient of the 2009 Best Publication Award given by the INFORMS Applied Probability Society and a CAREER award in Operations Research given by NSF in 2008. He worked as an analyst in Protego Financial Advisors, a leading investment bank in Mexico. He has research interests in applied probability and Monte Carlo methods. He serves in the editorial board of Advances in Applied Probability, Journal of Applied Probability, QUESTA and TOMACS.

]]> Anita Race 1 1262609232 2010-01-04 12:47:12 1475891363 2016-10-08 01:49:23 0 0 event 2010-01-26T11:00:00-05:00 2010-01-26T12:00:00-05:00 2010-01-26T12:00:00-05:00 2010-01-26 16:00:00 2010-01-26 17:00:00 2010-01-26 17:00:00 2010-01-26T11:00:00-05:00 2010-01-26T12:00:00-05:00 America/New_York America/New_York datetime 2010-01-26 11:00:00 2010-01-26 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Dr. Ton Dieker

]]>
<![CDATA[ISyE Advisory Board Meeting]]> 27187 TITLE: ISyE Advisory Board Meeting

The H. Milton Stewart Advisory Board is comprised of distinguished professionals and community leaders, and serve as a sounding board for the school chair and assist with the School's development goals.

]]> Anita Race 1 1263218301 2010-01-11 13:58:21 1475891363 2016-10-08 01:49:23 0 0 event 2010-10-22T09:00:00-04:00 2010-10-22T16:30:00-04:00 2010-10-22T16:30:00-04:00 2010-10-22 13:00:00 2010-10-22 20:30:00 2010-10-22 20:30:00 2010-10-22T09:00:00-04:00 2010-10-22T16:30:00-04:00 America/New_York America/New_York datetime 2010-10-22 09:00:00 2010-10-22 04:30:00 America/New_York America/New_York datetime <![CDATA[]]> Lisa Johnson
ISyE
Contact Lisa Johnson
404-894-2303

]]>
<![CDATA[Sheldon M. Ross - Some Multiple Player Gambler's Ruin Problems]]> 27187 TITLE: Some Multiple Player Gambler’s Ruin Problems

SPEAKER: Sheldon M. Ross

ABSTRACT:

Suppose there are r gamblers, with gambler i initially having a fortune of ni.  In our first model we suppose that at each stage two of the gamblers are chosen to play a game, equally likely to be won by either player, with the winner of the game receiving 1 from the loser. Any gambler whose fortune becomes 0 leaves, and this continues until there is only a single gambler left. We are interested in the probability that player i is the one left, and in the the mean number of games played between specified players i and j. In our second model we suppose that all remaining players contribute 1 to a pot, which is equally likely to be won by each of them. The problem here is to determine the expected number of games played until one player has all the
funds.

]]> Anita Race 1 1266926977 2010-02-23 12:09:37 1475891348 2016-10-08 01:49:08 0 0 event Some Multiple Player Gambler's Ruin Problems

]]>
2010-03-16T13:00:00-04:00 2010-03-16T14:00:00-04:00 2010-03-16T14:00:00-04:00 2010-03-16 17:00:00 2010-03-16 18:00:00 2010-03-16 18:00:00 2010-03-16T13:00:00-04:00 2010-03-16T14:00:00-04:00 America/New_York America/New_York datetime 2010-03-16 01:00:00 2010-03-16 02:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Hard Disk Drive Prognostics for Holistic Lifecycle Management of Personal Computers via Embedded Sensors]]> 27187 TITLE: Hard Disk Drive Prognostics for Holistic Lifecycle Management of Personal Computers via Embedded Sensors

SPEAKER: Associate Professor Sagar Kamarthi

ABSTRACT:

This talk presents a vision for holistic product lifecycle management via embedded sensors. The main goal of the proposed product monitoring framework is to cut down the product repair/service costs, boost the recovery of end-of-life products, and curtail the product disposal rate. The talk focuses on personal computers to demonstrate the approach; however the concepts and methods presented herein are equally applicable to a wide variety of consumer products. It is in this context that the talk presents the technical details of the hard disk drive prognostics and a novel feature extraction method through discrete wavelet transforms.

]]> Anita Race 1 1266937651 2010-02-23 15:07:31 1475891348 2016-10-08 01:49:08 0 0 event 2010-02-26T12:00:00-05:00 2010-02-26T13:00:00-05:00 2010-02-26T13:00:00-05:00 2010-02-26 17:00:00 2010-02-26 18:00:00 2010-02-26 18:00:00 2010-02-26T12:00:00-05:00 2010-02-26T13:00:00-05:00 America/New_York America/New_York datetime 2010-02-26 12:00:00 2010-02-26 01:00:00 America/New_York America/New_York datetime <![CDATA[]]>
<![CDATA[Ed Kaplan, Yale University]]> 27279 Speaker
Edward H. Kaplan

William N and Marie A Professor of Management Sciences,
Yale School of Management;

Professor of Public Health,
Yale School of Public Health;

Professor of Engineering,
Yale School of Engineering and Applied Science

Abstract
This article presents the first model developed specifically for understanding the infiltration and interdiction of ongoing terror plots by undercover intelligence agents, and does so via novel application of ideas from queueing theory and Markov population processes. The resulting "terror queue" models predict the number of undetected terror threats in an area from agent activity/utilization data, and also estimate the rate with which such threats can be interdicted. The models treat terror plots as customers and intelligence agents as servers. Agents spend all of their time either detecting and infiltrating new terror plots (in which case they are "available"), or interdicting already detected terror plots (in which case they are "busy"). Initially we examine a Markov model assuming that intelligence agents, while unable to detect all plots, never err by falsely detecting fake plots. While this model can be solved numerically, a simpler Ornstein-Uhlenbeck diffusion approximation yields some results in closed form while providing nearly identical numerical performance. The transient behavior of the terror queue model is discussed briefly along with a sample sensitivity analysis to study how model predictions
compare to simulated results when using estimated versus known terror plot arrival rates. The diffusion model is then extended to allow for the false detection of fake plots. Such false detection is a real feature of counterterror intelligence given that intelligence agents or informants can make mistakes, as well as the proclivity of terrorists to deliberately broadcast false information. The false detection model is illustrated using suicide bombing data from Israel.

Bio
Professor Kaplan's research has been reported on the front pages of the New York Times and the Jerusalem Post, editorialized in the Wall Street Journal (article), recognized by the New York Times Magazine's Year in Ideas, and discussed between the covers of Time (article), Newsweek (see article), US News and World Report, Consumer Reports and the New Yorker, and in person on NBC's Today Show, the Cronkite Report, and National Public Radio
(transcript).

The author of more than 100 research articles, Professor Kaplan received both the Lanchester Prize and the Edelman Award, the two top honors in the operations research field. An elected member of both the National Academy of Engineering and the Institute of Medicine of the US National Academies, he has also twice received the prestigious Lady Davis Visiting Professorship at the Hebrew University of Jerusalem, where he has investigated AIDS policy issues facing the State of Israel.

Kaplan’s current research focuses on the application of operations research to problems in counterterrorism and
homeland security.

]]> Barbara Christopher 1 1266504768 2010-02-18 14:52:48 1475891319 2016-10-08 01:48:39 0 0 event Joint IE/OR Colloquium Terror Queues

]]>
2010-03-03T11:00:00-05:00 2010-03-03T12:00:00-05:00 2010-03-03T12:00:00-05:00 2010-03-03 16:00:00 2010-03-03 17:00:00 2010-03-03 17:00:00 2010-03-03T11:00:00-05:00 2010-03-03T12:00:00-05:00 America/New_York America/New_York datetime 2010-03-03 11:00:00 2010-03-03 12:00:00 America/New_York America/New_York datetime <![CDATA[]]> Ton Dieker
ISyE
Contact Ton Dieker
404-385-3140

]]>
<![CDATA[ISyE Distinguished Lecture: 'Learning from the Experiences of Others']]> 27187 The School of Industrial and Systems Engineering welcomes Bradley Efron, Max H. Stein Professor of Statistics and Biostatistics in the School of Humanities and Sciences at Stanford University, as the featured guest for its 2010 Distinguished Lecture.

Abstract:
Familiar statistical estimates such as batting averages, political polls, and medical trial results are obtained by direct observation of cases of interest. Sometimes, though, we can learn from the experience of "others": for instance there may be information about Player A's batting ability in the observed averages of Players B, C, and D. In his presentation, Professor Efron will present several examples showing how this works in practice, indicating some of the surprising theoretical ideas involved. The talk is mainly descriptive in nature, and is intended for a general scientific audience.

About Dr. Efron:
Bradley Efron is the Max H. Stein Professor of Statistics and Biostatistics at Stanford University's School of Humanities and Sciences and the Department of Health Research and Policy with the School of Medicine. He completed his undergraduate work in mathematics at the California Institute of Technology, and earned his doctorate in statistics from Stanford in 1964, joining the Stanford faculty that same year. He was Associate Dean for the School of Humanities and Sciences from 1987 to 1990, served a term as Chair of the Faculty Senate as well as three terms as Chair of the Department of Statistics, and continues as Chairman of the Mathematical and Computational Sciences Program. He has served as president of the American Statistical Association and of the Institute of Mathematical Statistics. He is a past editor of the  Journal of the American Statistical Association and is presently the founding editor of the  Annals of Applied Statistics.

Among the numerous honors that Efron has received are Fellowships of the American Academy of Arts and Sciences, the American Statistical Association, the Institute of Mathematical Statistics, the Royal Statistical Society, the International Statistical Institute and the MacArthur Fellows Program of the John D. and Catherine T. MacArthur Foundation. He is a member of the U.S. National Academy of Sciences, a recipient of the Ford Prize of the Mathematical Association of America and of both the Wilks Medal and the Noether Prize of the American Statistical Association. Efron was awarded the 1998 Parzen Prize for Statistical Innovation by Texas A&M University, and the first-ever Rao Prize for outstanding research in statistics by Pennsylvania State University in 2003. He received the 2005 National Medal of Science "for his contributions to theoretical and applied statistics, especially the bootstrap sampling technique; for his extraordinary geometric insight into nonlinear statistical problems; and for applications in medicine, physics and astronomy."

]]> Anita Race 1 1263902984 2010-01-19 12:09:44 1475891144 2016-10-08 01:45:44 0 0 event The School of Industrial and Systems Engineering welcomes Bradley Efron, Max H. Stein Professor of Statistics and Biostatistics in the School of Humanities and Sciences at Stanford University, as the featured guest for its 2010 Distinguished Lecture.

]]>
2010-09-23T17:00:00-04:00 2010-09-23T18:30:00-04:00 2010-09-23T18:30:00-04:00 2010-09-23 21:00:00 2010-09-23 22:30:00 2010-09-23 22:30:00 2010-09-23T17:00:00-04:00 2010-09-23T18:30:00-04:00 America/New_York America/New_York datetime 2010-09-23 05:00:00 2010-09-23 06:30:00 America/New_York America/New_York datetime <![CDATA[]]> <![CDATA[Dr. Bradley Efron]]>
<![CDATA[Some recent developments in multi-state and degradation models for reliability inference]]> 27187 TITLE:  Some recent developments in multi-state and degradation models for reliability inference

SPEAKER:  Vijay Nair

ABSTRACT:

Traditional reliability analysis is based on time-to-failure data. In this talk, we discuss two different directions that lead to more informative reliability inference. The first is multi-state models. Challenges in statistical inference based on interval-censored data will be described, and methods for doing likelihood-based inference in semi-Markov models will be discussed. The second part of the talk will provide an overview of degradation models, describe a class of non-homogeneous Gaussian processes and some inference issues. This talk is based on joint work with Yang Yang, Yves Atchade and Xiao Wang.

*Bio*: Vijay Nair is the Donald A. Darling Professor of Statistics and Professor of Industrial and Operations Engineering at the University of Michigan, Ann Arbor. He has been Chair of the Department of Statistics sine 1998. Previously, he was a Research Scientist in the Mathematical Sciences and Operations Research Centers at Bell Laboratories for fifteen years.

His research interests include engineering statistics, information technology, quality and process improvement, industrial experiments, reliability and risk analysis, neuro-informatics, and statistical methods in behavioral intervention research. He also has extensive practical experience in the automotive, semiconductor, and telecommunications industries.

Vijay has a Bachelor’s degree in Economics from the University of Malaya (Malaysia) and a Ph. D. in Statistics from the University of California, Berkeley. He has been editor of Technometrtics, International Statistical Review, coordinating editor of the Journal of Statistical Planning and Inference, and has served on the editorial boards of many other leading statistics and quality journals. He is currently Vice President of the International Statistics Institute, President-elect of the International Society for Business and Industrial Statistics, and Chair of the Statistics Division of the American Society for Quality. He has chaired or co-chaired several panels of the National Academies and is a former chair of the Board of Trustees of the National Institute of Statistical Sciences. He is a Fellow of the American Association for the Advancement of Science, the American Statistical Association, the American Society for Quality, and the Institute of Mathematical Statistics.

]]> Anita Race 1 1288877126 2010-11-04 13:25:26 1475891144 2016-10-08 01:45:44 0 0 event 2010-11-18T11:00:00-05:00 2010-11-18T12:00:00-05:00 2010-11-18T12:00:00-05:00 2010-11-18 16:00:00 2010-11-18 17:00:00 2010-11-18 17:00:00 2010-11-18T11:00:00-05:00 2010-11-18T12:00:00-05:00 America/New_York America/New_York datetime 2010-11-18 11:00:00 2010-11-18 12:00:00 America/New_York America/New_York datetime <![CDATA[]]>