looking for ways to differentiate themselves, such as by being

responsive in fulfilling demand while keeping the delivery promises.

We look at the dynamic due date quotation problem under base-stock

inventory holding, where the demand is lead-time sensitive and unmet

due dates are penalized. We examine several facets of the due date

quotation problem, including quoting reliable due-dates based on workload

status, maximizing profit considering the lateness cost incurred

due to late deliveries, and deciding on the level of inventory. We develop

a structural analysis of the optimal due-date quotation policy under

a given base-stock level and we show an optimal policy exists and

is monotone in the number of customers. We also obtain insights on

the optimal base-stock level. We conduct experiments to identify when is

it more profitable to operate in a pure make-to-order environment,

and when is it more profitable to keep inventory; how much inventory

should be kept; and how utilization levels affect the profits.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

from various fields. This thesis investigates data mining problems

in tree-based models and large-scale contingency tables. The first

half of the thesis pertains to the tree-based models for the

classification problem, which have been very popular in various

fields because of their interpretability and flexibility. Tree

modeling involves two major steps: tree growing and tree pruning.

Tree growing searches over the whole data set to find the

splitting point that leads to the greatest improvement in a

specified score function. Once the trees are grown, tree pruning

pursues the right sized tree that provides the best estimate of

error when the tree is applied to unseen data. In this thesis, we

propose a novel algorithm for tree pruning, called frontier-based

tree pruning (FBP). The new method has an order of computational

complexity comparable to cost-complexity pruning (CCP). Regarding

tree pruning, FBP provides a full spectrum of information: namely,

(1) given the value of the penalization parameter $lambda$, it

gives the decision tree specified by the complexity-penalization

approach; (2) given the size of a decision tree, it provides the

range of the penalization parameter $lambda$, within which the

complexity-penalization approach renders this tree size; and (3)

it finds the tree sizes that are {it inadmissible},

--- so regardless of what the value of the penalty parameter is, the

resulting tree, based on a complexity-penalization framework, will

never have these sizes. Simulations on real data sets reveal

surprising results: in the complexity-penalization approach, most

of the tree sizes are inadmissible. FBP facilitates a more

faithful implementation of cross validation (CV), which is favored

by simulations. As an extension of the FBP algorithm, we study how

CV performs in tree-based models. Considering the abundant results

available on applying CV to regression models, there is little

research on the effects of CV in classification models due to

their nonlinear structure. The main purpose of this study is to

explore the behavior of CV in tree-based models. We report

simulation studies that compare a cross-validated tree classifier

with an oracle classifier that is ideally derived on the knowledge

of underlying distributions. The main observation of this study

indicates that the difference between the testing and training

error from a cross-validated tree classifier and an oracle

classifier empirically has a linear regression relation. The

``slope'' and the ``$R^2$'' of regression models are employed as

the performance measures of a cross-validated tree classifier.

Moreover, simulation reveals that the performance of a

cross-validated tree classifier depends on the geometry, the

parameters of the underlying distributions, and the sample sizes.

Such observations can explain and justify the behavior of CV in

tree-based models.

The second half of the thesis presents multiple testing in

large-scale contingency tables and its application to pattern

recognition of protein structures. One of the most common test

procedures using two-way contingency tables is the test of

independence between two categorizations. Current significant

tests such as $chi^2$ or likelihood ratio tests provide overall

independency but bring limited information about the nature of the

association in the contingency tables. The main purpose of this

study is to develop a follow-up method to $chi^2$ or likelihood

ratio tests that identifies the significantly associated

individual cells in the contingency table. We propose a framework

of multiple testing procedures for testing independence of the

cell categories in contingency tables. In the simulation study, we

compare the power, type I error, and false discovery rate of five

different testing procedures in the contingency table. We observe

that no single procedure is superior for every scenario examined.

In addition, we record the relationships among the proportion of

true null hypotheses, power, type I error, and false discovery

rate. Finally, we employ the proposed method to identify the

patterns of pair-wise associations between amino acids involved in

$beta$-sheet bridges of proteins. We identify a number of amino

acid pairs that exhibit either strong or weak association. These

patterns provide useful information for algorithms that predict

secondary and tertiary structures of proteins.

Industrial and Systems Engineering

Contact Barbara Christopher

storage/pick zones in a warehouse to minimize picking effort. Unlike

previous work which has focused on minimizing travel distance, our

objective is to minimize the number of zones that must be visited to fill

the orders. This problem is NP-Complete so heuristic methods are developed

to find solutions. We present a Lagrangian relaxation approach as well as

several other construction heuristics. Improvement methods discussed

include 2-exchanges and cyclic exchanges. We also consider problem

variations such as different product sizes, stock splitting, and

rewarehousing. Computational results are presented for problems containing

up to 10664 products and 40 zones. In particular, our results show that

cyclic exchanges are very powerful and can be used to obtain solutions 15%

better than those from using popularity, a standard approach.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

stochastic programming problems that satisfy the following properties:

(1) The objective function can only be evaluated with some error,

for example by and at high computational cost.

b) The error can be decreased with more computational effort.

c) The higher order derivatives of the objective function are unavailable.

Though our work is motivated from problems arising in stochastic simulation

optimization (Eg.Revenue Management), such optimization problems also commonly arise in

engineering design (Eg. helicopter rotor blade design).

Due to the high cost of evaluating the objective function, our aim is

to develop convergent algorithms that can solve such problems

while requiring the fewest possible number of evaluations of the

objective function.

When the objective function and its gradient can be evaluated easily

and exactly, a typical trust region algorithm works as follows.

At each iteration, we first construct a polynomial model function, typically a

truncated Taylor series expansion of the objective at the

current iterate, that approximates the objective function

in a certain neighborhood of the current iterate called

the trust region. We then optimize this model function within the trust

region. Depending on whether the resulting

optimal solution has a lower objective function value or not, we

either set this as the next iterate or conversely, alter the trust

region size and/or the model function appropriately and try again.

Since in our case, the objective function and its higher order

derivatives cannot be evaluated exactly, we cannot use a Taylor series

based model function. Accordingly, we propose

alternative linear or quadratic polynomial

model functions that are constructed by linear regression using only

sample average approximations of the objective, evaluated at points

in the neighborhood of the current iterate. We describe the various

changes that have to be made to the traditional trust region framework

in order to successfully construct and control the accuracy of such regression

based model functions and present the convergence theory for the

resulting modified trust region algorithm. Finally, we provide

computational results for such an algorithm run on selected problems

from the CUTE test set.

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Engineering from the University of Florida, and his M.S. and Ph.D. in

Industrial Engineering from the University of South Florida. He

currently works for the Centers for Disease Control and Prevention,

National Immunization Program in Atlanta, GA. Dr. Washington has worked

on simulation projects related to urban and rural public health clinics,

emergency rooms, and warehouses. He also developed simulations of mass

vaccination/prophylaxis clinic to help local official plan for massive

bio-terrorist attacks. Additional past projects include cost analysis

of extra-vaccinating children, cost analysis of vaccine wastage, vaccine

forecasting, and complex survey analyses of vaccine acceptance. Dr.

Washington was a consultant for the World Health Organization (WHO), and

Ghana's Ministry of Health. While in Ghana, he provided consultation

expertise in data management, statistics, and geographical informational

system concerning about seven-vaccine preventable disease. He has also

consulted with WHO South East Asia Regional Office and the National

Polio Surveillance Unit (NPSU), New Delhi, India to evaluate

surveillance software, and to assist in training vaccine preventable

disease surveillance, data management, and mapping. He visited 5 out of

7 regions in India to evaluate NPSU's data managers skills, tools, and

capabilities. Because of Dr. Washington's accomplishments, the National

Engineering Week selected him as one of the top 16 young engineers in

the nation in 2003, and he was profile in USA Today and other media

outlets throughout the year. Recently, he was nominated for a Service to

America Medal for his Homeland Security efforts in creating a mass

smallpox vaccinations computer model to help local officials prepare for

a bio-terrorist attack.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

is called a chance constraint on the decision variable x. Optimization problems with chance constraints are notoriously hard to solve.

A simple strategy for solving chance constrained problems is to generate N samples according to the distribution P and then impose the constraints f(x,Y_i) < 0, for all i = 1, ..., N (*)

Since Y_i are random samples, there is no hope that decision

variables x that are feasible for (*) are all feasible for the chance

constraint with probability 1. Therefore, one has to allow a probability of error delta. Question: How large should N be as a function of eps and

delta ?

Recently, Calafiore and Campi showed that when f(x,Y) is a convex function of x for every fixed Y we only need N =O(n/delta log(1/eps)). Nemirovski and Shapiro have shown that if the function f(x,Y) is bi-affine and the distribution P has a "concentration-of-measure" property the number of samples required drops to N = O(n log(1/(eps*delta))).

In many applications of chance constrained problems the distribution P is not completely known, i.e. the distribution is ambiguous. The natural

constraint to impose in this setting is the ambiguous chance constraint:

max_{P in M} {P(f(x,Y) > 0)} < eps, where M is an uncertainty set of distributions. In this talk we discuss how to extend many results known for chance constrained problems to this more general setting. Robust deterministic optimization naturally arises

in these extensions by way of the Strassen-Dudley representation theorem.

(Joint work with Emre Erdogan)

Brief Bio: Garud Iyengar received his PhD in Electrical Engineering from Stanford University in 1998. Since then he has been with the Industrial Engineering and Operations Research (IEOR) department at Columbia

University where he is currently an Associate Professor.

Industrial and Systems Engineering

Contact Barbara Christopher

technique for pricing American-style contingent options. The

second part details a statistical arbitrage model using

statistical process control approaches.

We propose a novel simulation approach for pricing American-style

contingent claims. We develop an adaptive policy search algorithm

for obtaining the optimal policy in exercising an American-style

option. The option price is first obtained by estimating the

optimal option exercising policy and then evaluating the option

with the estimated policy through simulation. Both high-biased and

low-biased estimators of the option price are obtained. We show

that the proposed algorithm leads to convergence to the true

optimal policy with probability one. This policy search algorithm

requires little knowledge about the structure of the optimal

policy and can be naturally implemented using parallel computing

methods. As illustrative examples, computational results on

pricing regular American options and American-Asian options are

reported and they indicate that our algorithm is faster than

certain alternative American option pricing algorithms reported in

the literature.

Secondly, we investigate arbitrage opportunities arising from

continuous monitoring of the price difference of highly correlated

assets. By differentiating between two assets, we can separate

common macroeconomic factors that influence the asset price

movements from an idiosyncratic condition that can be monitored

very closely by itself. Since price movements are in line with

macroeconomic conditions such as interest rates and economic

cycles, we can easily see out of the normal behaviors on the price

changes. We apply a statistical process control approach for

monitoring time series with the serially correlated data. We use

various variance estimators to set up and establish trading

strategy thresholds.

Industrial and Systems Engineering

Contact Barbara Christopher

process. The new estimators are linear combinations of estimators formed from overlapped

versions of standardized time series and/or batch means estimators using different batch sizes.

These "overlapping overlapping" estimators have both lower bias and variance than their original

overlapping (without linear combinations) counterparts. The work is joint with Tuba Aktaran.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

energy investing. Energy markets present a unique opportunity for

financial analysis, since the underlying commodity economics cause many traditional finance assumptions to be violated. Instead, practical research solutions must address the fundamental economics and account for the shortcomings of established equity and fixed income theory. During

this talk, we will attempt to highlight some of these issues and provide a discussion framework for application of textbook theory to the trading desk reality. Further, this talk will present a brief description of the

overall hedge fund environment in which a research professional must

operate.

Founded in 1990, Citadel Investment Group is a world leader in alternative investments, with a team of over 800 people in five offices worldwide. Our research philosophy is to apply a systematic process driven approach

to investing in order to advance the reliability and repeatability of high risk adjusted returns.

Speaker Bio:

Dr. Byrns is the Director of Energy Research at Citadel Investment Group in Chicago, IL. He received his PhD in Engineering, (GA Tech 1991), as

well as a MS in Economics (GA Tech 1991), a MSE in Aerospace Engineering (GA Tech 1988) and a BSE in Mechanical and Aerospace Engineering (Princeton 1985). He has been actively involved in commodity research for almost 8 years, holding various staff and management positions at Williams

Energy, Merchant Energy Group of the Americas, and Koch Industries. Prior to entering the commodity field, he worked as a consultant in Washington

D.C.

Industrial and Systems Engineering

Contact Barbara Christopher

Motivated by these remarks, we consider the problem of performance modeling, analysis and control of capacitated, flexibly automated re-entrant lines. Specifically, we develop an analytical framework for the modeling, analysis and control of capacitated re-entrant lines, which is based on Generalized Stochastic Petri nets (GSPN) framework. Furthermore, the underlying scheduling problem is transformed to a Markov Decision Process (MDP) problem and finally, we suggest a systematic, efficient and scalable approximating scheme, which is based on the Neuro-Dynamic Programming (NDP) theory, for the optimal scheduling policy characterized in the GSPN / MDP framework. The quality of the obtained approximations is experimentally assessed by

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

In this talk, it will first be shown that even in simple networks, commonly used operational policies like first-in-first-out sequencing may perform badly, failing to achieve even "throughput optimality." We then present two families policies, called discrete-proportional-processor-sharing and maximum pressure policies, that are always throughput optimal, regardless of the processing network's topology or parameter values. These policies have other attractive features as well, including distributed implementation that uses only local or semi-local congestion information. A simulation study has been undertaken that evaluates these policies in a wafer fabrication setting, using SEMATECH data sets. The results of that study will be discussed, along with other attractive theoretical properties of the two policy families.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

supply decisions of manufacturers. Companies are looking for ways

to decrease their procurement costs, which account for a large

percentage of the supply chain costs. We study the effects of

demand aggregation and collaborative procurement on buyers'

profitability. First, we make a high-level analysis and consider a

market with multiple buyers and suppliers where multi-unit

transactions for multiple items take place. The procurement costs

are effected by economies of scale in the suppliers' production

costs and by economies of scope in transportation. We design buyer

strategies that model different collaboration levels and assess

the role of collaboration under varying market conditions. Next,

we analyze the procurement process on a lower level and identify

benefits of inter-firm collaboration among buyers who are

potential competitors in the end market. We adopt a game-theoretic

approach to explore the economics of the basic mechanism

underlying collaborative procurement, and determine the conditions

that makes it beneficial to the participants.

Besides low procurement costs, important considerations in

supplier selection are responsiveness and the reliability of the

suppliers in meeting demand. Hence, manufacturers face the

pressure for quoting short and reliable lead-times. We cover

several aspects of the manufacturer's problem, such as quoting

reliable due-dates based on the workload status in the system,

maximizing profit considering the lateness cost incurred due to

late deliveries, and deciding on the level of inventory to

increase responsiveness. We employ a model where demand arrival

and manufacturing processes are stochastic, and obtain insights on

the optimal due-date policy and on the optimal inventory level.

Industrial and Systems Engineering

Contact Barbara Christopher

References Bala MV, Zarkin GA. Pharmacogenomics and the evolution of healthcare : is it time for cost-effectiveness analysis at the individual level? Pharmacoeconomics. 2004;22(8):495-8. Flowers CR, Veenstra D. The role of cost-effectiveness analysis in the era of pharmacogenomics. Pharmacoeconomics. 2004;22(8):481-93. Danzon P, Towse A. The economics of gene therapy and of pharmacogenetics. Value Health. 2002 Jan-Feb;5(1):5-13.

Dr. Christopher Flowers, MD, is an Assistant Professor in Hematology and Oncology and the Clinical Director for Oncology Informatics Program at the Winship Cancer Institute, Emory University School of Medicine

]]>Industrial and Systems Engineering

Contact Barbara Christopher

and components. Our objective is to maximize infinite horizon expected

discounted profit. We show that optimal product prices and component

production capacity result in utilization near 100%, so the system is in

heavy traffic. We further show that heavy traffic remains the optimal

operating regime when customer orders must be assembled within a maximum

delay, and component production can be expedited at some additional cost.

In heavy traffic, the system exhibits a reduction in problem

dimensionaltiy. The limiting diffusion approximation has dimension equal

to the number of components (rather than the number of components plus the

number of products). We use this insight to propose discrete review

policies for sequencing product orders for assembly in both of the

aforementioned models. When delay constraints are present, we

additionally provide a policy for expediting components at discrete review

time points. We show our discrete review policies are asymptotically

optimal in heavy traffic.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Sequential resource allocation systems (RAS) constitute a pertinent modeling abstraction for the operational dynamics of a broad range of contemporary technological applications, including production systems, material handling and railway / monorail systems, e-commerce and other service-related processes, and even computational environments like those emerging in internet-based computing. In all these environments, a set of concurrently executing processes contest for the sequential exclusive acquisition of a finite set of re-usable resources that are necessary to support the execution of their various processing stages. The resulting resource allocation process must be controlled for (i) operational efficiency, a requirement giving rise to scheduling problems in the context of these environments, but also, for (ii) logical correctness and inherent consistency, a requirement addressed by an emerging logical control theory for these systems. The effective logical control of the aforementioned applications becomes an even more important problem as these environments migrate to extensively automated operational modes.

This talk will survey the state-of-the-art in RAS logical control. More specifically, the first part of the talk will provide a general description of the problem and a formal characterization of it in the Discrete Event Systems (Ramadge & Wonham

]]>Industrial and Systems Engineering

Contact Barbara Christopher

system from a number of alternatives, where the best system is defined by

the given problem. The primary focus of this thesis is on experiments where

the data are from simulated systems. In simulation ranking and selection

procedures, four classes of comparison problems are typically encountered.

We focus on two of them: Bernoulli and multinomial selection. Therefore, we

wish to select the best system from a number of simulated alternatives where

the best system is defined as either the one with the largest probability of

success (Bernoulli selection) or the one with the greatest probability of

being the best performer (multinomial selection). We focus on procedures

that are sequential and use an indifference-zone formulation wherein the

user specifies the smallest practical difference he wishes to detect between

the best system and other contenders.

We apply fully sequential procedures due to Kim and Nelson (2004) to

Bernoulli data for terminating simulations, employing common random numbers.

We find that significant savings in total observations can be realized for

two to five systems when we wish to detect small differences between

competing systems. We also study the multinomial selection problem. We offer

a Monte Carlo simulation of the Bechhofer and Kulkarni (1984) MBK

multinomial procedure and provide extended tables of results. In addition,

we introduce a multi-factor extension of the MBK procedure. This procedure

allows for multiple independent factors of interest to be tested

simultaneously from one data source (e.g., one person will answer multiple

independent surveys) with significant savings in total observations compared

to the factors being tested in independent experiments (each survey is run

with separate focus groups and results are combined after the experiment).

Another multi-factor multinomial procedure is also introduced, which is an

extension to the MBG procedure due to Bechhofer and Goldsman (1985, 1986).

This procedure performs better that any other procedure to date for the

multi-factor multinomial selection problem and should always be used

whenever table values for the truncation point are available.

Industrial and Systems Engineering

Contact Barbara Christopher

from the University of Florida, and his M.S. and Ph.D. in Industrial Engineering

from the University of South Florida. He currently works in the National Immunization

Program at the Centers for Disease Control and Prevention in Atlanta, specializing

in applying OR techniques to public health issues. Dr. Washington has served

as a consultant for several foreign governments, and National Engineering Week

selected him as one of the top 16 young engineers in the nation in 2003. He was

nominated for a Service to America Medal for his Homeland Security efforts in

creating a mass smallpox vaccinations computer model to help local officials

prepare for a bio-terrorist attack.

A reception will precede the meeting at 5:30. The meeting is open to all interested

parties, and is free of charge. Refreshments will be served, and there will

also be time to network with fellow OR professionals in the Atlanta community.

The meeting will be in the Executive Classroom (Room 228) of the ISyE Main

Entrance Bldg. (755 Ferst Drive, formerly the Dupree School of Management Building

- same location as our previous meetings this year.) There is plenty of free

parking next to the building after 5:00 PM.

Map

http://www.isye.gatech.edu/visitors/maps/

Informs Chapter Website

http://www.isye.gatech.edu/~informs/atl_chapter/index.html

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

with robust design (RD), and on-line procedures, also known as statistical

process control (SPC). Most research in both areas of quality control has

dealt with single variables. Since most complex systems are multivariate

in nature, there is an increasing need for user friendly multivariate

techniques.

The multivariate quadratic loss function (MQL) is a popular multivariate

technique in static RD and has occasionally been applied to multivariate

SPC. In both areas we integrate the contents of the MQL into specially

constructed principal components called loss-scaled principal components

(LSPC). We examine how well a subset of these LSPC approximate the

expected value of MQL and apply them to a RD problem featuring six

responses and eight predictor variables. We also show when

LSPCs can quickly detect and accurately diagnose shifts in location and

dispersion in multivariate SPC.

Industrial and Systems Engineering

Contact Barbara Christopher

peripherals) in large quantities. In the face of rapid equipment changes, current tax laws and disposal challenges, leasing or procurement contracts with take-back considerations are attractive for electronic equipment.

For a large electronic equipment leasing company, optimal management of assets supported by good logistics decisions is crucial and may provide a significant competitive advantage. The leasing company tries to maximize operating profits through key decisions associated with the length of leases, efficient utilization of logistics facilities for material flow to and from customer sites, and equipment reuse, refurbishment and disposal actions.

In this research, a mixed integer linear programming (MILP) model is developed to facilitate better decisions from the perspective of an electronic equipment leasing company. A case study with representative industry data validates the approach and demonstrates the utility of the model in answering key research questions. Next, important problem uncertainties are identified and prioritized. The effects of these key uncertainties on optimal lease length and product flow decisions are examined in detail via an extended case study. It is also shown how the leasing company can make near-robust leasing decisions in the face of these uncertainties.

The computational research results also have implications for policy formulation on electronic waste. The important insights include an understanding of the potential impacts and expected effectiveness of alternative environmental legislation in different geographic areas, and the imposition of negative externalities on other policy realms as a result of this non-uniform approach. Therefore, this research contributes new models and understanding to the intersection of the fields of reverse logistics and equipment replacement, and provides valuable insights to both business asset managers and environmental policy makers.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

This talk will survey the state-of-the-art in RAS logical control. More specifically, the first part of the talk will provide a general description of the problem and a formal characterization of it in the Discrete Event Systems (Ramadge & Wonham

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Assume the networks satisfy a so-called resource pooling condition. We prove a maximum pressure policy asymptotically minimizes the workload processes in heavy traffic. A key to the proof is to show the network processes exhibit state space collapse.

(joint with Jim Dai)

Industrial and Systems Engineering

Contact Barbara Christopher

The conference program covers a multitude of topics relevant to assistive technologies and universal accessibility and is structured around technical papers, poster sessions, demonstrations, panels, the doctoral consortium, and a host of social events.

ASSETS 2004 will be held at the Georgia Tech Global Learning Center at Technology Square, a state-of-the-art meeting and accommodation facility that combines all the amenities and modern technologies of tomorrow with a prime midtown Atlanta location. The hotel and conference center is fully accessible to all participants.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

logistics network systems. The linkage across different levels of spatial models

is specified through estimating equations to utilize information considered at different

stages. The multi-level models are applied in decision analysis of logistics systems

such as facility location allocation and demand forecasting problems. System

reliability of logistics service is introduced to characterize the uncertainty of

supply chain disruptions. In particular, a sum-of-disjoint-product (SDP) method

is presented to evaluate the system reliability. Real-life examples show the potential

use of the models in characterizing logistics networks, assisting logistics planning

processes and evaluating network designs for dealing with unexpected supply chain

disruption. (Joint work with J.-C. Lu and P. H. Kvam).]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Authors: Sigrun Andradottir, Hayriye Ayhan, and Douglas G. Down

]]>Industrial and Systems Engineering

Contact Barbara Christopher

the uniformly most powerful test for the detection of linear trends. This

Cusum chart is compared with several of its alternatives which are based on

the likelihood ratio test and on transformations of standardized recursive

residuals on which for instance the Q-chart methodology is based. It turns

out that the proposed Cusum chart is not only superior in the detection of

linear trend out-of-control conditions, but also in the detection of other

out-of-control situations considered in this paper. Approximate control

limits, determined from simulation, and an example of its use in practice

are given for the proposed Cusum chart. (Joint work with Ronald Does,

University of Amsterdam).]]>

Industrial and Systems Engineering

Contact Barbara Christopher

(Joint with Amy Ward)]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

This event is FREE FOR ISyE Ph.D. STUDENTS, FACULTY, AND STAFF. We encourage people to bring their families and close friends, as one of our goals is to have a "family-oriented" event. However, to cover expenses, there is a nominal $3 fee for all adult guests. There is no cost for children.

The menu will include, but not be limited to: hamburgers, veggie burgers, beef hot dogs, grilled chicken, vegetarian baked beans, and potato salad.

Water and soda will be provided as well. ALCOHOL IS PROHIBITED.

We are also planning to have a volleyball setup and, if enough interested faculty attend, a faculty vs student volleyball match.

If you would like to attend this event, please respond to this email no later than Thursday, September 30, at 5 P.M. with the following

information:

Name:

Year in Program (2nd, 3rd, 4th, etc.):

Area of Concentration or Specialty (EDA, Optimization, Statistics, etc):

Number of Adult Guests:

Number of Children Guests:

Names of Adult Guests:

Names of Children Guests:

We also could use some volunteers to help with the following:

1. Setup of tables and chairs at approximately 3:00 P.M.

2. Volleyball setup at approximately 3:30 P.M.

3. Clean-up from approximately 7:30-9:00 P.M.

If you are available and are willing to volunteer for one or more of these, please indicate that in your reply as well. Your help would be greatly appreciated.

Thank you, and we look forward to enjoying your fellowship on Friday!!! :)

]]>Industrial and Systems Engineering

Contact Barbara Christopher

We assume that there are finite environment states and the changing environment is modeled as a general stochastic process which takes discrete values. At each state of the environment, the network operates as a queueing network where each server may serve multiple classes of customers. In this study, we establish a framework to search for asymptotically optimal scheduling policies for such queueing networks. We first show that open queueing networks in a slowly changing environment can be approximated by their fluid analog, stochastic fluid models, when the network speed increases. Given a solution of the stochastic fluid model, we provide a method to derive suitable scheduling policies for the original queueing networks. We further show that the queueing networks operating under the derived policies converge to the corresponding stochastic fluid model . This result implies that the derived scheduling policies are asymptotically optimal if the given stochastic fluid model solution is optimal. We also study a stochastic fluid model to investigate the optimal resource allocation policies of Web servers serving heterogeneous classes where the Web servers may be overloaded and operate under Quality of Service contracts.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

a solution neighborhood, and this neighborhood is almost always tailored to the structure of the particular problem being solved. A MIP model typically conveys little information about the underlying problem structure. This talk will consider two new approaches to exploring

interesting, domain-independent neighborhoods in MIP. The more

effective of the two, which we call Relaxation Induced Neighborhood Search (RINS), constructs a promising neighborhood using information

contained in the continuous relaxation of the MIP model. Neighborhood exploration is then formulated as a MIP model itself and solved recursively. The second, which we call guided dives, is a simple

modification of the MIP tree traversal order. Loosely speaking, it guides the search towards nodes that are close neighbors of the best known feasible solution. Extensive computational experiments on very

difficult MIP models show that both approaches outperform default CPLEX MIP and a previously described approach for exploring MIP neighborhoods (local branching) with respect to several different metrics. The metrics

we consider are quality of the best integer solution produced within a time limit, ability to improve a given integer solution (of both good and poor quality), and time required to diversify the search in order to find a new solution.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

tool to reduce process variability by appropriate selection of

control factors to make the process insensitive to noise.

However, when there exists strong noise factors in the process,

robust parameter design alone may not be effective and an on-line

control strategy can be used to compensate for the effect of

noise. In this paper, a parameter design methodology in the

presence of feedback control is developed. Systems with pure-gain

dynamic model are considered and the best proportional-integral

(PI) and minimum mean squared error (MMSE) control strategies are

developed by using robust parameter design. The proposed method is

illustrated using a real life example from a urea packing plant. ]]>

Industrial and Systems Engineering

Contact Barbara Christopher

(Joint work with Liwei Bai (Georgia Tech), Liming Liu and Weixin Shang (Hong Kong Univ. of Sci. and Tech.)). Based on the paper, L. Bai, B. Fralix, L. Liu, and W. Shang. Inter-Departure Times in Base-Stock Inventory-Queues. Queueing Systems 47 (2004) 345-361.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

in the generalized linear mixed models to describe

"inter-location" population variation in variance components

for modeling complicated data obtained from applications such as

antenna manufacturing. Our distribution studies lead to

a complicated integrated extended quasi-likelihood (IEQL) for

parameter estimations and large sample inference derivations.

Laplace's expansion and several approximation methods

are employed to simplify the IEQL estimation procedures.

Asymptotic properties of the approximate IEQL estimates are

derived for general structures of the covariance matrix of random scales.

Focusing on a few special covariance structures in simpler forms,

the authors further simplify IEQL estimates such that the typically used software

tools such as weighted regression can perform the estimates easily.

Moreover, these special cases

allow us to derive interesting asymptotic results in much more

compact expressions. Finally, numerical simulation results show that IEQL estimates

perform very well in several special cases studied.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

useful in various applied fields. The method is now utilized in problems

arising from business intelligence, system engineering, and public health,

in addition to those in behavioral/social sciences. In this talk, a

general class of latent variable statistical models is considered in the

framework of structural equation system. In such a system, observed

variables are assumed to be related to latent factors, and relationships

among the unobservable factors are to be studied. Recent results in

handling various kinds of nonlinearity are presented. Model fitting,

estimation, and inference procedures are discussed.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

introduced the

Gamma-Minimax or Bayes-Minimax Paradigm,

as a compromise between Minimax and Bayes

Paradigms. Gamma-Minimax actions rely on

Min-Max type theorems and are often hard or

impossible to find.

In this talk we overview linear approximations

to Gamma-minimax actions and demonstrate

that in many decision theoretic problems

linear approximations are not substantially affecting

the risk of the decision maker.

The talk is a mixture of educational and research overviews,

and does not assume prior exposure to minimaxity and

Gamma-minimaxity.

Industrial and Systems Engineering

Contact Barbara Christopher

risk management. This involves extrapolation of an unknown distribution

function beyond observations. Under consideration is construsting

confidence intervals for high quantiles of a heavy tailed distribution.

In this talk we introduce three methods, including the normal approximation

method based on Hill's estimator, the likelihood ratio method and the

data tilting method. Our simulation study shows that the data tilting

method has a better performance in terms of the accuracy of coverage

probabilities.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Modeling and simulation is widely regarded as one of the breakthrough technologies that will accelerate progress in addressing the grand challenges facing manufacturing in the next decade. In this seminar, new research challenges arising from the increased criticality of inter-factory dependencies in today

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

(1990) suggested the SEL-method to find an optimal setting. Genetic

algorithms (GA) can be used to improve upon this method. To make the

search procedure more efficient, new ideas of forbidden array and weighted

mutation are introduced. Relaxing the condition of orthogonality, GA is

able to accommodate a variety of design points which allows more

flexibility and enhances the chance of getting the best setting in fewer

runs, particularly in the presence of interactions. The search procedure

is enriched by a Bayesian method for identifying the important main

effects and two-factor interactions. Illustration is given with the

optimization of three functions, one of which is from Shekel's family.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

For those of you who came to our ISyE tailgate party this past October, this event will be very similar. We will have a picnic-style setup again will GRILLED MEATS, CHIPS, AND DRINKS. However, this time, we will have FRISBEES AND A VOLLEYBALL SET-UP in addition to other games and activities for your enjoyment.

With regard to food, there will be NON-MEAT OPTIONS available for vegetarians. In addition, there will be NO PORK among the meat products.

PLEASE RSVP NO LATER THAN THURSDAY, APRIL 15 USING THE FORM BELOW if you plan to attend. Like last time, THERE IS A $2 CHARGE FOR ALL ADULT GUESTS, BUT NO CHARGE FOR CHILDREN. So, for those of you with families and/or significant others, we encourage you to bring them, if possible.

Name:

Years in ISyE:

Number of Guests:

Names of Guests (Kids in Parentheses):

If Date is Moved to 4/30 Due To Rain, Can You Attend?

(The rest of the form is for Ph.D. students only.)

Concentration:

Advisor:

In case of rain, the alternate date for the event has been set for Friday, April 30.

This event is being sponsored by the Georgia Tech Chapter of INFORMS, with approval from its faculty advisor Dr. Dave Goldsman.

Thank you.

W. Brad Jones

_________________________________

_________________________________

W. Brad Jones, Ph.D. Student

Georgia Institute of Technology

School of Industrial and Systems Engineering

310 ISyE Main Building

Atlanta, GA 30332

Industrial and Systems Engineering

Contact Barbara Christopher

Once again, our festival will feature a Mexican Fajita & Taco Bar.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

practice of stochastic programming. Namely, development of Monte Carlo

simulation based optimization techniques and mathematical theory of risk

measures.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

The modeling and the methodological part of this talk are motivated by the following three scientific questions:

(a) How does one "statistically quantify" the unknown amount of gene in a "sample" using PCR amplification methods?

(b) What are the factors affecting the amplification rate of a PCR experiment?

(c) How does one efficiently design experiments to study the mutations induced by PCR experiments?

Statistical quantification can lead to an accurate estimate of the HIV-1 viral load in HIV-1 infected patients. This not only helps with disease diagnosis but also in the disease prognosis. Answers to questions (b) and (c) will facilitate a better understanding of the dynamics of a PCR process.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

stationary Gaussian time series by Bayesianly induced

shrinkage of empirical wavelet coefficients is studied.

A model in the wavelet domain that

accounts for distributional properties of the log-periodogram

at levels of fine detail and approximate normality at

coarse levels in the wavelet decomposition, is proposed.

The smoothing procedure, called BAMS-LP (Bayesian Adaptive Multiscale

Shrinker of Log-Periodogram),

ensures that the reconstructed log-spectrum is

as noise-free as possible. It is also shown that the resulting

Bayes estimators are asymptotically optimal (in the frequentist sense).

Comparisons with non-wavelet and wavelet-non-Bayesian

methods are discussed.

This is a joint work with Marianna Pensky from UCF.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

The meeting is open to all interested parties, and is free of charge. Please pass this information along to any associates whom you think might be interested.

The meeting will be in the Executive Classroom (Room 228) of the ISyE Main Entrance Bldg. (755 Ferst Drive, formerly the Dupree School of Management Building).

Here's a link to a map of the campus: http://www.isye.gatech.edu/visitors/maps/

There should be plenty of free parking in the area after 5:00 PM.

Kevin Geraghty (kevin@revenueresearch.com) is organizing an informal dinner after the meeting. Please contact him if you are interested in participating so he can make an appropriate reservation at a nearby restaurant.

We look forward to seeing you there. Please contact me if you have questions or suggestions, or if you wish to be added to the mailing list for future meetings.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

chi-square). To test H_0:mu=mu_0 one computes or bounds P value(mu_0)=F_T(observed T(mu_0,mu^)); it is a distribution function of mu_0 (whose probability density one could derive). Define its inverse m^(u) by Q_T(u)=T(mu^(u),mu^); mu^(u), called the parameter with P-value u, is a quantile function which has a pseudo-Bayesian interpretation as the conditional quantile of mu given the data. The conventional confidence level 1-a confidence interval can

be shown to be mu^(a/2) define (and plot on same graph with exponential and normal) informative

quantile/quartile function Q/Q(u)=(Q(u)-midquartile)/2 IQR. Talk could also discuss confidence Q-Q plots, conditional quantile, comparison distribution, mid-distribution, and definition of sample quantiles, linear rank statistics, and sample variance.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

(SMLE) is more flexible than the traditional methods such as the parametric maximum likelihood estimation, Cox's proportional hazards model, accelerated failure time model, quasi-likelihood, and generalized estimating equations with much less restrictions on distributions and regression-models. The needed information about distribution and regression structures is incorporated in estimating equations of the SMLE to improve the estimation quality of nonparametric methods. The likelihood of the SMLE in censored data cases involve several complicated implicit functions without closed-form expressions, and the first derivatives of the log-profile-likelihood cannot be expressed as summations of independent and identically distributed random variables.

For group-censored data and continuos data, it is verified that all the implicit functions are well defined, and the asymptotic distributions of the SMLE for model parameters and lifetime distributions are obtained.

A real life example with HIV data is presented to illustrate the application of SMLE method.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

If you are not able to make this meeting, but have comments to share, please contact Ellis Johnson.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

coefficient estimation in linear regression models. The method is based on

a particular hierarchical Bayes formulation, and the estimator is shown to

be closely related to the LASSO estimator. Such a connection allows us to

take advantage of the recently developed quick LASSO algorithm to compute

the empirical Bayes estimate, and provides new ways to select the tuning

parameter in the LASSO method. Unlike previous empirical Bayes variable

selection methods, which in most practical situation can only be

implemented through a greedy stepwise algorithm, our method gives a global

solution efficiently. Simulations show that the proposed method compares

favorably with other variable selection and estimation methods in terms of

variable selection, estimation accuracy, and computation speed. This is

joint work with Professor Yi Lin. ]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

inference problems and vital for many network traffic engineering tasks such as

dynamic routing optimization, Quality of Service (QoS) guarantee. We propose a

pseudo likelihood approach for estimating parameters of these problems based on

the principle of divide-and-conquer. The example of multicast link delay

estimation problem is used to motivate the concept. We then apply the pseudo

likelihood approach to the problem of estimating origin-destination matrix

through link traffic counts, which is one of the core problems in network

traffic engineering.

This is joint work with Bin Yu.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

methods proposed for diagnosis, forecasting, and classification

using multivariate data.

The primary proponent of the MTS is Genichi Taguchi, who is very well

known for his controversial ideas and methods for using design of experiments.

The MTS was claimed to be a groundbreaking new philosophy for multivariate

data analysis and diagnosis and has been popularized in major companies,

such as Ford, GE, Nissan, Sharp, Xerox, etc.

In this talk, we review the methods of the MTS and use a case study based on

medical data to illustrate them. We also compare common classification

tree methods with the MTS in the case study and illustrate the

differences.]]>

Industrial and Systems Engineering

Contact Barbara Christopher

manufacturing processes. They are very informative in process monitoring

and controlling for nano-machining, ultra-thin semiconductor fabrication,

and antenna, steel-stamping or chemical manufacturing processes as seen in

many literature. In this talk, we present wavelet-based statistical

process control (SPC) procedures and evaluates their performance using

simulation studies. Unlike the recent SPC research on linear profile data

for monitoring global changes of data patterns, our methods focus on local

changes in data segments. In contrast to most of the SPC procedures

developed for detecting a known type of process change, our idea of

updating the selected parameters enables the handling all types of process

changes whether known or unknown. (Joint work with Jye-Chyi Lu).]]>

Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher

Please mark this date and time on your calendars.

]]>Industrial and Systems Engineering

Contact Barbara Christopher

Industrial and Systems Engineering

Contact Barbara Christopher