All tutorials take place in Room 215 of the Convention Center.
Model Uncertainty, Robust Optimization and Learning
George J. Shanthikumar, Purdue University; Andrew E.B. Lim and Max Z. Shen, University of California-Berkeley
We will present approaches to address possible statistical and structural errors in modeling. The standard approaches used to address statistical errors are to develop better statistical estimation procedures or use robust Bayesian models. Model validation and ranking of models are used to minimize the effect of structural (and statistical) errors. Statistical errors can become significant because classical modeling under uncertainty assumes a full probabilistic characterization. The learning needed to implement the policies derived from these models is accomplished either through (1) classical statistical estimation procedures or (2) subjective Bayesian priors. Different models of model uncertainty will be discussed and different approaches to robust optimization with and without bench-marking will be presented. Two alternative learning approaches Objective Bayesian Learning and Operational Statistical Learning will be discussed. To address structural errors and avoid death spiraling effects, we will introduce Objective Operational Learning approaches that incorporate exploration and exploitation phases. Throughout this tutorial we will consider queueing control, inventory control, product portfolio selection, call center staffing, financial asset allocation and medical decision making problems as examples to illustrate these ideas.
Unlocking the Hidden Value in Your Data with SAS Analytics
Udo Sglavo and Jinxin Yi, SAS Institute Inc.
Organizations have been faced with the challenge to deal with data for decades. The enormous growth in the amount of data that the global economy now generates–transactional data, demographic data, data from social media and web sources, and more–has become overwhelming. More than ever it is critical to understand how the information locked in that data can be leveraged to drive actionable decisions for the business. Analytics is the key that can unlock that hidden value. The power of analytical models is integral to organizational dynamics and has direct impact on company performance. Companies across the globe are employing analytics for exploring and analyzing data to help uncover patterns and insights that can drive evidence-based decision making. This tutorial provides an overview of analytical methods, such as inventory optimization and statistical forecasting, with a focus on applications of those methods to solve specific business problems.
Recent Linear Programming Developments for Operations Research and Management Science
Yinyu Ye, Stanford University
Linear programming (LP) has been a core operations research and management science model since 1947. Due to the relentless research effort in LP algorithms, a linear program can be solved today one million times faster than it was done twenty years ago. Businesses, large and small, now use LP models to control manufacture inventories, price commodities, design civil/communication networks and plan investments. LP even becomes a popular subject taught in undergraduate and MBA curriculums, advancing human knowledge and promoting science education. Now many new important computational problems are emerging or reemerging. In particular, there has been a growing trend in models, theories and algorithms on problems arisen in Internet economics, information network, auction/game, stochastic and on-line decision making, as well as social organization issues enabled through the Web. The aim of the tutorial is to describe several modern linear programming developments and applications in today's service economy, such as auction, pricing, mechanism design and information aggregation. We also present a few recent algorithmic developments and future research directions of linear programming.
Optimization Software: Algorithms, Computation and Applications
Zonghao Gu, Gurobi Optimization
We will focus on linear, quadratic and integer programming, discuss briefly the history of math programming software and review important algorithms, such as primal and dual simplex and barrier for linear and quadratic programming, and the branch-and-cut algorithm for mixed integer programming. Computation is always a crucial part of software development; this talk will discuss different technologies used in the commercial optimization solvers to determine the degree of algorithmic progress. We will also give computational results indicating the current state of the art. Finally we will talk about several applications to show the benefits this progress in software brings to real world problems.
Marriage of Simulation and Optimization: Theories and Examples
Jeff Hong, Hong Kong University of Science and Technology
Simulation and optimization are two widely used operation research tools. The marriage of them, often known as optimization via simulation (OvS), provides a unique and powerful combination that may solve many problems that are previously considered difficult. Based on information available on simulation models, we categorize OvS problems into two categories: white-box and black-box OvS. This tutorial introduces difficulties and latest theoretical developments of both types of problems, provides examples on problem formulations as well as solution algorithms, shows how to translate black-box problems into white-box ones under certain situations, and offers my view on possible future research directions of this exciting area.
Challenges in Modern Data Analysis
Ming Yuan, Georgia Institute of Technology
The way science is done today is very different from the past, and big data are becoming the new microscope for big science. Immediate access to copious amount of interesting and important data presents unprecedented opportunities, but also creates unique challenges. A distinctive characteristic that often sets these massive data apart from the usual subject of traditional statistical analysis is their high dimensionality, which means that information can be abundant yet at the same time elusive. The difficulty in dealing with high dimensional data stems from the apparent intractability in approximating, integrating, optimizing and consequently estimating a high dimensional function. Development of statistical theory to understand the nature of such characteristic, and methodology to address the associated issues will advance our intellectual exploration and knowledge, and undoubtedly benefit a multitude of scientific and technological fields. In this tutorial, I will discuss some the recent advances and challenges ahead in tackling these problems, and their potential impact in other areas.
The Recent Financial Crisis and Two Related Financial Engineering Research Problems
Xuedong He, Columbia University; Xianhua Peng, Hong Kong University of Science & Technology
Some main causes of the recent financial crisis are excessive risk taking due to the limited liability of fund managers and corporations, which means profits are shared, but not losses, and the inability of valuing the housing market fairly, which partly lead to the housing bubble. In this tutorial we will present some broad research questions in financial engineering related to these. In particular, we will discuss how to design a better hedge fund performance fee scheme to improve the satisfaction of regulators, fund managers and fund investors simultaneously, and how to model housing prices. No finance background is assumed in this tutorial.
Our thanks to these organizations for their generous support of INFORMS International Beijing.
Operations Research Society
Systems Engineering Society
Chinese Society for Optimization, Overall Planning and Economic Mathematics
Chinese Society for Management Modernization
May 5, 2012
Authors’ deadline for final abstract changes
May 14, 2012
Early registration deadline