INFORMS Practice Conference

Poster Presentations

at the Hilton Waterfront Beach Resort.
Tuesday Sessions
Poster Presenter Bios
   

Sixty poster presentations will offer an inside look at analytics and O.R. across a wide range of industries and organizations, dealing with a variety of problems and solutions.  Poster sessions take place on Monday and Tuesday afternoons, during special dessert receptions.

Monday, April 16

1. Cash Logistics Optimization
Ahu Akdemir, Graduate Student, Bogazici University.
Idle money inbranches and ATM machines is a source of an important cost for a bank. On the other hand, customer satisfaction is also dependent on the availability of money. Therefore, it is a vital decision as to how much money to hold in branches and in ATMs. Moreover, as gas prices increase, each money transfer is also an important component of the bank’s cost. The main objective is to adjust cash levels such that the total cost is minimized and the main constraint of the problem is the customer satisfaction which is highly affected by the cash-out situation. Since money withdrawal is a random process, it should be forecasted by analyzing the historical data before the planning process. Many commercial solutions for this problem focus on the forecasting step and the planning part is solved by heuristically. In this project, we propose an autoregressive forecasting methodology followed by an integer programming formulation for the cash logistics optimization problem.

2. Data Envelopment Analysis of the First Year Courses at Appalachian State University
William C. Bauldry, Professor of Mathematics, Appalachian State University
We will present an adaptation and application of the technique Data Envelopment Analysis (DEA) to analyze four large-enrollment, first-year, general education course areas at Appalachian State University. DEA, developed in the 70's, is normally used to compare dissimilar departments within a division or similar departments across different organizations. We are treating the separate activities (our courses) as decision-making units for DEA analysis. Our analysis will produce a relative efficiency index for each activity. The project is being conducted by a team of graduate and undergraduate students from our Operations Research course. That a team of analytics neophytes can easily engage in such a project demonstrates the power and utility of DEA.

3. Analytics for City Services and Safety
Mehmet Ferhat Candas, Research Engineer, IBM TJ Watson Research Center
With increasing urban population and aging infrastructure, city operators are challenged to provide better service. IBM is working with three cities on developing solutions for supporting operations & maintenance, including field service optimization, predictive maintenance, usage & revenue optimization, coordinated capital planning and operations management. The core challenge with city organizations is to do “more with less.” This translates into making efficient coordinated decisions across agencies within the city. Furthermore, each agency needs to perform a long term sustainability analysis to ensure that they are not postponing the key infrastructure replace, rehab, and repair decisions such that there will be a huge future cost. The solutions we have developed target these questions and leverage descriptive, predictive and prescriptive analytics into a unified framework to help city agencies make better decisions. 

4. Optimization for a Client with Large-Scale Constrained Problems: A Case Study
Jeremy Walton, Senior Technical Consultant, Numerical Algorithms Group
We describe the solution which we provided to a marketing analytics organization which was searching for optimal points in a number of large-scale constrained problems having the same structure.  We found that sequential quadratic programming (SQP) active set methods were most appropriate for this class of problems, and also discovered that the number of constraints in the client’s original problem could be reduced, thereby increasing the stability of the solver and decreasing the time to solution by a factor of ~30.  In addition, we tested a solver from an alternative source, and discovered that it was well-suited to this problem structure.  We were able to implement a set of consistent interfaces to this solver, which allowed it to be easily plugged into the client’s existing application structure.  We are continuing to support the solver, which is being actively used by the client.

5. Digging into Dealership Data
Mark Colosimo, Global Director of Analytics, Urban Science
In the automotive world, a substantial amount of data is accumulated and purchased by OEMs and dealers.  Sales, service, parts, financing, customer satisfaction and other sources are readily available for analysis and application to improving performance.  Unfortunately, very few of these datasets are actually combined to perform any analysis on the operations and performance of dealerships.  Our goal is to use these datasets in combination to statistically and optimally uncover best practices amongst dealerships to assist in: (a) improving dealership performance (profitability), and (b) selling more OEM products (generally vehicles and parts).  We will first identify the key drivers of performance and then apply these findings to optimize performance scenarios in dealerships to obtain a “win-win” outcome.  It is rare that optimizing both dealership and OEM goal-based metrics is performed.  This will provide a concise method by which both can achieve goals without the sacrifice of one for the other.

6. Use of Predictive Analytics and Optimization Technique to Improve Traffic Flow in Bangalore
Rajdeep Debnath, Student, Indian Institute of Management Bangalore
Scaling up of infrastructure and optimization of resources takes place to meet the growing demand of resources in a city. But when the requirement of resources follow a steep growth curve, use of predictive analytics and optimization technique is required to have a long term solution. Predictive analytics on historical and real time demand data provides insight on what are the important factors causing growing demand of resources and what is the characteristic of that demand. Based on that finding, operations research technique will help to obtain an optimized scenario.

7. ERP Integrated Supply Network Design Using Milk Run: A Case Study in the Automotive Industry
Burak Erkayman, PhD Student, Ataturk University
In the automotive industry, building the supply network system for picking the parts manufactured to the automotive supply industry, should be designed according to minimum transportation cost and WIP level. An efficient design requires an ERP system as the infrastructure. Also, a convenient technique for the supply structure of the company must be used in the design. The objective of this study is to manage the 2 million USD/ year transportation cost using a milk-run system instead of traditional methods to reduce transportation costs and to minimize WIP (work in process) and main warehouse inventory in a company manufacturing light commercial vehicles in Turkey. A supply structure optimizing the subjected KPIs by using milk-run, in an automotive company manufacturing midibus and truck in Turkey is presented. As a consequence of these, we aimed to control the transportation cost per unit. With VSrm commercial software we managed routes, transportation operations and loading amounts by the data extracted through ERP, on a platform built among third-party logistic firms, suppliers and main industry. The conditions after using the technique are explained in detail as a case study.
Co-authors:
Emin Gundogar, Professor, Sakarya University; Gokay Akkaya, Assistant Professor, Ataturk University; Gökhan Tesdeviren, Leanfact Consulting

8. Robust Risk Matrix Definition Driving Qualitative Assessments for Better Quantitative Results
Antonio Ferrel, Project Risk Engineer, Centrica Energy Upstream
In many companies worldwide, there is an increasing need for Project Risk Management.  This is increasingly true today, as it is expected that within the next decade many companies will be investing in major capital projects in the billions of dollars, particularly in the oil & gas industry. A well-defined Project Risk Matrix is the first step in managing risks to the project. Depending on the industry, scope of the project, or impact categories, it works best if there is a robust rationale behind all its elements. It not only drives the initial qualitative risk assessment. It goes further into setting a guide to manage risks, feeds our Monte Carlo simulations, helps to provide an indication of potential cost and schedule overruns and ultimately, is a stepping stone in the decision-making process of capital investments. This poster will present how the Project Risk Matrix draws concepts from probability & statistics, project management and decision science. Also, how the scoring, chance, severity, impact definition and general risk management have been aligned to provide better results in the risk analysis.

9. Implementing Optimization and Analytics to Achieve Efficient Supply Chain in a Stochastic Semiconductor Manufacturing Environment
Anurag Joshi, Customer Optimization Engineer, Philips Lumileds Lighting
The LED (light emitting diode) business is characterized by a high degree of manufacturing variation and fluctuating demand requirements. In such a dynamic environment, delivering LEDs to customers becomes a difficult task. The challenge is to minimize the gap between stochastic supply distribution and variable customer demand. From this challenge sprouted a concept of software which could automatically build and solve mathematical models by analyzing process and customer data. The software resulted in efficient supply chains characterized by reduction in slow moving inventory and enhanced data driven decision making. The software was also used to perform scenario and risk analysis during a new product introduction. The presenter worked in a team of two which formulated the problem, gathered data requirements and developed the entire software.

10. A Bayesian Assessment Methodology for Rail Corridor Risk
Robert Love, Associate, Booz Allen Hamilton
The rail industry is following a process of continuous improvement to better enable rail operations planning; however, one inhibitor to progress in this area is the need to better understand the causal factors associated with incidents and security risks along or in proximity to routes and railroad facilities.  Currently, the industry relies heavily on causal factors that are not fully quantified, leading to subjective determinations of risk and broad safeguards to protect against future incidents. In the face of increasing consequences associated with rail incidents, regulators, firms, and other interested parties must be able to quickly align resources in order to prevent and/or mitigate harmful consequences. The purpose of this work is to demonstrate a Bayesian mathematical technique that informs stakeholders of which rail operations and entities are at higher risk, enabling more efficient route and operations planning and less costly risk mitigation strategies.

11. Solving the Military Theater Distribution Problem Using Planning Factor and Integer Programming Approaches
David Longhorn, Operations Research Analyst, Transportation Engineering Agency, Military Surface Deployment and Distribution Command
Josh Kovich, Operations Research Analyst, Transportation Engineering Agency, Military Surface Deployment and Distribution Command
This research presents two approaches to solve the military theater distribution problem, which in this context involves determining the number of military transportation vehicles (trucks, trains, and aircraft) needed to deliver unit equipment and sustainment cargo from theater airports or seaports to destinations. The first approach uses a high-level, planning factor formula generalized for different transportation modes such as road, rail, or air.  The second approach uses an integer program to minimize the transportation cost and occurrences of late delivery subject to meeting delivery requirements and daily installation outloading and unloading constraints for each transportation mode.  The authors developed these approaches to provide analytic insights into typical military distribution problems studied by analysts and planners at the Military Surface Deployment and Distribution Command, which is the Army Component of the United States Transportation Command.

12. Using a Hybrid Model to Examine Inventory Shrinkage
David Makovoz, PhD, Deloitte and Touche LLP
Patricia Higgins, Specialist Senior Consultant, Deloitte and Touche LLP
We report on work in progress toward application of computational modeling to the problem of rule breaking and shrinkage. We focus on inventory theft using integrated agent-based modeling, system dynamics modeling, and social network analysis. We apply the Belief-Desire-Intention agent paradigm to develop cognitive agents with modifiable behavior. Once validated, the model is used to explore the no-action aggravation of the problem due to both deterioration of the behavior of the original perpetrators as well as spread throughout the organization. We also investigate effectiveness of various controls to mitigate the problem. To our knowledge, this is the first attempt to apply agent-based modeling to this problem. This work has direct applications to the retail industry, corporate and government procurement, workplace theft, and government auctions. It can be further applied with some modifications to address fraud, waste, and abuse in the public and private sector, as well as the general pattern of rule breaking.

13. To What Extent Do Recent MBA Graduates Employ Business Analytics Practices in their Decision Making and How Much Does it Factor in their Success?
Mouwafac Sidaoui, PhD, Professor, University of San Francisco
This study is designed to explore the use of business analytics to improve decision making by recent MBA graduates [1-3 years] and the benefits of using various analytical tools for achieving high results across functions in an organization. A multiple regression model will be constructed to determine to what extent MBA graduates employ business analytics into their decision making in their job function. Data obtained from the online survey will be analyzed using the statistical analysis software, SPSS or Microsoft Excel. Data analysis will incorporate various statistical processes aimed at providing an interpretation of the raw data as it relates to answering the research questions.

14. Application of an Economic-Probabilistic Model to Conduct Risk Analytics in an IT Project
Rogério Feroldi Miorando, Postdoctoral Research Fellow, Federal University of Rio Grande do Sul
This work presents the application of an economic-probabilistic model to conduct risk analysis in an IT project – the development and implementation of an ERP system in a higher education institution. The model’s main contribution is to quantify the economic impact of risk factors, which together with their probabilities of occurrence allows integrating risk analysis with economic analysis using Monte Carlo simulation. As a result, the model generates: (1) the risk-adjusted project cash flow with the associated probability distribution to its NPV, and (2) the variability that each risk factor generates in the project return, highlighting the greatest dangers and opportunities for the project. The model proved to be robust in the project risk analysis providing a thorough analysis of risk factors involved and how project profitability behaves depending on these factors. The model also offers the possibility of real options analysis that can reduce project risk or increase its return.
Co-authors: José Luis Duarte Ribeiro, Professor, Federal University of Rio Grande do Sul; Camila Costa Dutra, PhD Student, Federal University of Rio Grande do Sul; Maria A. C. Tinoco, Postdoctoral Research Fellow, Federal University of Rio Grande do Sul.

15. Hospital Excellence Operation Model: an Approach to Lean Healthcare in Mexican Hospitals
Miguel Angel Moreno, Director, Universidad Panamericana
In recent years, Mexican hospitals have faced important challenges: obtaining certification from the General Health Council and complying with Joint Commission International standards, looking for costs reductions, and improving service levels. This first approach at hospitals in Jalisco demonstrates that there is a great need to find tools that will improve the operation, and subject to the search for certification by the General Council of health. We propose a structured framework, called Hospital Excellence Operation Model (HEOM), which aims at helping hospitals manage their efforts to achieve those challenges (goals) and long-term sustainability. The proposed framework is grounded on lean philosophy and adapts lean concepts, traditionally applied to manufacturing, to the specific requirements and conditions of hospitals in the state of Jalisco.

16. Creating a Data-Driven and Quantitative Score for Optimal Site Selection in Clinical Trials
Elizabeth Nielsen, Senior Operational Effectiveness Specialist, Quintiles
In pharmaceutical clinical trials, sites with principal investigators need to be chosen where the trial will be conducted.  One of the most obvious variables in this decision is the number of patients the site can be expected to recruit for the study. Data collected from earlier clinical trials can tell us the quantity of patients that each site has had for each previous trial, but this is not the complete picture. How consistently has the site enrolled patients over various trials?  How did the site compare with other sites running the same trial?  How much experience does the site have in running clinical trials?  This poster will show how a complex algorithm based on multiple factors is determined to grade a site. A simple score leads to optimal choices of clinical trial sites.

17. Constraint Integer Programming (CIP): A Must-Have in any O.R. Practitioner’s Toolkit
Puneet Piplani, Senior Vice President, Mu Sigma
Prescriptive analytics is an important enabler of executive and managerial decision making in most modern, analytically driven organizations. While integer programming (IP) has been the primary workhorse of the O.R. community for prescriptive analytics, constraint programming (CP) is a relatively new methodology in this area. The modeling phase-converting the business problem into a mathematical formulation is the biggest bottleneck in IP based approaches, whereas CP based approaches offer ease of modeling but lack the robustness and speed of IP based solvers.  This paper talks about how a combination of IP and CP in the form of an integrated problem solving methodology can lessen the burden on the analyst and enable higher analytic throughput with faster turnaround. This feature is especially important in the current scenario where organizations are short of O.R. talent. The O.R. community is already reaping the benefits of this integrated approach. Hitherto intractable problems, as varied as Major League baseball scheduling to nurse scheduling have now become tractable; while others like retail assortment optimization can be solved in reasonable time thanks to this integrated methodology. CIP will evolve and play a major role in developing the prescriptive analytics landscape in the coming years.

18. Network Risk Analysis – Framework and Modeling of Linkage Associated Risk
Vivek Sinha, Senior Consultant, Booz Allen Hamilton
Aided by technology, modern companies and organizations have become highly interconnected. The banking system, for example, has become highly interdependent because of inter-twined holdings. To supplement entity-based risk analytics, we propose to introduce the concept of linkage-associated risk based on connectedness, and devise broad methodology to define the two components of risk:  susceptibility (the ability to withstand an adverse event) and infectability (the ability to propagate influence/risk). Susceptibility can be a function of internal mechanisms to prevent adverse events while infectability can be a function of influencing ability and the number of linkages and distance from other entities based on degrees of separation. We plan to use the concept of Susceptible-Infected-Susceptible and Susceptible-Infected-Removed for risk management in financial institutions. This susceptibility and infectability framework can also be customized based on domain understanding for propagation of risk or influence in other domains such as marketing and/or advertising.

19. A Simple, Practical Prioritization Scheme to Minimize Cycle Time and Maximize Throughput of All Jobs in a Job Shop Environment with Multiple Job Types
Mandyam (“Srini”) Srinivasan, Pilot Corporation Chair of Excellence, University of Tennessee
Practitioners manage the flow of multiple products through back-shops in the repair and overhaul industry that support disassembly/reassembly operations in the locomotive, off-shore oil drilling, aircraft, ship, and heavy equipment repair industries.  Original equipment is disassembled into component parts; parts are then routed to back-shops for repair and returned for reassembling the equipment.  Back-shop scheduling necessitates prioritizing repair of component parts from different original assemblies at different machines.  We model the back-shop as a multi-class queueing network with a ConWIP execution system and introduce a new priority scheme to maximize system performance.  In this scheme, we identify bottleneck machine(s) based on overall workload and classify machines into two categories: bottleneck and non-bottleneck. Assemblies with the lowest cycle time receive highest priority on bottleneck machine(s) and lowest priority on non-bottleneck machines. Simulation results show that this priority scheme increases system performance by lowering average cycle times without dramatically impacting total throughput.
Co-authorsShuping Zhang, Ph.D. candidate, University of Tennessee; Melissa R. Bowers, Associate Professor of Management Science, University of Tennessee

20. Valuation of a New Technology: A Case of Remote Presence Robotics in Deep-Water Offshore Wind Farms
Darijus Strasunskas, PhD, Norwegian University of Science and Technology
A new technology typically requires new ways to operate that extend to possibly new business models. Systematic analysis of associated risks and benefits helps to build good business case for faster diffusion of technology. We have developed a pragmatic decision analytic framework to assess the economic value of new technology development for offshore wind farms, where remote presence robotics are one of the critical aspects to successfully operate deep-water offshore wind farms. The framework builds on contemporary literature for assessing technology in a broader context of organizational structures and work processes. The novelty lies in its efficiency and methodological guidelines of how to relate qualitative assessment of changes in competencies to quantitative decision analysis. The framework has a built-in qualitative assessment of competencies, work processes that provides indispensible insights into risks associated with introduction of new technologies. Yet, it allows assessment of expected value using an integrated formal decision analysis. 

21. Case Study Efficiency Benefits Analysis of Weather and Radar Processor (WARP)
Jim Sunderlin, Senior Associate, MCR, LLC
A case study analysis of pre-implementation versus post-implementation operational data was conducted to estimate aviation efficiency impacts and associated fuel savings based on verifiable changes in user decision-making using the Weather and Radar Processor (WARP) system developed by the Aviation Administration (FAA) instead of its predecessor Long Range Radar (LRR) system. Expected decision making inputs were obtained via user interviews and flight delay benefits were quantified based on these inputs. The quantified benefits based on user inputs were then compared to graphical movie loops of weather and flight traffic images depicting actual pilot decision making during selected case study events. The extent to which the delay benefits claimed by the users could be supported was determined based on these images.

22. Price Optimization for a Retail Category of Merchandise Items
Andrew Vakhutinsky, Senior Principal Scientist, Oracle Retail Global Business Unit
In this presentation we describe several approaches to solving a problem of finding the optimal price vector  to maximize a performance indicator function such as total revenue, profit margin, or sales volume of merchandise items in a retail category subject to various inter-item price constraints and other business rules.  The demand for each item is generally a non-linear function of all item prices in the category.  We propose different methodologies that were developed to solve this class of problems.  We also discuss how implementation of these algorithms is motivated by practical scenarios and compare the results of the computations.  In our presentation, we describe generalizations of this problem to multi-period price optimization and promotion planning where future price changes and associated demand lifts bear certain costs and are subject to various business rules. 

23. A Simulation Model for Assessing Supply Chain Risk
Ryan J. Warr, Senior Operations Research Engineer in the Supply Chain Modeling and Solutions, Intel Corporation
Supply chains are vulnerable to disruption risk due to natural disasters, internal site interruptions and various supplier issues. Assessing the risk level of a supply chain is a prerequisite to making the best risk reduction and business continuity plans. At Intel we have built a model that tests the resilience of our global supply chain under varying product, capacity, inventory and disruption environments. This model comprehends complex supply chain dynamics by simulating the weekly supply planning and production decisions at both supplier and internal manufacturing locations. The model is highly flexible, enabling decision makers to test multiple product, node and disruption scenarios. Model implementation is in Excel with extensive Visual Basic for Applications code and a linear programming add-in. The model is part of Intel’s manufacturing risk management process, helping to guide revenue concentration decisions, revealing key chokepoints in the supply chain, and helping to form better disruption mitigation strategies.     

24. Process Evaluation & System Analysis for Electrical Equipment Systems
Reginald (Ray) Wilson, Industrial Operations Specialist, Seattle City Light
Equipment installation process predicaments induce cost overruns and unnecessary extended electrical outages as well as extraneous strain on the bulk transmission system. Decision theory, alternative evaluation, and process optimization disciplines are required as a front-end analysis mechanism to ensure constructive integrity in the electrical infrastructure. The poster series investigates the equipment system integration problem specifically encountered by electrical utilities, process industries, and society as a whole. The value-added study is designed as a decision support mechanism for senior management in front-end analysis of high expenditure projects that reinforces the utility grid. The venture purports financial accountability, green stewardship, and detailed assessments while maintaining a safe and high quality work process to protect the current infrastructure. The poster series is expected to specifically focus on principal summer critical ventures, process analysis, and project alternative assessments to supply a comprehensive analysis of the assets for front-end decision support.

25. Common Pitfalls in “Common Practice” –Understanding the Value of Pilot Information
Chang Yan, Decision Process and Risk Management Consultant, Decision Strategies
It is not uncommon for an organization to make a decision based on pilot information. However, we have found that mistakes can be easily made in our “common practice,” where conventional thinking is applied to framing a problem or assessing new information. Using a case study, we demonstrate that:

  • The option value of a pilot is not intuitive for most people to understand.
  • People have a tendency to believe that if a pilot is successful, a good project outcome will follow.
  • When it comes to assessing new information, Bayesian logic is frequently ignored.

This paper challenges our conventional thinking in processing and deciphering information, points out some easily made but hard-to-spot mistakes in project decisions, and provides a new way of thinking to understand the often confusing and counterintuitive concepts in value-of-information analyses. Ultimately it aims to arm us against common pitfalls in our decision making.

26.  Modeling Price-Based Decisions in Advanced Electricity Markets
Eugene J. Zak, Alstom Grid Inc.
In advanced electricity markets, some bids and offers extend over a block of several consecutive time periods so that the block bid/offer has to be cleared entirely for all time periods comprising the block. According to a typical market rule, a block bid can be cleared only if its price is not lower than the average market price. Similarly, a block offer can be cleared only if its price is not higher than the average market price. The average market price is an arithmetic mean of the market prices over the block time periods (assuming identical time period lengths). The block bids/offers selection depends on the market prices which are part of the same optimization problem. The dilemma occurs: block bids/offers selection as a primal solution cannot be properly exercised without knowing the prices as a dual solution, and the prices depend on the selection decisions. We propose a model harmonizing such complex “primal-dual” market rules. The model ties together the primal and dual variables so that the “primal-dual” market rules become a part of the overall model. The model is a non-linear mixed integer program (MIP). We have implemented an exact algorithm to solve this model. The computational results demonstrate the adequacy of the modeling and algorithmic approaches and their practical value for several European electricity markets.
Co-authors: Sami Ammari, Kwok W. Cheung, Alstom Grid Inc

27. Covariance Contracting: Performance Incentives for Regulated Hydro-Electric Generation
Hans Tuenter, Senior Model Developer, Ontario Power Generation
In Ontario, most hydro-electric generation is regulated, and receives a fixed price for its production, thereby providing more predictable earnings on these assets. However, it also eliminates the economic driver to shape production to match market prices. This aspect is important, as additional generation made available at higher priced hours lowers the market-clearing price, and benefits the rate-payers of Ontario. In 2008 Ontario Power Generation (OPG) proposed an incentive mechanism to the Ontario Electricity Board (OEB) to provide a market-based incentive to time-shift electricity at our regulated hydro plants. The Sir Adam Beck (SAB) generating station near Niagara Falls is the most notable plant covered by the incentive mechanism. The proposal was successfully defended in a public hearing and endorsed by the OEB in November 2008. The mechanism resulted in additional, net revenues of $21M and $14M for 2009 and 2010, respectively. The presentation will cover the analytic work that was done to arrive at the current incentive mechanism, the process of convincing OPG management of its merit, and the preparations involved for the rate hearing.

28. A Large U.S. Retailer Selects Transportation Carriers under Diesel Price Uncertainty
John Turner, Assistant Professor, University of California-Irvine
In this paper, we describe how a team of researchers from Carnegie Mellon University developed a decision support tool to help a large U.S. retailer pick cost-effective transportation carriers in the face of diesel price volatility and negotiate more favorable fuel surcharge schedules.  I am the corresponding author for this paper.

29. Foresight Driven Analytics Enabled Marketing
Lana Klein, Managing Partner at 4i, Inc
Marketing is currently going through a fundamental transformation: moving from one-to-many to one-on-one interactions with consumers.  This transformation has created the need to re-examine all traditional methods and approaches to marketing. Marketers can utilize past consumer behavior and advanced analytics to drive brand and marketing decision.  This poster will demonstrate, with a client case study, a foresight driven marketing framework to help address key marketing issues by leveraging client behavior data. This framework is focused on the following steps: identify need states; profile category, markets, brands; identify trends; forecast need state; develop growth strategy. The presentation will explain how to develop marketing (consumer communication, brand and product development) taking into consideration future changes in consumer’s needs and behaviors, and a step-by-step process to develop foresight driven marketing.

30. Cooperative Game Theory: Understanding the Shapley Value and Its Applications in Supply Chain Management
Diana Gineth Ramirez, Chief Science Officer, Fundación Centro de Investigación en Modelación Empresarial del Caribe (FCIMEC)
Many researchers, academics and practitioners all over the world have heard of cooperative game theory and the Shapley value as a solution to this type of games, but most of them don’t really understand the essence of cooperative games: that it’s not just about calculating the Shapley Value, it’s about proving that the players have no incentive to leaving a coalition formed and that they will not intend to compete with the players involved in the coalition. Indeed, many questions have been formulated in this area and there still exists confusion with the way coalitions are formed and the way to calculate Shapley Value.

31. Coordination of Pricing and Inventory Control for Perishable Products Across Products
Yue Wu, PhD, University of Southampton
This research examines a joint pricing and inventory control problem for perishable products in a monopolistic supermarket, which orders, stocks and sells similar products with different brands. Cross-price effects are considered, in which demand for a product depends on its own prices and the prices of other competitive products. Products are sold competitively between different brands along with different age group. Therefore, the demand is price sensitive, depending on the full and discounted prices for all products. A dynamic programming model is proposed to determine an optimal pricing and inventory control strategy for perishable products. Computational results demonstrate the effectiveness of the model. We find that the supermarket can improve its profits considerably by managing similar products simultaneously rather than independently.

 

  Back to the top

Tuesday, April 17

1. Practitioner Insights on Estimating Queue Performance
Jim Grayson, Professor, Augusta State University
Significant estimation errors, especially in high utilization or low sample size situations, can occur when using sample data in standard queueing formulas to estimate queueing measures such as time in queue.  Perhaps surprisingly, these issues are rarely addressed.  Most uniquely, this approach applies a data (analytics) driven rather than a model driven methodology to working with queues. Queuing studies were simulated by generating arrival and service data based on an experimental design.  An innovative error scale was developed to provide a meaningful measure of errors.  We establish error bounds in the case of exponential arrivals and Poisson service times for the single server system when estimating average time in queue, and also provide practical guidance for practitioners.   Non-linear regression identified key predictors of estimation errors leading to tractable calculation of error bounds.

2. Identifying Robust Portfolios of Interventions to Reduce HIV Incidence in LA County: A Proof of Concept
Evan Bloom, Doctoral Candidate, the Pardee RAND Gradual School; and Assistant Policy Analyst, RAND Corporation
The Los Angeles County Division of HIV and STD Programs (DHSP) faces difficult decisions allocating limited resources across many interventions with the goal of reducing new HIV infections. To assist LA County in improving its impact, researchers at RAND worked with DHSP to create a system dynamics simulation model to forecast annual infection rates in LA County upon implementing alternative portfolios of interventions. Faced with uncertainty in future conditions, we used innovative robust decision making techniques to identify portfolios of interventions that would meet policy objectives across wide ranges of conditions. Vulnerabilities of these robust portfolios were identified and visualizations were used to interpret tradeoffs between portfolios based on performance, future conditions, and stakeholders’ beliefs about likelihood of future conditions occurring.

3. Multi-objective Decision Analysis: Managing Trade-offs and Uncertainty
Clinton Brownley, Ph.D., founder and President, Eclectic Analytics
In today’s complex, uncertain business environment, people can find it very challenging to structure their decisions to make trade-offs among conflicting objectives, deal with uncertainties, and clearly communicate with others.  A structured decision making process for multi-objective decisions under uncertainty addresses all of these challenges by providing a framework.  This framework helps decision makers model their decision problem, focus information search, assess the impact of value trade-offs, risk preferences, and uncertainties on the preferred alternative, and generate consensus and group commitment to action.  I developed a set of spreadsheet templates using Excel, an inexpensive, generally-available resource, to guide individuals through a simple yet powerful process for making multi-objective decisions under uncertainty.  By using this process to structure their decision making, individuals can measure their objectives, articulate and weigh their values, incorporate probabilities, and perform sensitivity analyses to ensure the decisions they make are consistent with their objectives and values.

4. Merlin – Predictive Analytics for Employee Engagement
Arturo Castillo, MSc , Research Consultant, MidlandHR
Employee engagement and retention is relevant for organizations aiming to secure their most valuable asset, the knowledge within their employees. The process of monitoring and evaluating employees’ performance, which relates directly to their engagement, is time consuming and generates extra work for managers. Merlin is a predicative analytic tool that gathers information from sources already kept in HR systems. Sources include absence patterns, engagement pulse surveys and managers appraisals. Merlin components include survey engine, graphics generator, organizational structure and a machine learning mechanism. Merlin generates alerts when organizational defined absence patterns are detected or when employees’ satisfaction/engagement levels decline obtained from analyzing historic answers. Some alerts are false, and managers can override them. Merlin considers such exceptions and applies them in the future by re-training the machine learning algorithm based on decision trees. Merlin allows managers to act when employees’ engagement declines, consequently preventing high turnover levels in organizations.

5.  Modeling Forest Fire Initial Attack Airtanker Operations
Nick Clark, Masters Student, University of Toronto; David Martell, Professor, University of Toronto
In the automotive world, a substantial amount of data is accumulated and purchased by OEMs and dealers.  Sales, service, parts, financing, customer satisfaction and other sources are readily available for analysis and application to improving performance.  Unfortunately, very few of these datasets are actually combined to perform any analysis on the operations and performance of dealerships.  Our goal is to use these datasets in combination to statistically and optimally uncover best practices amongst dealerships to assist in (a) improving dealership performance (profitability) and (b) selling more OEM products (generally vehicles and parts).  We will first identify the key drivers of performance and then apply these findings to optimize performance scenarios in dealerships to obtain a “win-win” outcome.  It is rare that optimizing both dealership and OEM goal-based metrics is performed.  This will provide a concise method by which both can achieve goals without the sacrifice of one for the other.

6. An Economic-Probabilistic Model for Projects Selection and Prioritization
Camila Costa Dutra, PhD Student, Federal University of Rio Grande do Sul
Research has been devoted to the study of different projects selection and prioritization methods. However, existing proposals show some limitations that prevent their practical use. This work presents an economic-probabilistic model for projects selection and prioritization that seeks to quantify investments, benefits and their possible deviations, providing an analysis of expected return of projects. The main contribution is to provide an alternative model for selection and prioritization of projects, which combines economic and probabilistic methods, following relatively simple procedures, but able to account for uncertainty in projects. The practical test was conducted with the project portfolio of an electricity company, and shows that: a) the criteria are sufficiently complete, 2) the use of economic and probabilistic approach qualifies the information available to decision makers, 3) the financial language is more easily understood and has a concrete meaning both for the management area and for the technical area.
Co-authors:
Rogério Feroldi Miorando, Postdoctoral Research Fellow;  José Luis Duarte Ribeiro, Professor, Federal University of Rio Grande do Sul

7. A Goal Programming Based Tool for the Schedule Change Request Process in Progressive Contact Centers
Mengying Fu, Sr. Operations Research Analyst, Progressive Corporation
Progressive offers customized shifts for thousands of custom service representatives (CSRs) located in various large contact centers in the US. CSRs can file schedule change requests for new schedules if they are not happy with their current ones. This process is important for offering flexibility for both CSRs and meeting highly seasonal call volume. However, with thousands of requests containing customized shifts, the current review process takes an extensive amount of time and offers only myopic decisions. A tool based on a goal programming model is developed to be integrated with multiple databases and automate the review process. With goal programming, each CSR’s goal is optimized individually with priority based on his/her tenure while business need to meet answer performance is also considered. In addition, the model takes full advantage of the flexibility offered by large number of requests.

8. Smarter Transportation Analytics using DSS Optimizer (Decision Support System)- Incident Management 
Bharat Gera, Line Manage, IBM Corporation
Transportation command centers today are largely not equipped to determine response plans based upon large volumes of data and analytic methods. Typically, today, some real-time data is visualized, but the expected outcomes of potential responses are generally not computed. It is widely accepted that the “Command Center of the Future” should leverage the massive amounts of transport data for more effective response plan generation. This is the motivation of the Decision Support System (DSS) Optimizer module. 

9. A Rolling Horizon Planning Approach to Shop Production Control
Eric Gross, PhD, Associate, Xerox Corporation
For the more than 100 billion dollar printing industry in the United States, Xerox Corporation has developed, tested, and implemented a set of operations-research-based productivity improvement offerings, trademarked as Lean Document Production® solutions.  Our solutions have been implemented in over 100 customer sites.  To improve upon this set, Xerox is developing real time print production control methodologies to aid in operation control decisions.  In particular, we have modified a Rolling Horizon Planning (RHP) framework that incorporates feedback and makes explicit use of a system model and constraints to predict process behavior over a future time horizon.  A sequence of actuations, in this case production decisions of what and how much to produce, is computed that minimizes a quadratic cost function over the horizon.  The first actuation is applied, the horizon is displaced in time, and the process is repeated.  The methodology has been extended to account for multi-functional equipment, the cost of overtime, and the cost of job lateness.  In this manner RHP provides a comprehensive framework to achieve tradeoffs among labor costs, work in progress levels, cycle times, and equipment utilization.  

10. Mayo Clinic Framework for Internal Business Consulting
Amy Donahoe-Anshus, Unit Head in Systems and Procedures, Mayo Clinic
The Mayo Clinic Internal Business Consulting Framework details the methodologies, analytical tools, soft skills and professional development mechanisms necessary to be an effective business consultant. The framework also incorporates project management, knowledge management and change management to ensure excellence, consistency and completeness of services delivered. The framework has been widely used and validated at Mayo Clinic for three years. Use of the framework has resulted in higher client and staff consultant satisfaction. It has also enabled a large, diverse business and management engineering consulting team to successfully work across a broad spectrum of projects and clients in a multisite, multidisciplinary organization. The framework will serve as a practical guide to business consultants in various industries, to effectively leverage organizational culture, build relationships, identify and engage key stakeholders, facilitate change, implement sustainable solutions and assure ongoing operational ownership.

11. Balancing the Costs and Public Benefits of a Vaccination Program using a Supply Chain Approach
Eugene Levner, Professor, Ashkelon Academic College
Annual influenza epidemics incur great losses in both human and financial terms. Vaccination is a main weapon for fighting influenza outbreaks. A key question arising in a large scale vaccination program is to balance the costs and public benefits of the program. The goal is to decrease the losses in the vaccination program with the benefits of vaccinating high-risk subgroups of population being taken into account. We suggest an operations research approach for decreasing shortages in vaccines and increasing vaccination rates among high-risk subgroups of citizen. Our model is a version of the constrained lot-sizing problem that we solve by a novel minimum-cost network flow algorithm. A case study of a nation-wide vaccination program and perspectives of its enhancing are discussed.
Co-authors
: Hanan Tell, Bar Ilan University; Sharon Hovav Manager, Clalit Health Service; Dmitry Tsadikovich, Bar Ilan University

12. Applying SNA Techniques to Telecom Subscriber Calling Data
Veena Mendiratta, Practice Leader, Bell Labs CTO
Subscriber churn in telecommunication networks is a major problem and cost for service providers. Given the large number of subscribers involved – tens of millions – even a small improvement in the churn prediction algorithm can reap huge economic benefits. Typical churn prediction algorithms use calling metrics and demographic and CRM data as variables in supervised machine learning algorithms.  In this work we include additional variables in these algorithms based on the results from Social Network Analysis (SNA) of the calling data. A novel algorithm was developed to compute tie strengths between subscribers based on several calling attributes.  The tie strengths are used to compute the influence of churners on other subscribers using a diffusion algorithm.  Preliminary results demonstrate that using SNA variables for churn prediction can improve the top-decile lift.

13.  Randomized Search Algorithm for Solving Large-Scale Complex Optimization Problems
Kresimir Mihic, Senior Member of Technical Staff, Oracle Labs
This poster presents Randomized Search (RS), a novel algorithm for solving hard combinatorial problems. The algorithm is similar in spirit to the Simulated Annealing (SA) methodology, but it performs a more structured exploration of the search space and as a result finds good solutions very quickly. RS consists of two phases, exploitation and exploration that alternate until the algorithm converges to some locally optimal solution or until maximum runtime is reached.  In the exploitation phase, the algorithm seeks to improve the current solution vector, while the exploration phase serves as a mean to "escape" locally optimal points. RS is especially suited for smooth surface problems that include complex constraints among the function's input and output variables and the problems that are non-linear and non-convex, thus were the optimal solution does not need to be guaranteed. We successfully applied the algorithm to two revenue management problems, namely regular price optimization problem, and shelf space optimization problem. Experimental results show that RS outperforms commercial solvers and internal heuristic solutions in runtime and yields good solutions across a wide range of problem configurations without tuning.

14. Maximizing the Utilization of Manpowered Systems through Optimized Crew Rotation Schedules
Nicholas Nahas, Lead Associate, Booz Allen Hamilton
With mounting Federal debt, growing political pressure to cut government spending, and increasing demands on the military, the armed forces are challenged more than ever to “do more with less.” Analytics can help by intelligently - through computational techniques - scheduling crew resources to maximize the utilization of manpowered systems. However, these problems can have complicated business requirements: manpowered systems need periodic maintenance, a fixed number of systems must be productive at all times, specialized crew must be assigned to their appropriate system, and equity in crew assignments must be maintained.  This poster will present a decision support tool designed to solve this resource scheduling problem.  Ultimately the poster will present a computational solution to large scheduling problems that applies not only to managers of military resources, but to business managers and chief operating officers interested in how analytics can be used to “squeeze” more productivity from existing resources with optimal scheduling.

15. Stacking Optimization of Thermoformed Plastic Packaging
Kyle Naumann, Master’s Student, Western Michigan University
Fabri-Kal is a leading manufacturer of thermoformed packaging solutions.  A problem faced by Fabri-Kal is product loss caused by non-optimized stacking configurations of distribution pallets.  The objective is to minimize crushed product while utilizing the space on a pallet efficiently to maximize overall profit.  The product line focused on was thermoformed drink cups.  Given relevant SKU parameters, weight thresholds, production rate and associated costs, an Excel based heuristic model generates optimized stacking configurations.  Other considerations accounted for are whether the product is sleeved, automatically bagged and/or manually counted.  The heuristic model is programmed using Excel Macros with VBA, as to provide a friendly user interface for immediate implementation.

16. Fleet Purchase Planner
Daniel Reich, Operations Research Analyst, Ford Motor Company
Sustainability and environmental impact are areas of growing importance to many of Ford’s fleet customers. In recent years, many new green vehicle technologies have emerged. These present organizations with an opportunity to reduce their emissions and lower their fuel costs, by increasing the fuel economy of their fleets. We are developing the “Fleet Purchase Planner (FPP)” (patent pending) software tool to be used by Ford’s sales managers to assist our fleet customers in planning their purchases. Given a customer’s current fleet and an estimate for annual miles traveled in the fleet vehicles, FPP provides a baseline report including statistics on the fleet’s current carbon footprint, annual fuel usage and operating costs. FPP then leverages this information to provide purchase recommendations that help our customers achieve both their sustainability and financial goals.

17. Using a Decision Component Registry to Ensure Optimal Analytic Maturity of Modeling and Simulation Tools for U.S. Marine Corps Logistics use in a Tight Budget Environment
Norman Reitter, Advisor for Information Technology, Concurrent Technologies Corporation
Supporting a global war on terror while reducing IT costs by $49M over the next 5 years requires the U.S. Marine Corps (USMC) to apply the right type of analytics for decision support. The USMC Logistics Modeling and Simulation (M&S) community must ensure validated capabilities are developed to support decisions to myriad critical issues that face senior leaders.  The M&S decision support tools that are being used and developed must be managed to ensure the appropriate level of validation and accreditation across the spectrum of decisions.  Data systems that support M&S tools must also be minimized and provide authoritative data to support the M&S community.  We have begun using a novel approach to develop a Decision Component Registry that will improve M&S and data maturity across the community.  This provides a multi-echelon mapping between data systems, tools, and decisions to use maturity-based improvements to make smart budget decisions. Co-authors: Captain Aaron Burciaga, Director, Logistics Operations Analyses, Headquarters, USMC; Jeffery P. Eaton, Director, Operations Research Solutions, Concurrent Technologies Corporation

18. An Analytical Approach to Item Planning and Inventory Management at Starbucks Coffee Company
Stephen Stoyan, Analytics Manager, Starbucks Coffee Company
We present the analytical tools developed to approach the Starbucks item planning and inventory management project.  There are a number of items in the Starbucks supply chain that have sourcing, demand and sometimes new substitutions or new products that challenge the planning organization. To address this we develop analytical tools that employ various statistical measures, integer programming and stochastic modeling techniques to effectively manage the challenges.  The first tool comprises of an algorithm that evaluates an items demand history and classifies it into an appropriate planning method.  Another tool examines the inventory level of an item by providing the optimal upper/lower bound and considers safety stock values.   Finally, we have created tools that monitor data processes involved with inventory levels and safety stock quantities.  The combinations of these analytical tools have resulted in significant improvements to planning, inventory management, and allow the process to be more agile.

19. Risk Intelligent Modeling: Avoiding the Pitfalls of Black Box Analytics
Ann Thornton BSc Hons, FCA Director, Accounting Valuation & Analytics, Deloitte & Touche LLP; Bob Torongo, MA, Manager, Accounting Valuation & Analytics, Deloitte & Touche LLP
Modern statistical software packages have immense amounts of computational power made accessible by “intelligent” and “user friendly” interfaces.  Such easy-to-use interfaces often obscure assumptions and hide the limitations of the statistical models they produce. As a result, the models generated can easily become “black boxes,” mathematical mountains that are impossible to understand and/or critically review, even for the statistically minded.  Such models, especially when insufficiently documented, are difficult to justify to a third party, such as a regulator. 
We offer practical advice to help avoid or remediate statistical models that have inadequate documentation or non-transparent assumptions. The proposed strategies help limit invalid results and/or misinformed decision making and are designed to separate the statistics from the judgments, such that the statistics are academically supportable while the judgments can be challenged based on expert business knowledge. We bring the two together to help provide transparent, maintainable, and effective statistical models.

20. Using Web Crawling to Augment Databases for Customer Acquisition Modeling
Dirk Van den Poel, Professor, Ghent University
The sales process generally is a stressful undertaking for sales representatives; lead conversion (i.e., turning potential clients into “profitable” customers) rates are typically very low (below 1%). To make this process more efficient, customer acquisition modeling helps to identify which of the prospects have the highest probability of becoming a “profitable” customer. The goal of this paper is to assist sales representatives by predicting which of the potential customers will be profitable. This is done by finding the optimal data source and data mining technique combination. The data sources under investigation are commercially available data and web data (i.e., the use of publicly-available (web) data from prospects’ websites). We use logistic regression, decision trees and bagged decision trees. Results show that bagged decision trees are consistently higher in accuracy than the other techniques. Web data is better at predicting profitability than commercially available data, but combining both is even better. Co-authors: Jeroen D’Haen, Ghent University; Dirk Thorleuchter, Fraunhofer INT

21. Long Term Strategic Workforce Planning for Contact Centers Using Mixed Integer Programming Approaches and Simulation-Based Forecasting Techniques
David Woo, Operations Research Practitioner, Bay Bridge Decision Technologies
Traditional planning for contact centers involves both long term strategic planning and short term tactical staff scheduling. Long term strategic planning is particularly challenging because lead time for hiring and training agents must be taken into account in order to ensure high service performance levels. In addition, with the growing customer base and technological advancements, contact centers are becoming increasingly sophisticated; employing different routing strategies, added multi-channel or multi-media capabilities and leveraging outsourced services and virtual agents. We describe a methodology that allows for forecasting, capacity planning and what-if analysis capabilities to develop long term strategic plans for contact center management. Discrete event simulations are used to provide predictive modeling of the queuing systems within contact centers and account for variations in customer patience and abandonment behavior resulting in accurate prediction of key performance parameters. Mixed integer programming algorithms determine hiring strategies to meet long term demand, extra and under time strategies to meet fluctuating seasonal demand and balance workload and staff allocation across the contact center enterprise.

22. Applying OR in Real-time Railway Operations
Erwin Abbink, Managing Consultant Innovation, Netherlands Railways
We will describe what kind of organizational problems are encountered while implementing these very promising systems. We will present our approach to solving these organizational issues by building a new team which is responsible for testing and evaluating the prototypes and also for using the system in real-life. Problems we address are the centralization of dispatching, phasing out the old dispatching system and creating urgency of introducing the system. Summarizing, we will give a broad presentation of developing OR systems for supporting the real-time operations at the complex railway system of the Netherlands. We will present some technical aspects but also describe our findings in the research & development and in the implementation process.

23. Designing for "Consumability": Novel Analytics for Next Generation Health Records
Rema Padman, Professor, Carnegie Mellon University
The digitization of healthcare has resulted in healthcare environments that are rich in data but lacking in the analytics that can provide cognitively guided, real time decision support at the point of use. The Electronic Health Record (EHR) has provided unprecedented level of access to clinical data in real time, however, they are considered to be neither user-friendly nor designed for "consumability" which facilitates the utilization of data and technologies for improving quality and efficiency of the healthcare delivery system. Additionally, current initiatives exclusively utilize computers and the internet as the platform for EHR solutions, limiting their potential utilization among a number of critical demographics, such as the functionally illiterate, underprivileged, elderly and disabled populations. Drawing on current research at the Heinz College at Carnegie Mellon University, this talk will address these challenges by highlighting three aspects of "consumability" in the healthcare setting - those related to data, technology, and a new channel of care delivery, respectively. 

24. Applying Stochastic Simulation Methods to Reduce Business Risks in the Airline Industry
Joel Tollefson, Modeling and Simulation Analyst, The Boeing Company
Boeing is developing a simulation tool called CEM (Customer Engagement Modeling) to help the airline customer determine how to grow their networks, select the best airplane for a given route, improve profitability, perform competitive analysis, and much more.  The current analysis methods are based on a “deterministic” approach – a single set of inputs determines a single set of outputs. The problem is that “real-world” variability is not used.  To better account for variability in airline economics and performance, stochastic simulation and analysis methods are being implemented in the CEM program.  This presentation shows how Monte Carlo methods are used to define the distribution of input data and to generate probabilistic output data to account for variability. This provides more accurate real-world data and analysis, enabling the airlines to improve their business decisions and reduce risk. 

25.  Affordable Readiness Model
Philip A. Fahringer, Systems Engineer, Lockheed Martin Corporation
The Lockheed Martin developed Dynamic Comparative Analysis Methodology is fundamentally a decision support approach that develops tailored decision support applications.  Once developed, these applications enable decision makers to interactively compare alternatives, account for uncertainty around assumptions and then make analytically based decisions regarding which courses of action will yield the best results under the widest range of likely scenarios.  The Methodology is rapid, completely adaptable to any program or budget allocation problem, any set  of alternatives, and any set of results that need to be considered; with the ultimate desire of being able to provide insights into which specific alternatives will achieve the best outcomes in terms of overall costs and results.

26. Information Architecture for Consumer Healthcare Analytics
George Konstantinow PhD, Information Technology Consultant
An Information Architecture is a set of information sources and connections among those sources for delivery of information with a purpose. In healthcare information technology, Information Architecture routinely refers to systems architecture for electronic medical records, clinical patient management systems, and billing systems. We propose an architecture for consumer healthcare informatics that centers medical and financial information sources on patients’ requirements as recipients of medical services. This architecture is based on collaboration for medical decision making, information exchange through aggregation of structured and unstructured data, and data connectors based on formal data models as well as generalized user access models (such as social media). Such “data coordination services” connect individuals’ health data within “virtual cohorts,” targeting individuals’ specific health, medical, and lifestyle attributes for decision support. Basing healthcare analytics on such an Information Architecture provides common guidance for analyzing medical information, treatment outcomes, and financial aspects of treatment effectiveness.

27. Developing Synthetic Populations and their Use in Agent-Based Models
Charles M. Macal, Director, Argonne National Laboratory
There are many problems in business in which we want to understand how populations will respond to strategies and policies.  Examples include how people will respond to marketing campaigns, public health appeals and interventions, and changes in government regulations.  The solution is to develop a what-if called a computational agent-based model in which populations comprised of individuals are explicitly modeled; individuals have diverse characteristics, behaviors, and interactions, possibly interacting through social networks with other agents.  Agent-based modeling is a new approach to modeling systems comprised of autonomous, interacting agents.  Example applications include understanding future requirements on the healthcare system, consumer adoption of new products, projecting tax revenues, demands on the transportation system, epidemics and pandemics, and evacuation and emergency preparedness requirements.  The population of agents in an agent-based model is referred to a synthetic population. Characteristics of agents population and behaviors, including demographics data, daily activity data, and behavioral data, and must be assembled from the ground up from many data sets.  This poster describes the process of developing synthetic populations, their use in agent-based models, and several applications in health care, energy and market adoption. 

  Back to the top


Copyright © 2011, INFORMS | Institute for Operations Research and the Management Sciences