Speakers & Tracks

no speaker image
Bahadir Aral
Zahir Balaporia image
Zahir Balaporia, CAP
cody baldwin image
Cody Baldwin
jim bander image
Jim Bander
Michael Bentley image
Michael Bentley
Anuradha Bhamidipaty image
Anuradha Bhamidipaty
Barnali Bhattacharjee image
Barnali Bhattacharjee
bnsf logo image
BNSF Railway
charles brandon image
Charles Brandon
jon breece image
Jon Breece
Alan Briggs image
Alan Briggs, CAP
melissa bowers image
Melissa Bowers
muge capan image
Muge Capan
unknown speaker image
Cavan Capps
James Cochran image
James Cochran
cnpc logo image
China National Petroleum Corporation
unknown speaker image
John Cuffe
Bill Danker image
Bill Danker
Ameya Dhaygude image
Joseph Dery
Ameya Dhaygude image
Ameya Dhaygude
ciro donalek image
Ciro Donalek
Martin Ellingsworth image
Martin Ellingsworth
ray ernenwein image
Ray Ernenwein
europcar logo image
Europcar
fcc logo image
Federal Communications Commission
gary godding image
Gary Godding
Bill Griffin image
Bill Griffin
Yael	Grushka-Cockayne image
Yael Grushka-Cockayne
Steven Hamlen image
Steven Hamlen
Kristian Hammond image
Kristian Hammond
warren hearnes image
Warren Hearnes
Mary Helander image
Mary Helander
ibm logo image
IBM
dan iliescu image
Dan Iliescu
Noyan Ilk image
Noyan Ilk
Tasha Inniss image
Tasha Inniss
intel logo image
Intel
Christopher Jerde image
Christopher Jerde
Anssi Kaki image
Anssi Kaki
Ity Kanoria image
Ity Kanoria
kastango image
Nick Kastango
Pavan Korada image
Pavan Korada
ahmet kuyumcu image
Ahmet Kuyumcu
Gertjan de Lange image
Gertjan de Lange
marcial lapp image
Marcial Lapp
no speaker image
Simon Lee
jay liebowitz image
Jay Liebowitz
Bing Liu image
Bing Liu
Marco Lübbecke image
Marco Lübbecke
macys logo image
Macys
Freeman Marvin image
Freeman Marvin
Richard McGrath image
Richard McGrath
Connor McLemore image
Connor McLemore
polly mitchellguthrie image
Polly Mitchell-Guthrie
robert moakler image
Robert Moakler
wendy moe image
Wendy Moe
Mike  Morello image
Mike Morello
david morrison image
David Morrison
Anna Nagurney image
Anna Nagurney
Scott Nestler image
Scott Nestler
northwestern logo image
Northwestern University
Arne Owens image
Arne Owens
randy paffenroth image
Randy Paffenroth
gregory parnell image
Gregory Parnell
phn logo image
Pediatric Heart Network
Lea Pica image
Lea Pica
Christina Phillips image
Christina Phillips
robert phillips image
Robert Phillips
Jennifer Priestley image
Jennifer Priestley
Stuart Price image
Stuart Price
Brian Pujanauski image
Brian Pujanauski
nancy pyron image
Nancy Pyron
Shanshan Qiu image
Shanshan Qiu
anu	raman image
Anu Raman
eva regnier image
Eva Regnier
bill roberts image
Bill Roberts
Schneider logo image
Schneider
Michael Schuldenfrei image
Michael Schuldenfrei
Bhavna Sharma image
Bhavna Sharma
mona siddiqui image
Mona Siddiqui
Manjeet Singh image
Manjeet Singh
giridhar tatavarty image
Giridhar Tatavarty
douglas thomas image
Douglas Thomas
Luke Thompson image
Luke Thompson
Noha Tohamy image
Noha Tohamy
Cenk Tunasar image
Cenk Tunasar
turner logo image
Turner
usaf logo image
United States Air Force
Sridhar Vaithianathan  image
Sridhar Vaithianathan
lalit wadhwa image
Lalit Wadhwa
vijay wadhwa image
Vijay Wadhwa
disney logo image
The Walt Disney Company
tim wilson image
Tim Wilson
Sallamar Worrell  image
Sallamar “Sally” Worrell, CAP
Nick Wzientek image
Nick Wzientek, CAP
tauhid zaman image
Tauhid Zaman
Michael Zargham image
Michael Zargham
Mei Zhang image
Mei Zhang
no speaker image
Bahadir Aral

SAS Institute Inc.

Improving Forecasting Process Using Machine Learning

Forecasting is an essential part of sales and operations planning process. Statistical forecasts are starting points for demand planners and planners incorporate their intuition, business knowledge, and bias to come up with the final planning forecast. In the process, the value of planners’ overrides is typically measured using forecast value-add metric. In this talk, we walk through a business case based on the real-world experience to show how machine learning (ML) can be used to improve forecast value-add scores. ML will be used to help forecast overriding process by recommending when to override (or not) and the direction for forecast overrides. Our results will show that we can improve the performance of planners with poorer forecast value-add scores and detect the time periods where overrides to statistical forecast will be successful with a high accuracy.

  • Bahadir Aral is an expert in advanced analytics and optimization. He has 10 years of leading and delivering advanced analytics projects in pricing & revenue management, supply chain, predictive modeling, simulation and optimization for fortune 500 customers. Before joining SAS advanced Analytics and services division, he was a member of Zilliant science team and led science efforts for a variety of price segmentation and optimization projects in multiple industries such as chemical, electrical, power and wiring manufacturers, auto parts, high-tech, and building materials. He was also actively involved in conducting internal research projects on demand modeling and price optimization. Education: Texas A &M (Ph.D.).
cody baldwin image
Cody Baldwin

Assistant Professor of Business Management
Brigham Young University-Hawaii

Bringing Competition to the Analytics Classroom: A Practical Guide to get Started

There is now extensive research on the value of competition (or “gamification”) across education; it can skyrocket student motivation and engagement. In the world of analytics education, the rise of open data science competitions makes it easier than ever to add “gamification” to the curriculum. Students can compete against top talent across the globe on real-world business problems. As we have incorporated competitions into our analytics coursework at Brigham Young University-Hawaii, student feedback shows an increase in motivation to learn beyond the classroom, among other benefits. This presentation will provide educators with practical advice and materials; such as learning outcomes, assignment guidelines, and grading rubrics; for implementing competitions in their analytics classes.

  • Cody Baldwin, MBA, PMP, CSM is an assistant professor of business management at Brigham Young University-Hawaii and currently teaches courses in business analytics, operations, project management, and process improvement. Prior to joining the faculty at BYU-Hawaii, Mr. Baldwin worked at HP, where he was a Senior Project Manager leading several big data and analytics initiatives. While there, he managed teams working across 6 countries and 3 continents.
jim bander image
Jim Bander

Analytics Lead
Experian

What do Credit Scores, Blockchains, and Vehicle Navigation Systems Have in Common? Profiting from Optimization with Imperfect Data

Modern analytics technology empowers companies to make automated and/or optimal decisions with large amounts of data. Yet real-world data have limitations in terms of completeness, timeliness, and accuracy. An emerging topic is to make optimal decisions while accounting for such data quality issues. This talk will: (a) illustrate the problems that arise when treating incomplete or delayed data as though they were completely accurate and timely; (b) demonstrate that new machine learning technology will allow decisioning systems to approach data more skeptically, with the algorithmic “grain of salt” that has been intractable until recently; and (c) explore the potential benefits of such an approach in applications including consumer finance, cryptocurrency, and mobility.

  • Jim Bander is the new Analytics Lead for Experian Advisory Services’ Go To Market team. He serves as Experian’s subject matter expert for the North American financial services industry, focused on Analytics.

    Until recently, Jim was responsible for decision science in the risk management department at the world’s largest auto lender. Among other projects, Jim’s teams there implemented an award-winning Collections Treatment Optimization system that kept thousands of customers from losing their cars and tens of thousands more from getting markers on their credit bureau reports after the Great Recession. The company received 6 awards for that project; Information Week rated it the best Analytics project of 2016.

    Jim has been involved with INFORMS since he was a part of the University of Virginia student chapter in the 1990s. He was chairman of INFORMS’s financial services section (now the section on Finance) from 2010 through 2014. Jim received his PhD from the University of Michigan in Ann Arbor and taught at the University of Florida before leaving academia to focus on the practice of analytics.

    As a practitioner, he has managed optimization applications for a number of problem domains including routing passengers and freight in transportation networks, capital planning, and consumer lending. His work on prescriptive analytics has been cited by sources including Harvard Business Review/SAS, Gartner, FICO, and CIO.com. Jim has also served as president of a charity that raised golden retrievers and other service dogs to alert when children had adverse events related to their juvenile diabetes.

     

Michael Bentley image
Michael Bentley

Partner
Revenue Analytics

Aligning Analytics & Culture: Is Your Organization Ready for the Industry’s Digital Transformation?

The supply chain and logistics industry is all abuzz with talk of the industry’s digital transformation. New and better-quality data is being created every day. From real-time availability and demand forecasting, to competitive pricing and market share data, technology and new sources of information are enabling a radical transformation of the commercial decision-making within the shipping and logistics industry. While the opportunity is massive, and many shippers are already beginning to make investments in predictive analytic capabilities, the most advanced tools in the world will get little traction unless your organization, pricing processes, data, and incentives are aligned.

Based on the speaker’s recent experience implementing an end-to-end RM system for a global container shipping company, this presentation will focus on identifying the key challenges to implementing predictive commercial analytics in shipping and ways to mitigate the risk of these hurdles endangering the success of your program. From dirty data resulting from bad business practices to misaligned incentives for margin and volume growth, this talk will highlight the unforeseen challenges associated with implementing a Revenue Management program in shipping and logistics and battle-tested ways to address these challenges. Finally, this presentation will outline a plan for successful implementation of predictive commercial analytics in the shipping industry, while providing key best practices and takeaways that should be considered when embarking on an analytics journey of your own.

  • As a Partner at Revenue Analytics, Michael Bentley manages client relationships and leads engagements with Fortune 500 clients on pricing and Revenue Management strategy, analytics and business process issues for the Airline and Transportation practice area. During his tenure, he has managed strategic and tactical engagements for clients developing new capabilities to improve pricing and Revenue Management and to measure forecast accuracy and pricing performance. Michael has experience across multiple industries including hospitality, ocean shipping, air cargo, rail, airlines, retail, gaming, food service and manufacturing.

    Prior to joining Revenue Analytics, Michael was Director of Analytics in InterContinental Hotels Group’s (IHG) Global Revenue Management organization. In this role, he led a variety of initiatives to enhance the forecasting and optimization capabilities of IHG’s pricing and Revenue Management systems. Prior to IHG, Michael spent eight years with Delta Air Lines providing business analysis and strategy in a variety of areas including International Pricing, e-Commerce Revenue Management and Network Development.

Anuradha Bhamidipaty image
Anuradha Bhamidipaty

Using Unstructured Data Analytics To Find Corporate Acquisition Targets

Identifying the right targets at the right time is essential for successful acquisitions. The process involves comprehensive search and discovery, and efficiently creating lists of “ideal” targets from among millions of candidates of companies. In response to IBM’s own mergers and acquisition requirements, IBM Research has developed capabilities to support its in-house M&A teams. These capabilities are centered around unstructured data analytics leveraging text-similarity algorithms to discover companies that meet user-specified profiles. Interactive visualizations that accompany the data analytics aid the business users in exploring the recommendations and interactively refining the search space. Initial user feedback suggest that the solution enables users to quickly create lists of companies with better candidates. We believe the approach is generic and can be used for discovering candidates not just for acquisitions but also prospective clients, partners etc.

  • Anuradha Bhamidipaty IBM Research, IBM T.J. Watson Research Center, New York (anubham@us.ibm.com). Ms. Bhamidipaty has been with IBM Research for over 14 years. She is currently managing the AI Solutions group involved in applying technologies that operate on unstructured text to client problems. Her interests include cognitive discovery and similarity analysis over data aggregated from a multitude of systems. Previously she co-led Global Technology Outlook which is IBM Research’s annual outlook on the future of Information Technology. Prior to that she has been leading services delivery research in IBM Research in Bangalore, India. Her work was focused on processes and techniques to automate and streamline service execution in IBM’s large scale IT delivery organization. She has published papers in leading conferences including KDD,VLDB; and has over 20 patent applications. She is a member of IBM Academy of Technology and has been an IBM Master Inventor.
Barnali Bhattacharjee image
Barnali Bhattacharjee

Data Scientist and Deployment Lead
The Dow Chemical Company

Hierarchical Statistical Demand Forecasting at Dow Chemical: Business Challenge, Data, Methods, Tools, and Value (Co-Speaker with Ameya Dhaygude)

The Dow Chemical Company is a leading chemical manufacturer in the world. Dow manufactures plastics, chemicals, and agriculture products. With a presence in over 150 countries, it employs over 50,000 people worldwide. Dow serves to customers in different market segments, with a wide variety of products made for each one. The magnitude of Dow makes demand forecasting a challenge. Dow is using hierarchical statistical demand forecasting to accurately forecast demand for the key products and customers. In this talk, you’ll learn the demand forecasting challenges for Dow, the data and statistical methods/tools used to implement the forecasts, and the value it’s delivering to the company.

  • Barnali Bhattacharjee has joined The Dow Chemical Company as a Data Scientist and Deployment Lead in January, 2018 after graduating from Michigan State University with Master’s in Business Analytics. Her focus area will be in Supply Chain, Commercial and Procurement. She also has a Master’s degree in Statistics from Indian Institute of Technology, Kanpur. Barnali has 5 years of experience in data science in Retail & Corporate Banking domain. She has expertise in Predictive Modelling, Data Mining and Strategic Decisioning.
bnsf logo image
BNSF Railway

Automatic Train Identification: Multidisciplinary Approach to Improve Safety and Efficiency

BNSF Railway has invested significant capital on various track-side detectors to monitor the condition of critical mechanical parts including wheels and bearings to enable proactive maintenance of assets. Among others, we have 1200 Hot Box Detectors (HBD’s) which measure wheel bearing temperature as trains pass by and generate 25,000 messages daily. Many HBD’s are in locations which lack Automatic Equipment Identification (AEI) devices. This presents the unique challenge of accurately matching HBD measurements with the corresponding train, car, and axle. The Operations Research team at BNSF developed a suite of descriptive, predictive, and prescriptive analytics tools that significantly improved train matching efficiency and accuracy. Compared to the legacy system, the new system has improved train matching rates from 75% to 98% and reduced processing time by 70 seconds. Additionally, these algorithms directly eliminated the need to install AEI devices (approximately $150 million) near HBD’s across our network.

charles brandon iamge
Dr. Charles T. Brandon, III

Director for Continuous Process Improvement
Headquarters Department of the Army
Office of Business Transformation

Army Data Analytics: Moving from a Coalition of the Willing to a Total Army Approach

Leaders in industry, academia and government broadly agree on the incredible potential of big data, data analytics and data science in identifying opportunities for innovation, driving operational improvement and advancing organizations toward their strategic goals. The availability of enormous quantities of data in the Army, combined with tools for rapidly analyzing data on a massive scale carries the potential to answer previously unanswerable questions or to ask questions that today, we don’t know we should even be asking. To optimize this potential, we are working to develop an enterprise-level Data Analytics capability that will allow us to analyze, extract meaningful information from, and make decisions and discoveries based upon large, diverse, and real-time datasets. This foundation for innovative analysis can generate new insights and inform strategic decisions regarding matters that are fundamental to the Army’s Title 10 United States Code responsibilities including readiness, modernization, and related planning, programming and budgeting. Ultimately, and most importantly, it will facilitate the resource informed decision making necessary to support optimized efficiency in manning, training and equipping the force. The Enterprise Data Analytics Strategy (EDAS) for the Army builds on the framework of the Army Data Strategy that established the foundation for sharing data, information and IT services to make data visible, accessible and understandable, trusted and interoperable. The EDAS will advance this data management framework to set the doctrinal, organizational and personnel related conditions for innovative analysis that provides actionable insights into areas of strategic importance. There has been a significant amount of work being done across the distributed 24 communities of practice in using data analytics to solve Army problems and in elevating it, through this strategy, as a means to achieving competitive advantage in a complex world. However, full realization of benefits associated with an Army-wide data analytics capability will require leadership engagement at all levels, increasing their awareness, understanding and use of its great potential.

  • Dr. Charles “Chuck” Brandon proudly serves as the Director for Continuous Process Improvement in the Army’s Office of Business Transformation. In this assignment he is tasked with revitalizing process improvement across the Army. Dr Brandon feels that at this moment in time The Army’s program represents perhaps the greatest opportunity to positively impact our ability to improve the processes and programs that support our service members around the world. There is a clarion call from all levels to increase our efficiencies especially with respect to our back office processes. As such, now is the right time to take a fresh approach to developing a culture for continuous process improvement. He joins the Army team after his assignment as the Technical Director for Business Optimization in the Office of the Deputy Chief management Officer. In this capacity he is responsible for the identification of best practices from with the Federal, Private and Academic arenas and assessing how they can be applied to improving the business of Defense. Prior to this appointment he led the Department of Defense Continuous Performance improvement (CPI)/Lean Six Sigma Program Office reporting to the Acting Deputy Chief Management Officer (DCMO), the Honorable Dave Tillotson, Principal Staff Assistant to the Deputy Secretary of Defense. He was responsible for leading the performance management activities of the Department of Defense (DoD), the largest organization in the world, with a $700B+ annual budget and over three million personnel dispersed in over 140 countries. As the DOD CPI/LSS Director he directed arguably the largest and most complex deployment of performance management ever attempted; driving DoD- wide performance improvement activities; rigorously tracking results; provided training; assisted the Department in establishing and growing its program; and ensuring the Office captures the best business practices Enterprise-wide. Dr. Brandon works with Office of the Secretary of Defense Principal Staff Assistants, Service Deputy Chief Management Officers (DCMOs), Service and Defense Agency Chief Information Officers (CIOs), and Program Executive Officers (PEOs) to promote collaboration across the Department on Enterprise-level initiatives.

    Prior to this assignment, Dr. Brandon served as Training Manager, Planning Performance and Management (PPM). In
    this role, he was responsible for Lean Six Sigma (LSS) and Business Process Reengineering (BPR) training for the
    Office of the Secretary of Defense (OSD). His responsibilities included curriculum development, management and
    oversight of LSS Master Black Belt (MBB) instructors, and certification of LSS students. Prior to serving in this
    capacity, he oversaw OSD’s Warfighter Support portfolio, managing Enterprise-level projects across Iraq, Kuwait,
    and Afghanistan. While in theater, Dr. Brandon provided direct support to the Warfighter by streamlining operational
    processes and finding efficiencies in activities directly affecting combat and non-combat operations.
    Before his assignment to OSD, Dr. Brandon served as Chief, Business Transformation and LSS Programs Office for the
    United States Army Training and Doctrine Command (TRADOC). In this assignment, he served as Deputy
    Deployment Director, MBB, and Primary Trainer for LSS/Continuous Process Improvement (CPI) where he
    exercised central management and oversight of all performance improvement training, education, certification, and
    related activities.

    Dr. Brandon is a retired Army Officer, certified LSS MBB, certified Training Developer, certified Project Management
    Professional (PMP), Data Scientist and certified Army Instructor. He has served as a change agent within the Federal
    space for over 15 years. Dr. Brandon holds a Bachelor’s Degree in Economics, a Masters of Business
    Administration, a Masters in Information Technology, and a Doctorate of Business Administration in Quality Systems
    Management from the National Graduate School of Quality Management where he also serves as an adjunct professor
    and is currently working on his Post Doctorate in Industrial and Organizational Psychology.

jon breece image
Brooks ‘Jon’ Breece

Geospatial Analyst
National Geospatial-Intelligence Agency

Advanced Analytics at the National Geospatial-Intelligence Agency

A combat support organization and a member of the US Intelligence Community, the National Geospatial-Intelligence Agency (NGA) recognizes that the volume and diversity of spatiotemporal data flowing into analytic units has moved “beyond the limits of human interpretation and explanation” (GEOINT CONOPS 2022). The organization is evolving technologically and culturally to provide analyses to policymakers and warfighters that utilize, for example, data feeds on real-time conditions, topography, and atmospherics. Harnessing these data will move the organization from anecdote-based analysis to evidence-based understanding and toward models of adversary behavior and other phenomena that are data-driven, dynamic, and anticipatory.The scale of data and our excitement for technology, however, are not reasons to jettison theory and other elements of rigorous investigation. Data and technology are only part of the ecosystem that also includes a skilled workforce required to turn what has been termed the “tsunami of data” into timely, actionable decision aids.Advanced analytics is a team activity that involves front-line personnel; data scientists and geospatial analysts; GIS and other technologies; and data. Like military leaders, business decision-makers must mobilize all these forces in a multi-pronged campaign to build a successful data-driven organization.

  • Brooks “Jon” Breece is a geospatial analyst with the National Geospatial-Intelligence Agency. Jon attended undergraduate at the University of Virginia and graduate school at the University of North Carolina at Chapel Hill, where he completed a master’s of science degree in library science, a master’s of public administration, and a graduate certificate in geospatial information sciences. In 2011, Jon received the Pratt-Severn Best Student Research Paper Award offered by the now-named Association for Information Science and Technology (ASIS&T) for his master’s paper, “Local Government Use of Web GIS in North Carolina,” in which he highlighted the increased use of Web mapping services by all levels of government, and how technology use can lead to an evolution in agencies’ workflows and cultures.
melissa bowers image
Melissa Bowers

Associate Professor and the Beaman Professor of Business in the Haslam College of Business
University of Tennessee

The History and Current State of Analytics Education

The past decade has witnessed explosive growth in the number and types of analytics educational programs. We review the history of the development of analytics education and discuss the breath of current offerings in various forms. We compare and contrast the various curricula and its impact on the practice of analytics in business, government, nonprofits and other organizations.

  • Melissa R. Bowers, Ph.D., is an Associate Professor and the Beaman Professor of Business in the Haslam College of Business at University of Tennessee Knoxville where she is the Director of the Master’s Program in Business Analytics. She is currently the chairperson of the INFORMS University Analytics Programs Committee. Her teaching and research interests include Scheduling, Operations, and Discrete Optimization Models. Dr. Bowers has worked with organizations such as Lockheed, Delta Air Lines, Air New Zealand, Embraer, Boeing, and the University of Tennessee Medical Center. She has published in numerous journals such as Interfaces, MIT Sloan Management Review, European Journal of Operational Research, and several other academic and professional journals.
muge capan image
Muge Capan, PhD

Using Advanced Analytics In Healthcare To Predict Sepsis Risk & Patient Outcome

This session will present a unique case study of how advanced analytics are deployed to tackle complex health care problems. The goals are to diagnose and risk-stratify sepsis, a life-threatening organ dysfunction resulting from response to infection, and to inform treatment decisions. A variety of methods will be presented including descriptive analysis, regression, hierarchical variable cluster analyses and semi-Markov Decision Process models. Key findings and lessons learned for addressing critical decision problems using information technology and subject matter expertise will be shared.

  • Muge Capan, Ph.D., has received her Ph.D. in Industrial and Systems Engineering from North Carolina State University. She has expertise in predictive analytics and optimization applied in health care, e.g., decision analytical modeling, forecasting, and mathematical modeling with applications to medical decision making and optimization of health care systems. Dr. Capan’s current research projects include analytical models for risk stratification in inpatient settings, early warning score-based clinical decision support systems, development and implementation of analytical models for personalization of sepsis diagnosis and treatment in inpatient settings, and patient flow optimization. She is currently the Primary Investigator on a multi-institutional National Science Foundation grant and serves as a Co-Investigator on an RO1 NIH National Library of Medicine grant. Both grants are focusing on sepsis identification and management by integrating industrial engineering and computer science in patient care and advancing the scientific knowledge to predict and prevent sepsis-induced health deterioration.
unknown speaker image
Cavan Capps

United States Census Bureau

Secure Distributed Computational Processing For Industry Statistical Data

Secure distributed technology for computing statistics is being developed by DARPA and the Census Bureau to process real time big data encrypted from businesses in such a way that only the providing business has the means to decrypt and read the data. Census cannot read the data, but will have the ability generate various statistics about the state of the entire industry reliably.

This is all done using open source software and the Intel SGX secure enclave chip. Our goal to greatly improve industry-wide statistics and their usefulness to industry while significantly reducing respondent burden.
A Shipment Survey has been selected as the use case for the pilot using simulated data from that survey. Machine learning is used to standardize product codes before the data is encrypted for transmission, enabling consistent definitions across a given industry.

When received, the data is parsed from standard electronic transaction records. It is tabulated from multiple company donors at scale using the cloud, and formally confidentially industry-wide aggregated results will be released to participants.

We have also explored linking data from different data donors which simulates the linkage needed to track packages from shippers (such as Walmart) to carriers (such as UPS) in order to provide better transportation data and information on aggregate supply chain data. Simple business intelligence queries can be done on the final detailed formally private aggregates to inform business decisions in the shipping and logistics industry as well as informing national economic policy.

  • Mr. Capps is focusing on new Big Data sources for use in official statistics, best practice private sector processing techniques and software/hardware configurations that may be used to improve statistical processes and products. He is designing and implementing the Big Data Lab at Census to explore these new possibilities with statistical and machine learning analysts in the Bureau. Previously, Mr. Capps initiated, designed and managed a multi-enterprise, fully distributed, statistical network called the DataWeb, one of the first and largest government open-data platforms. Currently, the DataWeb is the source of official API to Census data products.
unknown speaker image
James Cochran

Professor of Applied Statistics and the Rogers-Spivey Faculty Fellow in the Department of Information Systems, Statistics, and Management Science
University of Alabama’s Culverhouse College of Commerce and Business Administration

ABOK (Analytics Body of Knowledge)

In this talk we provide an overview of the process of creating and publishing the INFORMS Analytics Body of Knowledge (ABOK), review its contents, and discuss the industry/academia coordination that was necessary to bring the ABOK to fruition. We will also discuss ways that industry/academia collaborations can enhance future editions of the ABOK and the practice of analytics.

  • James J. Cochran is Professor of Applied Statistics and the Rogers-Spivey Faculty Fellow in the Department of Information Systems, Statistics, and Management Science at the University of Alabama’s Culverhouse College of Commerce and Business Administration. He earned a B.S. in Economics (1982), an MS in Economics (1984), and an MBA (1985) from Wright State University, and a PhD in Statistics (1997) from the University of Cincinnati. He has been a Visiting Scholar with Stanford University, the University of South Africa, the Universidad de Talca, and Pôle Universitaire Léonard De Vinci.

    Professor Cochran’s research interests include statistical methods (particularly general linear models), statistical learning, sample based and Bayesian optimization, and applications of statistics and operations research to real problems from a wide variety of disciplines. He has published over thirty articles in peer-reviewed academic journals and seven textbooks in statistics, analytics, and operations research. Professor Cochran has served as a consultant for many companies, government agencies, and NPOs.

    Professor Cochran was a founding co-chair of Statistics Without Borders, and he established and has organized INFORMS’ Teaching Effectiveness Colloquium series and annual Case Competition. Professor Cochran also established an annual International IR & Statistics Education Workshop series and has co-chaired workshops through this initiative in Uruguay, South Africa, Colombia, India, Argentina, Kenya, Cameroon, Croatia, Tanzania, Cuba, Mongolia, and Moldova. In 2008 he organized and chaired the 2008 ORPA Conference on Using Operations Research to Address Urban Transport and Water Resource Management Issues in Africa.
    In 2006 Professor Cochran was elected to the International Statistics Institute, in 2008 he received the INFORMS Prize for the Teaching of OR/MS Practice, and in 2010 he received the Mu Sigma Rho Statistical Education Award. In 2011 Professor Cochran was named a Fellow of the American Statistical Association, in 2014 he received the Founders Award from the American Statistical Association, and in 2015 he received the Karl E. Peace Award for Outstanding Statistical Contributions for the Betterment of Society. In 2017 he received the Waller Distinguished Teaching Career Award and was named a Fellow of the Institute for Operations Research and the Management Sciences. Professor Cochran is the founding Editor-in-Chief of the Wiley Encyclopedia of Operations Research and the Management Sciences and the Wiley Series in Operations Research and Management Science. He has served as the Editor-in-Chief of INFORMS Transactions on Education and serves on the Editorial Board for several other journals and publications including Significance and Interfaces.

cnpc logo image
China National Petroleum Corporation

Natural Gas Pipeline Transmission Optimization for China National Petroleum Corporation

China National Petroleum Corporation (CNPC) is China’s largest oil and natural gas producer and supplier and controls 75 percent of the country’s natural gas resources and pipeline network. Over the past five years China’s natural gas consumption has nearly doubled and the demand for natural gas is expected to grow at a steady rate. To better serve the increasing demand, CNPC partnered with researchers from the University of California, Berkeley and Tsinghua University (Beijing) to apply innovative operations research in order to develop and implement a new software that optimizes the operation of its natural gas pipeline network. Previously, all annual production and construction planning for CNPC was manually conducted using spreadsheets. However, the increasing size and complexity of China’s natural gas pipeline network caused the manual method to result in excess costs and wasted resources. Since the implementation of the new optimization software at the end of 2014, CNPC has realized approximately $530 million in direct savings and extra revenue. Meanwhile, the increased efficiency of the existing pipeline network has postponed the need for new pipelines, leading to an estimated savings of over $18 billion in construction costs.

no speaker image
John Cuffe

Researcher in the Center for Big Data Research and Applications
U.S. Bureau of the Census

Using Public Data to Improve Accuracy of North American Industry Classification System Codes at the Census Bureau

The North American Industry Classification System (NAICS) classification is fundamental for sample development and statistical product rigor to estimate and forecast the health and direction of the US economy. However, the current process of NAICS assignment (which dates back to the mid-2000s) is not efficient and places a high burden on manual classifications. These weaknesses can be quickly and inexpensively overcome with modern text mining and machine learning techniques. By combining information from IRS tax forms, combined with publicly-available information on businesses, we are able to generate a new coding model for NAICS Classifications more accurately for a lower cost than the previous methods. This research has significant cross-agency implications in our ability to improve NAICS and other classification models.

  • John Cuffe is a Researcher in the Center for Big Data Research and Applications at the U.S. Census Bureau, focusing on the use of text data and new record linkage techniques to generate new, or improve existing, public statistical products.
bill danker image
Bill Danker

Head of Seeds Research and Product Development Domain
Syngenta

Assembling Structured And Unstructured Data To Advance Next Generation Analytics In Agriculture

The agriculture industry is faced with the challenge of feeding a rising population. As a business serving the agriculture industry, Syngenta is committed to helping farmers meet that challenge in a sustainable way. The world will need to produce more food in the next 50 years than it had in the previous 10,000 while using resources far more efficiently. Simply put, crop production must increase. We recognized that in order to meet this challenge, we needed to leverage more advanced computer science methods that could provide predictive and prescriptive analytics that could then drive improvements in crop production and optimize our breeding programs. Our early learnings were that there were no easy solutions and we had many challenges related to data. Plant breeding is complex. A plant breeder’s job is to create stronger plants by creating offspring and then selecting the best offspring over time to provide the best products to growers. The performance of a plant is determined by 3 major factors: genes, environment and the interaction between genes and the environment. In order to develop robust models that identify patterns in our experimental data we had the challenge of integrating data from multiple sources, performance and genotypic data of our products and environmental data from where products are tested and grown. While our models and applications are proprietary, this discussion will focus on our experiences building our training sets, as well as the business challenges and the methods/approaches used to meet those challenges.

  • As Head of Seeds Research and Product Development Domain at Syngenta, Bill Danker is responsible for leading the delivery of tools and applying methods focused on process improvement, optimization and improved decision making in Syngenta’s breeding programs.

    Bill joined Syngenta in 2005, after more than 20 years of experience in R&D, Computer Science, Marketing and Agriculture. He served as CEO of Inventive Communications (a Dotcom startup in the late 1990s). During his 13-year tenure at Syngenta, Bill has held a variety of IT roles spanning commercial and R&D functions that include platform and application development of high-throughput systems, Genetic/Genomic research tools, next-generation breeding tools and new Gene Editing technology implementations. As head of IT architecture he was responsible for the development of new high-performance computing environments for R&D and drove the current expansion and adoption of high-redundancy, high-distribution parallel processing technologies that are enabling Syngenta’s next-generation analytics.
    Most recently, Bill has led global IT projects to bring innovative approaches to application development, agile project delivery and rapid application development to Syngenta. In 2017, Bill led the establishment of Syngenta’s new digital innovation labs in Europe and North America.

    Bill received his Bachelor of Science degree in computer science, with a minor in business administration, from Morningside College. He earned his Master of Science degree in Business Administration Sciences from Iowa State University.
    Bill is married and has five children now spread across the U.S.; he resides in the Raleigh-Durham, North Carolina, area.

joseph dery image
Joseph Dery

PhD student and Adjunct Lecturer
Bentley University

Towards Strategic “Jingle” Success: An Interplay of Business & Music Analytics

Ever have a catchy jingle stuck in your head? Ever wonder why? Well, in marketing, jingles are intended to communicate both information and emotions to you, the consumer, in concise, packaged bursts of sound. In many ways, these jingles are intended to trigger the recall of a brand or a particular message in your head – which, if you answered the original question with a resounding “YES!”, means it probably worked. HOWEVER, given their short duration, jingles rely on you, the consumer, to make quick AND correct associations… a truth effect. So, in an attempt to unpack the secret sauce of what drives “jingle success”, researchers from Bentley University are leveraging an interdisciplinary approach to business analytics that draws from philosophy, music theory, statistics and the emerging field of music analytics to extract key “jingle features” that allow businesses to strategically design jingles for maximum consumer familiarity, our defined measure of success.

  • Joe Dery is a PhD student and Adjunct Lecturer at Bentley University in Waltham, MA where he teaches Customer Data Analysis and Quantitative Methods for Business Decisions. Joe also works full-time as the Director of Decision Sciences for the Enterprise Analytics Group at Dell EMC. During his tenure with Dell EMC, Joe has utilized cross-functional big data, industry-leading technologies, and the newest data science practices to solve some of Dell EMC’s most complex challenges. Joe received his Bachelor’s from Babson College with concentrations in Statistical Modeling and Marketing, his Master’s from Bentley University in Marketing Analytics, and is currently working on his PhD in Business Analytics at Bentley University. His current research interests focus on theories of reference & association with applications in marketing analytics (specifically around Brand Genericide and strategic jingle success).
Ameya Dhaygude image
Ameya Dhaygude

Data Scientist
Dow Chemical Company

Hierarchical Statistical Demand Forecasting at Dow Chemical: Business Challenge, Data, Methods, Tools, and Value

The Dow Chemical Company is a leading chemical manufacturer in the world. Dow manufactures plastics, chemicals, and agriculture products. With a presence in over 150 countries, it employs over 50,000 people worldwide. Dow serves to customers in different market segments, with a wide variety of products made for each one. The magnitude of Dow makes demand forecasting a challenge. Dow is using hierarchical statistical demand forecasting to accurately forecast demand for the key products and customers. In this talk, you’ll learn the demand forecasting challenges for Dow, the data and statistical methods/tools used to implement the forecasts, and the value it’s delivering to the company.

  • Ameya Dhaygude has been working with The Dow Chemical Company as a Data Scientist for the past seven years. His focus areas are building scalable statistical and analytical solutions in the Supply Chain and Purchasing domains. Ameya has expertise in time series analysis, operations research, and machine learning methods. He is leading development of data science and analytics tools for the demand planning, inventory management, and logistics management processes at Dow. Ameya has a Master’s in Industrial Engineering from the Oklahoma State University
ciro donalek image
Ciro Donalek

CTO & Co-Founder
Virtualitics

The Future of Data Analytics: Coupling AI with Immersive Environments

Our ability to capture data exceeds our ability to make meaning of it” – this is especially true nowadays where most companies face the challenge of huge amounts of data coming in from different sources. Of course, all this data has no value if we don’t have the ability to extract useful information from it. Turning big and complex data into practical insights requires new ways to analyze and visualize it. People interacting with 3D visualizations while working in the same virtual space will naturally and intuitively discover more valuable insights. Thus, the analytics tools of the future will be AI-driven, immersive and collaborative by default, to help users gain a deeper level of understanding of the stories told by the raw data.

  • Ciro Donalek is CTO & Co-Founder at Virtualitics. Prior than that he spent over ten years as Staff Scientist at Caltech where he successfully applied Machine Learning techniques to many different scientific fields, co-authoring over a hundred scientific and technical publications (e.g., Nature, Neural Networks, IEEE Big Data, Bioinformatics). He has also pioneered some of the uses of Virtual Reality for immersive data visualization and artificial intelligence (that led to a patent). He also has a Minor Planet named after him as a reward for the work done in the automatic classification of celestial bodies, and has been part of the small group that built the Big Picture, the single largest real astronomical image in the world (152-feet-wide, 20-feet-tall), currently installed at the Griffith Observatory in Los Angeles. Dr. Donalek has a PhD in Computational Science and a MS in Computer Science/Artificial Intelligence.

ray ernenwein image
Ray Ernenwein

Director, e-Commerce Supply Chain Strategy & Analytics
Walmart Inc.

Using E-commerce Site ‘Click’ Data & Machine Learning To Better Understand Stock-out Costs & Improve Inventory Policy

Optimizing a firm’s investment in inventory given the competing forces of stock-out vs. holding costs is a timeless supply chain management problem. For a B2C e-commerce mass merchant like Walmart.com, this challenge is compounded by the great number of SKUs involved, the variety of stocking locations utilized, and the heaps of relevant data available. Accurately measuring the true cost of a stock-out in this environment is particularly complex as the effects of item substitution, store switching and lost affinity purchases cannot be measured through traditional in-store customer survey methods. In this presentation we summarize a solution to this challenge that uses large-scale analytics of website traffic and customer ‘click’ data. Specifically, we’ve measured customer purchase behavior during comparable in-stock and out-of-stock time periods on a diverse set of SKUs. Our findings confirm that a stock-out’s economic cost to an internet retailer can be quantified with such empirical data and that these costs vary considerably according to a SKU’s attributes. Moreover, our analytical results have served to train a supervised machine learning algorithm that recommends cost-of-stock-out factors for new items. These findings prove very valuable when informing inventory policy, guiding the internet retailer to carry more stock where the cost of a stock-out is demonstrably high and less inventory where the stock-out cost is lower. This, in turn, allows the company’s precious capital to be allocated so that profits and service levels are maximized in balance way.

  • Ray Ernenwein Director, e-Commerce Supply Chain Strategy & Analytics Walmart Inc. In his current role, Ray is directing the use of advanced data science and modeling to deliver asset efficiency, network optimization and strategic operational improvements in Walmart’s omni-channel value chain. Ray has been a supply chain management professional in the high tech industry for twenty years. His industry experience includes semiconductors, consumer electronics and IT hardware. He has held roles in strategic planning, logistics, business operations and program management. Prior to his career in supply chain management, Ray earned an undergraduate degree in mechanical & aerospace engineering and a graduate business degree, both from Cornell University. He has served as a U.S. Air Force Officer in Japan. Ray has been a member, Vice President and committee chair on the Board of Directors of the APICS-Supply Chain Council – a global non-profit dedicated to promoting supply chain education, research and talent development.
martin ellingsowrth image
Martin Ellingsworth

XR, Exploratory Research, for Property & Casualty Insurance
USAA

OR inside P&C – Sizzling Business Applications of Analytics in Insurance

Meet an information factory where over $600B a year goes into managing the risks of daily living at your work and home. This session will provide an overview on the Insurance Value Chain and a brief history of analytics in property and casualty insurance. Along with actual use cases of synergistic data and modeling approaches, you may find this hidden industry is an active employer of OR skills.

  • Marty Ellingsworth leads XR, Exploratory Research, for Property & Casualty Insurance at USAA. In this role he enjoys the challenges of creating membership value out of data, analytics, and decision support systems. He brings together collaborative efforts across USAA as well as from cutting edge research from adjacent industries, top academic minds, world class companies with synergistic technologies, government laboratories and entities, and the many brilliant professional services firms and vendor solutions in play across customer and insurance journeys.

    Marty has seen service in the Air Force as a scientific and applied research officer, spent nearly a decade with analytics, product, and information leadership in healthcare, and has focused on P&C Insurance since 1998 rotating between carriers, startups, vendors, and consulting doing largely innovation focused predictive analytics across the enterprise – Underwriting, Claims, Marketing, Operations, and Distribution. He has a passion for customer centric design over product specific performance, but is savvy to the needs of the businesses he has run and supports.

    A career long member of INFORMS, Marty graduated with academic distinction from the United States Air Force Academy in 1984 with a BS in Operations Research and achieved his MS in the same in 1988.

europcar logo image
Europcar

Europcar Integrates Forecast, Simulation and Optimization Techniques in a Capacity and Revenue Management System

Europcar, the leading European car rental company, partnered with ACT Operations Research to create Opticar, a complex decision support system. Opticar features forecasts, discrete event simulations and optimization techniques providing an integrated approach to revenue and capacity management. Opticar anticipates future demand for Europcar’s vehicles fleet, up to six months in advance, improving capacity management. In addition, Opticar enables Europcar to optimize its approach to revenue management and rental pricing, taking into account competitors information, the currently available fleet and expected demand for vehicles. Opticar provides a shared mathematical approach used as a starting point for all daily operations to nine Europcar’s corporate countries.

fcc logo image
Federal Communications Commission

Unlocking the Beachfront: Using Operations Research to Repurpose Wireless Spectrum

The Federal Communications Commission (FCC) recently completed the world’s first two-sided spectrum auction, reclaiming spectrum from TV broadcasters to meet exploding demand for wireless services. Operations research tools– including optimization, simulation, and SAT-solvers – were essential to both the design and implementation of the auction. The auction was one of the most successful in the FCC’s history, repurposing 84 MHz of spectrum and generating revenue of nearly $20 billion, including more than $10 billion in new capital for the broadcast TV industry and over $7 billion to pay down the U.S. deficit.

michael feindt image
Michael Feindt

Blue Yonder

Using Artificial Intelligence to Optimize Retail Operations

AI is now being used to transform the retail industry, enabling retailers to be better positioned to provide customers with the experience that they both demand and expect. Technology advancements enable machines to make billions of automated every-day decisions in replenishment and pricing, reducing the risk of human error or bias ensuring that decisions are made objectively and in real time. Using demand forecasts in form of probability distributions and linking them with business goals, stock levels and delivery schedules, AI-based solutions can calculate optimal orders for each product and location. Through standardized interfaces orders can be easily integrated directly into the ERP systems. Additionally, fallback orders are provided to support operational safety margins (e.g. in case of a temporary data loss, or ERP downtime). As a result, machine learning algorithms allow the replenishment to run with the highest degree of automation possible. The presentation will show you new approaches with AI, retail specific use cases and real customer cases. It should inspire you how AI can make you work smarter.

  • Prof. Dr. Michael Feindt is the brain behind Blue Yonder, market leaders in AI in retail. Blue Yonder is powered by Michael’s NeuroBayes algorithm, developed during his many years of scientific research at CERN, which enables retailers to automate complex decisions across the entire value chain. With AI embedded into their supply chain and merchandising processes, retailers can respond quicker to changing market conditions and customer dynamics, boosting revenues and increasing margins.
Gary Godding image
Gary Godding

Principle Engineer
Intel

Supply Chain Analytics at Intel

Intel’s vision of building a smart and connected world through the virtuous cycle of growth enables an environment where computing is mobile and ubiquitous and the depth and power of analytics are rapidly growing. Intel has advanced analytics teams within in supply chain organizations that enable areas such as advanced visualization to help supply chain operations, cognitive computing for better market intelligence, and advanced simulation for supply/demand analysis. We also show a case of how these learnings can be transferred from traditional supply chain areas to domains such as construction. Finally, to enable these types of capabilities, you need to build a team consisting of a diverse set of skills and to be able to effectively combine them to come up with the most robust solutions.

  • Gary is currently a Principle Engineer within the Global Supply Management. He provides the technical leadership for the analytics team and helps to provide advanced analytics solutions across Intel’s supply chain, currently having a heavy focus on the construction supply chain, end-to-end supply chain modeling, and AI technologies to enable next level of autonomous operations. Gary has a PhD in Computer Science from ASU. He has held a variety of positions at Intel over the past 25 years including factory automation, scheduling / dispatching, advanced analytics, simulation, factory planning, and supply chain.
Yael	Grushka-Cockayne image
Yael Grushka-Cockayne

Associate Professor
Darden School of Business

Forecasting Airport Transfer Passenger Flow Using Machine Learning and Real-time Data

Passengers missing their connection at an airport can have a major impact on passenger satisfaction and airline delays. Accurate forecasts of the flow of passengers and their journey times through an airport can help improve the experience of connecting passengers and support airline, airport, and air space punctuality. In collaboration with the Airport Operation Centre at London Heathrow Airport, we utilize real-time data to develop a predictive system based on a regression tree and Copula-based simulations. The system generates three outputs: (1) mean and quantiles of passengers’ journey times through the airport; (2) expected number of late passengers for each outbound flight; (3) mean and quantiles of the transfer passenger arrivals at the immigration and security areas. These real-time predictions can be used to inform target off-block time adjustments and determine resourcing levels at security and immigration. Our predictive system has been implemented at Heathrow airport, one of the busiest airports in the world handling more than 75 million passengers per year, with more than a quarter of all passengers making a flight transfer.

  • Associate Professor Yael Grushka-Cockayne’s research and teaching activities focus on data science, forecasting, project management, and behavioral decision making. Her research is published in numerous academic and professional journals, and she is a regular speaker at international conferences in the area of decision analysis, project management and management science. She is also an award-winning teacher, winning the Darden Morton Leadership Faculty Award in 2011, the University of Virginia’s Mead-Colley Award in 2012 and the Darden Outstanding Faculty Award and Faculty Diversity Award in 2013. In 2015 she won the University of Virginia All University Teaching award. Yael teaches the core Decision Analysis course, an elective she designed on Project Management, and an elective on Data Science. She is the leader of the open enrollment courses “Project Management for Executives” and “The Women’s Leadership Program”.Before starting her academic career, she worked in San Francisco as a marketing director of an Israeli ERP company. As an expert in the areas of project management, she has served as a consultant to international firms in the aerospace and pharma industries. She is a UVA Excellence in Diversity fellow and a member of INFORMS, the Decision Analysis Society, the Operational Research Society and the Project Management Institute (PMI). She is an Associate Editor at Management Science and Operation Research and the Secretary/Treasurer of the INFORMS Decision Analysis Society.In 2014, Yael was named one of “21 Thought-Leader Professors” in Data Science. Yael’s recent “Fundamentals of Project Planning and Management” Coursera MOOC had over 100,000 enrolled, across 200 countries worldwide.
steve hamlen image
Steven Hamlen

Managing Director
KROMITE consulting

Influencing The New Technology Adoption Curve Through Strategic Decision Making – A Pharmaceutical Drug Development Case Study For Cancer

Launching or repositioning any new technology platform must answer three key questions, regardless of the industry:

  1. Where should the innovation first be positioned in the market?
  2. How should the innovation be further developed and positioned over the short and long term to maximize its value?
  3. What actions can enhance the value of the innovation or mitigate risk?

Whether you are looking to develop a new drug treatment or an alternative energy supply, provide a new software platform or convert the market to an electric vs. fossil fuel mode of transportation; answering these questions is essential to your business.

Deciding the sequence in which a technology is applied in any market is critical. Initial decisions can either enable or inhibit an asset from reaching its full market potential.

This presentation will provide a methodology to answer these key questions utilizing a case study from pharmaceutical drug development, sequencing the potential diseases a new drug could treat. The discussion will then expand into how this strategic decision making methodology applies across all new technology introductions, regardless of industry.

  • Steve Hamlen is a managing director at KROMITE consulting. He has over 20 years of pharmaceutical and life sciences commercial assessment and marketing management experience including in-line Brand Management, Global Strategic Marketing, and Global Commercial Assessment roles. His experience includes identification and optimization of investments for life cycle strategies, creative in-line marketing campaigns, indication sequencing of internal pipeline investments and external licensing opportunities to maximize business value.

    In these roles, Steve has led and translated development, commercial and regulatory strategies into actionable investment decisions for numerous established and pipeline products, some of which were over $1 billion/year in sales.

    Steve has a BS in Chemical Engineering and an MBA in marketing from Lehigh University, and has multiple publications. Prior to KROMITE Consulting, he has held senior positions at Johnson & Johnson, Catalent Pharma Solutions, and Roche.

kristian hammond image
Kristian Hammond

Co-Founder
Narrative Science

Communicating with the New Machine: Human Insight at Machine Scale

The world of data analytics is at an inflection point. We have crafted a rich collection of methods for gathering, managing and analyzing massive data sets in business, government, public policy and our day-to-day lives. On top of this, our analytic capabilities make it possible of use to discover powerful correlations, trends and predications in the data we have. And, more recently, the rise of machine learning has given us even greater power to mine our data for information.
But this is not the end of the game. The numbers alone simply do not provide us with what we really need: information and insight. The data are only the first step in making these insights available and useful to the decision makers who need them.
In this talk, I will outline what we are with regard to data analytics and how the technology of automatic narrative generation from data plays the crucial role of bridging the gap between the Big Data world of numbers and symbols and our need for understandable insights. I will dive into use cases from business, education and everyday life to show how the power of automatically generated narratives can provide us all with the insights that are still trapped in the wealth of data we now control.

  • Kristian Hammond is a professor of computer science at Northwestern University and co-founder of the artificial intelligence company Narrative Science. At Narrative Science, Kris works on Natural Language Generation (NLG) and its use in the democratization of information. At Northwestern, he is focused on the development of models of artificial intelligence in which human and machine reasoning are integrated to make use the best of each with the aim of an interaction that is better than both. Since the fall of 2016, he has been the faculty lead of the University’s CS + X initiative where he has been exploring how computational thinking can be used to transform fields such as the law, medicine, and education. Most recently, he has been leading the development of Northwestern University’s new Masters of Science in Artificial Intelligence program.

    He believes in humanizing computers so we can stop the process of mechanizing people. Kris received his PhD from Yale.

waren hearnes image
Warren Hearnes

VP of Analytics & Data Science
Cardlytics

Using Purchase Intelligence To Target And Measure Advertising Campaigns

Cardlytics helps make marketing more relevant and measurable, enabling advertisers to make smarter business decisions and more meaningful customer connections. We partner with more than 2,000 financial institutions (FIs) to run their online and mobile banking rewards programs, which gives us a robust view into where and when consumers are spending their money. We see debit, credit, automated clearing house (ACH), and online bill pay spend for tens of millions of individual consumers in the US and UK, without sharing any personally identifiable information. Cardlytics drives value to three important audiences. First, our solutions offer actionable insights and allow marketers to identify, reach, and influence likely buyers at scale and measure the true online and in-store impact of marketing campaigns. Second, financial institutions bring their online and mobile banking customers a targeted rewards program to help them save on the things they like to buy, increasing engagement, loyalty, and revenue from their customer base. Third, consumers receive relevant advertising through their bank rewards programs and earn cash back on the things they like to buy. During this talk, we discuss approaches to use purchase intelligence in all parts of the process to plan for a measurable campaign, identify customers to target, and calculate the impact of the campaign on customers’ purchase behavior. In our pay-for-performance network, the analytics and data science team needs to accurately predict the budget required to cover all rewards and fees in an advertiser’s campaign, sometimes weeks or months prior to the campaign launching. We also use simulation of past engagement and purchase behavior to determine control group size or create a synthetic control group for campaigns not run on our platform. Once a campaign is executing, we use various forecasting techniques to predict whether we are tracking to the budgeted metrics, making changes as needed. Lastly, once a campaign is over we can run standard reporting, measure incremental return on advertising spend, and produce other business insights on the effectiveness of the advertising.

  • Warren Hearnes is an analytics and data science leader with over 20 years of experience. He is currently VP of Analytics & Data Science at Cardlytics, an Atlanta-based advertising and technology company. Cardlytics partners with more than 2,000 financial institutions to run their online and mobile banking rewards programs and provides solutions, allowing marketers to reach likely buyers at scale across media channels. Warren leads a team of 30+ analysts and data scientists that generates insights for banking and advertising partners using Tableau, SAS, R, and Python on both Vertica and Hadoop. Prior to Cardlytics, Warren held progressive analytics and data science roles in Atlanta. Most recently he led the Marketing Sciences team at The Home Depot, responsible for the insights and analytics of their direct mail and e-mail campaigns. Other previous roles include creating fraud detection, optimization, and pricing models in the Data Mining & Advanced Analytics group at UPS, and developing mixed-integer programming models to manage fiber inventory as a Member of Technical Staff at Lucent Technologies. Warren earned his BS in Mathematics from the United States Military Academy at West Point, and his MS in Operations Research and PhD in Industrial Engineering degrees from Georgia Tech. His academic research in machine learning combined the areas of dynamic programming and fuzzy sets to create reinforcement learning controllers for robotic systems. His work on fuzzy models for asset replacement won the 1999 Best Transactions Paper award from the IEEE Systems, Man, and Cybernetics Society.

mary helander image
Mary Helander

IBM TJ Watson Research Center

Findings from Modeling and Analyzing a Non-profit Organization’s Emergency Food and Community Social Service Operations

In 2017, IBM Research partnered with St. John’s Bread & Life (SJBL), a non-profit agency working to alleviate hunger and poverty in Brooklyn and Queens, in a year-long non-commercial endeavor to model and analyze SJBL’s supply chain and services operations. Completed in December 2017, the project involved direct observation, hands-on participation, mining and pattern analysis of more than 4 years of detailed historical data comprised of millions of event records, development of a discrete event simulation of the SJBL supply and services chains, and stochastic inventory optimization for the entire warehouse supporting SJBL’s digital food pantry. The project objectives were to explore and help address various challenges of SJBL’s non-profit mission, namely: SJBL’s efforts to educate the general public, as well as potential donors and granting agencies, about patterns and workings of SJBL’s operations, as well as performance evaluation of SJBL’s operations. This presentation will describe the data science used to reveal product and client demand and supply patterns, as well as the unique and sometimes surprising experiences of applying OR/MS analytics within a non-profit operation. A key finding from this study was coined “the non-profit self-optimization phenomenon”: that the combination of a mission to meet a steady and voluminous human basic need (i.e., hunger among the urban impoverished) with an extremely constrained resource condition (i.e., an operating budget limited by private fundraising and grant-giving foundations) is the likely explanation leading to operations performance efficiencies (evidenced by metrics such as inventory turnover, service level attainment, and minimized cost of working capital) that rival “for-profit” organizations. The INFORMS Analytics community who would be most interested in this talk are those interested in applications of MS/OR/Analytics modeling and analysis skills to real-world problems with social implications. While performed by OR/MS research practitioners from IBM, note that the work was not a commercial project and this talk is not intended to “sell” any IBM product or service. The work is innovative in its treatment and operations analysis of an organization within understudied industry: i.e. “non-profit and voluntary sector.” Further, this appears to be the first documented practice case to provide a comprehensive comparison between a non-profit’s efficiency with published benchmarks from “like” for-profits. The technical work is adapted from rigorous applied models and methods developed by IBM Research for McKesson’s wholesale supply chain, and published in Interfaces in Jan/Feb 2014. The speaker has over 30 years of experience in teaching, researching, and especially applying OR/MS and analytics in practice. She is a Franz Edelman Laureate class of 2013, and is the author of Chapter 5: Solution Methodologies for the INFORMS Analytics Body of Knowledge (ABOK) book, edited by James Cochran and aimed at the practitioner audience. A few of the examples in the ABOK chapter were derived from the SJBL project. Some recent invited speaking experiences include: • Invited key speaker: October 21, 2017, “Ties That Bind and Bridges to Nowhere.” Joint Meeting Seaway Section of the Mathematical Association of America (MAA) and New York State Mathematics Association of Two-Year Colleges (NYSMATYC), 2017 Fall Meeting, held at SUNY BROOME, Binghamton, NY • Invited speaker: September 9, 2017, “Issues​ ​and​ ​Consequences​ ​of​ ​Using​ ​Data​ ​Science​ ​to​ ​Detect​ ​Patterns​ ​of​ ​Bias.” 4th Annual Lesbians Who Tech + Allies, held at the New York University Law School, New York, NY • Invited speaker: April 19, 2016, “Application of Statistics to Mycotoxin Testing.” Mars Global Food Safety Center (GFSC) Science Symposium, Beijing, China • Invited speaker: November 22, 2013, “OR/MS in Practice.” University at Massachusetts, INFORMS Student Chapter, Amherst, MA • Invited speaker: June 2, 2009, “Technology Vision in Traceability.” Trace R&D Conference, University of Manitoba, Winnipeg, Manitoba, Canada • Invited speaker: September 18, 2009, “Food Safety in a Global Supply Chain.” University at Massachusetts, Lecture sponsored by the Isenberg School of Management and the School of Public Policy, Amherst, MA

  • Dr. Mary Helander is a computer scientist with expertise in network algorithms, discrete event transaction processing and stochastic simulation, as well as related data structures and OR problem solving. She received a B.A. (CIS; Math) from SUNY Potsdam, and M.S. and Ph.D. (Industrial Engineering & Operations Research) from Syracuse University and the University at Buffalo respectively. Prior to her current position as a senior research scientist and software architect in IBM Research, Mary was a professor (Dept. of Computer and Information Science) and research fellow (Dept. of Mechanical Engineering/Quality Technology) at Linköping University, in Sweden. She has researched and worked on algorithms to study HIV transmission dynamics; hazardous material routing; geographically distributed team coordination; the intrinsic value of administrative assistants in the workforce; sustainability and greenhouse gas emission minimization; tracking food ingredients; testing raw materials for contaminants; and a few other more standard topics in transportation, manufacturing, and distribution.

    Throughout her career, Mary has sought out research topics that leverage computer science and math to solve difficult problems. Her current project work involves modeling and revealing best practices of emergency food aid and assistance to the impoverished, a partnership between the IBM Research’s Science for Social Good Program and St. John Bread and Life, a non-profit agency working to alleviate hunger and poverty in Brooklyn and Queens. Mary has published in the Journal of Food Protection, Interfaces, Networks, Transportation Science, IBM Journal of R&D, IEEE Transactions on Software Engineering, Software Quality Journal, Empirical Software Engineering, Journal of Revenue and Pricing Management, Computers and Operations Research and Computers and Industrial Engineering. She is an INFORMS Franz Edelman Laureate, as well as the 2013 recipient of SUNY Potsdam Alumni Association’s Minerva Award–the highest honor that a SUNY Potsdam graduate can receive from the association.

    A native of central New York and graduate of John C. Birdlebough High School in Phoenix, in her free time, Mary is a percussionist in the Lesbian & Gay Big Apple Corps Band, in New York City, and plays competitive tennis on various USTA league teams.

ibm logo image
IBM

Analytics to Reduce Costs and Improve Quality in Wastewater Treatment

Wastewater treatment is carried out in a complex set of steps, in which the wastewater is treated by means of complex biological, physical, and chemical processes. Today, plants are often operated in a conservative and inefficient risk-averse mode, without the ability to quantify the risk or truly minimize the costs. An innovative Operational control applying descriptive (historical data analysis for a simulation model design and plant state estimation); predictive (wastewater process behavior modeled by a transition probability matrix), and prescriptive analytics (Markov Decision Process) was developed. The system was deployed in Lleida (Spain). Use of the system resulted in a dramatic 13.5 percent general reduction in the plant’s electricity consumption; a 14 percent reduction in the amount of chemicals needed to remove phosphorus from the water; and a 17 percent reduction in sludge production. 

ibm logo image
IBM

MPA Safer System

With the expected increase in vessel traffic and port capacity, the Singapore Maritime and Port Authority (MPA) has been working to ensure that the future Port of Singapore is safe, secure, efficient and sustainable. Project SAFER, “Sense-making Analytics For maritime Event Response”, is an important component in this effort. A collaboration between MPA and IBM Research, Project SAFER aims to design and develop new analytics capabilities for dramatically increasing the efficiency of maritime operations. The system uses novel cognitive-based analytics leveraging machine learning and entity resolution to provide full situational awareness capability, accurate prediction and intelligence for improving maritime decision-making. Using the SAFER machine-learning-based analytics and vessel prediction models, abnormal and suspicious behaviour is instantly discovered. Based on the extent to which the observed activity of individual or multiple interacting vessels deviates from the modelled behaviour, the event is instantly geo-localized, and sent in the form of an alert. MPA can thus address infringements across all 1000 vessels in real-time SAFER system’s automated movement detection leads to a significant accuracy improvement of 34%. Vessel movement information is needed not only for ensuring safety and security but for many other functions including billing: the accuracy improvement achieved by the SAFER system thus has direct implications on revenue and reducing disputes.

dan Iliescu image
Dan Iliescu

Senior Director of Operations Research
Revenue Analytics

Revenue Management And Pricing Analytics On The Cloud. Innovative Revenue Management Applications That Drive Organic Revenue Uplift For Your Organization

The application of Pricing and Revenue Management principles from traditional on-premises revenue management systems to cloud hosted services is a big transformation for any organization. To capture the full potential of cloud based applications and generate organic revenue growth opportunities for your company: consider building and integrating cloud capabilities one piece at the time with a firm that is expert in application development in the cloud, strike the right balance between the frequency and the timing of required computing capacity and data storage needs, select the appropriate modelling techniques and algorithms that scale well against cloud-based architectures, and establish control parameters and business rules to ensure the outcome is in line with corporate objectives and well-positioned to gain user acceptance. This presentation will focus on a high-level overview of the evolution of cloud hosted services using cloud computing and the recent developments in this area. Conversely, this talk will highlight the challenges associated with the data integration between on-premises and cloud applications, and a comparison of core predictive analytics and modeling techniques between cloud hosted services and on-premises systems. Finally, this presentation will outline successful cloud application business cases, while providing key best practices and takeaways that should be considered when developing cloud based solutions.

  • Dr. Dan Iliescu is a Senior Director of Operations Research at Revenue Analytics. His work with the firm covers the development and testing of large scale customer-centric pricing, yield management and forecasting solutions. He has particular expertise in statistical modeling including regression, customer behavioral segmentation, time series forecasting and time-to-event models, applied mathematical programming including resource allocation, and pricing optimization problems, and market response models.

    Prior to joining Revenue Analytics, Dan’s research focused on exploring determinants of airline passenger cancellation behavior using time-to-event models and quantifying the impact of survival forecasts on airline revenue streams. As part of his graduate studies, he conducted extensive data analysis for Boeing Commercial Airplanes and Airline Reporting Corp.

    In his current role, Dan has experience with multiple industries including pharmaceutical, consumer packaged goods, travel, hospitality, and automotive and has presented to INFORMS and Revenue Management and Price Optimization Conferences on topics related to the implementation of optimization and forecasting algorithms. Dan holds a M.S. in Civil Engineering (Transportation) from University of Maryland, and a Ph.D. in Civil and Environmental Engineering from the Georgia Institute of Technology.

noyan ilk image
Noyan Ilk

Assistant Professor of Business Analytics in the College of Business
Florida State University

A Text Analytic Approach to Match Customer Inquiries with Agent Specialties in Online Service Centers

Customer – service agent mismatch is a common problem in many service centers leading to service rework (i.e., customer transfers), operational waste, and customer dissatisfaction that collectively cost firms millions of dollars each year. We propose a text-analytic framework that leverages problem description texts to improve customer routing accuracy and reduce transfer rates in online service centers. Grounded in computational linguistics and machine learning methods, the framework helps extract signal cues from text that can be used as modeling inputs to identify the true nature of a problem. To demonstrate the usefulness of the framework, we conduct a comprehensive case study on a data set collected from an S&P 500 company. Our results indicate a 19% accuracy improvement from the framework over menu based routing. To assess the broader managerial implications of this improvement, we estimate potential reductions on agent service time, customer waiting time, and increments in customer satisfaction.

  • Noyan Ilk is an assistant professor of Business Analytics in the College of Business at the Florida State University. His research addresses analytics problems that are at the intersection of service operations and information systems domains. Specifically, he seeks to develop novel methods and policies to effectively manage service systems in electronic mediums. Noyan has taught courses in business analytics, business intelligence and operations management topics at both undergraduate and graduate levels.

intel logo image
Intel

Analytics Makes Inventory Planning A Lights-Out Activity at Intel Corporation

Intel, which employs more than 100,000 people in over 70 countries around the world and has an annual revenue of $60 billion, implemented a fully automated Multi-Echelon Inventory Optimization (MEIO) based inventory target-setting system managing $1 billion daily in finished goods inventory representing over $40B a year in sales. Algorithm-derived inventory targets at Intel are accepted by planners +99.5 percent of the time and have simultaneously driven higher customer service and lower inventory levels resulting in over $1.3B in gross profit since 2014. In addition, customers are delighted: since MEIO was implemented at all of Intel’s vendor managed inventory hubs in 2012, customer satisfaction has never been higher and Intel has landed in the top-10 of Gartner’s Supply Chain Top-25 every year. Faculty in the department of Business Analytics and Statistics at the University of Tennessee, Knoxville and the supply chain software company Logility also contributed to this project.

christopher jerde image
Christopher Jerde

Senior Associate Senior Portfolio Analyst and Data Scientist
Gensler

Application of Data Science in Corporate Real Estate and Workplace Design

Within the architecture industry, workplace planning has typically been a guessing game, where space requirements are determined based on headcount levels and growth. Data streams – such as badge security data, device pairings data and occupancy sensor data – coupled with readily available statistical techniques in hedonic regression and machine learning, allow CRE professionals to plan with much greater efficiency and precision. This planning can manifest in an improved end user experience and bottom line savings for an organization. My presentation will focus on an approach to collecting and exploring proprietary client data, merging it with other data sources, leveraging statistics to identify correlations, and applying findings to real estate strategies. I will reveal 2 real-world applications where data science transformed real estate strategy for two clients, both Fortune 10 technology companies. Finally, I will address future initiatives in leveraging data science for CRE.

  • As a data scientist, Chris excels in exploring how data can inform design of the built environment. He believes that data, big and small, pulled from multiple sources, can provide valuable and often unexpected insight into project programming (upstream) and performance (downstream). One of Chris’s most recent projects entails working with data scientists at Microsoft on leveraging big data and machine learning to optimize workplace programming. Using a variety of advanced statistical techniques and experimentation, Chris made a significant impact on how Microsoft approaches its global portfolio planning by developing data-driven strategies that have saved the company millions of dollars annually (by client estimation). Chris is also involved in many research initiatives at Gensler, including the study of rent premium of signature architecture firms, the use of big data to inform workplace satisfaction, and the application of sensor technology in the workplace. His formal training in financial analysis and statistics allows him to add client value through data exploration of multiple data streams, and often leads him and his clients to exciting new ways to unlock hidden value embedded in their data.

anssi kaki image
Anssi Käki

Manager, Advanced Analytics
UPM-Kymmene Corp.

Enabling State-of-the-art Time-series Forecasting for Everyone

Regardless of the company size and industry, time-series forecasting is a cornerstone of many business processes from procurement and sales & operations planning to budgeting, financial planning and strategy. In statistical forecasting, proprietary forecasting software has recently been challenged by open source tools developed for Python or R by academics, individuals and companies alike. But taking these new tools into operational use is not easy: business analysts and planners do not have the required programming skills, and corporate IT functions are not always supportive for open source tools. In this talk, I will outline an all-purpose forecasting system that was built in corporate environment in just a few months and at close to zero technology cost. The new system gives easy access for everyone in the company to both traditional statistical forecasting tools such as ARIMA and state space models, as well as machine learning methods tailored for time series forecasting. It is in use in various business processes around the company and the talk will outline the key success factors of the project. After presenting the forecasting system, more generic remarks are made about the role of technology in analytics and how implementation of Operations Research based tools is changing due to new technologies. These insights are relevant also to less tech-savy companies, as the learnings are mostly from a 150-years old pulp and paper producer.

  • Anssi Käki manages a team dedicated to analytics and operations research at Finland-based UPM-Kymmene Corporation. At UPM, he has been working on production planning, scheduling, material flow optimization, forecasting, and market bidding, for instance. Before joining UPM, Anssi worked in supply chain consulting projects related to logistics process management, supply chain planning and demand forecasting in consumer electronics and technical wholesales industries. He has a D.Sc. degree in Operations Research from the Aalto University. His research has been published in scientific journals such as Journal of Business Logistics, IEEE Transactions on Engineering Management, and Energy.

Ity Kanoria image
Ity Kanoria

Data Scientist
Hewlett Packard Enterprise

Winning Small and Medium Businesses Through Digital Insights

Today’s Small and Medium Businesses (SMBs) are more technologically cognizant than ever. It has become critical for vendors to build a detailed understanding of the small and midmarket segments, and to align resources and strategies to cater to needs of this fast growing market. There are many inherent challenges to do this, short lived purchase cycles, wide variations in buying patterns and large size of the SMB market. These pose a fundamental difficulty to use any traditional methods of targeting. There is a definite need to introduce a more dynamic and innovative insights system. We introduced digital footprint as a differentiator to gain insights into customer’s intent to buy. Digital intent is a score of customer activity on Hpe.com and third party websites. We collect and curate this data and map these activities to 3000+ technology topics and map it back to customers. This insight based on customer’s intent to buy is termed as “Digital Lead”. Further we overlay historical and firmographic information of customers with the use of Account Scoring (AS) and Life Time value (LTV) predictive models. Combining both, we provide a robust recommendation for account targeting called the “Propensity To Buy” (PTB) models. Our prioritisations Priority1(P1), Priority2(P2) and Priority3(P3) this year, has led the Inside Sales Representatives(ISRs) World Wide(WW), to over achieve their significantly enhanced revenue target by 110% even before the end of third quarter. Additionally the average deal size in top segment, P1 has seen an increase of 127% over the last year.

  • Working as a data scientist with Hewlett Packard Enterprise with a masters degree in Quantitative Economics-MSQE from Indian Statistical Institute (ISI). Have 4+ years of experience in the CPG domain and analytics. Adept at machine learning capabilities.Was adjudged as one of the top 6 speakers at the main stage for Parasparam, the Indian technology conference of HPE after being selected from 300 paper submissions and one round of poster booth sessions.
Nick Kastango image
Nick Kastango

Analytics Manager
Memorial Sloan Kettering Cancer Center

Embedding Analytics in a Healthcare Organization

Memorial Sloan Kettering Cancer Center was honored to receive the 2012 INFORMS Prize for early work in healthcare analytics. Nick will share insights from the team’s journey growing from just two people to a group of 16 analysts with deep ties to seven operational departments.

The talk will outline: the tradeoffs of a hybrid centralized-distributed talent model; how to find and evaluate operational business partners; the technology stack that has enabled us to scale; challenges and successes in partnering with IT and informatics; spending time on data governance vs. data analytics; and examples of two comprehensive analytics portfolios developed for the perioperative system and the Patient Access call center.
The intended audience for this talk is leaders responsible for building or scaling analytics functions in their institution, particularly healthcare organizations.

  • Nick Kastango is an Analytics Manager at Memorial Sloan Kettering Cancer Center in New York City, the country’s oldest and largest private cancer center. He and his team are responsible for the deployment of analytical solutions throughout the hospital. Nick holds an MS in Operations Research from Columbia University and a BS in Industrial Engineering from Lehigh University. Besides healthcare, he also has a background in nonprofit and manufacturing organizations.
Pavan Korada image
Pavan Korada

Data Science & Analytics
Zeta Global

Marketing Language Optimization Using Natural Language Processing Techniques: Deploying High Impact Copy that Maximizes Return-on-investment

Advertisers have 3 key optimization levers in their marketing campaigns: (1) Who to target?, (2) Which marketing channel to use (email, social, etc)? and (3) What message to send? All levers are important in driving marketing ROI, but more attention is given to Targeting and Channel mix decisions than to Messaging Copy & Content. The realization that marketing language can be improved using AI techniques is starting to take root in the advertising industry, with outstanding results. In this talk, I demonstrate how Fortune 100 advertisers are successfully using cutting edge NLP techniques to optimize their marketing language and significantly improve ROI.

  • Pavan is a consultative data science leader who leverages multi-industry expertise and statistical rigor to deliver data-driven solutions for a variety of business problems. Currently head up Data Science & Analytics for a leading Agency + Martech firm.

ahmet kuyumcu image
Ahmet Kuyumcu

Cofounder and CEO
Prorize

Revenue Management in the Self-storage Industry

Revenue Management (RM) is a new frontier for the self-storage industry and offers many analytical challenges. On the supply side, self-storage product is relatively complex and could be described by unit size, unit location, and unit attribute. A typical mid-size company offers thousands of product combinations, and availability of each product varies greatly. On the demand size, move-ins and move-outs are highly uncertain based on many factors, including market, site, product, calendar, customer type population density, competitive climate and selling channels. Demand seasonality is acute with 40 percent more rentals in the peak month of May than the slowest month of December. New customer rents can vary by more than 100% for the same product based on season or other market conditions. Besides, the industry extensively practice for incentives and promotions, which need to be calibrated with an optimal pricing strategy. In addition, existing customer rates must be optimally balanced based on churn rates, new customer rents, and other market conditions.

This presentation explores revenue management challenges and opportunities for the self-storage industry. It also discusses a RM solution framework that has been successfully implemented by over half a dozen operators and also won the 2017 Franz Edelman Award, the “Super-Bowl of Operations Research”.

  • Ahmet is the cofounder and CEO of Prorize and responsible for running all facets of the business. A recognized expert in the field of pricing and revenue management, he assures that Prorize delivers the best-in-class pricing solution and continuously exceeds client’s expectations. Ahmet led the revenue management application that won the 2017 Franz Edelman Award. He has directed and built profit-generating pricing systems across a broad range of sectors. Previously Chief Scientist at Zilliant, a provider of price optimization solutions, Ahmet led and pioneered price optimization solution for the Zilliant product suite. Before Zilliant, he served as a Senior Scientist at Talus (now JDA). At Talus, he engineered the scientific processes for the most successful pricing products in the travel-transportation, media, automotive and apartment industries. Ahmet also taught graduate-level classes in Pricing and Revenue Management at the University of Texas at Austin and the Indian School of Business. He served on the boards of Pricing and Revenue Management section of INFORMS, and the Journal of Revenue and Pricing Management. Ahmet earned M.S. and Ph.D. degrees in Operations Research from Texas A&M University.
Gertjan de Lange image
Gertjan de Lange

SVP Connecting Business & Optimization
AIMMS

The Intersection of OR and Data Science – Opportunities, Challenges, and Innovation

In the history of our almost 30 years in the market, AIMMS has executed some pivotal changes that were needed to ensure the value of optimization could be fully realized by our customers. Driven by market requirements, customer feedback and innovation initiatives, we remain laser focused on continuing to bring more value each day and broadening awareness of the benefits of optimization. Looking at the development of the Data Science industry, we realize there is a big opportunity for all of us and our customers. However, it also creates challenges as the modeling paradigms might conflict and the conversations shift. As things evolve there are likely to be significant implications and questions that may arise as new developments in data science, the consumerization of technology and software as a service, force us to think differently on how we service our community of partners and customers. Time will tell what exciting innovations lie ahead of us.

  • Gertjan de Lange is a member of the leadership team of AIMMS and has been with AIMMS since 1995. He worked with many different customers and partners in distinct roles to enable the successful use of the AIMMS optimization technology. After being in charge of sales for almost 15 years, Gertjan took the role of SVP Connecting Business & Optimization at AIMMS in July 2014. This new role was created to promote and discuss the use of analytics, and specifically the use of optimization to potential users and research analysts in and outside the typical Operations Research community. Gertjan has been working on the overall AIMMS product strategy since 2010 to enhance the user experience and support new developments to continuously increase the value that customers can gain from AIMMS.

    Gertjan holds an MSc degree in Applied Mathematics (OR) of University of Twente and currently resides, for AIMMS, in the Seattle area (WA, USA).

marcial lapp image
Marcial Lapp

Director of Operations Research in Revenue Management
American Airlines

Maximizing the Total Travel Experience at American Airlines

At its core, Revenue Management is synonymous with Customer Segmentation. In this talk we provide an overview of the science and systems used to maximize passenger revenue at American Airlines. We begin with an overview of distribution systems used in the airline/travel industry to highlight current system limitations. Next, we provide insight into current & common forms of customer segmentation built on current technology platforms. While distribution systems act as global gateways to American Airlines, we highlight differences that exist between different geographic regions, including the United States and various international points-of-sale. Next, we overview the on-going efforts to improve revenue optimization within this current infrastructure, but with the application of new tools/science and data. Finally, we will provide an overview of the upcoming changes in the distribution ecosystem, including the New Distribution Capability (NDC) and the potential opportunities this standard will unlock for American Airlines.

  • Marcial is the Director of Operations Research in Revenue Management at American Airlines. Responsible for leading a team in charge of the development of yield-management-, pricing- and ancillary-science systems. This includes the prototyping, building and implementing large-scale revenue management systems within American Airlines based on foundations such as traditional network revenue management modeling (LP, Stochastic-LP, Probabilistic BP, DAVN), network-aware demand forecasting, overbooking forecasting and optimization as well as traffic (spill) modeling.

no speaker image
Simon Lee

Vice President of Analytics
Homer Logistics

The Challenges of Optimization in a Start Up

Logistics optimization is already challenging under the ideal circumstances of abundant and consistent historical data, a relatively fixed set of business problems and minimal variability in staffing and operations. Unfortunately, these conditions have not existed at Homer Logistics, a start up specializing in urban same day logistics. At it’s core, however, Homer is a data driven company and this talk will examine how this startup built effective analytic models dealing with staffing, operations and black swans. The major role of Homer’s Analytics group in changing its business model under an increasing unfavorable regulatory environment will also be discussed. Finally, the impact of the business change on staffing, priorities, forecasting, and operational models will be described.

  • Simon Lee’s career has been built on using data to make companies more efficient. He has contributed in a variety of industries including finance, transportation (rail, trucking, air, ocean), manufacturing, and mobile gaming with a broad array of models from immediate operational models to important strategic models. He earned a Masters of Mathematics from Columbia University and is currently the Vice President of Analytics at Homer Logistics.
jay liebowitz image
Jay Liebowitz

Distinguished Chair of Applied Business and Finance
Harrisburg University of Science and Technology

Intuition Versus Analytics: The Role in Executive Decision Making

In this age of “big data”, knowledge gained from experiential learning may be taking a back seat to analytics. But, the use of intuition and trust in executive decision making should play an important role in the decision process. In fact, a KPMG (2016) study found that just one-third of CEOs trust data analytics, mainly due to concerns about internal data quality. KPMG in their second study also found that most business leaders believe in the value of using data and analytics, but say they lack confidence in their ability to measure the effectiveness and impact of data and analytics, and mistrust the analytics used to help drive decision making. Unfortunately, in the data analytics community, intuition typically hasn’t been discussed in terms of its application in executive decision making. This session will highlight a Fulbright multi-country research study looking at how executives trust intuition versus analytics.

  • Dr. Jay Liebowitz is the Distinguished Chair of Applied Business and Finance at Harrisburg University of Science and Technology. He previously was the Orkand Endowed Chair of Management and Technology in the Graduate School at the University of Maryland University College (UMUC). He served as a Professor in the Carey Business School at Johns Hopkins University. He was ranked one of the top 10 knowledge management researchers/practitioners out of 11,000 worldwide, and was ranked #2 in KM Strategy worldwide according to the January 2010 Journal of Knowledge Management. At Johns Hopkins University, he was the founding Program Director for the Graduate Certificate in Competitive Intelligence and the Capstone Director of the MS-Information and Telecommunications Systems for Business Program, where he engaged over 30 organizations in industry, government, and not-for-profits in capstone projects. Prior to joining Hopkins, Dr. Liebowitz was the first Knowledge Management Officer at NASA Goddard Space Flight Center. Before NASA, Dr. Liebowitz was the Robert W. Deutsch Distinguished Professor of Information Systems at the University of Maryland-Baltimore County, Professor of Management Science at George Washington University, and Chair of Artificial Intelligence at the U.S. Army War College. Dr. Liebowitz is the Founding Editor-in-Chief of Expert Systems With Applications: An International Journal (published by Elsevier), which is ranked #1 worldwide for AI journals according to the h5 index of Google Scholar journal rankings (March 2017). ESWA was ranked third worldwide for OR/MS journals (out of 83 journals), according to the 2016 Thomson impact factors. He is a Fulbright Scholar, IEEE-USA Federal Communications Commission Executive Fellow, and Computer Educator of the Year (International Association for Computer Information Systems). He has published over 40 books and a myriad of journal articles on knowledge management, analytics, intelligent systems, and IT management. His most recent books are Knowledge Retention: Strategies and Solutions (Taylor & Francis, 2009), Knowledge Management in Public Health (Taylor & Francis, 2010), Knowledge Management and E-Learning (Taylor & Francis, 2011), Beyond Knowledge Management: What Every Leader Should Know (Taylor & Francis, 2012), and Knowledge Management Handbook: Collaboration and Social Networking, 2nd ed. (Taylor & Francis, 2012), Big Data and Business Analytics (Taylor & Francis, 2013), Business Analytics: An Introduction (Taylor & Francis, January 2014), Bursting the Big Data Bubble: The Case for Intuition-Based Decision Making (Taylor & Francis, August 2014), A Guide to Publishing for Academics: Inside the Publish or Perish Phenomenon (Taylor & Francis, 2015), Successes and Failures of Knowledge Management (Morgan Kaufmann/Elsevier, 2016), and Actionable Intelligence in Healthcare (Taylor & Francis, 2017). Dr. Liebowitz served as the Editor-in-Chief of Procedia-CS (Elsevier). He is also the Series Book Editor of the new Data Analytics Applications book series (Taylor & Francis). In October 2011, the International Association for Computer Information Systems named the “Jay Liebowitz Outstanding Student Research Award” for the best student research paper at the IACIS Annual Conference. Dr. Liebowitz was the Fulbright Visiting Research Chair in Business at Queen’s University for the Summer 2017. He has lectured and consulted worldwide.

bing liu image
Bing Liu

TPD Modeling & Analytics Lead
Monsanto

Applying Operation Research in Trait Introgression to Improve the Success of Monsanto Traited Products

At Monsanto, we sell elite germplasm with important biotech traits added. The biotech traits are transferred into conventionally bred lines through a process called Trait Introgression (TI). To maintain a competitive advantage, the introgressed lines need to be ready at the same time as the conventional lines so that we can bring yield gain to market as quickly as possible. One of the most important factors for finishing the TI process on time is to select the right parents – the parents need to be closely matched on various genotypic and phenotypic characteristics. Out of millions of possible combinations, the best possible subsets need to be selected to control cost. Using Operation Research methods, we were able to optimize the selection of parental lines while satisfying all of the constraints. This optimization strategy starts our TI process on the best terms possible, increases the success rate of on-time delivery, thus bringing millions of dollars of value from increased genetic gain. This is a great success story of applying operation research in world’s #1 seed company, impacting the majority of our traited products.

  • Dr. Bing Liu is the TPD Modeling & Analytics Lead at Monsanto. During her eleven years’ tenure at Monsanto, she has been in various positions across the breeding and biotech organizations with increased responsibility. Her team of data scientists drive predictions, process optimization and decision making metrics at Monsanto’s TPD organization. Dr. Liu hold a Ph.D. in Statistics from Virginia Tech. University.
Marco Lübbecke image
Marco Lübbecke

Professor and Chair of Operations Research
RWTH Aachen University

Prescriptive Analytics for Political Districting: An Example from Germany

The design, and in particular the re-design of electoral districts is a regular task in preparation for elections. Each district should contain the same number of voters (within tolerances), so population shifts often trigger this re-design, enforced and restricted by laws and rules. The general task is similar in different countries, but we focus on the case of federal elections in Germany. We present a tool that provides the (current standard of) descriptive analytics for supporting the decision maker. More importantly, we offer prescriptive analytics to suggest an optimal (re-)districting. This is based on integer programming optimization models and algorithms which is why the tool is flexible enough to be rather easily adapted to different elections and different countries. We take into account that there are several conflicting objectives and we in fact face a multi-criteria optimization problem. We discuss the entire analytics process, from the idea, data gathering, convincing officials, model and algorithm development to implementation and what decision makers have to say about this tool. We will not discuss any political implications (at least not during the talk).

  • Marco Lübbecke is a professor and chair of operations research with the RWTH Aachen University, Germany, where he has a double affiliation with the school of business and economics and the mathematics department. Marco is an applied mathematician by education with strong links to computer science, business/economics, and engineering. He is an expert in modeling and solving large-scale and complex practical optimization problems. His technical contributions to the field as well as successful applications have been published in premium journals of the OR/Analytics profession. Marco has been delivering decision support via prescriptive analytics since 20+ years, often in academia/practice cooperation projects. He currently serves as the INFORMS VP of information technology.
macys logo image
Macys

A Model Driven Approach to Store Selling Space Optimization

Store Locations sales performance by merchandise business was until now being compared against benchmarks formulated by averages of similar scale. A new approach has been developed integrating Exploratory Analytics (Co-clustering) and Prescriptive Analytics (Non‐Linear Spline Regression Optimization Model and Seasonal (Random Walk) Autoregressive Integrated Moving Average (SARIMA) Model). We have developed a new workflow to integrate all three models in recommending optimal store layouts and merchandize mix for new store locations and major remodels of existing ones. In this three-tiered process, the analyst first identifies the statistical cluster membership of the under analysis location and formulates a plan based on that benchmark. Then he/she invokes the optimization model that provides the space adjustment recommendations that maximize its sales potential based on existing cross‐sectional data (for remodel stores). In the final step, the forecasting model is used to validate whether the recommendations made based on cross‐sectional (historical) data hold true in the time‐frame where these changes (projected store opening or remodel completion) are expected to take place.

r mcgrath image
Captain Richard McGrath

United States Naval Academy

Concepts for Optimal Resource Management in Command and Control

The future of battlespace dominance will depend on mastery of the electromagnetic spectrum. As part of that shift, military assets performing this mission can no longer exist to serve siloed and narrowly defined tasking. Instead, electromagnetic effects must be served by a centralized method for dynamically allocating resources that are shared across task and mission areas. This requires careful stochastic modeling to understand the capacity of systems under random process demand, as well as understanding the potential, possibilities, and limitations of concepts of operations induced by this new resource allocation paradigm. This talk will focus on the fundamental modeling choices necessary to lay the groundwork for providing that analysis, and will provide specific examples of robust, real-time scheduling methods in a dynamic, resource-constrained operating environment.

  • Captain Richard McGrath has served on active duty in the U.S. Navy for the past 27 years. He holds a Bachelor of Science degree from the Massachusetts Institute of Technology, a Master of Science degree from Stanford University, a Master of Arts degree from the Naval War College, and a Doctor of Philosophy Degree in Operations Research from the Naval Postgraduate School. As a career Naval Aviator, he completed extended deployments onboard USS John F. Kennedy (CV 67), USS Constellation (CV 64), and USS Theodore Roosevelt (CVN 71) flying the S-3B Viking and F/A-18 Hornet jet aircraft. He has flown several combat missions over Iraq and Afghanistan in support of Operations Iraqi Freedom and Enduring Freedom, including the only overland combat strike mission flown by the S-3B Viking. Ashore, he attended the United States Naval Test Pilot School and severed as a developmental test pilot for the Naval Air Systems Command in Patuxent River, Maryland. He also completed tours as a flight instructor in San Diego, California and an aviation personnel distribution officer for the Navy Personnel Command in Millington, Tennessee. From March 2009 to June 2010, he served as the Commanding Officer of the “Golden Warriors” of Strike Fighter Squadron Eight Seven (VFA-87) in Virginia Beach, Virginia. He is currently assigned to the United States Naval Academy as a Permanent Military Professor in the Mathematics Department and previously served as the Deputy Director of the Division of Mathematics and Science. His research interests include Optimization, Stochastic Models and Decision Theory. He has extensive speaking experience that includes technical presentations at three Society of Experimental Test Pilots symposia, three INFORMS Annual Meetings, and four presentations at the Institute of Industrial and Systems Engineers annual research conference. Captain McGrath’s personal awards include two Meritorious Service Medals, two Strike/Flight Air Medals, seven Navy and Marine Corps Commendation Medals, the Navy and Marine Corps Achievement Medal, and various campaign and unit awards. He has accumulated over 3,400 flight hours in 28 different aircraft and over 640 aircraft carrier arrested landings
conor mclemore image
Lieutenant Commander Connor McLemore

United States Navy

Analytic Practice in the Assessment Division (N81) at the Office of the Chief of Naval Operations

The Assessment Division (N81) at the Office of the Chief of Naval Operations enables the Naval Service’s inputs to the government budget process through the determination of requirements, allocation of scarce resources, and responsive decision-making support. A primary objective of the assessment process is to develop a thorough understanding of how naval forces contribute to the nation’s joint force capabilities. For example, how might N81 determine the best cost-effective system of systems to enhance the find, fix, track, target, and engage sequence during anti-submarine operations for the P-8A aircraft in the 2025-2030 timeframe? This presentation will describe the current state of analytic practice in the Navy Assessment Division at the Office of the Chief of Naval Operations.

  • Lieutenant Commander Connor S. McLemore graduated from the U.S. Naval Academy in 2000 with a Bachelor of Science in Mechanical Engineering. Lieutenant Commander McLemore was designated a Naval Flight Officer in 2002. Upon completion of flight training, he reported to his first fleet assignment with the “Sun Kings” of Carrier Command and Control Squadron 116 (VAW-116) aboard USS Constellation (CV 64). While at VAW-116, he deployed to the Persian Gulf in support of Operations Southern Watch and Iraqi Freedom, accumulating over 150 flight hours during major combat operations. In 2004, he deployed aboard USS Abraham Lincoln (CVN 72) to the Indian Ocean and Western Pacific in support of the humanitarian Operation Unified Assistance. In 2007, he returned to VAW-116 as Weapons and Tactics Instructor and was designated a CVW-2 Dynamic Strike Lead deployed aboard USS Abraham Lincoln (CVN 72). While at VAW-116, he deployed to the Arabian Gulf and Gulf of Oman in support of Operations Iraqi Freedom and Enduring Freedom. In 2010, Lieutenant Commander McLemore completed an Operations Research Masters Degree at the Naval Postgraduate School in Monterey, California. His thesis was awarded the Military Operations Research Society Stephen A. Tisdale Graduate Research Award. In January 2011, Lieutenant Commander McLemore assumed duties as Plans Officer on the staff of Commander, Combined Joint Special Operations Air Component (CJSOAC) in Al-Udied, Qatar. While there, he completed a National Security and Strategic Studies Masters Degree with distinction from the Naval War College in Newport, Rhode Island. In October 2011, Lieutenant Commander McLemore joined Tactical Air Control Squadron (TACRON) 12, Detachment Alfa, in Okinawa, Japan deployed aboard USS Bonhomme Richard (LHD 6). While at TACRON 12, he deployed regularly to Korea, Australia, and the Philippines for major exercises and was the lead Navy Air Officer in the Joint Task Force 505 Headquarters in support of Philippine Typhoon relief; the humanitarian Operation Damayan. In May 2014, Lieutenant Commander McLemore returned to the Naval Postgraduate School as a Military Assistant Professor of Operations Research and the Operations Research Program Officer. In June 2017, Lieutenant Commander McLemore joined the Office of the Chief of Naval Operations (OPNAV) Assessments Division (N81) as the Integrated Fires Section Head. Lieutenant Commander McLemore is a graduate of the Navy Fighter Weapons School (Topgun), Naval Aviation Safety Officer School, and Naval Strike and Air Warfare Center’s Advanced Mission Commander Course (AMCC).
polly mitchell guthrie image
Polly Mitchell-Guthrie

System Director, Analytical Consulting Services
UNC Health Care System

Built from Scratch: How We Built an Enterprise Analytics Function (almost) Overnight

Built from Scratch: How We Built an Enterprise Analytics Function (almost) Overnight At the UNC Health Care System, we started our Enterprise Analytics and Data Sciences team in 2016 and will have grown from zero to 40 employees in less than two years. Our mission is to enable the transformation of health care decision making through data sciences and analytical methods. We have functions dedicated to building reusable data and analytical assets, data governance, and analytical consulting services. We had the benefit of starting from scratch, versus the common practice of rebranding and expanding. So how and why did we design our organization and build it so fast? This presentation will explain what we did and illustrate our work with a case study on patient throughput, showcasing the walk from descriptive to predictive to prescriptive analytics.

  • Polly is the Director of Analytical Consulting Services within Enterprise Analytics and Data Sciences at the University of North Carolina Health Care System. Previously she was Senior Manager of the Advanced Analytics Customer Liaison Group in SAS’ Research and Development Division, where her team served as a bridge between R&D and external customers and internal SAS divisions. Before that she was Director of the SAS Global Academic Program, leading SAS’ outreach to colleges and universities worldwide to incorporate SAS into their teaching. Polly began her career at SAS in Strategic Investments and later served in Alliances, after working in the nonprofit sector in philanthropy and social services. She has an MBA from the Kenan-Flagler Business School of the University of North Carolina at Chapel Hill, where she also received her BA in Political Science as a Morehead Scholar. Within INFORMS, she has served as the Chair and Vice Chair of the Analytics Certification, Board, Secretary of the Analytics Society, and on the ad hoc committee planning the Executive Forum.
robert moakler image
Robert Moakler

Quantitative Researcher
Facebook

Methods For Measuring Ad Effectiveness: Experimental And Observational Methods With Validation Techniques

Causal models are essential for drawing real-world inferences about cause and effect relationships. There are two main methods used to estimate causal effects: randomized experiments and observational methods. Randomized experiments are often considered ideal, but can be expensive or difficult to setup. The alternative, observational methods, often require an extensive knowledge of statistics to implement and are subject to potentially large sources of bias that are difficult to quantify. In this talk we will discuss these two methods, how they are used in practical settings, how they compare to each other in real-world settings, and how they can be evaluated independently through the use of negative control testing.

  • Robert Moakler received his Ph.D. in Information Systems from the NYU Stern School of Business in 2017 and is currently working as a Quantitative Researcher at Facebook. At Facebook Robert focuses on questions around the creative aspects of advertising and the effectiveness of different ad formats. Before that, Robert received his B.S. in Applied Physics and M.S. in Service Oriented Computing from the Stevens Institute of Technology and worked as a data scientist at BuzzFeed and Integral Ad Science. Robert’s research interests include causal inference methodology and large-scale distributed machine learning.
wendy moe image
Wendy Moe

Professor of Marketing and Director of the Masters of Science in Marketing Analytics
University of Maryland’s Robert H. Smith School of Business

Measuring Brand Favorability Using Large-scale Social Media Data

Social media listening is the practice of collecting and analyzing user comments on social media in an effort to assess consumer sentiment surrounding a particular brand. Many marketers have noticed that popular social media metrics do not always align with traditional brand tracking metrics obtained from long-standing surveys. Researchers have identified several factors that influence when and what consumers post on social media, causing their posted comments to deviate from their underlying opinion toward the brand. These individual level deviations are systematic and predictable and, when aggregated, generate metrics that do not accurately represent the customers’ assessment of the brand. In this session, I will discuss research that has (1) documented factors that influence consumers’ social media posting behavior, (2) identified deviations between social media metrics (specifically, average sentiment) and traditional brand tracking metrics and (3) proposed methodologies to correct for these deviations. In one study, a co-author and I examine differences in sentiment across social media venues for a global technology brand. We develop a method that extracts the common sentiment expressed toward the brand regardless of context and show how an adjusted brand sentiment metric correlates with brand tracking surveys while simple average sentiment metrics do not. In another study, a co-author and I examine social media comments for multiple brands and identify user-level positivity/negativity biases based on their posting behavior across brands. Again, these user-level biases lead to biased metrics, and if these biases vary systematically across brands, it can be difficult to compare one brand to another. Thus, we propose a method to adjust for user positivity and offer a adjusted brand favorability that aligns with benchmarking studies based on brand tracking surveys.

  • Wendy Moe is Professor of Marketing and Director of the Masters of Science in Marketing Analytics at the University of Maryland’s Robert H. Smith School of Business. She is an expert in online and social media marketing with a focus on analytics and intelligence. Professor Moe is a highly published academic with her research appearing in numerous leading business journals. She is also the author of Social Media Intelligence (Cambridge: 2014). Professor Moe has been recognized by the American Marketing Association and the Marketing Science Institute as a leading scholar in her field with the Howard Award, the Young Scholar Award, the Erin Anderson Award and the Buzzell Award. She is the co-editor of the Journal of Interactive Marketing and serves on the Board of Trustees for the Marketing Science Institute, the advisory board for the Wharton Customer Analytics Initiative, and the editorial boards of Journal of Marketing Research, Marketing Science, Journal of Marketing, and International Journal of Research in Marketing. Professor Moe consults for numerous corporations and government agencies, helping them develop and implement state-of-the-art statistical models in the area of web analytics, social media intelligence and forecasting. Her research in web analytics was the foundation for NetConversions, Inc., an early innovator in the area of online data collection and analysis. She was part of the founding team that brought the company from start-up to acquisition in 2004. Professor Moe has also been a expert witness and consultant for litigation related to online retailing, advertising and branding issues. Professor Moe has been on the faculty at the University of Maryland since 2004. Prior to that, she was on the faculty at the University of Texas at Austin. She holds a PhD, MA and BS from the Wharton School at the University of Pennsylvania as well as an MBA from Georgetown University.

mike morello image
Mike Morello

Director
Governor’s Office of Performance Improvement, State of Maryland

Applications of Maryland Open Data Portal

Mike specializes in performance management and business intelligence. In his role as Director of the Governor’s Office of Performance Improvement, Mike helps agencies and the Governor’s Coordinating Offices to improve accountability, transparency, performance, and results in alignment with Governor Hogan’s priorities for Changing Maryland for the Better. Mike will be presenting the Maryland Open Data Portal, including best practices, highlighting state datasets for example demographic data by county, and how to use the site’s APIs.

  • Mike Morello, Director, Governor’s Office of Performance Improvement, State of Maryland.
no speaker image
David Morrison

Software Engineer
Yelp

Using Simulation to Predict and Improve Autoscaling Behavior for Yelp’s Distributed Systems

At Yelp, we use distributed computation on many large clusters to improve performance; everything from unit and integration tests to the production services powering Yelp.com run on such clusters. Each of these clusters is powered by machines running in the cloud, and have different workloads and resource requirements. In order to ensure that we have sufficient capacity for running jobs, while also keeping costs down during periods of low usage, cluster-wide autoscaling is necessary. This talk will describe 1) Our in-house cluster autoscaling tool, called ClusterMan; 2) How ClusterMan’s autoscaling has saved ~50% on our distributed clusters, while still ensuring sufficient compute capacity for tasks; 3) How we built a simulation environment for ClusterMan to test changes before they go live; 4) How the simulator helped us understand recent changes to the pricing model for cloud computation; and 5) How we are using the simulator and machine learning models to build predictive autoscaling signals

  • David R. Morrison is a software engineer working in scheduling and optimization on the Distributed Systems team at Yelp, where he has developed auto-scaling code for Yelp’s most expensive compute clusters. Previously, David worked in research and development at Inverse Limit, where he received federal funding from DARPA and Google’s ATAP program. David received his PhD in computer science from the University of Illinois, Urbana-Champaign under the supervision of Dr. Sheldon Jacobson. David has spoken at the INFORMS Business Analytics conference in 2017, at AWS re:Invent 20161, and given multiple presentations at the INFORMS Annual Meetings and other venues.
anna nagurney image
Anna Nagurney

John F. Smith Memorial Professor at the Isenberg School of Management
University of Massachusetts-Amherst

Predictive and Prescriptive Models of Cybercrime and Cybersecurity Investments Under Network Vulnerability

The effects of cyberattacks are being felt across the globe in multiple sectors and industries. The damages incurred include direct financial damages as well as reputation issues, the loss of business, the inability to provide the expected services, opportunity costs, and the loss of trust. The world economy sustained over $460 billion in losses from cyberattacks in 2016 alone. In this talk, I will first describe a predictive analytical multiproduct network economic model of cybercrime in financial services in which the hacked products are perishable in that their “value” deteriorates over time. I will then discuss our research on prescriptive analytical models for cybersecurity investments and network vulnerability when firms compete and when they cooperate in terms of information sharing. Algorithms and computational results for both classes of models will be presented and discussed and case studies presented in retail and energy sectors with extensive sensitivity analysis results that demonstrate the benefits of cooperation. I will conclude with recent research on cybersecurity and supply chains.

  • Anna Nagurney is the John F. Smith Memorial Professor at the Isenberg School of Management at the University of Massachusetts Amherst and the Director of the Virtual Center for Supernetworks, which she founded in 2001. She holds ScB, AB, ScM and PhD degrees from Brown University in Providence, RI. She is the author/editor of 12 books, has written more than 185 refereed journal articles, and over 50 book chapters. She presently serves on the editorial boards of a dozen journals and two book series and is the editor of another book series. Professor Nagurney has been a Fulbrighter twice (in Austria and Italy), was a Visiting Professor at the School of Business, Economics and Law at the University of Gothenburg in Sweden for the past 4 years, and was a Distinguished Guest Visiting Professor at the Royal Institute of Technology (KTH) in Stockholm. She as also a Visiting Fellow at All Souls College at Oxford University during the 2016 Trinity Term. Anna has held visiting appointments at MIT (at the Center for Transportation and the Sloan School of Management) and at Brown University and was a Science Fellow at the Radcliffe Institute for Advanced Study at Harvard University in 2005-2006. She has been recognized for her research on networks with the Kempe prize from the University of Umea, the Faculty Award for Women from the US National Science Foundation, the University Medal from the University of Catania in Italy, and was elected a Fellow of the RSAI (Regional Science Association International) as well as INFORMS (Institute for Operations Research and the Management Sciences) among other awards. She has also been recognized with several awards for her mentorship of students and her female leadership with the WORMS Award. Her research has garnered support from the AT&T Foundation, the Rockefeller Foundation through its Bellagio Center programs, the Institute for International Education, and the National Science Foundation. She has given plenary/keynote talks and tutorials on 5 continents. Anna’s research focuses on network systems from transportation and logistical ones, including supply chains, to financial, economic, social networks and their integration, along with the Internet. She studies and models complex behaviors on networks with a goal towards providing frameworks and tools for understanding their structure, performance, and resilience and has contributed also to the understanding of the Braess paradox in transportation networks and the Internet. She has advanced methodological tools used in game theory, network theory, equilibrium analysis, and dynamical systems. She was a Co-PI on a multi-university NSF grant with UMass Amherst as the lead: Network Innovation Through Choice, which was part of the Future Internet Architecture (FIA) program and is presently a Co-PI on an NSF EAGER grant.
scott nestler image
Scott Nestler, PhD, CAP

Associate Teaching Professor in the Department of IT, Analytics and Operations, at the Mendoza College of Business
University of Notre Dame

“Should We?” Not Just “Can We?”: Ethical Considerations In Data Science And Business Analytics

Data-informed decision making creates new opportunities, but also expands the set of possible risks to organizations when technical capabilities get too far ahead of ethical considerations. Concerns should extend beyond individual privacy to issues of identity, ownership, and reputation. In this presentation, motivating examples of ethical dilemmas and algorithmic bias are explored using data from behavioral science, social media, wearable devices, health care, and human resources. Roles of public laws (to include regional/national differences), government regulations, professional codes, organizational approaches, and individual ethics are presented as ways to address such issues when performing and managing analytic activities. Key questions considered include: (1) Just because I can do something (with data and analytics), does that mean that I should?; (2) How can I help guide ethical decision making at my organization, while still accomplishing business objectives?; and (3) What extra precautions should I consider when dealing with human (medical and behavioral) data?

  • Scott Nestler, PhD, CAP is an Associate Teaching Professor in the Department of IT, Analytics and Operations, in the Mendoza College of Business, at the University of Notre Dame. Previously, he served as an operations research analyst in the U.S. Army, to include teaching at the Naval Postgraduate School and the U.S. Military Academy at West Point. Nestler has a Ph.D. in management science from the University of Maryland – College Park. He is the currently the Vice-Chair for Programs of the INFORMS SpORts (Operations Research in Sports) Section and has served as the Chair and Vice-Chair of the Analytics Certification Board.

northwestern logo image
Northwestern University

SAFE (Situational Awareness for Events): A Data Visualization System

Marathons and other endurance events are growing in popularity, and thus require significant resources to ensure safety and success. Event management tools have not grown to meet this need. A team of Northwestern University faculty and students and staff members of the Bank of America Chicago Marathon has developed a data visualization system that incorporates critical data into a user-friendly dashboard to provide a centralized source of information at mass gathering events. This system uses descriptive, predictive and prescriptive analytics to help race organizers and relevant stakeholders effectively manage and oversee all participants, monitor the dynamic location of race participants, and manage health and safety resources throughout the event should any emergency issues arise. Our system is the first comprehensive dashboard for endurance event management. The system provides a dynamic representation of the flow of people and resources. The system integrates real-time dynamic data from tracking devices and predictive algorithms developed by the research team, and presents the information on a summary visual device, both as a large screen in an incident command facility for group monitoring and a desktop/mobile version for individual monitoring. The system has become an integral component in the management of the Bank of America Chicago Marathon and Shamrock Shuffle 8K and the Chevron Houston Marathon and Aramco Half Marathon.

arne owens image
Arne Owens

Health Care Policy Advisor for U.S. Senator Bob Corker (R-Tennessee) and Global Health Policy Advisor for the Senate Committee on Foreign Relations

Analytics for Public Good: How to Get Legislators to Understand (and Act On!) Your Policy Analysis

Operations Research and analytics are driving advancements that touch nearly every part of our lives. From improving disaster relief efforts following a storm, to criminal justice reform and real-time traffic reporting, to reforming the American health care system, analytics is saving lives, reducing costs and improving productivity across the private and the public sector. Yet, when our elected officials draft policy they may lack access to, or simply ignore, the data, models, and analysis that would help them understand the economic and social implications of proposed legislation. Members of Congress and their staff members may have access to “think tank” studies to inform and shape vital policy questions, but typically don’t receive a detailed analysis of a bill until after it has been written, and after they have sought support for it. How can analysts get the attention of legislators and help them understand a policy analysis?

  • Arne Owens is the health care policy advisor for U.S. Senator Bob Corker (R-Tennessee) and global health policy advisor for the Senate Committee on Foreign Relations. Prior to this he served as health care policy advisor to Senator David Vitter (R-Louisiana) and the Committee on Small Business and Entrepreneurship. Previous executive branch experience includes service as Chief Deputy Director, Virginia Department of Health Professions; acting Deputy Administrator and Senior Advisor to the Administrator, Substance Abuse and Mental Health Services Administration, U.S. Department of Health and Human Services; and Deputy Commissioner, Virginia Department of Mental Health, Mental Retardation and Substance Abuse Services. He has also supported Federal government agencies as a contractor. Until 1997, Mr. Owens was a career Army officer, serving in a variety of executive and staff assignments throughout the world, including the Persian Gulf and Iraq during Operation Desert Storm. He completed his military service in the Office of the Secretary of Defense, at the Pentagon in Washington, D.C., with the rank of lieutenant colonel. He is a graduate of the U. S. Military Academy at West Point, and holds a Masters Degree in Organizational Systems Management from the University of Southern California.

randy paffenroth image
Randy Paffenroth

Associate Professor of Mathematical Sciences, Associate Professor of Computer Science, and Associate Professor of Data Science
Worcester Polytechnic Institute

Applications of Robust Principal Component Analysis for Dimension Reduction and Anomaly Detection

Robust principal component analysis (RPCA) is an extension of classic principal component analysis that aims to recover low dimensional subspaces corrupted by sparse outliers, and in this talk, we will demonstrate how such methods can be applied in a number of domains for dimension reduction and anomaly detection. RPCA can be used to detect weak, distributed patterns in many types of data, and we will provide a number of illustrative examples that illuminate the promise of such techniques. In particular, one important and recent application of RPCA is in detecting weak patterns in computer networks where the nodes (terminals, routers, servers, etc.) are sensors that provide measurements (of packet rates, user activity, CPU usage, IDS logs, etc.) and we will show how RPCA can be used to detect distributed patterns indicative of attacks that are not discernible at the level of individual sensors. The approaches we propose have many applications, including in social networks and financial data, where anomalous phenomena are of interest.

  • Dr. Paffenroth graduated from Boston University with degrees in both mathematics and computer science and he was awarded his Ph.D. in Applied Mathematics from the University of Maryland in June of 1999. After attaining his Ph.D., Dr. Paffenroth spent seven years as a Staff Scientist in Applied and Computational Mathematics at the California Institute of Technology. In 2006, he joined Numerica Corporation where he held the position of Computational Scientist and Program Director. Dr. Paffenroth is currently an Associate Professor of Mathematical Sciences, Associate Professor of Computer Science, and Associate Professor of Data Science at Worcester Polytechnic Institute. His current technical interests include machine learning, signal processing, large-scale data analytics, compressed sensing, and the interaction between mathematics, computer science, and software engineering, with a focus on applications in cyber-defense.
gregory parnell image
Gregory Parnell

Research Professor, Department of Industrial Engineering
University of Arkansas

Trade-off Analytics

The design of new systems involves complex decisions with conflicting stakeholder objectives, dynamic systems requirements, new technologies, and major adversary/competition uncertainties. System decisions involve trade-offs between performance, cost, and time. To provide effective decision support, trade-off analytics requires the integration of descriptive, predictive and prescriptive analytics. Descriptive analytics provides data about the performance of current systems in past environments. Predictive analytics provides data on the performance of current systems and potential system designs in future environments. Prescriptive analytics provides evaluations of the potential designs using stakeholder values. Decision analysis provides a sound mathematical foundation for prescriptive analytics and a set of techniques for framing the decision, identifying stakeholder’s objectives, identifying system performance measures, identifying the tradespace, and evaluating the potential designs in uncertain future environments. To be efficient, trade-off analytics requires the integration of innovative design techniques (e.g., Set-Based Design); model-based systems engineering using models and simulations; probabilistic analysis, and decision analysis. This presentation reports on a multi-year effort to develop best practices for trade-off analytics to support systems decision makers in near real-time using Probability Management. We use the Analytics Hierarchy to convey the use of descriptive, predictive, and prescriptive analytics to decision makers.

  • Dr. Gregory S. Parnell is a Research Professor in the Department of Industrial Engineering at the University of Arkansas and Director of the M.S. in Operations Management (the university’s largest graduate program) and Engineering Management programs. He is also a principal and board member with Innovative Decisions Inc. His research focuses on decision and risk analysis. He was lead editor of Decision Making for Systems Engineering and Management, (2nd Ed, Wiley and Sons, 2011), lead author of the Handbook of Decision Analysis, Wiley Operations Research/ Management Science Series (Wiley and Sons, 2013), and editor of Trade-off Analytics: Creating and Exploring the System Tradespace, Wiley and Sons, 2017).  He is a fellow of the Institute for Operations Research/Management Science, the International Committee for Systems Engineering, the Military Operations Research Society, and the Society for Decision Professionals. He has won numerous awards including the 2014 Frank P. Ramsey Medal for distinguished contributions to the field of decision analysis. He previously taught at the West Point (Professor Emeritus), the U.S. Air Force Academy (Distinguished Visiting Professor), the Virginia Commonwealth University, and the Air Force Institute of Technology. He has a PhD from Stanford University and is a retired Air Force Colonel.
phn logo image
Pediatric Heart Network

Collaborative Systems Analytics: Establishing Effective Clinical Practice Guidelines for Advancing Congenital Cardiac Care

The Pediatric Heart Network enlisted researchers with the Georgia Institute of Technology to create clinical practice guidelines (CPG) for pre-, intra-, and post-surgical care of patients with congenital heart defects (CHDs), the most common birth defect, impacting nearly 1 million children and 1.4 million adults in the U.S. Substantial variances in surgical practices to treat patients with CHDs among different healthcare centers were reflected in inconsistent surgical outcomes, some of which resulted in negative consequences for patients. By studying the nine leading U.S. pediatric centers, the researchers identified seven significant factors for influencing surgical outcome, and implemented a CPG that enables patients to be removed from breathing apparatuses earlier, lowered the rate of reintubation, and decreased the time patients need to remain in the intensive care unit. These guidelines also realized a cost savings of 27 percent, which translates to $13,500 per patient. 

lea pica image
Lea Pica

Founder
LeaPica.com

Starting Strong and Finishing Effectively: Techniques for Driving Actionable Results (Co-speaker with Tim Wilson)

You have the data. You have the tools to clean, integrate, and analyze the data. Yet, your analytics organization is struggling to effectively engage the business users it supports, and some of your team’s best analyses seem to die quietly after they’re presented rather than lead to action and business impact. This session tackles some of the likely culprits when this happens: a failure to adequately articulate and qualify hypotheses at the outset, and failing to effectively communicate the results of the analysis once it is completed. The practical tips and examples covered in this session are implementation-ready: you will be putting them to use as soon as you return to the office after the conference!

  • Lea Pica is the founder of LeaPica.com. She is a seasoned professional speaker, digital analytics practitioner, social media marketer, and blogger with over 13 years of experience building search marketing and digital analytics practices for companies like Scholastic, Victoria’s Secret, and Prudential. Today she trains hundreds of analysts and marketers each year, hosts the popular Present Beyond Measure Podcast (http://leapica.com/podcast), and blogs about stellar data presentation at leapica.com. Lea’s greatest passion is the stage, which is her platform for empowering digital practitioners and analytics consultants to present information in a way that inspires action and creates indispensability.
christina phillips image
Christina Phillips

University of Leeds

Facilitating the Shift Toward a Data Driven Culture Using Human Centric Analytics

In 2015 Forbes predicted that embedded analytics could save US companies $60 B by 2020 but, in order for companies to leverage the full capabilities of analytics in their organisations, the human analytics interface must be robust and fit for purpose. This is especially so in uncertain environments where the reliance on human decision making and management is high. Models which only capture process and data can fall short of expectations and miss nuances which are held implicitly by the organisation. Action Research together with contextualised explanatory models derived through grounded analysis can enhance engagement and create new organisational pathways to understanding. In this talk you will hear about; • Multiple analytics interventions in a complex healthcare environment over 4.5yrs • Successful ways to couple soft and hard methods so that contextualised models can be developed • A cultural shift in an organisation around demand, production planning and forecast use

  • The presenter has an eclectic background from tutoring in Physics and Statistics to running her own art and design company. Her specialism has always been in Mathematical Modelling but through recent in-industry research this has been extended to include ways to facilitate and maximise benefit from participative modelling and design. Her work has helped to improve operations and promote cultural change in a multinational organisation. This facilitated achievement of substantial savings across the organisation and moved it toward a more questioning data driven culture.

    She also worked as a Business Analyst and Researcher at Siemens Healthineers, providing senior management and staff with Business Intelligence and improving their data management. In this work she gained an understanding of the analytics needs of businesses and experienced first-hand many of the difficulties faced by organisations in moving toward a data driven culture.

robert phillips image
Robert Phillips

Director of Marketplace Optimization Data Science
Uber

Dynamic Pricing In A Two-sided Marketplace

We discuss setting prices in a two-sided marketplace (such as Uber) in which the platform can set prices for both the buyer and the seller. A key question in such marketplaces is how to measure the health of the underlying market and how to specify an objective function consistent with maintaining a healthy market. Pricing in a two-sided market is also a challenge because buyers and sellers operate on different time frames: demand reacts immediately to prices while supply requires more time to adjust. We discuss these issues and different approaches that can be used to set and update prices to balance supply and demand.

  • Dr. Robert Phillips is Director of Marketplace Optimization Data Science at Uber where he leads a group of data scientists who develop and implement the analytics that empower Uber’s core businesses. Prior to joining Uber, Dr. Phillips was Professor of Professional Practice at Columbia Business School and Director of the Columbia Center for Pricing and Revenue Management. He has also served as a lecturer at the Stanford Business School. Dr. Phillips is founder of Nomis Solutions; a software and consulting company that helps financial service companies better manage pricing and profitability.

    Dr. Phillips has also served as CEO of Decision Focus Incorporated, a consulting company specializing in the application of business analytics in business and government. During his 25-year career in industry, he was a pioneer in the application of pricing and revenue optimization in many different industries including passenger airlines, cruise lines, rental cars, automotive, hotels, air cargo, trucking, container shipping, and financial services.

    Dr. Phillips is author of the book Pricing and Revenue Optimization and co-editor of The Oxford Handbook of Pricing Management as well as author of the forthcoming Pricing for Consumer Lending. In 2014 he was elected a Fellow of the International Federation of Operations Research and Management Science and in 2016 he was awarded the INFORMS Impact Award for his work as a pioneer in dynamic pricing and revenue optimization. 
    Dr. Phillips holds a Ph.D. in Engineering-Economic Systems from Stanford University and B.A. degrees in Economics and Mathematics from Washington State University.

jennifer priestley image
Jennifer Priestley

Associate Dean of The Graduate College and the Director of the Analytics and Data Science Institute
Kennesaw State University

Four Things that Women in Data Science Can Learn from Game of Thrones

Studies consistently find that women are underrepresented in most computational disciplines – particularly in Analytics and Data Science. And although events and organizations that encourage girls K-12 to learn to code have increased over the last few years, the number of college-age women in computational disciplines has not increased. Nor has the proportion of women in analytical leadership positions.

In this talk, one of the few female directors of a Ph.D. in Data Science will provide perspective on how the discipline can attract (and retain) more female talent. These points will be framed through the popular HBO Series Game of Thrones.

  • Dr. Priestley is the Associate Dean of The Graduate College and the Director of the Analytics and Data Science Institute at Kennesaw State University. In 2012, the SAS Institute recognized Dr. Priestley as the 2012 Distinguished Statistics Professor of the Year. She served as the 2012 and 2015 Co-Chair of the National Analytics Conference. Datanami recognized Dr. Priestley as one of the top 12 “Data Scientists to Watch in 2016.” Dr. Priestley has been a featured speaker at SAS Analytics, Big Data Week, Technology Association of Georgia, Data Science ATL, The Atlanta Chief Data Officer Summit, The Atlanta CEO Council, and dozens of corporate events addressing issues related to advanced analytics and the challenges and opportunities of “Big Data”. She is a member of the Advisory Board for the Southeastern Data Science Conference. She has authored dozens of articles on Binary Classification, Risk Modeling, Sampling, Statistical Methodologies for Problem Solving and Applications of Big Data Analytics as well as several textbook manuals for Excel, SAS, JMP and Minitab. Prior to receiving a Ph.D. in Statistics, Dr. Priestley worked in the Financial Services industry for 11 years. Her positions included Vice President of Business Development for VISA EU in London, where she was responsible for developing the consumer credit markets for Irish and Scottish banks. She also worked for MasterCard International as a Vice President for Business Development, where she was responsible for banking relationships in the Southeastern US. She also held positions with AT&T Universal Card and with Andersen Consulting. Dr. Priestley received an MBA from The Pennsylvania State University, where she was president of the graduate student body, and a BS from Georgia Tech. She also received a certification from the ABA Bankcard School in Norman, OK, and a Certification in Base SAS Programming, and a Business Analyst Certification from the SAS Institute
stuart price image
Stuart Price

Data Scientist
Elder Research

Unstructured Data Analysis For Classification And Anomaly Detection

The talk will explore the use of network and text analysis for classification and anomaly detection. Fraud, waste, and abuse of prescription drugs is a serious problem with both financial and health implications. When drugs are prescribed a network is created between patients, doctors, and pharmacies. This network can be used to find anomalous behavior by patients with drug seeking behavior visiting multiple doctors, doctors with excessive prescriptions, and pharmacies filling a skewed distribution of prescriptions. Text is another form of unstructured data that can augment understanding. Text analysis can be used to identify topics in product discussion threads, helping manufacturers understand key consumer issues.

  • Dr. Stuart Price is a data scientist with Elder Research where he has worked applying machine learning and optimization to problems in insurance fraud, text analysis, and healthcare. Stuart earned a BA in Physics and Mathematics from Hendrix College, and an MS in Applied Mathematics and a PhD in Operations Management from the University of Maryland, College Park.
Pujanauski image
Brian Pujanauski

Vice President of Client Services
Applied Predictive Technologies

Decisions with Confidence: Using Experiments and Predictive Modeling to Make Roll-Out Decisions

Data analytics has been inundated with a multitude of technical jargon and buzzwords. At the end of the day when it comes to analytics there are two key steps: Diagnosis and Prescription. Are you (1) diagnosing — understanding and forecasting the situation you’re in or will be in, or (2) are you prescribing – are you deciding what to do. Many of the most popular machine learning techniques are fantastic for diagnosing what is happening or predicting what will happen. What can be difficult in certain circumstances is figuring out what to do given a current (or future) situation. This talk will focus on how to move analytics from Diagnosis to Prescription in order to make more confident decisions.

  • Brian Pujanauski is a Vice President of Client Services at Applied Predictive Technologies. Brian joined APT after completing his graduate work in Organic Chemistry at the University of California-Berkeley. At APT, Brian has specialized in analytic development and applications, including the development of new software products and novel use-cases for APT’s Test & Learn software platform across multiple industries such as retail, insurance, pharma, and politics.

nancy pyron image
Nancy Pyron

Director of Operations Research and Revenue Management
Starwood Hotels

Predictive Modeling for Contract Pricing

For many companies, annual contracts consume a significant portion of company assets; whether it’s a room in a hotel, a seat on an airplane, or space on a long-haul truck. Often these customers are perceived as a blessing during low demand periods and a curse during high demand. Setting the right combination of price and contract conditions for each potential customer is critical to ensuring the customer is an overall benefit to the company. What are the core components required to build and maintain a rigorous contract pricing system? What data is really needed ⋯ Historical? Customer specific? Capacity? Competitive? What predictive analytic methods work best … Simple regression? Complex machine learning? Or something in between? What about pricing models? Finally, what type of system requirements will be needed for various combinations of data, predictive complexity, and pricing logic?

  • Nancy was a Senior Director of Revenue Management and Operations Research at Marriott International and Starwood Hotels from 2010 through 2017. At Starwood, she built an analytics team to create, design, and implement a new revenue management system. The team’s focus was to employ the most recent advances in predictive analytics, machine learning, and statistics to drive revenue and respond to user needs. Nancy began her Revenue Management journey in 1997 with Aeronomics as an Operations Research analyst for Revenue Management Systems. Her career includes user training, implementation consulting, blending the needs of business, technical, and analytics teams, and creating long term team strategy and vision. Nancy received an MS in Operations Research from Stamford and a BS in Applied Mathematics from University of North Texas.
shanshan qiu image
Shanshan Qiu

Analytics Scientist
Ford Motor Company

A Quality Analytics Decision Tool

In this project, we developed a quality analytic decision tool to reduce scrap in a plant by following the descriptive-predictive-prescriptive approach. In the original process in the plant, production shifts that cause scraps are identified weeks later based on cost performance as opposed to signal from production process. We first built a near-real time visualization tool that displays the time series plot, control charts, distribution, etc. of all the process parameters of all departments in the plant for the engineers to watch the trends or the shift of the data. We then developed machine leaning models to identify the key parameters that cause the product failures/scraps. This tool allows the engineers to identify the problems in the production line timely and efficiently. In this work, we were able to make seamless connection between data stored in a Hadoop system, robust analytics models, and solution deployment.

  • Analytics Scientist at Ford Motor Company. She obtained her Ph.D. in Industrial Engineering in 2014. She has worked on Manufacturing Analytics, Pricing Optimization as well as Dealer Risk Analytics using OR, statistical as well as machine learning models ever since she joined Ford.
anu raman image
Anu Raman

Monsanto

You Have A Model, So What?

Simulation and optimization models are important tools for improving our seed processing business process. At Monsanto, recommendations from such models have resulted in business investment decisions ranging from equipment upgrades and re-design of shop floors to improved scheduling and resource utilization. Models in early stages require a continuous cycle of visualization, validation and improvement and this testing cycle is typically done by groups that support our Production operations such as Production Research and Innovation Engineering. As these models mature, multiple teams and functions come together to coalesce around the commitment to transforming the model into a script. Agreement on this tipping point requires not only confidence in the model but also in our ability to execute the model at a commercially significant scale with a clear Return on Investment. Putting the pieces together into a clear business value proposition is critical for leadership approval and investment in the technical ecosystem needed to operationalize the script. Some components of this technical ecosystem include automatic script generation, real-time data management and integration into our operational suite of tools. Finally, true value comes from our production operations teams understanding the mechanics of the model, how they will be responsible for its execution and a clear understanding of the cost savings or quality improvements due to their operational commitment. It truly requires the entire Monsanto village to make the model “real” and to ultimately deliver a better product to Monsanto’s farmer customer.

  • Anu Raman is a domain expert on Monsanto’s seed advancement pipeline and has 20 years of experience designing, deploying and supporting an increasingly sophisticated landscape of IT systems to the Breeding and Production organizations globally. As an IT Strategist for the Global Seed Production and Quality organizations, she is responsible for the global alignment of IT and Business strategies in these teams. An important aspect of her role is to evaluate and recommend new technologies, solutions and products to deliver value to her business partners. Her passion for agriculture and the promise of what new technologies can deliver is shared with her colleagues and results in a commitment to deliver the best product possible to Monsanto’s farmer customers. Anu has a B.S. in Zoology from Michigan State University and a M.S. in Biochemistry from Case Western Reserve University.
eva regnier image
Eva Regnier

Associate Professor in the Graduate School of Business and Public Policy
Naval Postgraduate School

Prepare or Wait? The Marine Forces Reserve Hurricane Decision Simulator

U.S. Marine installations in coastal locations face a difficult problem – maintaining mission readiness and personnel safety while under threat from hurricanes. Preparing for a hurricane is the archetypal multi-stage decision under uncertainty. For leaders, it has very high stakes – even early preparation steps may incur $10 M in direct, unrecoverable cost. But failing to prepare can be even worse. This challenge is compounded by another common analytics problem – a gap between the people who generate information and the people who use it, and a resulting mismatch between the information provided and the decision processes that use it.

To help Marines train for hurricane preparation, NPS developed the Hurricane Decision Simulator (HDS), an online choose-your-own adventure training tool for U.S. Marine Forces Reserve. It is so far being used successfully at the Marine Forces Reserve headquarters for two seasons, and at a small training center for one year including a Command handover just a few months before Hurricane Irma.

  • Eva Regnier is an Associate Professor in the Graduate School of Business and Public Policy at the Naval Postgraduate School, where she teaches analytics to full-time MBA students and working professionals in the Navy and Marine Corps. Her research integrates information in multi-stage decision processes, helping to bridge the gap between professionals who generate and analyze data and forecasts and those who use that information. She has trained and helped develop products with Navy meteorology and oceanography organizations, as well as operational organizations. Her research has been funded by the National Science Foundation, Office of Naval Research, the Joint Typhoon Warning Center, the Marine Forces Reserve, and other Department of Defense organizations. Eva Regnier holds a Ph.D. in Industrial Engineering and a M.S. in Operations Research from the Georgia Institute of Technology, and a B.S. from Massachusetts Institute of Technology.
bill roberts image
Bill Roberts

Managing Director
Deloitte Consulting LLP

Cognitive Data Science: the Correct Algorithm Makes All the Difference

The talk discusses data science for strategic business insight and for the solution of new business problems using advanced cognitive algorithms. The importance of using the right algorithm for a given business challenge is explored through real-life examples

  • Bill specializes in data science, algorithms and advanced analytics. He has over 20 years experience in industry, government, and academia in the application of machine learning and stochastic modeling to real-world problems in a variety of industries. He has developed and deployed algorithmic solutions in the oil and gas industry for well resource optimization, warning detection, text processing, and hydraulic fracturing impact estimation.

    He has held teaching and visiting academic positions at research and academic entities including George Mason University, George Washington University, and Tokyo Institute of Technology. He has authored or co-authored over 20 academic publications on machine learning, stochastic modeling, and acoustics. He has received awards for innovation, a best paper award from the international Speech Communication journal, and has notable performances in analytics competitions including runner-up in the Netflix movie recommendation competition.

    Notable industry and government achievements include algorithms for voice recognition, sales performance estimation, opioid abuse detection, retail loss prevention estimation, and customer attrition reduction.
    Bill has B.E. (Hons) and B.Sc.(hons) degrees from the University of Adelaide, Adelaide, Australia, and a Ph.D. degree from George Mason University, Fairfax VA.

Schneider logo image
Schneider

Chassis Leasing and Selection Policy for Port Operations

Port cargo drayage operations manage the movement of shipping containers that arrive and depart on ocean-going container vessels and are transported over the road to and from inland trans-loading facilities.   While containers are on land they are placed on wheeled chassis until they return to the port facility.  A significant operational challenge is the acquisition and management of these chassis.  While many port drayage operators simply lease chassis on a per day basis as demand warrants, Schneider National has determined that an analytics-driven policy that combines long term leasing with daily rental leads to significant cost savings while improving both service and reliability.  We present and implement a solution methodology that addresses the two decision problems that arise with this dual sourcing approach:   1) the optimal fleet size for leased chassis and 2) a real-time decision policy for selecting between rental and leased chassis as containers are received.  As we demonstrate our solution represents an integrated approach that combines the three general areas of analytics methodology and incorporates a particularly novel interplay of optimization, simulation, and predictive modeling.  We conclude with an analysis of the financial benefit that has been achieved and a discussion of the applicability of our methodology to other problem settings.

Michael Schuldenfrei image
Michael Schuldenfrei

Senior Member of Executive Management Team
Optimal+

Building a Digital Thread with Machine Learning & IoT Analytics in Complex Manufacturing Environments

In this session, Michael will discuss creating a multi-industry and value-chain digital thread to give brand owners “cradle to grave” visibility into their manufactured semiconductor and electronics products.

  • Michael is a senior member of the Optimal+ executive management team.and contributes more than 30 years of software and information technology experience. He leads the definition of the company’s strategic product roadmap, evangelizes its solutions to customers, partners and industry forums and has published numerous visionary articles on its products. In his current and previous roles at Optimal+, Michael designed many of the technologies at the heart of the company’s solutions.

    Previously, Michael was a Senior Software Architect at SAP, where he led the development of Duet, a joint venture with Microsoft to enable seamless access to SAP data via Microsoft Office. Before joining SAP, Michael was a Software Architect at Microsoft, where he consulted with the company’s major customers. He was also Vice President R&D, at ActionBase, a company providing business-management enterprise solutions to enhance internal organizational workflow and collaboration.

Bhavna Sharma image
Bhavna Sharma

Post-doctoral Research Associate in the Environmental Sciences Division
Oak Ridge National Lab

Simulation Modeling for Reliable Biomass Supply Chain Design Under Operational Disruptions

Cellulosic biofuel made from wood, grasses, or non-edible parts of plants are promising and sustainable alternative replacements for petroleum-based fuels and chemicals. Currently, the cellulosic biofuel industry relies on a conventional system where feedstock is harvested, baled, stored locally and then delivered in a low-density format to the biorefinery. However, the conventional supply chain system causes operational disruptions at the biorefinery mainly due to seasonal availability, handling problems, and quality variability in biomass feedstock. Operational disruptions decrease facility uptime, production efficiencies, and increase maintenance costs. In bioenergy industry, for a low-value high-volume product where margins are very tight, system disruptions are especially problematic. In this work we evaluate advanced system strategy in which a network of biomass processing centers (depots) are utilized for storing and preprocessing biomass into stable, dense, and uniform material to reduce feedstock supply disruptions, facility downtime, and thus boost economic returns to the bioenergy industry. A database centric discrete event supply chain simulation model was developed and the impact of operational disruptions on supply chain cost, inventory and production levels, and uptime, downtime, ramp up time, starved time for the operations were evaluated. The variation in uptime at the depot and biorefinery directly affects the processing operations at both entities and thus impacts the reliability of biomass supply chain system. The talk will outline: • Convectional biomass supply chain challenges and strategies to address those challenges.• Implementation of dynamic multi-biomass, multi-product, multi-form database centric discrete event supply chain simulation model to evaluate impact of process failures and strategies to cope with them. • We present an industrial case study for cellulosic biorefinery in Iowa to offer insights on impacts of operational disruptions and managing operational risks. The audiences will understand the challenges, and strategies for designing a reliable supply chain so that the system can hedge against potential operational disruptions. Lessons learned from this work will be serve other industries to manage endogenous uncertainties and gain legitimate competitive advantage against such disruptions.

  • Dr. Sharma is a Post-doctoral research associate in the Environmental Sciences Division at the Oak Ridge National Laboratory. Primary focus of her work is on developing mathematical, simulation, and spatial models for agriculture supply chain management, conserving environment, biomass assessment, siting analysis, and agriculture risk analysis. Dr. Sharma received a Master’s of Science in Food Engineering and Industrial Engineering and Management from Punjab Agricultural University and Oklahoma State University (OSU), respectively. She received Ph.D. in Biosystems and Agricultural Engineering from OSU.
mona siddiqui image
Mona Siddiqui

Chief Data Officer
Department of Health and Human Services

Creating Value from Data at HHS

The Department of Health and Human Services collects an enormous amount of data on the health of individuals and on the performance of providers, health systems and states. While this data has been used for its primary reporting purposes, capturing value from it in the form of evidence based decision making has been elusive. This talk will describe the journey that the Department has embarked on to change this culture.

  • Dr. Siddiqui is currently serving as the Chief Data Officer for the Department of Health and Human Services where she is focused on building an enterprise wide data governance structure, streamlined data use agreements, creating a business analytics unit and developing strategic partnerships for data sharing with external stakeholders. Prior to this role, Dr. Siddiqui was at the Center for Medicare and Medicaid Innovation working towards implementing a rapid cycle testing platform for payment models. She has also previously served with the White House Social and Behavioral Sciences Team (“nudge” unit) leading work in the health space. She has a MD from the Johns Hopkins School of Medicine and a masters in quantitative methods from the Harvard School of Public Health.
manjeet singh image
Manjeet Singh

DHL Supply Chain

Connected Marketplace Dynamics

The emergence of rapid fulfillment models has caused major disruption to the retail space. Consumers have access to wide range of products that are being delivered faster but at similar costs in comparison with physically shopping at retail stores. These same day models have high costs of entry and are dominated by a few large players. The increasing popularity of these models and consequently increasing size of these big players has created a strong turbulence in the retail landscape. Left unchecked, this will ultimately render much of traditional retail obsolete. Technology is the key behind the metamorphosis of the shopping experience. Therefore, on the bright side, with the help of a good technology partner retailers have the opportunity to leverage existing assets and reap the benefits of this new wave. We propose a novel solution, V3, that could be leveraged to tap this market. V3 intelligently provides consumers visibility to local inventories and optimizes crowd sourced agents to pick and deliver products at a fraction of the cost of current models. V3orchestrates a natural connected marketplace by building density and leveraging underutilized capacity based on a shared sourcing principle no current model can compete with.

  • Manjeet specializes in complex data analytics and mathematical modeling across DHL’s Global Center of Design Excellence. He leads the Operations Science team who have worked on numerous research projects leading in development of various OR tools and processes being utilized at DHL Supply Chain. Prior to joining DHL in February 2015, he worked at Netjets in their Business Insights & Analytics department, developing mathematical and heuristic tools for scheduling of pilots, capacity planning and network modeling for spare parts inventory. He has extensive research experience, culminating with a PhD in Industrial and Systems Engineering from Ohio University. Manjeet has published papers in distinguished technical journals and frequently presents at industry and academic conferences.
giri tatavarty image
Giridhar Tatavarty

Director of Data Science
84.51°

Doing Data Science at Scale

The deluge of big data and open source machine learning tools create both opportunities and challenges for scalability. Doing Data Science at scale involves a radically different approach towards picking the right processes and tools in data preparation, data management, analysis, visualization, automation and execution of models. There are many problems where addition of large datasets to model training leads in impressive results and better models (such as spell check, image recognition, recommender systems), However there may be cases where any amount of additional data only results in insignificant improvement. It is important to understand the nature of the problem and then decide what is the correct scale and data that is needed to solve the problem in a meaningful and efficient manner. The talk will cover mainly 3 types of scalability – amount of data, number of models and finally organizational scale with respect to number of data scientists and problems that are solved. Finally, the talk will touch upon real life case studies in retail industry where such scalable processes touch millions of customers.

  • Giri Tatavarty is a Director of Data Science at 84.51° ( A wholly owned subsidiary of The Kroger Co). He has 15 years of experience in retailer analytics, data science and lead teams across Data Engineering, Architecture and Data Science. He has extensive experience on data science problems of some of the world’s largest retailers such as Kroger, Tesco, Macys, Home Depot and others. He has also designed and implemented machine learning pipelines for these customers. His areas of focus are scalable machine learning, time series forecasting, streaming analytics, natural language processing, and visualization tools. Giri has previously worked for dunnhumby and ORACLE corporation. Apart from his work at 84.51, Giri is also teaching Cloud Computing and Big Data courses at University of Cincinnati. He has a Masters Degree in Computer Science and Engineering from University of Cincinnati.
douglas thomas image
Douglas Thomas

Research Economist for the Engineering Laboratory’s Applied Economics Office
National Institute of Standards and Technology

Economic Decision Making for Sustainable Manufacturing

The manufacturing sector accounts for 17 % of GDP, making it a critical component of the US economy. Additionally, the sector accounts for 35 % of the US economy’s environmental impact; therefore, manufacturers are increasingly focused on minimizing their portion of the environmental impact, while also maintaining their industrial competitiveness. Although there are accepted practices oriented towards economic decision making, currently, there are no economic standards that incorporate environmental life-cycle assessment. Economists at the National Institute of Standards and Technology (NIST), who have developed widely adopted software tools and economic standards for buildings, have proposed a methodology that can be used by firms to make economic decisions regarding sustainable manufacturing. The methodology provides a procedure for evaluating investments by coupling environmental life-cycle assessment (LCA) along with economic methods for investment analysis. This presentation will discuss the following topics regarding the proposed method:

  • Types of economic decisions in sustainable manufacturing
  • Standards and accepted practices for economic decision making
  • Environmental life-cycle assessment
  • Methods for assessing tradeoffs between cost and sustainability
  • Sensitivity analysis
  • Case study example in sustainable manufacturing

The economic methods discussed in this presentation include net present value, internal rate of return, payback period, and hurdle rates. Six methods are presented for examining the tradeoffs between sustainability and expenditures. Monte Carlo techniques are presented as a method for sensitivity analysis and the presentation will conclude by illustrating the method using an example case study. Audience members should walk away with knowledge on sustainable economic decision making to answer questions such as: which of five proposed investments should my firm select; what is the tradeoff between expenditures and sustainability for two investments; and is an investment a sustainable and cost-effective endeavor. This dual methodology will equip individuals with the tools to evaluate the tradeoffs between competing economic and environmental priorities.

  • Douglas S. Thomas is a research economist for the Engineering Laboratory’s Applied Economics Office at the National Institute of Standards and Technology (NIST). Currently, his activities are focused in two areas of research: 1) manufacturing industry costs and resource consumption and 2) methods for economic decision making in the adoption of technologies and processes in manufacturing. The first area includes measuring and tracking the US manufacturing supply chain using methods such as economic input-output analysis. The second area of research studies barriers to technology and process adoption in manufacturing as well as identifies methods for economic decision making in the adoption of technologies and processes. Thomas has published journal articles, book chapters, and technical reports on a variety of economic topics. In the course of his research, he has examined the cost effectiveness of a number of investments, including additive manufacturing. In addition to presenting at multiple INFORMS conferences, he has given more than 30 professional presentations, including those at the Louisiana Natural Resources Symposium; PDES Inc. Workshop on Additive Manufacturing; ASTM E06 Symposium: Idea to Impact- How Building Economic Standards keep you on Track; SAVE Value Summit; Production Operations Management Society Annual Conference; International Input-Output Conference; International Symposium on Fire Economics; International Association of Wildland Fire: Fire Behavior and Fuels Conference; and IAWF’s Human Dimensions of Wildland Fire Conference, to name a few. Thomas has also given a number of presentations at the National Institute of Standards and Technology, including seminars on the economics of manufacturing, courses on investment analysis, and talks to motivate health and safety in the workplace. He has lead multiple NIST projects on the economics of manufacturing, and is a member/participant in the International Input-Output Association, ASTM, Construction Industry Institute, and the Production Operations Management Society.
luke thompson image
Luke Thompson

VP of Politics and Advocacy
Applecart

Beyond Beer Preferences: New Data Sources for Targeting

Machine Learning and other advances in automating model selection, combined with increased processing speed, have reduced the edge once offered by domain expertise and clever model selection. When everyone is running the same models on the same data, how can firms set distinguish themselves? Find new data. Some efforts have tried to explore intrapersonal sources of data, typically construed as “psychographics”. But the key to finding real, meaningful, durable, and actionable divisions in the population – the latent heterogeneity at the heart of the entire exercise – actually lies in interpersonal relationships. Finding, mapping, and measuring that data can unlock new analytic opportunities.

  • Luke Thompson is VP of Politics and Advocacy at Applecart. He previously worked for Right to Rise USA and the National Republican Senatorial Committee. He has a PhD in political science from Yale University.
cenk tunasar image
Cenk Tunasar

Principal in Data & Analytics focused on Artificial Intelligence, Machine Learning, Optimization and Simulation
KPMG

L3adersh1p: Influencing the D&A Community

Analytics is undeniably at the center stage for decision making. As demand increases for data and analytics driven solutions, organizations pivoted towards establishing new roles and getting quite creative in naming them, with CDO winning the race with a land slide. Scientist, data scientist, data archeologist, data engineers, software engineers, modelers, economists, operations researchers and many more such categorizations describe a group of us doing sometimes quite similar work, but somehow we get quite sensitive and defend the differences. What is the best way to recruit, train, grow and keep this analytics community of people? Of course there is this age old functional versus business divide. But do we need a third dimension and introduce another concept as a catalyst to bond this seemingly similar but rather different sub groups within analytics. This talk describes some lessons learned through multiple organizational constructs and offers suggestions to keep this community intact, relevant and happy.

  • Cenk is a Principal in KPMG’s Data & Analytics practice focused on Artificial Intelligence, Machine Learning, Optimization and Simulation. Cenk has 20 years of experience across multiple industries including financial services, health, defense and transportation – utilizing advanced analytics to solve ridiculously interesting problems! His functional expertise includes the development of decision support solutions utilizing predictive analytics, artificial intelligence, combinatorial decision theory and optimization. One of Cenk’s particular interests is the concept of trust in analytics, specifically the accountability in algorithms that make decisions for us. Cenk has a Ph.D. in Operations Research from the University of Pittsburgh.

turner logo image
Turner

Turner Blazes a Trail for Audience Targeting on Television with Operations Research and Advanced Analytics

Turner has designed and implemented innovative and integrated forecasting and optimization models that power audience targeting solutions disrupting decades-old paradigms and business processes in the media industry, and producing significant sales and advertisement efficiencies for Turner and its clients. Turner is on track to sell 50 percent of its inventory through audience targeting by 2020, representing billions in ad revenue.

usaf logo image
United States Air Force
The Secretary of the Air Force signed our winning nomination for the INFORMS Prize. We highlighted three exemplar topics of aircraft repair, nuclear deterrence, and testing. We had 17 additional topics in operation effectiveness, logistics, manpower, acquisitions, and cost analysis. Leaders from Congress, U.S. and allied defense forces, industry, professional societies, and academia endorsed the Air Force. We summarized recent Air Force research, publications, and awards. We discussed the history of operations research in the Air Force along with contributions to the foundations of operations research. We conclude with how the Air Force develops, trains, and organizes analysts. We conclude our presentations with suggestions on submitting for the INFORMS Prize.
sridhar vaithianathan image
Sridhar Vaithianathan

Associate Professor and Heads the Business Analytics Department
Institute of Management Technology (IMT)

Predictive Model to Proliferate Spend Propensity Among Freemium Mobile Gamers

Within the freemium mobile game industry, there is significance difference between engaging players and monetizing from them. At present, marketing and player loyalty efforts are more focused on engaging them rather than converting them into spenders.

We have recently developed a predictive modelling system that studies the players’ spending behaviour in a game by analysing the gaming data captured by the telemetry systems for analytics. Using Machine learning Algorithm, the players are classified into three segments by looking at weekly player behaviour correlations and engagement/disengagement snapshots. Further, it also suggests specific strategies to deploy based on the recommended player category. This Proof of Concept (POC) is both scalable and platform (IoS, Android) independent in mobile portfolio.

Our system predicts spend propensity with more than 85% accuracy and also auto highlight areas of improvement in design and economy, something that helps a game team to create a road map for determining the best actions for various player segments.

  • Sridhar Vaithianathan is an Associate Professor and heads the Business Analytics Department at Institute of Management Technology (IMT), Hyderabad, India. He has completed his Ph. D (Determinants and Impact of E-Commerce Adoption in India) in Management from ICFAI University, Dehradun. He was a visiting scholar at University of Toledo, Ohio, USA (Aug 2007 – June 2008). He received “Highly Commendable paper award” in the recently concluded “Business Analytics and Intelligence Conference (11-13 Dec 2017) at IIM, Bangalore. Has participated in various data contests hosted by Kaggle, KDnuggets, Analyticsvidhya, Crowdanalytix etc. He has published in both international and national journals of repute. He is a trainer in Structural Equation Modeling (SEM), Machine Learning Algorithms (MLA), and Data Visualization. He has high proficiency in statistical software such as Python, R, SPSS/PASW, AMOS, SMART PLS, SAS MINER, and XL MINER. He is in the field of education, research and training for fifteen years. He received his B.E. degree from M.S.University at Tamilnadu and MBA degree from National Institute of Technology (NIT), Warangal respectively. His areas of research and teaching interest include Business Analytics, Quantitative Methods, Technology Adoption and Data Visualization.
lilat wadhwa image
Lalit Wadhwa

Vice President of Advanced Analytics
Avnet

Building an Analytics Culture at Avnet

Based on Avnet’s analytics journey, the presentation discusses four major challenges that Avnet incrementally addresses in order to develop and sustain an analytics-driven culture. Specific deliverables and initiatives are used as examples to illustrate challenges and their resolution.

  • Lalit Wadhwa is vice president of Advanced Analytics at Avnet. Based in Phoenix, his team is responsible for continually improving customer experience, revenue & margin opportunities by embedding pervasive analytics into business processes. His responsibilities also include implementing Big Data infrastructure, building vendor partnerships, and delivering promised benefits. Mr. Wadhwa is a 28-year Avnet veteran with more than two decades of global supply chain operations, procurement, materials management and hardware product design experience. He is a trained and certified SCOR-Professional – having undergone the educational practicum offered by the Supply Chain Operations Reference model (SCOR), the world’s leading supply chain framework. Mr. Wadhwa is passionate about leveraging the R, Python and Octave languages for advanced analytics, visualization and machine learning in large datasets. Mr. Wadhwa holds a bachelor of Engineering, Electronics and Communications Delhi University in India.
unknown speaker image
Vijay Wadhwa

Sr Manager of Advanced Analytics & Data Science
Southwest Airlines

Applying NLP to Improve Customer Experience (CX) at Southwest Airlines

Southwest currently collects in excess of 6 million text based data points from Customers on an annual basis which reflects both solicited and unsolicited feedback from our Customers. Historically we have focused on analyzing structured data; however we have lacked the capability to analyze text data. Synthesizing and analyzing all of this text based data in a consistent and efficient manner is challenging. We invested in a software which specializes in text analytics and is focused on extracting the theme/topics and other insights from open-end comments. Building our formalized VoC program has helped Southwest Airline systematically gather, interpret, react, and monitor what we hear from our Customers in ways that drive measurable business value

  • Vijay Wadhwa is the Sr Manager of Advanced Analytics & Data Science at Southwest Airlines. His portfolio includes Customer Analytics, Marketing Mix and being the champion of the Customer.
    He has a M.S & PhD in Industrial Engineering and Operations Research from The Pennsylvania State University.

disney logo image
The Walt Disney Company
This presentation will include a summary of the Prize winning application submitted by The Walt Disney Company for the INFORMS Prize. The presentation provides an overview of analytics at Disney as well as how analytics organization collaborate to drive significant value to the company. With four business segments and constantly changing products and offerings, Disney presents a world of opportunities for pioneering, impactful and innovative analytics. What sets analytics at Disney apart is the diversity of applications that could only be found by combining a variety of different industries. Even though these applications are diverse, the core operations research principles transcend across multiple lines of business. We will also provide an overview of three impactful projects across different segments of the organization that leverage a fusion of operations research, statistics, econometrics, machine learning, big data analytics, and even borrowing from computer science and other science disciplines.
tim wilson image
Tim Wilson

Senior Director of Analytics
Search Discovery

Starting Strong and Finishing Effectively: Techniques for Driving Actionable Results

You have the data. You have the tools to clean, integrate, and analyze the data. Yet, your analytics organization is struggling to effectively engage the business users it supports, and some of your team’s best analyses seem to die quietly after they’re presented rather than lead to action and business impact. This session tackles some of the likely culprits when this happens: a failure to adequately articulate and qualify hypotheses at the outset, and failing to effectively communicate the results of the analysis once it is completed. The practical tips and examples covered in this session are implementation-ready: you will be putting them to use as soon as you return to the office after the conference!

  • Tim Wilson is a Senior Director of Analytics at Search Discovery, a digital intelligence company that empowers organizations to make transformative business decisions. He has been working with digital data full-time since 2001 in a variety of roles: from managing a web analytics platform migration and developing analytics processes as the head of the business intelligence department at a $500 million high tech B2B company; to creating and growing the analytics practices at three different agencies that worked with a range of large consumer brands; to consulting with the digital analytics teams at Fortune 500 companies on the their strategies, processes, and tactics for effectively putting their digital data to actionable use. Tim is a long-time creator of pragmatic content for analysts and marketers, including co-hosting the bi-weekly Digital Analytics Power Hour podcast (analyticshour.io) and co-creating dartistics.com — a site dedicated to encouraging analysts to learn the R programming language and apply statistical methods to their data.

tauhid zaman image
Tauhid Zaman

KDD Career Development Professor in Communications and Technology
MIT Sloan School of Management

Picking Winners: A Framework for Venture Capital Investment

How should a venture capitalist pick startup companies to invest in? In this talk, we will discuss a data-driven framework for picking companies which are “winners”, meaning they achieve an exit (IPO or acquisition). We will propose a model for startups exiting based on first passage times of Brownian motions. We will then present an optimization formulation where the objective is to build a portfolio of startups to maximize the probability of having at least one winner, and show how to build such portfolios efficiently. Finally, we will show that our portfolios perform well in real settings. In particular, we find that we can build portfolios of startup companies which outperform top venture capital firms.

  • Tauhid Zaman is the KDD Career Development Professor in Communications and Technology MIT Sloan School of Management. He received his BS, MEng, and PhD degrees in electrical engineering and computer science from MIT. His research interest is in behavioral analytics, with a focus on solving operational problems using behavioral models, modern statistical methods, and network algorithms. His work has been featured in the Wallstreet Journal, Wired, Mashable, the LA Times, and Time Magazine.
michael zargham image
Michael Zargham

Founder
BlockScience

A Blockchain and Business Analytics Case Study

This talk will start by introducing the core concept of the blockchain, focusing on it’s role in business as a shared data source including enabling and enforcing coordination in multiparty workflows. A brief overview of the current state of the blockchain technology will highlight the areas of greatest promise and the major outstanding challenges.

The body of this talk will cover a case study from Sweetbridge, a global supply chain focused fin-tech firm using Ethereum smart contracts to offer self-banking services to individuals and businesses.

The Sweetbridge Liquidity protocol allows Sweetbridge members to place tokenized assets into vaults, locking those assets as collateral for loans. Borrowing limits for the loans also called UOUs are strictly less than the total value of the assets; UOUs do not have a counter-party independent of the borrower, rather the smart contracts track repayments, interest accruals and changes in asset values, allowing the contract itself to liquidate assets to cover the debts if it would otherwise fall into default. The mathematical details are presented in our working white paper available on the Sweetbridge website.

Our case study will cover our methodology for economic system engineering, the use of iterative stochastic process simulations to evaluate the design, deriving KPIs for a new kind of business and conclude with reviewing the current state of the project using data retrieved from the public Ethereum blockchain.

  • Systems Engineering PhD from the University of Pennsylvania with focus on optimization and control of decentralized systems, and over a decade of experience applying data to business decision making applications. His career started as a data scientist, deriving algorithms to isolate professional network effects in enterprise software adoption for technology market research firm Techtel Corp in 2005.

    In 2017, Dr. Zargham founded BlockScience, a technology research and analytics firm specializing in the design and evaluation of decentralized economic systems. We are defining and practicing the emerging field of Economic Systems Engineering by applying the mathematical engineering technologies associated of the decades old Systems Engineering field, along with Game Theory and Behavioral Economics fields, to the economic networks being instantiated via blockchain and smart contract enabled applications. Our work includes pre-launch design and evaluation based on real analysis and simulation as well as well as post launch monitoring and maintenance via reporting, analytics and decision support software development supporting economic health KPIs. BlockScience is actively working with projects in the Education Tech, Ad Tech and Supply Chain Industries.

    Formerly, he was the Director of Data Science at Cross MediaWorks; working with the CTO/COO, he founded the firm’s data science team in January 2015 after spending a few months in 2014 consulting with one of their subsidiaries regarding their data strategy. The technology and team he built remain a driver of data driven decision-making at the firm.

Panel Discussion: Drive Better Analytics Practice through Professional Development

INFORMS develops and offers many professional development and career growth opportunities for analytics professionals. These opportunities fall into three main categories: Courses, Colloquia, and Committees (the “3Cs”). Working with well-established and highly-regarded INFORMS members who are experts in the fields of analytics and O.R., INFORMS has created such products as the Analytics Body of Knowledge (ABOK) and the Essential Practice Skills for High-Impact Analytics Projects’ course. In this session, we will highlight some of the other available resources and opportunities, including the CAP Prep course, the Early Career Professionals’ Network (ECPN), and the Industry Outreach and Engagement Committee.

tasha inniss image
Tasha Inniss, PhD

Director of Education and Industry Outreach
INFORMS

  • Tasha R. Inniss, Ph.D. currently serves as the inaugural Director of Education and Industry Outreach at INFORMS. As a member of the INFORMS staff and Senior Leadership Team, Inniss is responsible for the overall vision, strategic direction, and implementation of all INFORMS education-related, professional development, and practice (industry) activities and outreach.

    Prior to assuming this role, she served as the Acting Deputy Division Director of the Division of Human Resource Development in the Directorate of Education and Human Resources at the National Science Foundation where she did a rotation. Concurrent with that, she was a tenured Associate Professor of Mathematics at Spelman College, a liberal arts college for women in Atlanta, Georgia. Her first faculty position was as a Clare Boothe Luce Professor of Mathematics at Trinity Washington University (formerly Trinity College) in Washington, D.C.

    Originally from New Orleans, Inniss graduated summa cum laude from Xavier University of Louisiana with a Bachelor of Science degree in mathematics. She earned a Master of Science degree in applied mathematics from the Georgia Institute of Technology and a Ph.D. in applied mathematics from the University of Maryland, College Park. Her dissertation advisor at UMD was Michael O. Ball.

bill griffin image
Bill Griffin

Continuing Education Program Manager
INFORMS

  • Bill Griffin currently serves as Continuing Education Program Manager at INFORMS and is responsible for professional development initiatives, including continuing education course development and delivery, professional colloquia, and engagement and development of academic and corporate partners. Griffin has 20+ years of experience working at the intersection of education and business, with stints at Sylvan Learning Systems, Prometric, and Laureate Education.

    Prior to joining INFORMS in 2015, he developed and managed Masters and Doctoral programs at the Richard W. Riley College of Education and Leadership at Walden University. From 1998 to 2009, he oversaw international, high-stakes test delivery for programs like TOEFL, GRE, GMAT, PMI and MCAT.

    Born, raised, and currently residing in Catonsville, MD (home to INFORMS) with his wife and three boys, Griffin is a proud graduate of the first college chartered after American independence, Washington College.

freeman marvin image
Freeman Marvin

Executive Principal
Innovative Decisions, Inc.

  • Freeman Marvin has over 25 years of experience as a decision analyst, group facilitator, and process consultant in the Federal government sector. His expertise is in the integration and application of organization development, operations research, and electronic collaboration technologies. He has facilitated numerous decision conferences and electronic meetings, developed decision models using a variety of software, and taught courses on group decision support for traditional facilitators, managers, and analysts. Freeman is a Certified Professional Facilitator (CPF) with The International Association of Facilitators. Freeman has worked as an operations research analyst for Rockwell International and Decision Science Consortium, and was a senior member of the technical staff for Northrop-Grumman TASC, a systems engineering and professional services company. He holds a Master in Public Policy degree from the Kennedy School of Government at Harvard University and a Bachelor of Science degree from the U.S. Military Academy at West Point.
james cochran image
James Cochran

Professor of Applied Statistics and the Rogers-Spivey Faculty Fellow in the Department of Information Systems, Statistics, and Management Science
University of Alabama’s Culverhouse College of Commerce and Business Administration

  • James J. Cochran is Professor of Applied Statistics and the Rogers-Spivey Faculty Fellow in the Department of Information Systems, Statistics, and Management Science at the University of Alabama’s Culverhouse College of Commerce and Business Administration. He earned a B.S. in Economics (1982), an MS in Economics (1984), and an MBA (1985) from Wright State University, and a PhD in Statistics (1997) from the University of Cincinnati. He has been a Visiting Scholar with Stanford University, the University of South Africa, the Universidad de Talca, and Pôle Universitaire Léonard De Vinci.

    Professor Cochran’s research interests include statistical methods (particularly general linear models), statistical learning, sample based and Bayesian optimization, and applications of statistics and operations research to real problems from a wide variety of disciplines. He has published over thirty articles in peer-reviewed academic journals and seven textbooks in statistics, analytics, and operations research. Professor Cochran has served as a consultant for many companies, government agencies, and NPOs.

    Professor Cochran was a founding co-chair of Statistics Without Borders, and he established and has organized INFORMS’ Teaching Effectiveness Colloquium series and annual Case Competition. Professor Cochran also established an annual International IR & Statistics Education Workshop series and has co-chaired workshops through this initiative in Uruguay, South Africa, Colombia, India, Argentina, Kenya, Cameroon, Croatia, Tanzania, Cuba, Mongolia, and Moldova. In 2008 he organized and chaired the 2008 ORPA Conference on Using Operations Research to Address Urban Transport and Water Resource Management Issues in Africa.
    In 2006 Professor Cochran was elected to the International Statistics Institute, in 2008 he received the INFORMS Prize for the Teaching of OR/MS Practice, and in 2010 he received the Mu Sigma Rho Statistical Education Award. In 2011 Professor Cochran was named a Fellow of the American Statistical Association, in 2014 he received the Founders Award from the American Statistical Association, and in 2015 he received the Karl E. Peace Award for Outstanding Statistical Contributions for the Betterment of Society. In 2017 he received the Waller Distinguished Teaching Career Award and was named a Fellow of the Institute for Operations Research and the Management Sciences. Professor Cochran is the founding Editor-in-Chief of the Wiley Encyclopedia of Operations Research and the Management Sciences and the Wiley Series in Operations Research and Management Science. He has served as the Editor-in-Chief of INFORMS Transactions on Education and serves on the Editorial Board for several other journals and publications including Significance and Interfaces.

Panel Discussion: Competing with CAP

The Certified Analytics Professional (CAP®) certification is the only global, ANSI-accredited certification for analytics’ professionals. During this panel session, you will have the opportunity to hear from, and ask questions of, individuals who have either earned the CAP/aCAP credential or who train/supervise individuals who are pursuing the certification. They will talk frankly about how CAP can be used as a career differentiator for individuals or as a talent development tool by an organization’s leadership.

balaporia image
Moderator
Zahir Balaporia, CAP

Solutions Partner
FICO

  • Zahir creates customer focused solutions within FICO’s Optimization Suite. With over 20 years of business and IT operations experience, he brings thought leadership to the development of advanced analytics solutions across analytical domains and industry verticals. He also specializes in change management associated with deploying advanced analytics. Before joining the FICO team in 2015, Zahir was Director of Advanced Planning and Decision Sciences at a large US transportation and logistics company. His team specialized in the application of advanced analytics techniques for decision support across the enterprise. His work is profiled in the new book Only Humans Need Apply – Winners and Losers in the Age of Smart Machines by Tom Davenport and Julia Kirby.

briggs image
Alan Briggs, CAP

Director of Machine Intelligence
Ayasdi

  • Alan Briggs earned the Certified Analytics Professional designation in 2014 and is actively involved in INFORMS in a variety of ways. Since attending the INFORMS Professional Colloquium in 2011, he has gone on to present at both the Analytics and Annual Meetings; has served on a variety of INFORMS planning committees, including Information Technology, Analytics Maturity Model, and Subdivisions Council; and served as President of the Maryland Chapter. At this year’s conference, he is co-chair of the Early Career Practitioners’ Network and enjoys mentoring through the Coffee with a Member program.

    As Director of Machine Intelligence at Ayasdi Government Service, Alan Briggs oversees client delivery on AI/ML projects, and technical engineering support on projects across the US Government. Alan brings a strong National Security vantage point, having led or supported a multitude of efforts throughout the Defense and Intelligence community. He has previously worked at SAS Institute as a Senior Technical Resource for Account Executives, as well as a Data Scientist and Project Manager for Elder Research, conducting ground-breaking projects to advance national security practices. He holds a Master’s Degree in Systems Engineering from North Carolina State University and a BS in Business Management from Emory & Henry College.

mitchel-guthrie image
Polly Mitchell-Guthrie

System Director, Analytical Consulting Services
UNC Health Care System

  • Polly is the Director of Analytical Consulting Services within Enterprise Analytics and Data Sciences at the University of North Carolina Health Care System. Previously she was Senior Manager of the Advanced Analytics Customer Liaison Group in SAS’ Research and Development Division, where her team served as a bridge between R&D and external customers and internal SAS divisions. Before that she was Director of the SAS Global Academic Program, leading SAS’ outreach to colleges and universities worldwide to incorporate SAS into their teaching. Polly began her career at SAS in Strategic Investments and later served in Alliances, after working in the nonprofit sector in philanthropy and social services. She has an MBA from the Kenan-Flagler Business School of the University of North Carolina at Chapel Hill, where she also received her BA in Political Science as a Morehead Scholar. Within INFORMS, she has served as the Chair and Vice Chair of the Analytics Certification, Board, Secretary of the Analytics Society, and on the ad hoc committee planning the Executive Forum.

worrell image
Sallamar “Sally” Worrell, CAP

Operations Research Analyst
US Food and Drug Administration

  • Sallamar “Sally” Worrell is an Operations Research Analyst with the US Food and Drug Administration. Her tenure of more than nine years with the federal government has given her experience in the fields of business analysis, strategy development, and policy implementation. She has led projects such as the Prescription Drug User Fee Act Performance Report, Government Accountability Office inquires, and congressional requests for data. In her current role within the Center for Drug Evaluation and Research, Office of Program & Strategic Analysis, she is collaboratively working to create predictive models for the capacity planning needs of various user fee programs supporting the Center. She finds this work the most interesting because it allows her to integrate her interests in systems engineering and applied mathematics to provide holistic solutions to complex business process modeling and reengineering problems.

    Sally also serves as an adjunct professor and mentor. She enjoys helping students overcome their fear of math by adapting technical content to their learning styles and encouraging them to explore STEM fields.

    Sally received her bachelor’s degree in applied mathematics and statistics from the Johns Hopkins University and her master’s in systems engineering and doctorate in engineering management from the George Washington University. She is also a Certified Analytics Professional and a Project Management Professional.

Wzientek image
Nick Wzientek, CAP

Data Scientist
Independent Pet Partners

  • Nick Wzientek, CAP is currently a Data Scientist for Independent Pet Partners where he is responsible for Marketing, Customer, and Merchandising Analytics.

    Prior to Independent Pet Partners, Nick served in variety of leadership roles including Vice President of Operations Research and Analytics at Rocky Mountain Resources, a resource holding company where he was responsible for the application and validation of advanced analytics techniques to drive profitably across all assets. Nick also established and led the Enterprise Analytics team at Sports Authority where his team provided analytical services across all domains of retail including operations, merchandising, finance, marketing, and eCommerce. His team specialized in forecasting, longitudinal data analysis, experimental design, mathematical programming, and customer analysis.

    Prior to Sports Authority, Nick worked for Booz Allen Hamilton as an Operations Research Analyst where he supported the United States Air Force in a variety capacities including planning and operations, personnel, foreign affairs, and foreign language development. Nick initially started his career as an officer in the United States Air Force, serving as a B-2 Flight Test Analyst and employing experimental design techniques to evaluate and assess the mission effectiveness of the B-2.

    Nick holds the CAP designation, a M.S. in Operations Research from Southern Methodist University, and a B.S. in Operations Research from the United States Air Force Academy.

Panel Discussion: Advancing Women’s Excellence In The Field Of Data & Analytics

Career choice, professional development, networking and work-life integration are only a few of the important topics facing women in the field of data and analytics. Come to this panel to hear from fellow women in analytics about their professional experiences, challenges, opportunities and tips on continuing development and advancement in this ever growing field.

tohamy image
Moderator
Noha Tohamy

Vice President & Distinguished Analyst
Gartner

  • Ms. Tohamy is a Vice President & Distinguished Analyst at Gartner, Inc. She helps organizations build strong supply chain analytics competency to support the digital transformation. This spans identifying the strategy and roadmap, organizational structure, and talent requirements. She advices organizations on how to adopt leading-edge analytics technologies to optimize supply chain performance. Ms. Tohamy has written extensively about best in class supply chains, successful adoption of advanced technologies and the evolution of digital supply chains. For over 20 years, she has helped organizations improve their supply chain performance through process redesign, talent development and technology adoption. She has been a frequent speaker at global Gartner events, MIT Sloan School of Management, Georgia Tech’s Supply Chain and Logistics Institute, APICS and Informs. Ms. Tohamy received her B.S., with high honors, in Mathematics from Emory University and her Masters in Operations Research from Georgia Institute of Technology.

mitchel-guthrie image
Polly Mitchell-Guthrie

System Director, Analytical Consulting Services
UNC Health Care System

  • Polly is the Director of Analytical Consulting Services within Enterprise Analytics and Data Sciences at the University of North Carolina Health Care System. Previously she was Senior Manager of the Advanced Analytics Customer Liaison Group in SAS’ Research and Development Division, where her team served as a bridge between R&D and external customers and internal SAS divisions. Before that she was Director of the SAS Global Academic Program, leading SAS’ outreach to colleges and universities worldwide to incorporate SAS into their teaching. Polly began her career at SAS in Strategic Investments and later served in Alliances, after working in the nonprofit sector in philanthropy and social services. She has an MBA from the Kenan-Flagler Business School of the University of North Carolina at Chapel Hill, where she also received her BA in Political Science as a Morehead Scholar. Within INFORMS, she has served as the Chair and Vice Chair of the Analytics Certification, Board, Secretary of the Analytics Society, and on the ad hoc committee planning the Executive Forum.

no speaker image
Mei Zhang

American Airlines

  • Mei Zhang is Senior Manager of Operations Research and Advanced Analytics at American Airlines. Mei and her team support the Technical Operations (Tech Ops) organization for various business objectives using analytical solutions including machine Learning, optimization, data mining, simulation, and process modeling. Mei builds and maintains relationships with different business groups within Tech Ops and assists in developing data and analytics strategies. Mei works with business users to identify process improvement opportunities through applying analytics, helps business groups evaluate vendor offerings and promotes analytics through demonstrations and training. Within the OR team, Mei oversees projects, provides guidance and support to team members. Mei received her Ph.D. degree in Industrial and Systems Engineering from Georgia Institute of Technology, M.S. in Geographic Information Systems and B.S. in Mathematics from University of Tennessee, Knoxville. Mei worked at Sabre and i2 Technology briefly before joining AA.

pyron image
Nancy Pyron

Director of Operations Research and Revenue Management
Starwood Hotels

  • Nancy was a Senior Director of Revenue Management and Operations Research at Marriott International and Starwood Hotels from 2010 through 2017. At Starwood, she built an analytics team to create, design, and implement a new revenue management system. The team’s focus was to employ the most recent advances in predictive analytics, machine learning, and statistics to drive revenue and respond to user needs. Nancy began her Revenue Management journey in 1997 with Aeronomics as an Operations Research analyst for Revenue Management Systems. Her career includes user training, implementation consulting, blending the needs of business, technical, and analytics teams, and creating long term team strategy and vision. Nancy received an MS in Operations Research from Stamford and a BS in Applied Mathematics from University of North Texas.