Join the conference exhibitors as they discuss innovations and best practices in the field. All attendees are welcome to join these tutorials onsite. Descriptions Below
Monday, April 15, 3:40-4:30pm, Room 304
Model-Based Optimization + Application Programming = Streamlined Deployment in AMPL
Presented by: Robert Fourer, Filipe Brandão
AMPL offers the advantages of modeling in a specialized optimization environment combined with the power of application development via general-purpose programming. Optimization problems are formulated concisely and naturally in AMPL’s modeling language, promoting rapid development, reliable maintenance, and evaluation of multiple solvers and data sources. APIs for popular full-featured programming languages facilitate embedding of AMPL models and scripts into complex applications, with access to data management and interface development libraries. We illustrate using AMPL’s Python API and new AMPL features that leverage Python for optimization application development.
Monday, April 15, 10:30-11:20am, Room 306
Using AnyLogic Simulations as a Training Environment for Deep Reinforcement Learning
Presented by: Arash Mahdavi, Simulation Modeling Consultant
In this tutorial, we will discuss how you can leverage unique features of AnyLogic simulation software and AnyLogic Cloud to solve complex business challenges. We will demonstrate state-of-the-art technologies including the use of simulation models as the training environment for deep reinforcement learning that can take your simulation models to the next level in terms of sophistication and usefulness.
Monday, April 15, 10:30-11:20, 305
Power Systems Optimization and Price Forecast with Artelys Crystal
Presenting author. Violette Berge, Vice President, Artelys Canada Inc.
Artelys Crystal is a high level modeller for energy systems: it enables the description of multi-energy systems at a large scale (one or several states), taking into account generation assets (thermal, renewables, hydro, etc.), storage assets, networks, demand (load, demand-response, etc.), and interties within a given market structure. Thanks to detailed models and hourly time steps, the Artelys Crystal technology can be used to carry-out techno-economic analysis of the future of the power grid (benefits of ReS, storage, network developments, peak load reduction, etc.) for several scenarios and derive long-term energy prices. The talk will introduce the technology Artelys Crystal, its model and capabilities, throughout an application case.
Monday, April 15, 1:50-2:40pm, Room 304
Using Artelys Knitro with Julia/JuMP
Presented by: Richard Waltz, Senior Scientist, Artelys Corp, firstname.lastname@example.org
Artelys Knitro is the premier solver for nonlinear optimization problems. Julia is a free, open-source, high-level programming language for technical computing that also comes with an optimization modeling language JuMP. This software demonstration will highlight the newly released Knitro interface to Julia/JuMP. This interface leverages the new C API in Knitro to exploit problem structures such as quadratic and conic constraints. The Knitro C API can be called directory through Julia to access all Knitro features. Alternatively, Knitro may be used through the JuMP modeling language, which provides automatic differentiation for nonlinear functions. We will demonstrate both approaches as well as benchmarking features provided through this interface.
Tuesday, April 16, 10:30-11:20am, Room 306
Storytelling for Machine Learning and Advanced Analytics
Storytelling and depiction of information has existed for hundreds of years in various forms and formats. In today’s era of artificial intelligence and machine-assisted analytics, accurately interpreting and effectively communicating findings is becoming a crucial skill to bridge the growing data literacy gap. To help decision makers get the most value from analytics projects to drive better outcomes, you need to help them make sense of the results.
Machine learning and advanced analytics can be difficult to understand and explain. Describing the problem, the model, the relationships among variables and the findings are often subtle, surprising and technically complex. Effectively translating quantitative insights and telling a compelling story requires planning, compelling design, and visualization choices. Successful analytical communicators don’t wait until the end of the analysis but rather use the entire process as a vehicle to communicate with stakeholders. Please join me in this session to learn the essence of storytelling for advanced analytics.
Tuesday, April 16, 4:40-5:30pm, Room 306
Presented by: Filippo Focacci, Co-founder, CEO
From Strategic to Real-Time Workforce Optimization: Improve Productivity and Service Levels through Advanced Analytics
For industries that rely on a vast workforce to deliver services, it is critical to ensure that tasks are delivered on time and that the workforce is highly productive. To achieve this, market leaders have embedded advanced analytics techniques into their systems.
In this tutorial, we present the DecisionBrain suite of workforce optimization solutions that support key business decisions at the Operational, Tactical, and Strategic level. This deployment leads to significant improvements in workforce productivity by approximately 20-30%. These solutions are currently used by market leaders in Facility Management industry such as ISS, JLL, and Serco.
Before founding DecisionBrain, Filippo worked for ILOG and IBM for 20 years where he held several leadership positions in Consulting, R&D, Product Management and Product Marketing in the areas of Supply Chain, Logistics and Optimization. He received a Ph.D. in Operations Research (OR) from the University of Modena (Italy) and has over 20 years of experience applying OR techniques in industrial applications in several optimization domains. He has published several Supply Chain and Optimization articles for international conferences and journals. He has been granted a patent for Optimization Models.
Tuesday, April 16, 11:30am-12:20pm, Room 306
DTrio: The next generation of Decision Framing
Presented by: Jeremy Walker V.P.
Poised to tackle a complex decision problem? Walking into a team facilitation with “stickies” and colored pencils? Take a look at the new DTrio – Decision Framing tool, a .Net application. It features the major capabilities which made the original DTrio Excel add-in the most widely used Decision Framing tool. The new DTrio offers you the ability to collect and categorize issues, automatically generate Decision Hierarchies, design Strategy Tables, weave together different thematic choices, compare your Strategies, build Influence Diagrams and Decision and Risk Timelines. Come see the next big thing in Decision Quality and leave your next meeting with more than just a photo of your work.
Monday, April 15, 1:50-2:40pm, Room 306
End-to-End FICO® Xpress Insight Tutorial: From Data to Decisions for Non-Technical Business Users
Presented by: Jim Williams
You have a team with a great analytics background. They’ve developed advanced analytical tools using Python, R, or your current optimization solver. They’ve derived crucial insights from your data and figured out how your decisions shape your customers’ behaviors. Now it’s time to put these critical analytical insights into the hands of your non-technical business users.
In this tutorial, you’ll learn how FICO’s Xpress Optimization solutions (including Xpress Mosel, Xpress Workbench, Xpress Solver and Xpress Insight) make it possible to embed your analytic models in business user-friendly applications. See how to supercharge your analytic models with simulation, optimization, reporting, what-if analysis, and agile extensibility for your ever-changing business. Plus, you’ll discover how to use the new View Editor to reduce GUI development times from minutes to seconds.
Tuesday, April 15 11:30am-12:20pm, Room 305
Keep it Simple: Getting Analytics Results with Less Cost, Time and Risk
Presenter: Daniel Fylstra, President, Frontline Systems Inc.
Many organizations today are trying to invest in analytics, hire data scientists, deal with “big data”, and get results. Some roads ahead involve big commitments, high costs, and complexity. But another road is possible: Start small, keep it simple, and recognize you have more in-house expertise than you thought. This session will show how you can use skills you have and tools like Excel and C# to build optimization, simulation, and data mining applications; Tableau and Power BI to connect models to data (even “big data”) and deploy them widely; and resources like Solver.Academy to build your team’s expertise.
Monday, April 15, 11:30am-12:20pm, Room 305
Switching to Gurobi
Presented by: Dan Jeffrey (Sr. Support Engineer)
Have your optimization needs outgrown your optimization tools? We’ll discuss the pros of cons of different solvers, including open-source and commercial offerings, and look at how to recognize when the time has come to switch. We’ll talk about the best way to migrate your existing work to Gurobi, and discuss common situations we have seen from customers who have switched.
Monday, April 15, 11:30am-12:20pm, Room 306
Optimization and Machine Learning – A Match Made in Data Science Heaven!
Presented by: Virginie Grandhaye – Decision Optimization Offering Manager, IBM Hybrid Cloud, Sumeet Parashar – Technical Sales Specialist – Decision Optimization / CPLEX, Data Science, IBM Watson AI Platform
Data Science teams are increasingly challenged to build innovative solutions using a combination of techniques like machine learning and optimization. So we added new capabilities to our data science platform – IBM Watson Studio to help analytics teams experience the benefits of machine learning and optimization within the same tool. IBM Decision Optimization is well known for its optimization engine, namely CPLEX and by delivering these engines on Watson Studio platform, we are helping data scientists create powerful applications and deliver tangible business outcomes. This capability will help organizations across industries maximize the value of their analytics investments.
With this offering, users can develop, train and deploy their models, in a single platform while supporting hybrid cloud (private cloud, public cloud, or desktop). We will also feature real use cases of our customers across Finance, energy, production planning.
Monday, April 15, 11:30-12:20pm, Room 304
Optimization Modeling Tools from LINDO Systems
Presented by: Mark Wiley and Gautier Laude
Exceptional ease of use, wide range of capabilities, and flexibility have made LINDO software the tool of choice for thousands of Operations Research professionals across nearly every industry for over 30 years. LINDO offers solvers to cover all your optimization needs. The Linear Programming solvers handle million variable/constraint problems fast and reliably. The Quadratic/SOCP/Barrier solver efficiently handles quadratically constrained problems. The Integer solver works fast and reliably with LP, QP and NLP models. The Global NLP solver finds the guaranteed global optimum of nonconvex models. The Stochastic Programming solver has a full range of capabilities for planning under uncertainty.
Get the tools you need to get up and running quickly. LINDO provides a set of intuitive interfaces to suit your modeling preference.
- What’s Best! is an add-in to Excel that you can use to quickly build models that managers can use and understand.
- LINDO has a full featured modeling language for expressing complex models clearly and concisely, and it has links to Excel and databases that make data handling easy.
- LINDO API is a callable library that allows you to seamlessly embed the solvers into your own applications.
Pick the best tool for the job based upon who will build the application, who will use it, and where the data reside. Technical support at LINDO is responsive and thorough – whether you have questions about the software or need some modeling advise. Get started today. Visit our booth or www.lindo.com to get more information and pick up full capacity evaluation licenses.
Tuesday, April 16, 9:10-10am, Room 305
Predictive and Prescriptive Analytics with MATLAB
Presented by: Mary Fenelon, Product Marketing Manger
MATLAB makes it easy to build applications that include both predictive and prescriptive analytics. In this tutorial you will learn how to access data from the web, preprocess it, build predictive models, and formulate optimization problems using a natural syntax. We will show you how to perform these steps by building a web app for selecting sites for disaster relief.
Monday, April 15, 9:10-10am, Room 306
The Beer Game as a Reinforcement Learning Tool
Presented by: Larry Snyder, Professor, Lehigh University & Senior Research Fellow-Optimization, Opex Analytics
Reinforcement Learning & the Beer Game: An Online Version of the Classic Beer Game – Now Brewed with Artificial Intelligence
Reinforcement learning (RL) is a branch of machine learning that has been successfully applied to play games (chess, Go, Atari), control robots, and target marketing campaigns. But RL has only just begun to be used as a decision-making tool in the supply chain.
In this talk, we present a semi-technical overview of RL and discuss how RL can be used for inventory management (and why it should be). Finally, we present an RL algorithm that we developed to play the beer game, a well known classroom exercise that demonstrates the difficulties inherent in multi-agent inventory management. Our RL algorithm achieves costs that are similar to those achieved by the optimal inventory policy (a base-stock policy).
When the other players on the team also use a base-stock policy. When playing with “human-like” teammates (computerized players whose ordering decisions mimic human decisions), our RL algorithm performs better than a base-stock inventory policy, and usually better than human beer game players. Our RL algorithm serves as the “AI player” in the Opex Analytics online beer game, which we demonstrate during the talk.
Beer Game Link (Reference): https://beergame.opexanalytics.com/#/
Monday, April 15, 3:40-4:30pm, Room 305
A DOCplex and ODH|CPLEX Python primer
This short tutorial shows participants how to build a basic model using the DOCplex API in Python. This session includes setting the Python environment, reading data from a csv or spreadsheet, creating variables, objective functions, constraints, solving the model, and returning the results. Additionally this session points the participants to further reading so that they may expand their capabilities. Furthermore we will present the brand new ODH|CPLEX API for Python, which improves solution times for large models.
Tuesday, April 16, 10:30-11:20am, Room 305
Cost Benefit Analyses for Optimization Projects
Presented by: Patricia Randall, PhD, Director
Although practitioners often must prove and quantify the potential benefits of an optimization project, there is little guidance to prepare a rigorous Cost Benefit Analysis (CBA). Princeton Consultants Director Patricia Randall, PhD will describe how to conceptualize, construct and validate a CBA, based on Princeton Consultants’ development of CBAs over 30 years. Using examples, Dr. Randall will walk through the steps and principles of building the CBA, and discuss how to monetize benefits, how to identify and use major leverage points, and how to divide a project into independently cost-justifiable phases. This tutorial will address issues including:
- Business purposes of a CBA
- Outcomes that client executives are seeking
- Key questions the CBA must answer
- Why a CBA should evaluate and compare a spectrum of solution scenarios
- Why extensive CBA validation is so important
Dr. Randall specializes in the design, development, and implementation of large-scale, high impact systems that help businesses optimize their decision-making at the strategic, tactical, and operational levels. In July 2018, she published the Analytics Magazine article, “How to Avoid Chaos in the Field by Combining Simulation and Optimization.” Dr. Randall holds a Ph.D. in Industrial Engineering from Clemson University.
Monday, April 15, 9:10-10am, Room 305
Introduction to Text Analytics Approaches used for Business Analytics to Quickly Extract Insights.
Business Analytics and Operations Research involves researching incident reports, corporate reports, social media, customer reviews and much more. The volume of available text has exploded in the digital age. It is extremely time consuming, expensive and in many cases impossible to read each and every document related to one’s research. Text Analytics makes it possible to quickly import and analyze very large volumes of text documents. This presentation will showcase the different text analytics approaches used for Business Analytics such as computer assisted qualitative coding; text mining; content analysis dictionaries or taxonomies, supervised and unsupervised machine learning. We will discuss when one technique may be more appropriate than another and how they can work together to analyze text data.
Tuesday, April 16, 1:50-2:40pm, Room 306
Machine Learning with H2O for R Users
Presented by: Matthew A. Lanham, Purdue University, Department of Management
This tech workshop provides R users and enthusiasts a deep dive of the available functions and resources that H2O provides to perform machine learning and predictive analytics at scale. The motivation for this workshop is that R continues to be one of the top languages used by data scientists and analytics professionals. However, R does not scale well with massive data sets. H2O provides a user-friendly REST API that allows R (as well as Python, Scala, and Java) users to perform machine learning at scale all within the environment of their choice. For the newbie or R lover wanting to get up to speed in how to incorporate H2O into your workflow, this workshop provides an end-to-end case that demonstrates the functions, reason for their use, and typical order in how you would use them.
Matthew is a Clinical Assistant Professor at Purdue Universities’ Krannert School of Management. His primary focus is serving as Academic Director for the M.S. in Business Analytics & Information Management (BAIM) program, coordinating and teaching Krannert’s Data Mining, Predictive Analytics, Using R for Analytics, and Industry Practicum courses, as well as interfacing these activities with Purdue’s Business Information and Analytics Center (BIAC) serving as its Associate Director of Student Engagements.
Tuesday, April 16, 9:10-10am, Room 306
Combining Technology and Advanced Analytics to Master Risk
Presented by: Yvon Pho, Director, Digital Risk Solutions, PwC
Join our technology tutorial to learn how a single platform can deliver holistic enterprise-wide insights into strategic and operational risks. Our enterprise-level software fuses enterprise risk platform data and external data (both structured and unstructured), with advanced data analytics to deliver business intelligence and insights. Workflow capabilities and data visualization accelerate speed to insight and corresponding action. PwC is on a journey to digitally upskill our people and harness technology to create innovative software-as-a-service solutions that turn the next threat into an advantage.
Tuesday, April 16, 4:40-5:30pm, Room 305
Data Visualization and Machine Learning in SAS Viya
Presented by: André de Waal
SAS Visual Analytics is a web-based product that leverages SAS high-performance analytics technologies to empower organizations to explore huge volumes of data very quickly to identify patterns, trends, and opportunities for further analysis. SAS Visual Data Mining and Machine Learning adds machine learning functionality to the SAS Visual Analytics web client. Paired with SAS Visual Analytics, Visual Data Mining and Machine Learning enables users to experience powerful statistical modeling and machine learning techniques running on SAS Viya through and easy-to-use, drag-and-drop visual interface. In this presentation I will give a an overview of SAS Viya, explore a large data set using SAS Visual Analytics and build a machine learning model (e.g. a recommender system) in SAS Visual Data Mining and Machine Learning. For SAS Viya users wanting more control over the modelling process (e.g. the data scientist), I will also demonstrate how the same machine learning model can be built in SAS Studio using the built-in tasks or through SAS programming. Lastly, I will show how to build the same recommender system in Python using a Jupiter Notebook.
André de Waal received his Ph.D. in theoretical computer science from the University of Bristol during 1994. He spent the next year in Germany and Belgium continuing his research in Logic Programming and Automated Theorem Proving. During 1996 he returned to South Africa to take up his position as lecturer at the School of Computer Science and Information Systems at the then Potchefstroom University for Christian Higher Education (which later became the North-West University), where he was later promoted to Associated Professor. During 1999 he became one of the founder members of the Centre for Business Mathematics and Informatics at the same university. He became responsible for the Data Mining Program in the Centre and shifted his research focus to include Neural Networks and Predictive Modeling. He is the co-developer of the AutoGANN modeling node in SAS Enterprise Miner and has published several research papers on the automation and use of Generalized Additive Neural Networks. He joined SAS Institute in Cary, NC during December 2010 to take up the position of Analytical Consultant in the Global Academic Program.
Monday, April 15, 10:30-11:20am, Room 304
Building and Solving Optimization Models with SAS
Presented by: Rob Pratt, Senior R&D Manager Ed Hughes, Principal Product Manager
SAS provides comprehensive data and analytic capabilities, including statistics, data/text mining, forecasting, and operations research methods: mathematical optimization, discrete-event simulation, and project and resource scheduling. The OPTMODEL procedure from SAS provides a powerful and intuitive algebraic optimization modeling language, with unified support for linear programming, mixed integer linear programming, quadratic programming, nonlinear programming, constraint programming, local search optimization, and network-oriented optimization models. We’ll demonstrate PROC OPTMODEL, highlighting its newer capabilities and its support for standard and customized solution approaches. We’ll also show how you can access SAS optimization capabilities from other programming languages like Python, Lua, Java, and R, thanks to the open, cloud-enabled architecture of SAS® Viya®.
Monday, April 15, 1:50-2:40pm, Room 305
Using JMP: Mining Unstructured Text Data for Meaningful Insights
Presented by: Ruth Hummel
JMP makes it very simple to take unstructured text data and extract meaning. The point-and-click interface allows us to easily pull out clusters of co-occurring words (Which words often show up together in product reviews? What topics emerge?), clusters of similarly-themed documents (Can we find groups of people with similar opinions?), and to impose a structured format on the data to allow use of these features in a descriptive or predictive statistical model. In this talk, we will demonstrate how easy it is to glean insight from text data using JMP.
Monday, April 15, 3:40-4:30pm, Room 306
Simulation and Scheduling Software All in One!
Presented by: Renee Thiesing and Katie Prochaska (Simio LLC)
Simio is a premier simulation and scheduling software that allows you to expand traditional benefits of simulation to improve daily operations. In this tutorial, we will demonstrate Simio’s 3D rapid modeling capability to effectively solve real problems. Explore how a single tool can be used to not only optimize your system design, but also provide effective planning and scheduling. Come explore the Simio difference and see why so many professional and novice simulationists are changing to Simio.
Tuesday, April 16, 1:50-2:40pm, Room 305
Presented by: Bryce Johnson, Course Consultant
16 Tips for Teaching a Marketing Analytics Course
Are you struggling to teach your marketing analytics course? In this session, Bryce will share the top tips to teaching a more complete course including what platforms you should use, available resources, and the ways you can keep your course up-to-date