Technology Workshops – Sunday

Prescriptive Analytics in Action: Turning Ideas into Results and Improving Decision Making Across Your Organization

Gertjan de Lange, Deanne Zhang, AIMMS, 3:00-4:45pm

Are you trying to improve planning, reduce costs, optimize your operations/supply chain or increase innovation? Looking for a way to update your operational processes and tools? Leading companies are already getting significant value and crafting a competitive edge from applying advanced analytics in their operations. Don’t risk falling behind.

This workshop offers a pragmatic approach to getting started with Prescriptive Analytics. We will demonstrate how to create a 1st application from start to finish and start generating value for your business.

We will share:

How a business challenge or problem can be transformed into business-value generating Apps

How these cool, Smart Apps can provide your users with recommended actions on an ongoing basis

Provide insights on how to replicate this in your own organization

Share how the new AIMMS Cloud offering can accelerate your attainment of results

Adding Optimization to your Business:
Fast Development and Dissemination with AMPL and QuanDec

AMPL Optimization, 1:00-2:45pm

Optimization is the most widely adopted technology of Prescriptive Analytics, but also the most challenging to implement. How can you quickly prototype an optimization application and then just as efficiently provide it to your company’s decision makers? In this presentation, you’ll learn about a pair of tools that together address both development and dissemination. The AMPL modeling system builds an optimization model using general and powerful algebraic notation. QuanDec turns an AMPL model into an interactive, distributed decision-making tool.

The workshop begins with an introduction to how an operations research analyst uses AMPL, to tailor an optimization model to an application. The analyst focuses on the form of the model — decision variables, objective functions, constraints — while AMPL takes care of the low-level work of problem generation and solution. AMPL’s design promotes speed and reliability in the development phase of optimization modeling, and provides access to all of the most powerful large-scale solver packages.

Our presentation then introduces QuanDec, newly offered by AMPL in collaboration with Cassotis Consulting. QuanDec automates the dissemination of an optimization model for use within an organization; it moves the AMPL model’s complexity to the background, so as to let the decision maker focus on the analysis and results. QuanDec offers an intuitive and user-friendly graphical interface where decision makers, even without any knowledge of modeling and operations research, are able to analyze and visualize the results of optimization; compare several scenarios; share work with colleagues; recalibrate part of the model by making regressions from external data; and much more. Several live examples will be presented to demonstrate the capabilities of QuanDec and to illustrate answers to questions from workshop participants.

Applying AnyLogic Simulation to Solve Various Business Challenges

Andrei Borshchev, CEO & Arash Mahdavi, Simulation Modeling Consultant, AnyLogic North America, 3:00-4:45pm

Take a journey through model demonstrations in various domains such as Manufacturing, Healthcare, Supply Chain, Railways, Oils and Gas, Human Resources, Pedestrian movement and Road Traffic. We’ll also showcase how AnyLogic 8 can provide you with better ways to USE your models: Run, Optimize, Compare, Deliver, Share, Discuss, and Store you models on the cloud!

Building A Data Science Capability

Booz Allen Hamilton, 11:00am-12:45pm

Are you ready to work smarter, not harder? It’s time to embrace the power of data to make informed, strategic business decisions in every aspect of your organization. Join Booz Allen as we explain how to build a data-driven culture, as well as the tools and technology to support your success.

How to Model, Solve, and Deploy Optimization Across Industries with FICO Xpress

FICO, 1:00-2:45pm

At the core of FICO Xpress Optimization Suite are its solver libraries. In this session, hear about the latest enhancements to our linear, mixed-integer and nonlinear solvers. Besides an update on the latest performance improvements, you will learn about the new tuning capabilities as well as the recently added Python interface.

We are launching a new development environment. Be one of the first to experience how quickly you can develop optimization models in Xpress-Mosel or even complete cloud based optimization solutions.

To complete the workshop, we will show a number of case studies from various industries which combine predictive analytics with optimization models as well as show how to connect to and manage data from various sources.

Creating And Publishing Interactive Online Analytics Applications

Forio, 11:00-12:45pm

Forio’s web platform makes your analytic model available to hundreds of people within your organization through the browser. We will start with an introduction to the platform and example analytics applications. Then we’ll divide the workshop into two parts. In the first part, we will teach you how to get your analysis on a server so it can be shared. The second part will focus on creating a user interface for your model.

Everything You Need for Analytics Success: Learn How, Build Models, Deploy Applications

Frontline Systems, Inc, 3:00-4:45pm

Find the easiest, fastest path to real analytics results for you and your team at this workshop, where business analysts, developers new to analytics, and experts are all welcome. Save learning time by leveraging the skills you already have in Excel or your favorite programming language. Use Help, Wizards, Guided Mode, Live Chat, Proactive Support, examples, videos, and full-scale online courses to enhance your analytic skills while you build real models. Use a comprehensive set of tools for forecasting, data mining, text mining, simulation and risk analysis, decision analysis, and conventional and stochastic optimization, on your desktop or in the cloud at Easily build models three ways: “point-and-click” in your browser or Excel, in our high-level RASON® modeling language, or via an object API in C#, Java, R, Python or C++. Pull data into your analytic models from spreadsheets, databases, BI systems, cloud resources, and Apache Spark Big Data clusters. You’ll see why over 8,000 organizations have used Frontline Solvers to get results over more than 25 years, and why over 200,000 users are already using our cloud analytics apps.

Recent Developments in the Gurobi Optimizer

Dr. Edward Rothberg, CEO, & Dr. Daniel Espinoza, Senior Developer
Gurobi Optimization, 3:00-4:45pm

Learn about the latest advancements in Gurobi Optimizer. In this workshop, we’ll give an overview of our recent 7.0 release, which includes Python modeling enhancements, performance improvements on real-world models, and several major new features like support for multiple objectives. We’ll also discuss the new intuitive interface and enhanced API support for the Gurobi Instant Cloud.

Discover How to get the Most out of Predictive and Prescriptive Analytics with IBM Decision Optimization and SPSS Modeler

Ted Fischer & Xavier Nodet
IBM, 11:00am-12:45pm

Get a brief overview of what’s new in IBM Advanced Analytics
Learn how to use the new features in CPLEX Optimization Studio V12.7, including performance improvements, working with the automated Benders decomposition algorithm, using the interactive program for CP Optimizer, using Modeling Assistance, and evaluating the variability of your models
Learn how to use SPSS Modeler for several business applications
Learn when and how to combine CPLEX and SPSS to best leverage both Predictive and Prescriptive Analytics, including guidelines for practical usage from within IBM Decision Optimization, SPSS Modeler, and the new IBM Data Science Experience (DSX)


Exploiting Optimization in Multi-Period Planning Models

Linus Schrage, LINDO Systems, Inc, 1:00-2:45pm

Multi-period planning is concerned with decision-making over time, ranging from one day ahead in managing a collection of electrical generation units, to 50 years ahead if managing a set of oil fields, a forest, or a set of mines. We will provide an introduction to the variety of applications, what is special or complicating about each class of problems, and what works and what does not, with examples of solution experience using LINDO Systems optimization software.

Applications areas considered are:

Resource extraction, e.g., Mining and Petroleum where a complicating factor is that the production rate and the cost per unit extracted changes depending upon the stage;
Crop/Forest Planning, where the interesting features are the long planning horizon and the decision of when to harvest, given an age related growth rate, and perhaps interactions between species;
Electricity Generation unit commitment, where interesting features are start-up and shut-down costs, and output rate related efficiencies as with hydro;
Workforce staffing, especially in the health industry;
Scheduling & Sequencing, e.g., machine scheduling, tank scheduling, pipeline scheduling, where the interesting complications are precedence constraints among tasks, due dates, ready times, and more.

The most common tool we will consider is integer programming, although we will briefly look at analytical tools such as eigenvalue analysis.


How to Rapidly Deploy Analytics on the Opalytics Cloud Platform

Opalytics, 11-12:45pm
Dr. David Simchi-Levi, Chairman and Co-founder, Opalytics
Peter Cacioppi, Chief Scientist and Co-founder, Opalytics

Data analytics is a hot topic, and while there are off the shelf solutions available, many of the most exciting opportunities are tailored to a specific or unique challenge. Data scientists use tools such as Python or R libraries for machine learning and Gurobi or CPLEX for optimization to build models. But, what happens when they need to deploy these to business users for ongoing use? Historically, this means e-mailing spreadsheets or presentations or, possibly, a lengthy IT project to build an interface. There will be long rounds of testing and validation. Eventually, a user-ready solution may be deployed, but probably long after it was needed or requested.

The Opalytics Cloud Platform (OCP) deploys advanced analytics models instantly. The OCP delivers faster and more reliable model development through a layer that defines the schema data requirements and enables testing with real data formats. The result is then immediately deployed to business users in a form they can readily understand and use. In this workshop, we will show how to deploy an optimization model on the OCP in three steps:

Step 1: Develop Analytics: Define the data and solver leveraging OCP support for schema, keys, types, validation rules.
Step 2: Deploy the Advanced Analytics Model on the Opalytics Cloud Platform: Use Instant Applications to load your model. OCP will recognize the schema and perform data validation. Business users automatically enjoy a rich, easy-to-use, off-the-shelf application.
Step 3: Go live! Integrate into the system of record. Use your data definitions as a target for data connectivity scripting. Configure your OCP solver to automate simple data cleaning tasks, and leverage the OCP GUI for result analysis and diagnostics.

CPLEX Optimization Studio Modeling, Theory, Best Practices and Case Studies

Optimization Direct, 9:00-10:45am

Recent advancements in Linear and Mixed Programing give us the capability to solve larger Optimization Problems. CPLEX Optimization Studio solves large-scale optimization problems and enables better business decisions and resulting financial benefits in areas such as supply chain management, operations, healthcare, retail, transportation, logistics and asset management. In this workshop using CPLEX Optimization Studio we will discuss modeling practices, case studies and demonstrate good practices for solving Hard Optimization Problems. We will also discuss recent CPLEX performance improvements and recently added features.

Risk & Decision Assessment using @RISK

Palisade Corporation, 1:00-2:45pm

The course starts with a brief review of core principles, best practices and tools involved in the building of good Excel models. Next part is about how Palisade’s tools add value to models built in Excel and how can users get the most of it. Basic simulation models will be used to show the complete lifecycle of a @RISK model, and how to quickly get tangible value to the tool. Percentiles and risk profiles will be touched, as well as basic probability concepts, necessary to transfer the knowledge to the decision makers.

Have some knowledge of key principles and best practices in general Excel modeling, including the ability to recognize where pure Excel-based approaches are insufficient;
Recognize situations where @RISK could be of use;
Have sufficient knowledge of @RISK to build simple yet powerful models for realistic applications in risk modeling, decision analysis, and optimization.

pandas for Analytics Practitioners, with Applications in Optimization

Dr. Irv Lustig, Optimization Principal, Princeton Consultants, 3:00-4:45pm

The Python library pandas ( is popular with data scientists, who use it to carry out an entire data analysis workflow in Python. When building analytics models, we often work with data in tables that are sourced from databases, CSV files, and spreadsheets. pandas provides a uniform environment for working with data tables with a large number of methods for manipulating tabular data, many of which are directly applicable for building large scale optimization models. In this workshop, Irv Lustig will present an introduction to pandas and illustrate some of its powerful features that can accelerate optimization model development and deployment.

Text Analytics Software

Provalis Research, 1:00-2:45pm
Normand Peladeau

QDA Miner is easy to use qualitative and mixed methods software that meets the needs of researchers performing qualitative data analysis and would like to code more quickly and more consistently larger amounts of documents. It offers high level computer assistance for qualitative coding with innovative text search tools that help users speed up the coding process as well as advanced statistical and visualization tools. Users with even bigger text data can also take advantage of WordStat. This add-on module to QDA Miner can be used to analyze huge amounts of unstructured information, quickly extract themes, find trends over time, and automatically identify patterns and references to specific concepts using categorization dictionaries.

Want SAS Skills? Get SAS University Edition, a Powerful and Free Analytical Tool from SAS.

James Harroun, SAS, 9-10:45am

SAS University Edition is a powerful tool for data integration and manipulation, reporting, and analysis. SAS University Edition provides an intuitive interface that gives users the ultimate flexibility: code using the programming interface, use the graphical user interface and simply drag and drop, or combine coding and tasks to build an entire program as a graphical process flow. SAS University Edition allows users to combine SAS syntax with other supported languages such as SQL and the SAS Macro Language and also allows users to access SAS through the Jupyter Notebook. This workshop will provide guidance on how to download and install SAS University Edition, how to load data, how navigate and use the SAS University Edition interfaces, and this workshop will provide a variety of demonstrations on the functionality of this free software tool. Attendees will also learn about free e-Learning and online resources from SAS that provide prospective users with training and support.

Solving Business Problems with SAS Analytics and OPTMODEL

SAS, 3:00-4:45pm

SAS offers diverse analytic capabilities, including data integration, statistical analysis, data and text mining, forecasting, optimization, and simulation. The OPTMODEL procedure from SAS provides you with a full-featured optimization modeling language, access to a diverse set of solvers, and the ability to create and use customized solution algorithms.

We’ll explore analytical and optimization case studies drawn directly from our work with SAS users in government, pharmaceuticals, and transportation. These case studies demonstrate PROC OPTMODEL’s power and versatility in building and solving optimization models, and in integrating with the full array of analytics provided by SAS.

Data Discovery and Analysis with JMP 13 Pro

SAS JMP®, 11:00am-12:45pm
Mia Stephens, Academic Ambassador
JMP Academic Programs

JMP Statistical Discovery Software is visual and interactive desktop software for Windows and Mac, with a complete array of integrated graphical and statistical features.

In this workshop we use JMP 13 Pro to demonstrate tools for data preparation, visualization, and exploration, including recode, Graph Builder®, the data filter, and geographic mapping. We’ll see how to analyze univariate, bivariate, and multivariate data, and will demonstrate tools for building and interacting with predictive models. Finally, we’ll see how to share results using HTML output and interactive web reports.

Modern Analytics: Scaling Data Science with Statistica, Open Source, and Reusability

Statistica, 9-10:45am

As you are keenly aware, many organizations struggle with recruiting and retaining key analytic talent. As a Statistician or Data Scientist, there never seems to be enough time in the day to finish all of the analytic projects on our desk. Some organizations rely on commercial software, some on open source, and some use a combination of both. Not sure which algorithm to use, no problem we can let Statistica recommend the best or use ensemble modeling to combine all models. Need a niche algorithm? Learn how to use the algorithm economy and integrate with algorithm marketplaces like Algorithmia, Apervita and AzureML. This hands-on workshop will share best practices on how to scale data science within the organization so you can work smarter, not more.

In the workshop you will learn how to:

Scale data science by create reusable analytic and data prep work flow templates
Empower business users with interactive dashboards and visualizations
Incorporate Python and R scripts within your workflows
Incorporate algorithms from marketplaces like Algorithmia, Apervita, and AzureML
Understand best practices for deployment and monitoring of data prep and analytic workflows

Solving the Puzzle: How to Build a Big Data Analytics Capability complete with People, Process, and Technology

WebbMason, 1:00-2:45pm

There are three topics that are dominating today’s analytic discussion: Big Data, Data Science, and Agile Development. Remove one and the risk of failure increases significantly. In this workshop, you will receive a combination of best-practices for structuring your analytic teams and processes for success, as well as a hands-on deep dive on the technologies that will allow you to change Big Data into Analytic-Friendly Data. This workshop is divided into three parts:
Part 1: Learn how other companies have transformed their legacy data warehouse, reporting and BI teams into Big Data Data Engineers and Data Scientists.
Part 2: Learn how to run a Big Data Analytic project by implementing an agile development process in JIRA
Part 3: Learn how to handle Big Data and leverage different technologies so your team can deliver insights fast. Get hands-on experience using the latest Big Data data science tool – Dataiku – by implementing a Big Data, Data Science use case.