Technology Tutorials

Effectively Leveraging a Small Optimization Team Across a Large Organization

Frans van Helden, Scott Nicholas, Luuk Besselink
ORTEC
Monday, April 3, 2:10-3pm

Are you part of a small but growing optimization team that struggles to find ways to integrate with business? Do roadblocks exist within your organization to bring more analytics into operations? This workshop covers successful approaches that have been used to help strength the relationship between optimization teams with the rest of their organization.

We will share:
Methodologies to improve engagement across departments and teams
How to nurture your team from startup to scale-out phase
How to deliver business value within 3 months
Methods to extend the scale of your team using AIMMS

AMPL in the Cloud: Using Online Services to Develop and Deploy Optimization Applications through Algebraic Modeling

Robert Fourer, President
AMPL Optimization Inc.
4er@ampl.com
Monday April 3, 2:10-3pm

Optimization modeling systems first appeared online almost 20 years ago, not long after web browsers came into widespread use. This presentation describes the evolution of optimization alternatives in what has come to be known as cloud computing, with emphasis on the role of the AMPL modeling language in making models easy to develop and deploy. We start with the pioneering free NEOS Server, and then compare more recent commercial offerings such as Gurobi Instant Cloud; the benefits of these solver services are readily leveraged through their use with the AMPL modeling tools. We conclude by introducing QuanDec, which creates web-based collaborative applications from an AMPL models.

anyLogistix: Integrating Analytical and Dynamic Simulation Methods for Precise Supply Chain Design and Analysis

Derek Magilton, Director of Business Development
AnyLogic North America
Monday, April 3, 10:30-11:20am

Although companies have had some success with the current supply chain network optimization tools available, there is one large component missing. You guessed it, simulation. The power of simulation modeling is apparent in many sectors of an organization. This presentation will concentrate on how simulation modeling benefits end to end supply chain analysis including: the ability to observe how your supply chain will perform over time, incorporating and gaining visibility into dynamic interactions between supply chain elements, analyzing real-world stochasticity into various supply chain inputs and processes, simulating behavior that occur inside the ‘four walls’, and confirming and validating the adoption of supply chain policies.

Solving Large Nonlinear Least-Squares Models with the Artelys Knitro Optimization Solver

Richard Waltz, Senior Scientist
Artelys Corp
richard.waltz@artelys.com
Monday April 3, 9:10-10am

Artelys Knitro is the premier solver for nonlinear optimization problems. This software demonstration will highlight the latest Knitro developments, including a new specialized API, as well as enhanced algorithms, for large-scale nonlinear least-squares models. Nonlinear least-squares is an increasingly important class of problems with many applications in engineering, statistics and machine learning. We will demonstrate how to solve nonlinear least-squares models using Knitro through a variety of interfaces such as R, MATLAB and C/C++. We will also provide benchmark results against other least-squares solvers, including Google Ceres. In addition, we will summarize some of the other recent developments in Knitro.

Artelys Crystal Energy Planner: A Software for Power Generation Assets Optimization

Violette Berge, Optimization Consultant
Artelys Canada, Inc.
Tuesday April 4, 9:10-10am

This presentation focuses on the use of Artelys Crystal Energy Planner, a software platform within the Artelys Crystal Suite, dedicated to short and medium-term optimization of power generation assets (thermal, gas, hydro, etc.). Our users are mostly energy producers willing to increase their benefits or reduce their generation costs. Artelys Crystal Energy Planner integrates an accurate and complete modeling of their energy system (production assets, supply and sales contracts, markets, stocks). It takes all the constraints into consideration to generate an optimal extensive production plan using state-of-the-art combinatorial optimization techniques. The ergonomic user-interface is designed to ease input data, visualize production plans and export them.

How to Deploy Your Analytic Models to Empower Non-Technical Business Users

Jim Williams
FICO
Monday, April 3, 10:30-11:20am

You have a team with great analytics background. They have developed advanced analytical tools using Python, R, or with your current traditional optimization solver. They have derived crucial insights from your data, and figured out how your decisions shape your customers’ behaviors. Now it’s time to put these critical analytical insights in the hands of your non-technical business users.

In this tutorial, we will cover how FICO’s Optimization Suite (including Xpress and Optimization Modeler) make it possible to embed your analytic models in user-friendly business-user facing applications. Learn how you can supercharge your analytic models with simulation, optimization, reporting, what-if analysis and agile extensibility.

Creating Interactive Analytics on the Web with Forio Epicenter

Forio
Monday, April 3, 3:50-4:40pm

Forio Epicenter makes your model available to hundreds of people within your organization through the browser. Forio Epicenter supports R, Python, Julia and other languages for optimization, machine learning, simulation, and other analytics techniques. The platform is enterprise-compatible with the ability to integrate with an organization’s existing IT infrastructure and tiered control for thousands of users. We will start with an introduction to Epicenter and sample interactive online models. Then we’ll divide the workshop into two parts. In the first part, we will teach you how to get your analysis on a server so it can be shared. In the second part, we’ll focus on creating a user interface for your model.

AnalyticSolver.com: Data Mining, Simulation and Optimization in Your Web Browser

Daniel Fylstra, President
Frontline Systems, Inc
Tuesday, April 4, 11:30am-12:20pm

AnalyticSolver.com is the new, simple, point-and-click way to create and run analytic models using only your web browser – that also works interchangeably with your spreadsheet. Whether you need forecasting, data mining and text mining, Monte Carlo simulation and risk analysis, and conventional and stochastic optimization, you can “do it all” in the cloud. We’ll show how you can upload and download Excel workbooks, pull data from SQL Server databases and Apache Spark Big Data clusters, solve large-scale models, and visualize results – without leaving your browser. If you’re more comfortable working on your own laptop or server, we’ll show how you can do that, too. Bonus: “Learn it all” in the cloud, at your own pace with our web-based, self-service courses – complete with quizzes, midterm, final, and certificate of completion.

Combining Predictive and Prescriptive Models in Gurobi: A Simple Case Study

Dr. Daniel Espinoza, Senior Developer
Gurobi Optimization
Monday, April 3, 11:30am-12:20pm

Companies typically now gather fine-grained historical and real-time data that tracks sales, operations, and many other aspects of the business or industry. Applying readily available statistical modeling tools to this data has led to big improvements in many industries. However, when using those statistical models as the basis for making better decisions, companies often couple sophisticated statistical techniques with very simple, often ad-hoc approaches to making decisions, resulting in systems that either ignore important aspects of the overall business environment or leading to systems that are difficult to adapt over time as business conditions change. In this tutorial, we’ll illustrate how general-purpose optimization tools like Gurobi can be used to tackle a well known business problem. From there, we’ll incorporate several different special business situations and conditions into the model. The result is in an easy-to-understand application that delivers superior results quickly. We’ll illustrate our example with Python implementations of the statistical model and the optimization model.

Develop a Data Science Project Using Predictive Analyses and Decision Optimization

Ted Fischer & Vincent Beraudier
IBM
Tuesday, April 4, 11:30am-12:20pm

In this tutorial, you will learn how to develop a Data Science project for a Marketing campaign planning use case leveraging the IBM Decision Optimization (CPLEX), Python API and Cloud service, SPSS Predictive Analytics, and IBM Data Science Experience.

Optimization Modeling Tools from LINDO Systems

Mark Wiley
LINDO Systems, Inc.
Monday, April 3, 11:30am-12:20pm

Exceptional ease of use, widest range of capabilities, and flexibility has made LINDO software the tool of choice for thousands of Operations Research professionals across nearly every industry for over 30 years. LINDO offers a full range of solvers to cover all your optimization needs. The Linear Programming solvers handle million variable/constraint problems fast and reliably. The Quadratic/SOCP/Barrier solver efficiently handles quadratically constrained problems. The Integer solver works fast and reliably with LP, QP and NLP models. The Global NLP solver finds the guaranteed global optimum of nonconvex models. The Stochastic Programming solver has a full range of capabilities for planning under uncertainty.

Get all the tools you need to get up and running quickly. LINDO provides a set of versatile intuitive interfaces to suit your modeling preference.

What’s Best is an add-in to Excel that you can use to quickly build spreadsheet models that managers can use and understand.

LINDO has a full featured modeling language for expressing complex models clearly and concisely, and it has links to Excel and databases that make data handling easy.

LINDO API is a callable library that allows you to seamlessly embed the solvers into your own applications.

You can pick the best tool for the job based upon who will build the application, who will use it, and where the data reside. Technical support at LINDO is responsive and thorough – whether you have questions about the software or need some guidance on handling a particular application. Get started today. Visit our booth or www.lindo.com to get more information and pick up full capacity evaluation licenses to try them out on your toughest models.

Integrated Business Planning (IBP) Drives Decision Making Across the Extended Business

Henry Canitz, Director of Product Marketing & Business Development
Logility, Inc.
Tuesday April 4, 11:30-12:20am

Logility Voyager Integrated Business Planning™ combines volumetric and financial information with powerful analysis capabilities and collaborative workflow to align and synchronize your strategic and tactical planning processes. In this session, we show how Voyager IBP revolutionizes business planning by uniting volumetric and financial information into one flexible planning and decision support system to accelerate, direct and optimize business decisions.

Span traditional S&OP, SIOP, financial and strategic planning

Synchronize strategic and tactical analysis over multiple time horizons

Perform fast planning simulations, comparisons and what-if scenarios

Identify and assess risk to ensure mitigation and prompt response to disruptions

Monitor, measure and report based on Key Performance Indicators (KPIs)

Enable descriptive, diagnostic, predictive, and prescriptive analytics

Opalytics

Deploying Machine Learning and Optimization on the Opalytics Cloud Platform

Opalytics
Monday, April 3, 3:50-4:40pm
Dr. David Simchi-Levi, Chairman and Co-founder, Opalytics


Most analytics applications in supply chain and operations involve some form of forecasting and optimization. These are also called more generally predictive and prescriptive analytics. We will show how to rapidly deploy predictive and prescriptive analytics on the Opalytics Cloud Platform. One example we will show is the deployment of Price optimization using Machine Learning and Optimization.

An Introduction to ODH-CPLEX and Recent Computational Results

Optimization Direct
Monday, April 3, 9:10-10am

ODHeuristics is a general purpose program built on CPLEX for obtaining good feasible solutions to MIPs. It is intended for use on large scale MIP models, many of which are so computationally onerous that it is not possible to raise the best bound at all beyond the root solve. ODHeuristics is a general purpose program built on CPLEX for obtaining good feasible solutions to such MIPs.

Whilst these good solutions are useful they do not provide the optimality guarantee that many users require. ODH-CPLEX is the CPLEX optimizer in which ODHeuristics is embedded using the standard CPLEX API. On computers with many cores, it delivers the benefits of ODHeuristics whilst using CPLEX to provide optimality measures (the “gap”). Providing good solutions early can accelerate the CPLEX solve.

On small scale test sets such as MIPLIB2010, ODH-CPLEX performs on average as well as CPLEX alone, i.e. the benefits of ODHeuristics compensate for the resources spent on it.

On large scale MIPs it provides good solutions and optimality measures that are often beyond the reach of traditional optimization methods.

It is designed for scheduling problems but works for any MIP which has a reasonable number of integer feasible solutions. It has been deployed effectively on packing problems, supply chain and telecoms as well as scheduling applications.

This talk reviews ODH-CPLEX performance on standard test sets and on large scale user MIPs. Features of models which suggest that ODH-CPLEX might work well are identified and the benefits of parallelism explored.

Choosing the Right Granularity: How Deep Should I Model?

Palisade Corporation
Tuesday, April 4, 9:10-10am

Gustavo Vinueza
Trainer, Consultant
798 Cascadilla Street
Ithaca, NY 14850
+1607.277.8000
gvinueza@palisade.com

You are in a mental plane and need a 10,000 feet view, a 30,000 feet one and you need to literally touch ground. You need to target the right audience with the right numbers so the information gap is minimal. You need to show the model to an Operations Manager, a Financial Analyst and the owner of the company. Choosing the right granularity is key to answer the same questions with different levels of aggregation. In this presentation, @RISK models are shown where distribution definitions, correlation matrices, outputs and graphical reports are used accordingly, in order to send the right message to the right people. Recommendations and tips will be shared on how to categorize, generate partial results and how to get the most of the in order to get right to the point, at any level.

Analytics Model Review and Validation

Princeton Consultants
Irv Lustig, PhD
Optimization Principal
Tuesday, April 4, 9:10-10am

Acting as an independent third party, Princeton Consultants reviews analytics models and how they are deployed in a business. Through our Advanced Analytics Model Review and Validation service, we ask questions such as: What is a correct model? What data is being integrated and how? How are solutions published and used in the business? How sensitive are the answers to the inputs? Did the implemented model reflect the intentions of the practitioner? In this tutorial, Irv Lustig will illustrate the importance of addressing these questions in the context of deploying advanced analytics models in practice.

Text Analytics Software

Normand Peladeau
Provalis Research
Monday, April 3, 11:30am-12:20pm

QDA Miner is easy to use qualitative and mixed methods software that meets the needs of researchers performing qualitative data analysis and would like to code more quickly and more consistently larger amounts of documents. It offers high level computer assistance for qualitative coding with innovative text search tools that help users speed up the coding process as well as advanced statistical and visualization tools. Users with even bigger text data can also take advantage of WordStat. This add-on module to QDA Miner can be used to analyze huge amounts of unstructured information, quickly extract themes, find trends over time, and automatically identify patterns and references to specific concepts using categorization dictionaries.

Building and Solving Optimization Models with SAS

Rob Pratt, Senior R&D Manager
SAS Institute, Inc
Monday, April 3, 10:30-11:20am

SAS provides comprehensive data and analytic capabilities, including statistics, data/text mining, forecasting, and operations research methods: optimization, simulation, and scheduling. The OPTMODEL procedure from SAS provides a powerful and intuitive algebraic optimization modeling language, with unified support for linear programming, mixed integer linear programming, quadratic programming, nonlinear programming, constraint programming, and network-oriented optimization models. We’ll demonstrate PROC OPTMODEL, highlighting its newer capabilities and its support for standard and customized solution approaches.

SAS Visual Statistics

James Harroun
SAS Institute, Inc, Educations/GAP
Tuesday, April 4, 1:50-2:40pm

This tutorial teaches the basics on how to build predictive statistical models using SAS Visual Statistics. These models are built by using the drag and drop functionality of SAS Visual Statistics. Diagnostic and fit statistics are produced to enable the user to evaluate the models built. These models can be decision tree, regression, and general linear models. SAS Visual Statistics provides you with the capability of comparing these models. SAS Visual Statistics also allows you to perform cluster analysis and use these clusters or segments to build models for each segment.

Analyzing Unstructured Text Data with the JMP 13 Text Explorer

SAS JMP®
Mia Stephens, Academic Ambassador
JMP Academic Programs
Monday, April 3, 9:10-10am

In the era of big data, a majority of the data captured by organizations is unstructured. Much of this unstructured data is in the form of text – from customer feedback, survey results, emails and texts, web reports, social media and other channels. Analyzing this text-based information is particularly challenging, but the new Text Explorer platform in JMP 13 makes it easy. This platform provides an efficient and interactive tool for analyzing unstructured text data, allowing us to easily extract information and transform unstructured text data into structured information.

In this session, we’ll use case studies to demonstrate how to use the JMP Text Explorer platform to analyze text data. We’ll use a word cloud to visualize word frequency, use latent class analysis to cluster words, and apply other tools to understand underlying themes in unstructured text data. We’ll also see how to create a document term matrix (DTM), and will use the resulting structured data in predictive modeling.

Learn & Discuss Tips and Tricks for Teaching Analytics to Business Students

Stukent, Inc.
Jeff Dotson, PhD – Associate Professor of Marketing – BYU
Monday, April 3, 3:50-4:40pm

The analytics world evolves at a rapid pace and the ability to keep up with emerging technologies, platforms, tools, and strategies is vital for any instructor.

Jeff is the author of Developing Data Products: A Practical Introduction to Business Data and Analytics, which is published through Stukent, Inc.

Jeff will be speaking on how to effectively use industry-leading tools such as Microsoft’s Power BI, Azure Machine Learning Studio, and Excel to construct a course in business analytics.

Come catch up on all of the latest tips and tricks to teach your students, plus get FREE instructor access to the textbook.

Operational Analytics In The Age of Big Data

Todd Jones, SVP Analytics, WebbMason Analytics
Brad Donovan Director of Operational Strategy, Blue Cross Blue Shield North Carolina
WebbMason
Monday, April 3, 2:10-3pm

Traditional operational analysis relied on samples of data extracted from machines and processes on fixed time intervals. In the world of Big Data, accurate, complete, second-by-second process data can be extracted, stored, and analyzed to improve our understanding of bottlenecks and opportunities for optimization. In this new era, traditional business strategies, technologies, and organizational structure must be modified in order to keep pace with technology and stay ahead of competition. With changes in regulations and the rising cost of healthcare, Blue Cross Blue Shield of North Caroline is leveraging Big Data to improve operational analysis and drive efficiency. In this talk, you will see examples of how Blue Cross Blue Shield of North Carolina is leveraging Big Data technologies to provide better visibility into operational processes, data engineering to support self-service analysis, and data science to drive operational efficiency.