Professor; Head of Research Center; Co-Founder
FEUP; INESC; LTPlabs
Finding The Best Product-delivery Modes For Grocery Retail Stores
Sonae MC is the leading grocery retailer in Portugal. Its store outlet is divided in three segments: hypermarkets, supermarkets and convenience stores. To supply its 210 stores with about 60.000 products, it relies on two hubs and two specialized warehouses (for fish and meat). Stores are supplied every day with a dedicated, heterogeneous fleet and transportation related annual costs amount to around 30 million.
Nowadays, thanks to an optimization driven approach, Sonae MC has improved the way in which it supplies its stores and it was able to cut transportation related costs by more than 4%, while still keeping a similar service level to stores. After several iterations, Sonae MC has opted for a solution where the type of products consolidated is maintained stable throughout the year (instead of changing daily, for example), because it is easier to operationalize in the warehouses. The stores for which the type of supply was changed receive now more products together and at an earlier hour, which allow the shelves to be replenished sooner. In this talk we present the decision framework responsible for shaping and tuning the new decisions, as well as the change management challenges faced. At the core of this framework there are three linked optimization models that use both mathematical programming and metaheuristics.
In the food retail sector, maintaining the food quality across the supply chain is of vital importance. The quality of the products is dependent on its storage and transportation conditions. This peculiarity increases the supply chain complexity relatively to other types of retailers, leading to three types of intertwined food supply chains: frozen, chilled and ambient. Although products with different temperature requirements need to be stored separately, they can be transported together (consolidation) as long as the required transportation conditions are ensured. Actually, the choice of the consolidation to perform is a combinatorial problem as each store can be supplied separately for each category of products (temperature) or via a consolation. However, these consolidations have to take into consideration several business restrictions, such as stores have pre-defined time-windows for the products according to their replenishment policies that need to be secured; the preparation times in the different storage temperatures might be distinct and some consolidation combinations might not be possible; and warehouses have finite capacities which may constraint the consolidation of products as it increases the flow of materials that passes through them. In this context, finding the best product-delivery mode is a hard problem that needs to be tackled using the right analytical tools having the practical side-constraints embedded.
The applied approach has two innovative contributions: (1) an optimization approach to find the best product-delivery mode suitable to be applied in real-world settings; and (2) a clear evidence of the considerable savings that can be achieved with such a practical approach to product-delivery modes in grocery retailing.
The optimization approach comprises three linked models:- Consolidation Model – A mixed-integer programming model is formulated to identify the best type of consolidations for each store. The objective is to maximize the number of direct shipments to each store taking into consideration the operational characteristics defined, such as the minimum utilization of the vehicles and the flexibility to change the consolidation along the time.- Distribution Centers Capacity Model – The stores with potential to be supplied directly with products consolidated are ranked to define the priority of allocation to the hub. The stores are then allocated to the hub’s slots of time according to their time of preparation, which is dependent on the pre-defined time-window, time of travel and time of loading the freight.- Solution Evaluation Model – This step considers three components of cost, namely the transportation cost of supplying directly the stores that consolidate, the transportation and warehouse cost of moving the products to the consolidation hub and the transportation cost of the remaining stores being supplied with routes. For quantifying this last term, a fast Adaptative Large Neighborhood Search algorithm is used.The quality of the solutions provided, the implementation of user friendly outputs and automatic generated inputs gave the necessary motivation for the wide use of the new solutions by the company practitioners, namely by the transportation director. To ensure a smooth transition adoption was progressive: the groups of stores that promised the greatest savings had firstly changed their product-delivery mode and, only later, the remaining stores followed. The implications of the framework go beyond better decisions and they extend to a reinforced engagement in looking to operations form a different perspective having an attitude focused on cost optimization that may yield solutions that were not easily devised empirically.This talk focuses on the following topics:- How to efficiently find the best product-delivery modes for grocery retail stores- How to account for operational details in tactical supply chain decisions both in terms of demand patterns and detailed costing.- How to design a linked optimization framework to fine-tune the consolidation parameters, test tactical decisions and smooth the transition to a real-world implementation.- The advantages of an approach that goes deep into the retail operations over traditional optimal solution methods that require several theoretical assumptions.- An interesting approach to change management by giving accurate, detailed and directed information about future indicators with the operational and managerial outputs of the optimization framework.
Pedro Amorim has both a PhD and MSc in Industrial Engineering and Management, University of Porto.He is now Professor at Faculty of Engineering of University of Porto and was a visiting Scholar at Carnegie Mellon University. He co-founded the analytics consultancy company LTPlabs.Pedro Amorim develops and applies advanced analytical models and methods to help make better decisions, solving managerial problems in various domains (telecom, manufacturing, pharma, retail and logistics).He has conducted over 20 industry-based research and consulting projects with various companies and also involving science and technology foundations of several countries. Pedro Amorim has given over 50 oral presentations in international conferences and seminars. This includes conferences, such as Production and Operation Management Society Conferences, European Conferences on Operational Research, and Informs Conferences.
AL International Expert, Applied Mathematics R&D
A Suite Of Supply Chain Optimization Tools: Insights Gained From Academic Research To Worldwide Industrial Deployment
In recent years, Air Liquide has developed and deployed a suite of supply chain optimization tools supporting tactical and strategic decision-making throughout its worldwide operations. To accelerate this development, we sponsored academic research on customer tank allocation and fleet sizing at Virginia Tech through the Center for Excellence in Logistics and Distribution (CELDi). In-house, we developed a sourcing tool to assign customers to product sources and distribution depots.
Based upon our experience, this presentation will offer insights and specific industrial examples for the successful deployment of tools considering the following key elements: a clear business case with buy-in from management; establishment of an effective academic research collaboration, if appropriate; proactive, effective resolution of challenges as they arise; buy-in from the actual end-users of the tool; promotional and training materials for end-users and other stakeholders; an effective, sustainable user interface; and a sustainable platform for enterprise-wide deployment and support.
Jeff Arbogast is an Air Liquide International Expert in the Applied Mathematics R&D Group at Air Liquide’s Delaware Research and Technology Center. Jeff led an R&D project focused on the development of decision support tools for optimization of the bulk distribution supply chain. Through Air Liquide’s collaboration with Virginia Tech on this topic, Jeff has served as a leader of the industrial advisory board of the Center for Excellence in Logistics and Distribution (CELDi) since 2014. Previously, he has managed a Process Control & Logistics R&D group and projects related to regulatory control loop tuning, alarm management, human-machine interface design, and the application of analytics for plant reliability. Jeff received his BS and Ph.D. degrees in Chemical Engineering from Virginia Tech and the University of Connecticut, respectively.
A Solution To Reduce Manufacturing Lead Time And Working Capital Using Monte Carlo Simulation And Predictive Analytics
|The core technology of today’s leading material requirement planning (MRP) software was developed in the 1960s and is based on a deterministic approach using point estimation of average behavior. Such tools cannot effectively handle lead time uncertainties that originate from the supply and manufacturing processes, leading to unnecessary inventories and a prolonged production span. Traditionally, lead time compression has focused on decreasing the lead time of individual parts based on deterministic characteristics. In this paper, we introduce a string concept to identify the root cause of lead time inefficiency, within a risk adjusted view of the total end to end lead time. We use this method for complex products, where tens of thousands of unique strings are prioritized to show the true critical paths of the product. Inventory levels are maintained for parts over time specifically to cover lead time risks at the string level and at the platform level preventing material shortages and improving the on-time completion of the end products.
Previously, many studies in planning under uncertainty have been conducted in the fields of artificial intelligence (AI) and operations research (OR). In AI uncertainties are modeled by Bayesian statistics and Markov chains. In OR, the algorithms developed involve stochastic programming and fuzzy mathematical programming. However, many of these models come up short in solving real world problems, as they not only involve analytical models of finite states but also typically use smaller data sets that are clean and complete. Such models can be easily intractable even if only a few hundred nodes are involved. As we have seen time and time again, real world planning problems in the manufacturing industry are large scale and complex, and real world data, such as data collected from legacy systems and the fragmentary information for new parts and suppliers, is often inconsistent and incomplete. The key contribution of this project is that we provide a novel solution to solve both the data and computational complexity issues associated with planning under uncertainty. Our approach combines machine learning models and Monte Carlo simulation, which we implement by using the latest big-data platform. Our solution employs a massive Monte Carlo simulation to estimate the risks at platform, string and part levels in an SAP HANA in-memory database. We have built predictive models to derive distributions for elements with missing or incomplete data using R, a powerful analytics language built in SAP HANA.
We successfully applied our solution to one of the largest OEM manufacturers in A&D industry. The client has been able to realize many short term and long term benefits including
– A reduction in total product span time by more than 30%
– The elimination of 90% of annual long lead financial exposure from one aircraft program
– A reduction in total inventory (raw/WIP) by 15-25% over the delivery lifecycle
Besides its industrial application, the success of this combined approach can also help the academic community to rethink the way large scale planning problems in uncertain domains have been studied in the past.
Kevin Brannon is a Senior Manager within the Manufacturing practice of Deloitte Consulting LLP. He has over 15 years of industry experience in supply chain, focusing primarily in discrete, highly engineered sectors such as Aerospace & Defense. Kevin’s client work typically leads with an analytics-driven approach to diagnosing operational issues followed by delivery of operational improvement program. His most recent work has focused on lead time reduction, working capital management and direct labor cost reduction. He received an MBA in finance from the Wharton School of the University of Pennsylvania and his BS in Business Management from the Georgia Institute of Technology.
Leder & Schuh AG
Fashion Retail Assortment Planning: Methods And Solutions
A retailer’s assortment is defined by the set of products carried in each store at each point in time. The goal of assortment planning is to specify an assortment that maximizes sales or gross margin subject to various constraints. Assortment planning is a challenge for an organization running hundreds of stores in several countries, especially when its stores greatly vary in size and location type. In this talk, analytical, technical and organizational aspects of a shoe retailer’s assortment planning process are explored. The assortment planning method incorporates the individual stores’ assortment strengths and weaknesses, aiming at maximizing the stores’ key performance indicators. A complementary clustering approach is used to implement overall assortment strategy. The talk will also outline how planning lead-time was significantly improved and how the organization dealt with cultural, organizational and change management issues.
Markus Kohlbacher has more than 10 years of expertise in the Analytics, Decision and Management Science fields. Prior to joining Leder und Schuh AG, he worked as a management consultant primarily on large scale reorganization programs, strategy development, business analytics, predictive analytics, and process optimization projects. He has published his work in academic journals such as the Business Process Management Journal, Competitiveness Review, International Journal of Productivity and Performance Management, Process and Knowledge Management, and The Service Industries Journal. He also frequently speaks at international management and technology conferences. He holds a PhD in Industrial Management and a MSc in Telematics, both from Graz University of Technology.
Predictive Analyst, Researcher
Product Lifecycle Extraction And Clustering Based On Ingram Micros Multi Level Electronics Distribution Data
Good characterizations of product’s lifecycles, including stages of growth, maturity and decline are important for economical purchasing, pricing, marketing and investing decisions in the supply chain industry. Ingram Micro, a leader in B2B electronics distribution, has developed a tool that infers lifecycle phases directly from internal transactional data, panel data and Ingram Micro market share data. The approach uses (a) structural decomposition techniques based on scatterplot smoothing and (b) empirical mode decomposition. The extracted trends were then categorized via clustering methods (including k-means and hierarchical clustering) to detect similar sales lifecycle patterns across products. We also provide average profiles of such lifecycles that can be used to detect different crucial phases in a product’s timeline, for example trigger points of growth, maturity, and decline. Given the wide variety of products that Ingram Micro carries in its portfolio, we have an enriched database where we can not only identify product lifecycle differences amongst categories, but also the related impact it has on adjacent product categories. Lastly, product lifecycle descriptions can also be a critical piece of information to help drive pricing and marketing decisions, for example within the context of starting to sell a new product or service within popular technology domains in this competitive market.
|Olaf Menzer is a Predictive Analyst at Ingram Micro in Irvine, CA, United States. He received a M.Sc. in Bioinformatics from Friedrich Schiller University in Jena, Germany, and a Ph.D. in Geographic Information Science from University of California, Santa Barbara. His dissertation was on a set of studies that applied statistical modeling to ecosystem science, entitled “Temporal and Spatial Modeling of Urban Carbon Dioxide Fluxes Using a Data Based Approach”. Olaf is responsible for providing data-driven solutions to Ingram Micro’s Global Business Intelligence and Analytics department. Projects he recently focused on included demand forecasting for inventory stocking optimization and synthesizing large contact data bases. Previously, Olaf has been a teaching assistant at University of California, Santa Barbara for courses such as Environmental Decision Making and Spatial Reasoning. As a graduate student intern at the Data Science and Technology Department at Lawrence Berkeley National Lab he supported the generation of new data synthesis products from a large data set that covers ecosystems in the Americas. He received the 2014-2015 Dean’s fellowship at University of California, Santa Barbara.|
Speakers organized by Track
2016 Franz Edelman Award Competition
Analytics Leadership & Soft Skills
Decision & Risk Analysis
Fraud Detection & Cyber Security
Health Care & Life Sciences
INFORMS Prizes & Special Sessions
Internet of Things
Revenue Management & Pricing
Sports & Entertainment
Supply Chain Analytics
Technology Workshops – Sunday