Operations research and analytics are driving advancements in government that touch nearly every part of our lives. From improving disaster relief efforts following a storm, to enhancing access to healthcare, to criminal justice and immigration reforms, and insuring our national security, analytics is saving lives, reducing costs, and improving productivity across the private and the public sectors. Just as business leaders have used O.R. and analytics to make smart business decisions, policymakers in government have increasingly turned to these modern tools to analyze important policy questions. Come see how the latest applications of analytics are solving public policy problems.
Interactive Simulations in Support of Warfighters, Intel Analysts and Policy Makers
The Open SIPmath™ Standard from 501(c)(3) ProbabilityManagement.org allows simulations in any environment to be networked by communicating uncertainties as arrays of Monte Carlo realizations called SIPs. This presentation will show how to roll up operational risk in native Excel or other computer environments that support arrays. This sort of analysis is particularly applicable to infrastructure such as roads, bridges, communications networks, pipelines etc. Examples will include portfolios of mitigations for gas pipeline risk, military communications networks, and protection against flooding of coastal regions. The presentation is for all Excel users who make decisions under uncertainty, so bring your laptop. No statistical background is assumed, but for those with extensive training in the area, this session should repair the damage. We encourage all participants to download some of the companion models to our presentation in advance available here: https://www.probabilitymanagement.org/models. We also encourage you to see our public article in ORMS Today titled, Probability Management: Rolling up operational risk at PG&E, which can be downloaded here: https://www.informs.org/ORMS-Today/Public-Articles/December-Volume-43-Number-6/Probability-Management-Rolling-up-operational-risk-at-PG-E.
Using Multiattribute Decision Analysis for Public-Sector Decisions
Since businesses exist to make money, the primary decision criterion for business decisions usually comes down to money (or occasionally proxies for future streams of money, such as market share). In the public sector this is not the case. There are typically many stakeholders with many diverse and divergent values that have to be taken into account. The techniques of multiattribute decision analysis (MADA) may not be able to make this problem actually easy, but they can help avoid certain common pitfalls. MADA can help define the value differences so they can be understood and dealt with explicitly, identifying the necessary tradeoffs. This talk will lay out a technically sound and easy-to-apply approach for multiattribute problems, based on an additive value model. The emphasis will be on sound and practical methods that understandable by clients without special training in analytics or operations research, and on clear methods of presentation of results. Several common technical errors, some of them surprisingly popular, will be pointed out. Other issues addressed will include cost issues, uncertainty, portfolio decisions, and sensitivity analysis
Starting from Scratch – An Army “start-up”
LTC Cade Saie
New organizations and companies are created every day in the private sector, however, in the public sector this is much rarer. The creation of Army Futures Command was the most significant Army reorganization since 1973. The concept of the command is simple, build a better Army for years to come through harnessing artificial intelligence and big data analysis to quickly process information and identify trends that will shape modernization efforts. This presentation will the share lessons learned starting from scratch and address the pitfalls associated with developing an enduring strategy and capability for a command managing a $30 plus billion modernization portfolio from day one.
Process Mining: The Capability Every Organization Needs
Process Mining is an emerging AI/ML technique which may be thought of as an x-ray capability for your organization’s processes. It allows you to see where process challenges reside, simulate change assumptions, make corrections with confidence, and quickly re-measure the upgraded ecosystem — capturing return on investment every step of the way. Your organization has next-level strategic advantage hidden within your IT systems. All systems create “data exhaust” which is rich with process activity trails documenting the actions of users or machines while performing business activities. When process ecosystems are not optimized towards meaningful goals, your organization hemorrhages costs unnecessarily. Failing to adopt cutting edge artificial intelligence to optimize your processes places you at a competitive disadvantage. In this session, you will learn process mining fundamentals, hear impactful cross-industry use cases, and understand why it is the capability your organization needs to compete and transform continually.
Most commercial and government organizations have massive collections of useful data spanning decades. However, with that quantity comes the inability to quickly analyze patterns and discern insights. Performing a holistic assessment may require checking and resolving entities across siloed systems, which can take days. In today’s technology driven era, there are strong solutions in the market that understand the advances in big data to help uncover hidden or complex relationships to visualize and make connections within seconds.
In addition to reviewing the principles and use cases for rapid data discovery, this session also deep dives into a successful case study – the Texas Department of Public Safety’s Intelligence and Counterterrorism (ICT) Division. ICT tackled these challenges head on by developing a state-of-the-art modern analytical data mart. The solution provides ICT Analysts a data platform with sophisticated data visualization capabilities that enables them to obtain results from a number of large data sets in a user-friendly, proficient, accurate, and expedient manner on a continuous (24/7) basis.