The Decision & Risk Analysis track describes effective ways to aid those who must make complex decisions. In particular, the talks reference systematic, quantitative, and interactive approaches to address choices, considering the likelihood and impact of unexpected, often adverse consequences.
The Art of Decision Framing and Uncertainty Analysis for Clarity of Action
How can we ensure that teams operate at peak efficiency while enabling managers to make high-quality, informed decisions? Powerful ingredients combine to achieve this, Decision Quality, across global industries: decision framing, insightful uncertainty analysis and timely dialogue between decision makers and project teams.
So, what is a decision frame and why is it important? A decision frame is a group’s bounded viewpoint of a decision problem. It’s important because while all projects have their share of issues and complexity, not all issues are created equal. Structured framing achieves clarity and consensus quickly, which is not only important but critical in defining the range of alternatives which should be considered.
Practical uncertainty analysis brings scenario thinking to the evaluation of alternatives, allowing teams to imagine different possible futures, gain insight and brainstorm hybrid solutions to consider. The result is well thought out clarity of action.
Precision Medicine vs Accurate Medicine? A Critical Decision Fraught With Risk
While scientists can rigorously define the terms accuracy and precision, in healthcare they are used colloquially, andthis affects us personally through clinical decisions that involve friends, family and ourselves. To appropriately use these concepts to guide critical risk management and decision making we must consider the diverse perspectives and priorities of patients, physicians, payers, pharma and regulators. We develop comprehensive models of this complexity that can be objectively applied to any disease, implement the model in a “learning environment” and then use it to identify, prioritize and quantify elements of risk to improve personal and system-based decision making. We utilize a broad range of analytical tools,e.g.,graph theory, stochastic modeling, signal processing, etc., which we apply as appropriate to the specific question that needs to be addressed, and data that is available, across multiple scales. This approach should be transferable to the analysis of other complex networks and system-level problem areas.
Enrich Your Data with Better Questions
What problem are you trying to solve and why? What question really needs to be answered by your data analytics initiative? If you can’t answer this, your data analytics initiative will be yet another in a long line of company initiatives such as TQM, LEAN, and the Decision Sciences that failed to deliver real value because they were a solution looking for a problem. Avoid this common issue by learning to ask better questions of leaders that frame opportunities for creating real value. Avoid the ad hoc efficiency gains—become a skilled asker!
Analyzing Social Media Data To Identify Cybersecurity Threats: Decision Making With Real-time Data
Theodore “Ted” Allen
In 2018, 27.9% of businesses experienced a cybersecurity breach, losing over 10,000 documents and $3M according to the Ponemon Institute. Of breaches known to Ponemon, 77% involve the exploitation of existing bugs or vulnerabilities. In our work, we found that incidents occur in narrow time windows around when vulnerabilities are publicized. Can you optimally adjust your cybersecurity policies and decisions to address emerging threats? Analyzing social media will help you preemptively identify major medium-level vulnerabilities, which managers often ignore, but which contribute to a large fraction of the incidents and warnings. Success requires transforming textual information into numbers, and I present a method, called K-means latent Dirichlet allocation, that identified the Heartbleed virus. I will describe a Bayesian approach as well, and with both methods, you can adjust your cybersecurity as social media identifies new hazards. Related opportunities for closed loop control using Fast Bayesian Reinforcement Learning are also briefly described. The qualitative benefit of experimentalism of these methods enables improved maintenance options.
Can We Do Better than Garbage-In – Garbage-Out?
Can an analytics approach that receives poor quality (aggregated) data produce useful outputs, outputs that have low mean squared error and are calibrated? A team from IDI confronted this question as part of a research project with the Intelligence Advanced Research Projects Activity (IARPA) to mitigate insider threats. Here insider threats are people, who are driven by rage, national loyalty, or profit to steal, destroy, sabotage data from an organization. This talk describes the motives and behaviors of insider threats and details of our multi-modeling solution, which includes data elicitation activities to address missing data (e.g., correlations). The modeling techniques used range from discrete event simulation to copulas to stochastic optimization for simulation populations, from random forests to support vector machines to naïve Bayesian networks to neural networks for down-selecting the potential threats.