A practical guide to adopting an accurate risk analysis methodology
The Failure of Risk Management provides effective solutionstosignificantfaults in current risk analysis methods. Conventional approaches to managing risk lack accurate quantitative analysis methods, yielding strategies that can actually make things worse. Many widely used methods have no systems to measure performance, resulting in inaccurate selection and ineffective application of risk management strategies. These fundamental flaws propagate unrealistic perceptions of risk in business, government, and the general public. This book provides expert examination of essential areas of risk management, including risk assessment and evaluation methods, risk mitigation strategies, common errors in quantitative models, and more. Guidance on topics such as probability modelling and empirical inputs emphasizes the efficacy of appropriate risk methodology in practical applications.
Recognized as a leader in the field of risk management, author Douglas W. Hubbard combines science-based analysis with real-world examples to present a detailed investigation of risk management practices. This revised and updated second edition includes updated data sets and checklists, expanded coverage of innovative statistical methods, and new cases of current risk management issues such as data breaches and natural disasters.
- Identify deficiencies in your current risk management strategy and take appropriate corrective measures
- Adopt a calibrated approach to risk analysis using up-to-date statistical tools
- Employ accurate quantitative risk analysis and modelling methods
- Keep pace with new developments in the rapidly expanding risk analysis industry
Risk analysis is a vital component of government policy, public safety, banking and finance, and many other public and private institutions. The Failure of Risk Management: Why It’s Broken and How to Fix It is a valuable resource for business leaders, policy makers, managers, consultants, and practitioners across industries.
Inhoudsopgave
About the Author xi
Preface xiii
Acknowledgments xvii
Part One An Introduction To The Crisis 1
Chapter 1 Healthy Skepticism for Risk Management 3
A “Common Mode Failure” 5
Key Definitions: Risk Management and Some Related Terms 8
What Failure Means 14
Scope and Objectives of This Book 17
Chapter 2 A Summary of the Current State of Risk Management 21
A Short and Entirely-Too-Superficial History of Risk 21
Current State of Risk Management in the Organization 25
Current Risks and How They are Assessed 26
Chapter 3 How Do We Know What Works? 35
Anecdote: The Risk of Outsourcing Drug Manufacturing 36
Why It’s Hard to Know What Works 40
An Assessment of Self-Assessments 44
Potential Objective Evaluations of Risk Management 48
What We May Find 57
Chapter 4 Getting Started: A Simple Straw Man Quantitative Model 61
A Simple One-for-One Substitution 63
The Expert as the Instrument 64
A Quick Overview of “Uncertainty Math” 67
Establishing Risk Tolerance 72
Supporting the Decision: A Return on Mitigation 73
Making the Straw Man Better 75
Part Two Why It’s Broken 79
Chapter 5 The “Four Horsemen” of Risk Management: Some (Mostly) Sincere Attempts to Prevent an Apocalypse 81
Actuaries 83
War Quants: How World War II Changed Risk Analysis Forever 86
Economists 90
Management Consulting: How a Power Tie and a Good Pitch Changed Risk Management 96
Comparing the Horsemen 103
Major Risk Management Problems to Be Addressed 105
Chapter 6 An Ivory Tower of Babel: Fixing the Confusion about Risk 109
The Frank Knight Definition 111
Knight’s Influence in Finance and Project Management 114
A Construction Engineering Definition 118
Risk as Expected Loss 119
Defining Risk Tolerance 121
Defining Probability 128
Enriching the Lexicon 131
Chapter 7 The Limits of Expert Knowledge: Why We Don’t Know What We Think We Know about Uncertainty 135
The Right Stuff: How a Group of Psychologists Might Save Risk Analysis 137
Mental Math: Why We Shouldn’t Trust the Numbers in Our Heads 139
“Catastrophic” Overconfidence 142
The Mind of “Aces”: Possible Causes and Consequences of Overconfidence 150
Inconsistencies and Artifacts: What Shouldn’t Matter Does 155
Answers to Calibration Tests 160
Chapter 8 Worse Than Useless: The Most Popular Risk Assessment Method and Why It Doesn’t Work 163
A Few Examples of Scores and Matrices 164
Does That Come in “Medium”?: Why Ambiguity Does Not Offset Uncertainty 170
Unintended Effects of Scales: What You Don’t Know Can Hurt You 173
Different but Similar-Sounding Methods and Similar but Different-Sounding Methods 183
Chapter 9 Bears, Swans and Other Obstacles to Improved Risk Management 193
Algorithm Aversion and a Key Fallacy 194
Algorithms versus Experts: Generalizing the Findings 198
A Note about Black Swans 203
Major Mathematical Misconceptions 209
We’re Special: The Belief That Risk Analysis Might Work, but Not Here 217
Chapter 10 Where Even the Quants Go Wrong: Common and Fundamental Errors in Quantitative Models 223
A Survey of Analysts Using Monte Carlos 224
The Risk Paradox 228
Financial Models and the Shape of Disaster: Why Normal Isn’t So Normal 236
Following Your Inner Cow: The Problem with Correlations 243
The Measurement Inversion 248
Is Monte Carlo Too Complicated? 250
Part Three How to Fix It 255
Chapter 11 Starting with What Works 257
Speak the Language 259
Getting Your Probabilities Calibrated 266
Using Data for Initial Benchmarks 272
Checking the Substitution 280
Simple Risk Management 285
Chapter 12 Improving the Model 293
Empirical Inputs 294
Adding Detail to the Model 305
Advanced Methods for Improving Expert’s Subjective Estimates 312
Other Monte Carlo Tools 315
Self-Examinations for Modelers 317
Chapter 13 The Risk Community: Intra- and Extra-organizational Issues of Risk Management 323
Getting Organized 324
Managing the Model 327
Incentives for a Calibrated Culture 331
Extraorganizational Issues: Solutions beyond Your Office Building 337
Practical Observations from Trustmark 339
Final Thoughts on Quantitative Models and Better Decisions 341
Additional Calibration Tests and Answers 345
Index 357
Over de auteur
DOUGLAS W. HUBBARD is the inventor of Applied Information Economics (AIE). His methodology has earned him critical praise from Gartner and Forrester Research. He is also the author of How to Measure Anything: Finding the Value of Intangibles in Business and How to Measure Anything in Cybersecurity Risk. His articles appear in Nature, The American Statistician, The IBM Journal of R&D, Information Week and many more. He has over 30 years of experience in management consulting focusing on the application of quantitative methods in decision making