Risk is something we deal with all of the time, and humans are good at assessing some risks and bad at assessing others. But, we need to assess risk and decide what it is worth to us (or our organisation), this is a risk assessment. The thing with risk is that we only see the outcomes, something happened, something didn’t happen and we tend to only attribute value to the outcomes, not to the risk itself.
For example a person reversing over kids walking to school – bad. Person reversing without looking but no-one was there – meh. But the thing most people don’t think about is that taking the risk is the bad thing, not the outcome, that’s just luck.
Are outcomes a good proxy for assessing risk? It depends on the risk. For high probability risks, they are fine because we are familiar with them and there is a lot of data so it is more reliable. For low probability risks, not so much. The lack of data tends to make us underestimate the likelihood of low probability events, past performance is a bad guide.
And this is a challenge we will face in security, when we are talking about risks people have not encountered or thought of, it is difficult to instigate change.
Risk Quadrant
Key:
Top left: covered by BAU
Top right: Covered and planned for
Bottom left: not too worried about these
Bottom right: Hollywood events, we are not good at understanding these.
Errors
There are often two types of errors in measurements or systems. Type I and Type II are often opposing and change in one often causes inversely proportional change to the other.
Reflection
I think I’m not in the usual camp on risk assessment. I do value risk, I’m always boring my wife with risks as I observe them. I found the two error types quite amazing, I often feel like we come down to ‘this’ or ‘that’ binary decisions when designing or building a product or system or process. And here we are, there actually are two error types.
Things to remember:
- Risk can be invisible
- Incurring risk is bad, whether you get away with it or not
- It’s hard to quantify low probability, high impact Hollywood risks. But it is our duty to quantify and assess them, by using engineering principles (other experts, best practice/standards, case studies).
- Be aware of a posteriori hubris (being over confident post event)