Shifting Cyber Risk from Probability to Priority

Published by Axio

What kinds of risks are worth ignoring?

Think about purchasing home-owner’s insurance. A $10,000 policy is definitely enough coverage, right? Sure, really bad things like fires, tornadoes, floods and lightning strikes can happen, but any of things are extremely improbable. Thus conducting an annualized loss expectancy would likely predict that you’ll suffer well under $10,000 of losses to your home this year, so $10,000 of coverage should be more than enough. There’s a good argument to be made this amount is enough if you think of your risk of loss from a probabilistic perspective.

This way of thinking about risk is obviously flawed. For most people, a home is their most expensive purchase and most prized asset. It makes sense to protect it against unforeseen circumstances, even the ones that are low in frequency. It makes sense to look at risk in terms more granular than simply the probability of an event happening: it’s knowing the susceptibility and having the proper protective measures in place. Few individuals have the resources or capabilities to jump back from catastrophe. The same goes for businesses hit by a low frequency cyber attack.

The Allure of Probability

Conversations around cyber risk management are often centered around estimating both probability and impact of a risk event. Using cyber risk analysis centered around probability is alluring because we all want to know the future. However, a focus on probability can be misleading and even perilous for analyzing high-impact low frequency events, such as a large data breach or data destruction event.

Probability Traps

Having probability serve as the panacea to your organization’s cyber risk understanding has multiple traps. It’s also not grounded in the type of forward-thinking that actually provides a true sense of security.

In this article, we explain the 5 Probability Traps when attempting to manage your cybersecurity risk program.

1. ALE: Powerful for Frequency Events, Powerless for Others

Annual Loss Expectancy only works for risks that have historic predictability like those caused by natural events. ALE works when you’re talking about high frequency events (multiple occurrences per year) but is perilous when you talk about low frequency and catastrophic events. Multiple types of risks such as fraudulent transactions can be measured based on a large data set of precedent attacks. This is not the case for major, non-frequent attacks such as those experienced by German Steel Mill, Equifax, and Honda. Additionally, attacks such as a Not-Petya that are infrequent cannot be amortized over a decade and hence can’t be accurately depicted by Annual Loss Expectancy. For example, if a utility company was hit with $90M in losses due to a cyber event, they must incur that cost in the year it happened, and won’t be able to amortize it over 10 years.

2. Probability is not Actionable

Knowing the probability of an event is not actionable since you don’t get a vote on what an attacker is going to try. You only get a vote on how prepared you are for an attack that may happen. Susceptibility is something your organization can control. It’s important to have a core set of events that your organization is constantly thinking about. For example, some of these events can be: business interruptions, theft of funds, ransomware events, and device bricking. Probability can lead to a dead end when it comes to actionable approaches to mitigate risk.

3. False Sense of Confidence

Communicating probability to your executive team suggests that you know more about the frequency of major cyber events than you actually do. If a cyber event does occur, the probability then is 100%. Therefore, using probability to communicate risk can often give your executive team a false sense of confidence in their cyber risk management, leading them to be caught off guard when a catastrophic cyber event occurs. These types of events are only often understood when looking through a business context and determining impact.

4. Precision: The Search that Never Ends

Precision is a very slow process that may cost more than it’s worth. As Phil Venables famously said,  “If the cost of doing the risk analysis exceeds the cost of implementing the control then just implement the damn control.” When organizations have tunnel vision on precision, they can sometimes end up spending more money to model the risk than the actual cost of the event. According to the 80/20 rule, nearly 80% of the impact comes from 20% of the events. Rapid analysis can allow you to determine an impact range and prioritization can prepare against those with high impact.

5. Planning in a Vacuum

Being armed with an Annual Loss Expectancy number is only one part of evolving your cybersecurity strategy and making more prudent cyber investments. With ALE, you won’t have the granularity to understand what controls to invest in. Paul Proctor said in his Urgency to Treat Cybersecurity as a Business Decision report, “Doing cyber risk quantification is only good for insurance purchase decisions…if you are using it wrong.” Having a holistic understanding of your cyber risk program requires being able to quickly model the effect of control changes on one or more loss numbers and prioritize.

There’s a Better Way to Prepare for your Cyber Future

When you can predict your cyber future, it becomes very easy to prioritize what risks require more attention. Considering most organizations have limited resources, one magic number can give leaders confidence in their cybersecurity programs are optimized and make them look good to leadership across the enterprise. However, it’s not enough…

Preparing for the Most Impactful Cyber Events

An approach grounded in susceptibility allows leaders from every business unit to weigh in on what operations and outcomes the company needs to prioritize and determines plausible cyber incidents that could disrupt business operations and their assets. These financial impacts help inform business decisions such as insurance purchases, investing in controls and more. These costs are categorized into quadrants depending on who is affected (and what type of impact it is).  Afterwards, companies can optimize the entire portfolio of controls by playing out how changing one or more controls will impact exposure.