March 10, 2026
The hidden variable in cyber risk decisions: The decision environment
James Hanbury
Global Lead Director, Co-founder

Whenever anyone asks me for book recommendations, one of my go-tos is Thinking Fast and Slow by Daniel Kahneman. Amongst many achievements, he notably won the Nobel Prize for proving we’re not as rational as we think. That’s a deliberate simplification, but it’s a useful starting point for the purpose of this article.

He sadly passed away at the ripe age of 90 in March 2024. However in summer 2025 The Knowledge Project re-released an excellent interview with him which, after marking as a must-listen many months ago, I finally got around to. It reminded me just how relevant and timeless his work, advice, and frameworks are, both to individuals and to businesses.

Predictably, I can’t help myself but apply it to the world of cyber risk. Here are three of my reflections that I’ve tried to orient as useful for senior leaders who chair or contribute to group decisions about cyber.

Remove the restraining forces

First, stop adding “driving forces”, and start removing “restraining forces”.

Kahneman’s point is not to add more and more “driving forces” to push change (e.g. incentives, training, mandates) but instead to focus on the “restraining forces” that are holding things where they are (e.g. friction, fears, habits, situational barriers).

When I think about the measurement and reporting of cyber risk, this feels very familiar. We spend a lot of time explaining why quantitative measurement and leading indicators are better, but far less time understanding what’s blocking a move away from qualitative, subjective reporting, even when the balance of evidence makes a compelling case that the shift is worth it.

I touched on a number of the underlying concerns at play here in another article last year. They were about credibility, confidence, readiness, consistency, and timing.

For leaders who chair decision forums, the practical question becomes: what is actually stopping the group from trusting a more evidence‑based approach to decision making? I genuinely believe that asking that question more (and the second order questions that arise from it) will help us collectively move our industry on from risk matrices and, as David White aptly puts it, colouring crayons in the board room.

Delay the verdict

Second, delay intuition by forcing decisions to be made dimension by dimension.

Kahneman’s advice is very practical here. It emphasises the importance of breaking a decision into dimensions, assessing each dimension separately, and then, once the picture is complete, making the decision.

My take on this for cyber investment decisions for example, is that those dimensions might include how an initiative reduces risk to key scenarios like ransomware, how it defends against prevalent attacker techniques, what it does for critical business services, or how it supports regulatory expectations.

I think the fundamental objective of Kahneman’s points on this topic are to ensure a decision-making group is given a genuine opportunity to change its mind before early impressions harden into a “position”. If we don’t do this, we risk confirmation bias on a potentially incorrect decision.

Make dissent safe

Finally, protect dissenters, then institutionalise dissent through a pre-mortem.

My final reflection is focused on Kahneman’s comment: “If you’re head of group that makes decisions, then protect the dissenters, because they’re very valuable.”

He makes the point about how hard and costly it can sometimes be to be the person who disagrees. People often do not raise concerns because it is uncomfortable and because it can carry social or career risk. If leaders want better decisions, they need to protect dissenters and make disagreement as painless as possible.

This links nicely to an excellent article by Laura Cristiana Voicu on critical thinking where she writes in a line that stuck with me: “Critical thinking requires a genuine willingness to be wrong. Publicly. In front of people whose opinion you care about.

One useful tool Kahneman referenced was the “pre-mortem”, an idea Gary Klein developed and Kahneman strongly endorsed. The concept is simply to assume it is two years from now, the decision in question has turned into a disaster, and then everyone writes the history of how it failed.

In cyber, that is a strong fit for big bets like a major tooling or programme investment, a significant risk acceptance, a cloud or identity transformation, or an outsourcing decision.

Framed this way, dissent stops being personal and becomes part of the process. Leaders can make it clear that they expect alternative views, not as an obstacle to progress but as a way of stress-testing it.

Closing thought

In conclusion, I don’t think any of this is about finding the perfect method or the perfect model. It’s about creating the conditions where good judgement is more likely: removing the retaining forces that keep you anchored to the status quo, forcing decisions to be made dimension by dimension before a verdict is made, and making it safe to disagree early on, while it is still useful.

Read the next blog in the series

No items found.
Blog
CRQ in action
The hidden variable in cyber risk decisions: The decision environment
By submitting this form I agree that Cyber Risk Insights may collect, process and retain my data pursuant to its Privacy Policy.
Thank you! Use the button below to access the content.
Oops! Something went wrong while submitting the form.

Summary

Whenever anyone asks me for book recommendations, one of my go-tos is Thinking Fast and Slow by Daniel Kahneman. Amongst many achievements, he notably won the Nobel Prize for proving we’re not as rational as we think. That’s a deliberate simplification, but it’s a useful starting point for the purpose of this article.

He sadly passed away at the ripe age of 90 in March 2024. However in summer 2025 The Knowledge Project re-released an excellent interview with him which, after marking as a must-listen many months ago, I finally got around to. It reminded me just how relevant and timeless his work, advice, and frameworks are, both to individuals and to businesses.

Predictably, I can’t help myself but apply it to the world of cyber risk. Here are three of my reflections that I’ve tried to orient as useful for senior leaders who chair or contribute to group decisions about cyber.

Remove the restraining forces

First, stop adding “driving forces”, and start removing “restraining forces”.

Kahneman’s point is not to add more and more “driving forces” to push change (e.g. incentives, training, mandates) but instead to focus on the “restraining forces” that are holding things where they are (e.g. friction, fears, habits, situational barriers).

When I think about the measurement and reporting of cyber risk, this feels very familiar. We spend a lot of time explaining why quantitative measurement and leading indicators are better, but far less time understanding what’s blocking a move away from qualitative, subjective reporting, even when the balance of evidence makes a compelling case that the shift is worth it.

I touched on a number of the underlying concerns at play here in another article last year. They were about credibility, confidence, readiness, consistency, and timing.

For leaders who chair decision forums, the practical question becomes: what is actually stopping the group from trusting a more evidence‑based approach to decision making? I genuinely believe that asking that question more (and the second order questions that arise from it) will help us collectively move our industry on from risk matrices and, as David White aptly puts it, colouring crayons in the board room.

Delay the verdict

Second, delay intuition by forcing decisions to be made dimension by dimension.

Kahneman’s advice is very practical here. It emphasises the importance of breaking a decision into dimensions, assessing each dimension separately, and then, once the picture is complete, making the decision.

My take on this for cyber investment decisions for example, is that those dimensions might include how an initiative reduces risk to key scenarios like ransomware, how it defends against prevalent attacker techniques, what it does for critical business services, or how it supports regulatory expectations.

I think the fundamental objective of Kahneman’s points on this topic are to ensure a decision-making group is given a genuine opportunity to change its mind before early impressions harden into a “position”. If we don’t do this, we risk confirmation bias on a potentially incorrect decision.

Make dissent safe

Finally, protect dissenters, then institutionalise dissent through a pre-mortem.

My final reflection is focused on Kahneman’s comment: “If you’re head of group that makes decisions, then protect the dissenters, because they’re very valuable.”

He makes the point about how hard and costly it can sometimes be to be the person who disagrees. People often do not raise concerns because it is uncomfortable and because it can carry social or career risk. If leaders want better decisions, they need to protect dissenters and make disagreement as painless as possible.

This links nicely to an excellent article by Laura Cristiana Voicu on critical thinking where she writes in a line that stuck with me: “Critical thinking requires a genuine willingness to be wrong. Publicly. In front of people whose opinion you care about.

One useful tool Kahneman referenced was the “pre-mortem”, an idea Gary Klein developed and Kahneman strongly endorsed. The concept is simply to assume it is two years from now, the decision in question has turned into a disaster, and then everyone writes the history of how it failed.

In cyber, that is a strong fit for big bets like a major tooling or programme investment, a significant risk acceptance, a cloud or identity transformation, or an outsourcing decision.

Framed this way, dissent stops being personal and becomes part of the process. Leaders can make it clear that they expect alternative views, not as an obstacle to progress but as a way of stress-testing it.

Closing thought

In conclusion, I don’t think any of this is about finding the perfect method or the perfect model. It’s about creating the conditions where good judgement is more likely: removing the retaining forces that keep you anchored to the status quo, forcing decisions to be made dimension by dimension before a verdict is made, and making it safe to disagree early on, while it is still useful.

Key messages

01

02

03

Blog
CRQ in action
The hidden variable in cyber risk decisions: The decision environment

Summary

Whenever anyone asks me for book recommendations, one of my go-tos is Thinking Fast and Slow by Daniel Kahneman. Amongst many achievements, he notably won the Nobel Prize for proving we’re not as rational as we think. That’s a deliberate simplification, but it’s a useful starting point for the purpose of this article.

He sadly passed away at the ripe age of 90 in March 2024. However in summer 2025 The Knowledge Project re-released an excellent interview with him which, after marking as a must-listen many months ago, I finally got around to. It reminded me just how relevant and timeless his work, advice, and frameworks are, both to individuals and to businesses.

Predictably, I can’t help myself but apply it to the world of cyber risk. Here are three of my reflections that I’ve tried to orient as useful for senior leaders who chair or contribute to group decisions about cyber.

Remove the restraining forces

First, stop adding “driving forces”, and start removing “restraining forces”.

Kahneman’s point is not to add more and more “driving forces” to push change (e.g. incentives, training, mandates) but instead to focus on the “restraining forces” that are holding things where they are (e.g. friction, fears, habits, situational barriers).

When I think about the measurement and reporting of cyber risk, this feels very familiar. We spend a lot of time explaining why quantitative measurement and leading indicators are better, but far less time understanding what’s blocking a move away from qualitative, subjective reporting, even when the balance of evidence makes a compelling case that the shift is worth it.

I touched on a number of the underlying concerns at play here in another article last year. They were about credibility, confidence, readiness, consistency, and timing.

For leaders who chair decision forums, the practical question becomes: what is actually stopping the group from trusting a more evidence‑based approach to decision making? I genuinely believe that asking that question more (and the second order questions that arise from it) will help us collectively move our industry on from risk matrices and, as David White aptly puts it, colouring crayons in the board room.

Delay the verdict

Second, delay intuition by forcing decisions to be made dimension by dimension.

Kahneman’s advice is very practical here. It emphasises the importance of breaking a decision into dimensions, assessing each dimension separately, and then, once the picture is complete, making the decision.

My take on this for cyber investment decisions for example, is that those dimensions might include how an initiative reduces risk to key scenarios like ransomware, how it defends against prevalent attacker techniques, what it does for critical business services, or how it supports regulatory expectations.

I think the fundamental objective of Kahneman’s points on this topic are to ensure a decision-making group is given a genuine opportunity to change its mind before early impressions harden into a “position”. If we don’t do this, we risk confirmation bias on a potentially incorrect decision.

Make dissent safe

Finally, protect dissenters, then institutionalise dissent through a pre-mortem.

My final reflection is focused on Kahneman’s comment: “If you’re head of group that makes decisions, then protect the dissenters, because they’re very valuable.”

He makes the point about how hard and costly it can sometimes be to be the person who disagrees. People often do not raise concerns because it is uncomfortable and because it can carry social or career risk. If leaders want better decisions, they need to protect dissenters and make disagreement as painless as possible.

This links nicely to an excellent article by Laura Cristiana Voicu on critical thinking where she writes in a line that stuck with me: “Critical thinking requires a genuine willingness to be wrong. Publicly. In front of people whose opinion you care about.

One useful tool Kahneman referenced was the “pre-mortem”, an idea Gary Klein developed and Kahneman strongly endorsed. The concept is simply to assume it is two years from now, the decision in question has turned into a disaster, and then everyone writes the history of how it failed.

In cyber, that is a strong fit for big bets like a major tooling or programme investment, a significant risk acceptance, a cloud or identity transformation, or an outsourcing decision.

Framed this way, dissent stops being personal and becomes part of the process. Leaders can make it clear that they expect alternative views, not as an obstacle to progress but as a way of stress-testing it.

Closing thought

In conclusion, I don’t think any of this is about finding the perfect method or the perfect model. It’s about creating the conditions where good judgement is more likely: removing the retaining forces that keep you anchored to the status quo, forcing decisions to be made dimension by dimension before a verdict is made, and making it safe to disagree early on, while it is still useful.

Key messages

01

02

03

Recent Insights

8 shifts changing how organisations manage risk

Cyber risk isn’t being rewritten by a shiny new framework, it’s being forced to evolve because the way organisations use technology has changed. We unpack eight shifts already surfacing in incidents, audits and boardrooms.
Martin Tyley

Calculating the impact of a cyber-attack on critical infrastructure

What would a systemic cyber-attack cost the UK economy? We recently conducted a study for the Department for Science, Innovation and Technology (DSIT) to answer that question. The findings show the scale of potential disruption and underline why resilience planning matters.
James Hanbury

Cyber resilience in the North West: turning risk into regional strength

According to the Department for Science, Innovation and Technology (DSIT), over 600,000 UK businesses experiencing some form of cyber‑attack. So, if cyber risk isn’t new, why do impacts keep rising? And what can we do in the North West to change the trend?
Martin Tyley

See CRI in action

Book a personalised demo and discover how CRI can help you make smarter cyber risk decisions.