April 15, 2025
From Insight to Action: Making CRQ Results Actually Useful
James Hanbury
Global Lead Director, Co-founder

Why Cyber Risk Quantification (CRQ) Must Lead to Better Decisions Not Just Better Analysis

For all the energy that organisations invest in CRQ, a frustrating truth remains: many results don't actually lead to better decisions.

The analysis is rarely the issue. It’s often technically sound and well intentioned — but somewhere between the CRQ output and the decision-making table, something breaks down.

This blog explores why that happens — and what CRQ practitioners can do to close the gap between insight and action.

When Insight Doesn’t Lead to Action

Quantification is a powerful tool. But like any tool, its value lies in how it’s used.

Imagine a scenario: a cyber risk team has just completed a detailed quantification of ransomware risk. They’ve built a model, validated assumptions, run simulations — and produced a 40-page slide deck. But when they share it with senior stakeholders, the reaction is muted. There are questions, concerns, maybe even appreciation for the effort — but no clear next step. No shift in decision. No budget reallocation. No change in direction.

This is a familiar moment in many CRQ journeys.

Why?

Because the output wasn’t designed to enable a decision. It was designed to show the analysis.

To make CRQ useful, we need to shift our mindset. The goal isn't just to quantify risk. It’s to influence how that risk is understood, prioritised, and managed.

And to do that, the results need to be more than technically correct. They need to be actionable.

Three Reasons CRQ Outputs Often Miss the Mark

1. Framing without business context

Results are too often framed in ways that make sense to the analyst, not the audience. I’ve made that mistake myself — burying the business relevance beneath layers of complexity. An executive wants to understand the risk to the business — not necessarily the inner workings of a Monte Carlo simulation.

2. Analysis without a narrative

Even when the framing is right, many outputs fail to tell a story. They jump straight to figures, ranges, and probabilities. There’s no tension, no progression, no clear ‘so what?’.

3. One-size-fits-all communication

What works for the CISO might not land with the CFO — and certainly not with the board. Each audience needs a tailored lens that speaks their language.

What Makes CRQ Results Actionable?

In my experience, the most impactful CRQ outputs have three characteristics in common:

1. They’re clear.

They tell a story in plain language, free from jargon or unnecessary technical detail. The structure is simple, the takeaway is obvious, and the audience doesn’t have to work hard to understand what matters.

2. They’re relevant.

They speak directly to a real business decision or concern: Should we fund this control? How much cyber insurance do we need? What could a ransomware incident cost us? Are we operating within our risk appetite?

3. They’re timely.

They arrive when the decision is being made — not, for example, six weeks after the budget was finalised.

Use Cases That Demand Action

Across dozens of CRQ engagements, I’ve seen certain use cases consistently lead to real decision-making impact. Here are a few:

💵 Cyber Budget Planning

Quantification can help shift cyber investment conversations from “how much are we spending?” to “what are we getting for it?”.

A loss exceedance curve — or even a simple comparison of annualised loss exposure (ALE) estimates — can show your current exposure and how it changes under different investment options.

Example:

"Spending £500K on enhanced endpoint detection reduces our 90th percentile ransomware ALE by £2M. Compared with other options, it delivers the best risk reduction per £ spent — and brings our exposure within the target range set by the Risk Committee, supporting their goal of keeping cyber losses within appetite without over-investing."

This kind of output shifts the conversation from how much we’re spending to what we’re getting for it. It enables prioritisation across competing options, grounded in cost-benefit and aligned to appetite. It’s especially powerful during budget season, when choices need to be justified.

Going back to that earlier scenario: imagine if, instead of delivering a 40-page technical deck, the team had walked into that room with this kind of analysis. The response and outcome would likely have been quite different.

📈 Board Risk Reporting

CRQ enables accurate, business relevant insights that go beyond vague red-amber-green heatmaps or unclear technical metrics. The best reports not only quantify exposure but also show movement over time and tie back to strategic priorities.

Example:

"We estimate that a ransomware incident could cost us between £3M and £12M, with a most likely loss of £5M — up from £3.5M last year due to accelerated migration to cloud-based platforms. This risk intersects directly with our digital transformation programme — highlighting a temporary reduction in our ability to recover quickly from a major incident."

The best board reports don’t just quantify risk, they show how it’s evolving, explain why, and connect it to strategic priorities. In this case, CRQ flagged an unintended consequence of transformation — prompting a board-level discussion on sequencing and risk treatment.

🧮 Measurement Against Risk Appetite

CRQ can turn abstract risk appetite statements into measurable financial thresholds. That’s critical — because if you can’t measure against your appetite, you can’t manage to it.

Yet in many organisations, these thresholds don’t exist in a meaningful way. I still see risk appetite statements saying things like “we have zero appetite for cyber risk” — or that rely on vague language that’s impossible to measure against. That makes it hard to know when action is needed or whether risk is truly being managed to an acceptable level.

CRQ helps change that. It turns something vague into something clear and actionable.

Example:

"Our board-approved risk appetite is no more than a 10% chance of a cyber event exceeding £10M. Our current ransomware exposure sits at 18% — down from 23% last quarter but still above threshold."

This simple statement translates abstract appetite into measurable terms. It shows progress, reveals trajectory, and prompts a clear question: Do we need additional controls, or are we willing to accept this deviation?

Structuring CRQ for Decision-Making

As mentioned, one of the most effective ways to make results actionable is to structure your analysis around the decision it’s meant to support.

Below is a visual we often use to guide this thinking. It captures three common decision types that CRQ can inform: governance, prioritisation, and investment.

Each of these lenses requires a different communication approach — and CRQ, used in the right way, is flexible enough to support them all.

Think back to that team with the ransomware model. Had they asked, “what decision are we trying to inform?” before they built their deck, their output might have looked quite different — and far more likely to drive action.

By explicitly asking that question, practitioners can anchor their efforts — and avoid the trap of doing analysis for analysis’ sake.

Summary Top Tips for Actionable CRQ

Think of CRQ as a translation engine. Its real value lies in converting analysis into language that decision-makers understand — and can act on.

Here are a few tips to keep that front of mind:

  • Start with the question, not the model. Anchor your analysis in a real decision someone needs to make.
  • Lead with the key takeaway in the report. Then show your working. Include detail for those who want it — but don't make it the headline.
  • Use comparisons to create context. "This risk is 2x greater than last year" or "This control would deliver a 5:1 cost-benefit ratio."
  • Bring timing into the picture. Insights must land before key decisions are made. Map your CRQ roadmap to planning cycles and governance processes.

Next Up: The Art and Science of CRQ

In the next post, we'll explore what may be the most important ingredient for CRQ success — the practitioners behind it.

I'll write about the evolving role of the CRQ practitioner as a change agent — and why analytical and storytelling skills are only part of the puzzle. Being bold, empathetic, and resilient matter just as much.

And, in the meantime, if you'd like support making your CRQ insights more actionable, my team and I would love to help.

Read the next blog in the series

The Art and Science of CRQ: Why Practitioners Must Lead the Change

What Shackleton Can Teach Us About Navigating Cyber Risk
Blog
From Insight to Action: Making CRQ Results Actually Useful
Get your copy below.
By submitting this form I agree that Cyber Risk Insights may collect, process and retain my data pursuant to its Privacy Policy.
Thank you! Use the button below to read now.
Oops! Something went wrong while submitting the form.

Summary

Why Cyber Risk Quantification (CRQ) Must Lead to Better Decisions Not Just Better Analysis

For all the energy that organisations invest in CRQ, a frustrating truth remains: many results don't actually lead to better decisions.

The analysis is rarely the issue. It’s often technically sound and well intentioned — but somewhere between the CRQ output and the decision-making table, something breaks down.

This blog explores why that happens — and what CRQ practitioners can do to close the gap between insight and action.

When Insight Doesn’t Lead to Action

Quantification is a powerful tool. But like any tool, its value lies in how it’s used.

Imagine a scenario: a cyber risk team has just completed a detailed quantification of ransomware risk. They’ve built a model, validated assumptions, run simulations — and produced a 40-page slide deck. But when they share it with senior stakeholders, the reaction is muted. There are questions, concerns, maybe even appreciation for the effort — but no clear next step. No shift in decision. No budget reallocation. No change in direction.

This is a familiar moment in many CRQ journeys.

Why?

Because the output wasn’t designed to enable a decision. It was designed to show the analysis.

To make CRQ useful, we need to shift our mindset. The goal isn't just to quantify risk. It’s to influence how that risk is understood, prioritised, and managed.

And to do that, the results need to be more than technically correct. They need to be actionable.

Three Reasons CRQ Outputs Often Miss the Mark

1. Framing without business context

Results are too often framed in ways that make sense to the analyst, not the audience. I’ve made that mistake myself — burying the business relevance beneath layers of complexity. An executive wants to understand the risk to the business — not necessarily the inner workings of a Monte Carlo simulation.

2. Analysis without a narrative

Even when the framing is right, many outputs fail to tell a story. They jump straight to figures, ranges, and probabilities. There’s no tension, no progression, no clear ‘so what?’.

3. One-size-fits-all communication

What works for the CISO might not land with the CFO — and certainly not with the board. Each audience needs a tailored lens that speaks their language.

What Makes CRQ Results Actionable?

In my experience, the most impactful CRQ outputs have three characteristics in common:

1. They’re clear.

They tell a story in plain language, free from jargon or unnecessary technical detail. The structure is simple, the takeaway is obvious, and the audience doesn’t have to work hard to understand what matters.

2. They’re relevant.

They speak directly to a real business decision or concern: Should we fund this control? How much cyber insurance do we need? What could a ransomware incident cost us? Are we operating within our risk appetite?

3. They’re timely.

They arrive when the decision is being made — not, for example, six weeks after the budget was finalised.

Use Cases That Demand Action

Across dozens of CRQ engagements, I’ve seen certain use cases consistently lead to real decision-making impact. Here are a few:

💵 Cyber Budget Planning

Quantification can help shift cyber investment conversations from “how much are we spending?” to “what are we getting for it?”.

A loss exceedance curve — or even a simple comparison of annualised loss exposure (ALE) estimates — can show your current exposure and how it changes under different investment options.

Example:

"Spending £500K on enhanced endpoint detection reduces our 90th percentile ransomware ALE by £2M. Compared with other options, it delivers the best risk reduction per £ spent — and brings our exposure within the target range set by the Risk Committee, supporting their goal of keeping cyber losses within appetite without over-investing."

This kind of output shifts the conversation from how much we’re spending to what we’re getting for it. It enables prioritisation across competing options, grounded in cost-benefit and aligned to appetite. It’s especially powerful during budget season, when choices need to be justified.

Going back to that earlier scenario: imagine if, instead of delivering a 40-page technical deck, the team had walked into that room with this kind of analysis. The response and outcome would likely have been quite different.

📈 Board Risk Reporting

CRQ enables accurate, business relevant insights that go beyond vague red-amber-green heatmaps or unclear technical metrics. The best reports not only quantify exposure but also show movement over time and tie back to strategic priorities.

Example:

"We estimate that a ransomware incident could cost us between £3M and £12M, with a most likely loss of £5M — up from £3.5M last year due to accelerated migration to cloud-based platforms. This risk intersects directly with our digital transformation programme — highlighting a temporary reduction in our ability to recover quickly from a major incident."

The best board reports don’t just quantify risk, they show how it’s evolving, explain why, and connect it to strategic priorities. In this case, CRQ flagged an unintended consequence of transformation — prompting a board-level discussion on sequencing and risk treatment.

🧮 Measurement Against Risk Appetite

CRQ can turn abstract risk appetite statements into measurable financial thresholds. That’s critical — because if you can’t measure against your appetite, you can’t manage to it.

Yet in many organisations, these thresholds don’t exist in a meaningful way. I still see risk appetite statements saying things like “we have zero appetite for cyber risk” — or that rely on vague language that’s impossible to measure against. That makes it hard to know when action is needed or whether risk is truly being managed to an acceptable level.

CRQ helps change that. It turns something vague into something clear and actionable.

Example:

"Our board-approved risk appetite is no more than a 10% chance of a cyber event exceeding £10M. Our current ransomware exposure sits at 18% — down from 23% last quarter but still above threshold."

This simple statement translates abstract appetite into measurable terms. It shows progress, reveals trajectory, and prompts a clear question: Do we need additional controls, or are we willing to accept this deviation?

Structuring CRQ for Decision-Making

As mentioned, one of the most effective ways to make results actionable is to structure your analysis around the decision it’s meant to support.

Below is a visual we often use to guide this thinking. It captures three common decision types that CRQ can inform: governance, prioritisation, and investment.

Each of these lenses requires a different communication approach — and CRQ, used in the right way, is flexible enough to support them all.

Think back to that team with the ransomware model. Had they asked, “what decision are we trying to inform?” before they built their deck, their output might have looked quite different — and far more likely to drive action.

By explicitly asking that question, practitioners can anchor their efforts — and avoid the trap of doing analysis for analysis’ sake.

Summary Top Tips for Actionable CRQ

Think of CRQ as a translation engine. Its real value lies in converting analysis into language that decision-makers understand — and can act on.

Here are a few tips to keep that front of mind:

  • Start with the question, not the model. Anchor your analysis in a real decision someone needs to make.
  • Lead with the key takeaway in the report. Then show your working. Include detail for those who want it — but don't make it the headline.
  • Use comparisons to create context. "This risk is 2x greater than last year" or "This control would deliver a 5:1 cost-benefit ratio."
  • Bring timing into the picture. Insights must land before key decisions are made. Map your CRQ roadmap to planning cycles and governance processes.

Next Up: The Art and Science of CRQ

In the next post, we'll explore what may be the most important ingredient for CRQ success — the practitioners behind it.

I'll write about the evolving role of the CRQ practitioner as a change agent — and why analytical and storytelling skills are only part of the puzzle. Being bold, empathetic, and resilient matter just as much.

And, in the meantime, if you'd like support making your CRQ insights more actionable, my team and I would love to help.

Key messages

01

02

03

Blog
From Insight to Action: Making CRQ Results Actually Useful

Summary

Why Cyber Risk Quantification (CRQ) Must Lead to Better Decisions Not Just Better Analysis

For all the energy that organisations invest in CRQ, a frustrating truth remains: many results don't actually lead to better decisions.

The analysis is rarely the issue. It’s often technically sound and well intentioned — but somewhere between the CRQ output and the decision-making table, something breaks down.

This blog explores why that happens — and what CRQ practitioners can do to close the gap between insight and action.

When Insight Doesn’t Lead to Action

Quantification is a powerful tool. But like any tool, its value lies in how it’s used.

Imagine a scenario: a cyber risk team has just completed a detailed quantification of ransomware risk. They’ve built a model, validated assumptions, run simulations — and produced a 40-page slide deck. But when they share it with senior stakeholders, the reaction is muted. There are questions, concerns, maybe even appreciation for the effort — but no clear next step. No shift in decision. No budget reallocation. No change in direction.

This is a familiar moment in many CRQ journeys.

Why?

Because the output wasn’t designed to enable a decision. It was designed to show the analysis.

To make CRQ useful, we need to shift our mindset. The goal isn't just to quantify risk. It’s to influence how that risk is understood, prioritised, and managed.

And to do that, the results need to be more than technically correct. They need to be actionable.

Three Reasons CRQ Outputs Often Miss the Mark

1. Framing without business context

Results are too often framed in ways that make sense to the analyst, not the audience. I’ve made that mistake myself — burying the business relevance beneath layers of complexity. An executive wants to understand the risk to the business — not necessarily the inner workings of a Monte Carlo simulation.

2. Analysis without a narrative

Even when the framing is right, many outputs fail to tell a story. They jump straight to figures, ranges, and probabilities. There’s no tension, no progression, no clear ‘so what?’.

3. One-size-fits-all communication

What works for the CISO might not land with the CFO — and certainly not with the board. Each audience needs a tailored lens that speaks their language.

What Makes CRQ Results Actionable?

In my experience, the most impactful CRQ outputs have three characteristics in common:

1. They’re clear.

They tell a story in plain language, free from jargon or unnecessary technical detail. The structure is simple, the takeaway is obvious, and the audience doesn’t have to work hard to understand what matters.

2. They’re relevant.

They speak directly to a real business decision or concern: Should we fund this control? How much cyber insurance do we need? What could a ransomware incident cost us? Are we operating within our risk appetite?

3. They’re timely.

They arrive when the decision is being made — not, for example, six weeks after the budget was finalised.

Use Cases That Demand Action

Across dozens of CRQ engagements, I’ve seen certain use cases consistently lead to real decision-making impact. Here are a few:

💵 Cyber Budget Planning

Quantification can help shift cyber investment conversations from “how much are we spending?” to “what are we getting for it?”.

A loss exceedance curve — or even a simple comparison of annualised loss exposure (ALE) estimates — can show your current exposure and how it changes under different investment options.

Example:

"Spending £500K on enhanced endpoint detection reduces our 90th percentile ransomware ALE by £2M. Compared with other options, it delivers the best risk reduction per £ spent — and brings our exposure within the target range set by the Risk Committee, supporting their goal of keeping cyber losses within appetite without over-investing."

This kind of output shifts the conversation from how much we’re spending to what we’re getting for it. It enables prioritisation across competing options, grounded in cost-benefit and aligned to appetite. It’s especially powerful during budget season, when choices need to be justified.

Going back to that earlier scenario: imagine if, instead of delivering a 40-page technical deck, the team had walked into that room with this kind of analysis. The response and outcome would likely have been quite different.

📈 Board Risk Reporting

CRQ enables accurate, business relevant insights that go beyond vague red-amber-green heatmaps or unclear technical metrics. The best reports not only quantify exposure but also show movement over time and tie back to strategic priorities.

Example:

"We estimate that a ransomware incident could cost us between £3M and £12M, with a most likely loss of £5M — up from £3.5M last year due to accelerated migration to cloud-based platforms. This risk intersects directly with our digital transformation programme — highlighting a temporary reduction in our ability to recover quickly from a major incident."

The best board reports don’t just quantify risk, they show how it’s evolving, explain why, and connect it to strategic priorities. In this case, CRQ flagged an unintended consequence of transformation — prompting a board-level discussion on sequencing and risk treatment.

🧮 Measurement Against Risk Appetite

CRQ can turn abstract risk appetite statements into measurable financial thresholds. That’s critical — because if you can’t measure against your appetite, you can’t manage to it.

Yet in many organisations, these thresholds don’t exist in a meaningful way. I still see risk appetite statements saying things like “we have zero appetite for cyber risk” — or that rely on vague language that’s impossible to measure against. That makes it hard to know when action is needed or whether risk is truly being managed to an acceptable level.

CRQ helps change that. It turns something vague into something clear and actionable.

Example:

"Our board-approved risk appetite is no more than a 10% chance of a cyber event exceeding £10M. Our current ransomware exposure sits at 18% — down from 23% last quarter but still above threshold."

This simple statement translates abstract appetite into measurable terms. It shows progress, reveals trajectory, and prompts a clear question: Do we need additional controls, or are we willing to accept this deviation?

Structuring CRQ for Decision-Making

As mentioned, one of the most effective ways to make results actionable is to structure your analysis around the decision it’s meant to support.

Below is a visual we often use to guide this thinking. It captures three common decision types that CRQ can inform: governance, prioritisation, and investment.

Each of these lenses requires a different communication approach — and CRQ, used in the right way, is flexible enough to support them all.

Think back to that team with the ransomware model. Had they asked, “what decision are we trying to inform?” before they built their deck, their output might have looked quite different — and far more likely to drive action.

By explicitly asking that question, practitioners can anchor their efforts — and avoid the trap of doing analysis for analysis’ sake.

Summary Top Tips for Actionable CRQ

Think of CRQ as a translation engine. Its real value lies in converting analysis into language that decision-makers understand — and can act on.

Here are a few tips to keep that front of mind:

  • Start with the question, not the model. Anchor your analysis in a real decision someone needs to make.
  • Lead with the key takeaway in the report. Then show your working. Include detail for those who want it — but don't make it the headline.
  • Use comparisons to create context. "This risk is 2x greater than last year" or "This control would deliver a 5:1 cost-benefit ratio."
  • Bring timing into the picture. Insights must land before key decisions are made. Map your CRQ roadmap to planning cycles and governance processes.

Next Up: The Art and Science of CRQ

In the next post, we'll explore what may be the most important ingredient for CRQ success — the practitioners behind it.

I'll write about the evolving role of the CRQ practitioner as a change agent — and why analytical and storytelling skills are only part of the puzzle. Being bold, empathetic, and resilient matter just as much.

And, in the meantime, if you'd like support making your CRQ insights more actionable, my team and I would love to help.

Key messages

01

02

03

Recent Insights

From Pilot to Capability: The Journey to Operationalise CRQ

CRQ can’t remain a pilot forever. To drive meaningful, repeatable value, it needs to mature into a business capability: trusted, embedded, and regularly informing decisions.
James Hanbury

Winning the First Yes: Navigating the Five Most Common CRQ Objections

Before a single scenario is modelled or a number estimated, one of first challenges in adopting cyber risk quantification (CRQ) is simply persuading stakeholders it's worth doing.
James Hanbury

Six Principles of Effective CRQ: How to Build an Engine That Lasts

In this article, I’ll share six working principles I’ve found essential for embedding CRQ in a way that sticks — not just as a project, but as a true business capability.
James Hanbury

Empowering you to make smarter cyber risk decisions.

Thank you! A member of the team will be in touch shortly.
Oops! Something went wrong while submitting the form. Please try again.