Everything you need to know about measuring IT and cyber risks
CISOs and IT security professionals are grappling with more cyber threats now than ever. From malware and ransomware, to DDoS attacks and zero-day exploits, the risks just keep increasing. So, how do you know which risks to tackle first? Or where to focus your cybersecurity investments?
The traditional approach would be to rank all your risks as high, medium, and low. But these categorizations can be interpreted differently by different people. You might think a medium risk needs to be mitigated, but the management team might argue that it can be accepted. Defending your point of view can be tough because the term ‘medium risk’ sounds quite ambiguous.
It gets more challenging when you have 2-3 different risks that are all ranked medium. Which one do you focus on first? Do you spend the same amount of time and resources managing all three risks? It’s difficult to know for sure.
But what if you were told that a malware attack on your organization could cost you $3 million in losses? And that there’s a 60% chance of that loss occurring? Now, things become clearer, both for your IT security team and the business. You can quickly come up with a response, get consensus, and take action to protect your business.
Simply put, it’s the process of measuring IT and cyber risk exposure in monetary terms. It helps you determine which risks to focus on first, and where to allocate your cybersecurity resources for maximum impact.
Typically, cyber risk quantification uses sophisticated modeling techniques like Monte Carlo simulations to estimate the value at risk (VaR) or expected loss from risk exposure.
By quantifying the dollar impact of a risk event, you can confidently answer questions like “How much should we invest in cybersecurity?”, “What will be the return on investment?”, and “Do we have enough cyber insurance coverage?”
Risk quantification can benefit multiple stakeholders. CISOs gain a deeper understanding of risk impact which helps them make data-driven decisions. Boards have more visibility into what’s at stake for the business in terms of dollar value. And executives can effectively prioritize cybersecurity investments, driving alignment between cyber programs and business goals.
50% of C-level executives use risk quantification tools to track and evaluate their cybersecurity investment decisions
Risk quantification isn’t a new practice. But it’s receiving more attention these days because:
1. Cyber-attacks are getting more complex and aggressive: The UN reported a 600% increase in malicious emails during the pandemic. Cisco predicts that DDoS attacks will touch 15.4 million by 2023. Cybersecurity Ventures estimates that cybercrime will cost the world $10.5 trillion annually by 2025. All this means that we need to get smarter about how we assess, measure, and respond to cyber risks.
2. Attack surfaces are expanding: Businesses are increasingly adopting AI, IoT, robotic process automation, cloud apps, and other digital technologies to achieve their business goals. But that creates more entry points for cyber criminals to breach sensitive networks. If we want to stay ahead, we have to build a more accurate understanding of risk impact and likelihood.
3. Cybersecurity budgets and resources are limited: Organizations face thousands of IT and cyber risks. The challenge is to figure out which risks to deal with first. Likewise, there may be hundreds of possible security controls. Which one will yield the most benefits for the least cost? These are questions that CISOs have to answer because their budgets are finite. Investments have to be allocated as efficiently as possible. That starts with quantifying the financial loss of a potential cyber risk. When you know how much the risk will cost you, and how much a particular control can help lower that cost, it becomes easier to decide where to direct security investments.
4. Qualitative measurements aren’t always sufficient: Cyber risks have historically been communicated in qualitative terms like “probably likely to occur” or “somewhat likely to impact the business”. But these terms often raise more questions than provide answers. What does “probably likely” mean? How is it different from “somewhat likely”? If resources are applied to a “probably likely” risk, how much risk reduction will be achieved? To answer these questions, we need more quantitative data.
By measuring and communicating cyber risks in monetary terms, you can:
Make better-informed decisions: No longer do you have to guess which IT and cyber risks to prioritize based simply on intuition or judgement. With properly quantified risk data, you understand the true impact and probability of a risk. You know where to focus your cyber investments, and how to reduce your risk exposure in line with business objectives. You’re less likely to over-react or under-react to potential risk events. Instead, you’re able to make calculated IT and cyber risk management decisions that yield optimal value.
Strengthen the objectivity and accuracy of your risk assessments: When you express cyber risk exposure in clear and precise terms, you minimize uncertainty. There’s much less debate and confusion about what the top three cyber risks are, or why they’ve been ranked that way, or which controls are most relevant to mitigate those risks. The data is there for everyone to see.
Demystify cybersecurity for the board and management: Cybersecurity presentations to the board and leadership team can be filled with confusing technical jargon. Or, they fan the flames of FUD (fear, uncertainty, and doubt). But that doesn’t help with effective business analysis or decision-making. Quantification, by contrast, provides a more nuanced and easy-to-comprehend view of cybersecurity risks. Boards and executives can quickly understand the most critical and costly cyber threats facing their business. CISOs, in turn, can better justify the need for cybersecurity investments.
Understand the effectiveness of risk mitigation strategies: When you invest in a security control, you want to know how effective it is. Cyber risk quantification can help you understand how much risk reduction has been achieved with each control. If you find your risk exposure is still high, you can quickly re-direct your investments to another, better control. This way, your cyber risk mitigation efforts become more proactive and productive.
Gain a competitive advantage: Cyber risk quantification helps you strengthen your cyber maturity and resilience. It gives you the insights to respond to cyber threats in a more targeted and cost-efficient way. That translates into improved customer trust and credibility. In fact, a report from The Cybersecurity Imperative found that companies using, or planning to use, quantitative risk assessment models are ahead in digital transformation, and have overall higher cybersecurity performance.
Factor Analysis of Information Risk (FAIR™) is a useful framework to understand, analyze, and quantify cyber risks in financial terms. It was developed by The Open Group, a global consortium that enables the achievement of business objectives through IT standards.
With Open FAIR, you can quantify your security risk exposure in terms of the dollar value at risk. The framework helps you challenge and defend your risk decisions using an advanced risk model, while also determining how security investments will impact your risk profile.
Open FAIR can be used in tandem with other risk assessment frameworks such as NIST, ISO, and OCTAVE. While many of them rely on qualitative color charts or numerical weighted scales to assess risks, Open FAIR adds a quantitative dimension that makes risk assessments more holistic.
ISO 27005 acts as a guideline for information security risk assessments. It doesn’t outline a specific methodology, but it does imply continuous risk management based on the following components: context establishment, risk assessment, risk treatment, risk acceptance, risk communication and consultation, and risk monitoring and review.
NIST SP 800-53 was developed by the US National Institute of Standards and Technology (NIST) to establish common control assessment procedures for federal organizations. But many private organizations also use NIST to determine if their security controls are implemented correctly, operating as intended, and producing the desired outcome.
OCTAVE or the Operationally Critical Threat, Asset, and Vulnerability Evaluation was developed by Carnegie Mellon University for the Department of Defense. The new version, OCTAVE FORTE, helps organizations evaluate their security risks, and use ERM principles to bridge the gap between executives and practitioners. OCTAVE Allegro – which serves as a complement to OCTAVE FORTE – helps streamline and optimize security risk assessments.
COBIT® 5 was created by the Information Systems Audit and Control Association (ISACA) for enterprise IT governance. It enables a consistent and accurate assessment of IT risks and their impact on an organization.
A Monte Carlo analysis is a powerful tool to help you model the probability and impact of different risk exposures in quantitative terms. It simulates a cyber risk event like a ransomware attack multiple times over, so that you can predict the financial losses that could result from each scenario – ranging from best-case, to most likely, to worst-case scenarios. Based on these insights, you can decide on the best approach to risk mitigation.
While many of the risk assessment frameworks covered above provide clear guidelines and procedures on how to measure cyber risks, here are a few best practices to get you started:
Build a comprehensive profile of your information assets. Know where they’re stored, transported, and processed.
Identify the threats that could compromise the security and privacy of your assets. Determine which of these assets are most vulnerable to the identified threats.
Analyze the controls that are in place to minimize the probability of the threats or vulnerabilities.
Capture the financial consequences of a threat being realized. For instance, a data breach could result in multiple financial losses – be it legal liabilities, regulatory penalties, reputational costs, or customer damage claims. Use industry data or insights from past cybersecurity incidents within the organization to estimate the cost and scale of risk impact.
Determine the most likely loss outcomes using Monte Carlo simulation models.
Prioritize risks based on their financial impact and probability. Select a mitigation approach.
Document and report the results to help management decide on cybersecurity budgets, policies, and procedures.
Establish a common risk language: If everyone in the organization has a different definition for IT asset, threat, or vulnerability, you’ll find it difficult to communicate and defend your risk decisions. Standardize the risk nomenclature as much as possible.
Involve other functions: Cyber risk quantification is a collaborative exercise that goes beyond the IT security department. Engage other divisions in identifying critical risk scenarios. The more perspectives you have at the table, the more comprehensive your risk data will be.
Revisit risk results periodically: Cyber risks and threats are always evolving. A risk that was critical a year ago may not be so anymore. The only way to know is to re-quantify your risks at regular intervals – maybe once or twice annually.
Don’t try and boil the ocean: It’s neither efficient nor effective to cover all possible threats and risk scenarios at once. Start small. Pick one important use case and work on that first.
Automate wherever possible: Manual cyber risk quantification processes can be both complex and time-consuming. Find a solution that can help you automate workflows, and measure risks faster.
Remember, quantification isn’t a panacea: Cyber risk quantification should enhance, not replace other IT and cyber risk management processes. Its value is best realized when complemented with risk monitoring, qualitative assessments, internal audits, and issue management processes.