In the spring of 2017, the New York State Department of Financial Services passed 23 NCRR 500, which created a set of stringent cybersecurity requirements for financial services companies doing business in New York State. “Doing business” in this context means providing financial services to anyone living in the state regardless of whether the provider had an office or even representatives in the state. An example would be a life insurance policy held by someone living in Syracuse issued by Lincoln National in Indiana.
This set of regulations was the first of its kind and was certain to become the model for other states following in its footsteps over the next few years.
It was also the first to advocate a risk-based approach to cybersecurity.
The foundation of the regulation is the requirement for a complete risk assessment that makes the regulation more adaptable to process and technology changes and allows entities to actually focus on their security program rather than acting as just another compliance mandate. A rigorous and quantitative risk assessment also provides CISOs and CFOs with a common language and useful information that they can integrate into their financials, risk transference strategies and cost/benefit analyses for new technology investments that might improve their overall enterprise risk.
Asset Visibility
Complying with the regulation’s required risk assessments and managing cybersecurity, in general, requires companies to have visibility into their customers, shareholders and assets, especially those that if compromised would most impact the business. Like Equifax painfully demonstrated, information asset management at most companies is not very mature. It is hard to believe that if Equifax management understood that the business impact of a breach would have resulted in a loss of $4 billion, they would not have done whatever it took to apply that Apache patch.
The beginnings of a risk assessment require an inventory of a company’s information assets and their associated business value, yet even businesses who have a solid understanding of their information assets have likely not quantified the impact of a loss of those assets due to a cyber breach.
Traditional Dither and Debate
Traditional risk frameworks, while useful in providing guidance toward setting up a risk assessment are actually part of the problem. The risk framework process is tedious and the journey through the myriad of corporate subject matter experts can be difficult on the best day. There is also a heavy reliance on practitioner intuition and experience, industry lore and best practices. The tendency is to dither and debate not just the valuations themselves but the process that is required to get there.
Due to these obstacles, most companies attempt a risk assessment only once a year or when they are forced to by regulations, audit requirements, compliance issues or by a third party.
FAIR Enough
FAIR (Factor Analysis of Information Risk) is a risk framework that has managed to gain acceptance as an international standard for establishing information risk and is designed to address some of these framework weaknesses.
The FAIR framework aims to allow organizations to speak the same language about risk, apply risk assessment to any object or asset, view organizational risk in total, defend or challenge risk determination using advanced analysis and understand how time and money will affect the organization’s security profile.
All good stuff.
FAIR uses dollar estimates for losses and probability values for threats and vulnerabilities. Combined with a range of values and levels of confidence, it allows for true mathematical modeling of loss exposures.
No Fair
The downside is that FAIR can be difficult to use and it approaches risk analysis by deconstructing a large problem into small problems, the solutions to which rely upon making guesses which it presumes will result in solving the large problem more accurately. This is obviously a fallacy.
The sum of many small guesses usually results in a larger aggregated guess. Doug Hubbard is not a fan.
A Distorted View
FAIR additionally defines risk as the probable frequency multiplied by the probable magnitude of future loss. Probable is synonymous with possible, plausible, feasible and presumed. Probability is defined as a determination supported by evidence strong enough to establish presumption but not proof. Probability invites fallacy. Fallacies of presumption begin with a false (or at least unwarranted) assumption, and so fail to establish their conclusion.
If you look at FAIR’s Risk Assessment Guide, you will see that in 7 of the 10 steps it calls for an “estimate” (of probable threat frequency, control strength, vulnerability, probable loss, worse case loss, etc.). Every time you see the word “estimate” you are accepting a best guess that may not be backed by any objective criteria or metric and may be driven by an agenda. Because all the derived values are based on those estimates, the FAIR risk model results in a great deal of work and by its very definition, a distorted view of actual risk.
Tying Threats to Assets
We prefer the equation, Risk = Vulnerability x Threat x Impact which produces a result with higher precision, reliability and relevance. Because even if you start with “estimated” values for Impact, a more precise determination of Vulnerability and Threat are possible, resulting in a dramatic improvement of the accuracy of your risk assessment.
Most businesses use some form of threat and vulnerability identification software that collects log data in an effort to identify vulnerable network devices that are being attacked. But none of these conventional cybersecurity solutions tie those vulnerabilities and threats to specific assets. As a result, it is impossible to connect the dots between the data coming from those solutions and the assets actually at risk. Knowing that thousands of vulnerabilities and threats are impacting thousands of servers does not identify the devices upon which the highest value assets are stored or processed. Nor does this knowledge translate to which actions should be taken to minimize the company’s exposure.
That list of vulnerabilities only becomes meaningful when they can be prioritized for action based on the loss impact of the information and systems to which those servers are connected, and the vulnerability being exploited.
There are several technologies available today which either use the FAIR approach to risk or one of the major risk frameworks as a basis and factor in simulated threat scenarios to approximate what might happen if a certain set of events occurs.
The Other Approach
Today, we have technologies that do all of this automatically using the actual threat data collected in the computing environment instead of simulated threat scenarios. They do it continuously and in real-time so that there is a continual, moment-to-moment snapshot of the aggregated and detailed risk exposures reported in dollars. And it runs Monte Carlo simulations factoring in the fidelity levels of Impact to move a real-time risk assessment as close as possible to actual risk.
Whatever approach and technology one chooses to begin working on truly quantified risk exposure, every business needs to start engaging in a risk assessment process that enables increased visibility into the dangers of a cyberattack on their information resources, so that they are able to address the day to day exposures on the basis of actual threat data correlated with asset values at risk.
In risk assessment, the journey is often the reward. Just the activity of attempting to identify and understand all of the possible threats, vulnerabilities and impacts will result in a reduction of uncertainty and collaterally a reduction in risk.
In a New York State of Mind
And if you operate in New York State, each provision of the NYDFS regulation requires that the vulnerabilities of the public-facing applications that contain customer PII, must take precedence over internal applications with less sensitive information, it will be necessary for you to do it anyway.
The bonus is that you might be finally able to align your cybersecurity budgets with the actual risk to your enterprise, communicate effectively with your CFO and optimize how your cybersecurity resources are applied, so you can avoid becoming the next Equifax.
Who’d have thought that a government regulation could have such a positive and powerful outcome?