Measuring the Effectiveness of Your Cybersecurity Program Against Evolving Cybersecurity Risks

Author photo: Eric Cosman
ByEric Cosman
Category:
ARC Report Abstract

Overview

Previous ARC Advisory Group reports have discussed how to define and implement cybersecurity programs based on guidance and requirements available from standards, guidelines, frameworks, and other sources. However, implementation is just the beginning since responding to constantly evolving cybersecurity risks is not a project, but a process. As with any management process it must include provisions for continuous improvement, including metrics definition and performance assessment.

While there is considerable information available about requirements and performance levels, comparatively little practical guidance is available on how to define and implement effective metrics. Standards committees and others have attempted to address the subject but have had only limited success.

To be most effective, these metrics must reflect the needs and expectations of major stakeholders, including those who must approve the expense of operating the cybersecurity program. Metrics must also be defined in a manner that allows for objective assessment of progress.

Asset owners who have implemented successful programs no doubt have such metrics. Whenever possible, they should consider sharing how they have defined them and conducted performance assessments. This allows others to learn from their experience.

The Cybersecurity Response Is Improving

Due to increased reporting of cyber threats, vulnerabilities, and incidents those responsible for industrial automation systems employed in the critical infrastructure are very aware of the need to improve cybersecurity. Efforts to develop standards and guidance over the past decade or more have produced a wealth of standards, guidance, and related information about how to mount an effective response to the risk.

Agreeing on Potential Consequences

While we now hear fewer debates about whether the risk is real, there is still some disagreement about the likelihood of certain consequences. For example, many asset owners with hazardous processes or materials still rely on dedicated safety protection systems to prevent serious process upsets. Recently reported events have shown that these systems can also be vulnerable to cyber-attack, particularly if they connect to a network. Even safety systems that are not connected can be compromised if the systems used to program and configure them are not adequately protected.

Defining a Response

With each new incident our knowledge and understanding of the potential consequences have increased.  It is now quite common for asset owners to create formal response plans as part of a broader cybersecurity management program. There is now a sizable body of knowledge about how to best structure such a program, including organizational models, role definitions, skills required, and associated processes and procedures.  Standards provide very specific requirements with respect to the measures needed and performance levels expected. For regulated industries, these may be augmented by even more specific regulations from a government or overseeing agency. Many of these sources also recommend or require a way to gauge the effectiveness of the response over time.

Finally, sources such as the NIST Cybersecurity Framework (CSF) provide very specific guidance on how to structure an adequate response and what elements must be included.

Moving Toward Continuous Improvement Against Evolving Cybersecurity Risks

Increased awareness and acceptance of the threats facing industrial automation systems is a welcome trend. More asset owners have implemented comprehensive cybersecurity management programs, often with the assistance of trusted consultants and advisors. Standards, practices and frameworks are useful in constructing such programs and defining objectives. But a structured approach is also needed for planning and applying system changes, control and countermeasures, and measuring the resulting changes in performance. These measurements are expressed relative to a baseline established at the beginning of the program.

Plan, Do, Check, Act

This is essentially the same as the “Plan, Do, Check, Act” (PDCA) cycle typically used in areas such as quality management and process or system safety. Although the subject matter and related goals may be different, similar or identical tools are applicable. This is Evolving Cybersecurity Risks eccyberinsight.JPGparticularly true when comparing cybersecurity and safety.

Cybersecurity experts have acknowledged the relationship with process or system safety. If a system is not secure, it is difficult or impossible to ensure its safe operation. The tools and methods used for analyzing risk are essentially the same and some consultants have already defined procedures for cybersecurity process hazard analysis (CyberPHA). Some asset owners have considered linking their safety and cybersecurity programs. Many asset owners have used information from existing standards, guidelines and frameworks to complete the first two steps of the PDCA cycle for cybersecurity as part of their cybersecurity management program. While this a good start, it is not sufficient to address evolving threats and vulnerabilities. New consequences can also emerge as a result of more detailed process analysis.

The Basis for Improvement

Sustaining such a program over the long term requires appropriate actions in the check and act steps of the change cycle. Specifically, those accountable for the program must provide methods for checking performance and the means for making improvements based on these measurements. Frequent assessments and measurements must be an integral part of the program with the results presented to management to gain support for the required changes.

In a variety of areas – including cybersecurity – experience has shown that continuous improvement is achieved in evolutionary steps rather than larger revolutionary innovations. This was the basis for the Capability Maturity Model (CMM) framework first described in the IEEE Software journal. It organizes these evolutionary steps into five maturity levels that lay successive foundations for continuous process improvement. The same CMM framework has also been applied to cybersecurity. A notable example is the C2M2 model available from the US Department of Energy. The US Department of Homeland Security has also endorsed this methodology.

There are also other interpretations of the need for progressive levels of maturity in the cybersecurity response. ARC provides a cybersecurity maturity model to help organizations understand and use cybersecurity maturity in their planning. Each of the levels in this model represents a recommended sequence of cybersecurity objectives and an associated set of defensive actions. Successive levels provide an additional layer of cybersecurity protection and prepare for advance to the next level.  Each level also has certain costs and support requirements.  For lower levels these are minimal, and the risk reduction benefits are significant.  Advancing to higher levels needs to be tempered by specific cyber concerns and resources.

Metrics Required

While maturity models provide a useful context, qualitative descriptions are not sufficient as the basis for determining specific opportunities for improvement. They must be supported by specific metrics that address the aspects of the program that may be adjusted to accomplish improved performance. Examples include patching, implementing new capabilities, and responding to newly identified threats and vulnerabilities. These metrics must be relevant and unambiguous to establish the appropriate scale for measurements.

ARC Advisory Group clients can view the complete report at  ARC Client Portal 

If you would like to buy this report or obtain information about how to become a client, please Contact Us    

Keywords: Cybersecurity Programs, Assessment, Conformance, Continuous Improvement, Metrics, Performance, ARC Advisory Group.

Engage with ARC Advisory Group

Representative End User Clients
Representative Automation Clients
Representative Software Clients