Reports & Papers
from Belfer Center for Science and International Affairs, Harvard Kennedy School

Learning from Cyber Incidents: Adapting Aviation Safety Models to Cybersecurity

Download
An image of a passenger airplane superimposed on computer code and a computer chip.

 

Download the Full Publication

Executive Summary

Over four months in the spring of 2021, over 70 experts participated in a (virtual) workshop on the concept of creating a “Cyber NTSB”. The workshop was funded by the National Science Foundation with additional support from the Hewlett Foundation, and organized by Harvard’s Belfer Center with support from Northeastern University’s Global Resilience Institute.

The first call for the creation of a Cyber NTSB was in 1991. Since that time, many practitioners and policymakers have invoked the analogy, but little has been done to develop the concept. This workshop was carried out with the goal of moving the concept forward.

The NTSB acts as an inspiring metaphor because it helped transform the nascent technology of aviation. It’s easy to forget how much airplanes shrunk the planet over the last century, making it possible to go anywhere in the world quickly, safely, and reliably, but that was not always the case. It’s also easy to forget how often planes crashed. The NTSB is the best known of the broad, deep, and intertwined set of aviation safety programs we learned about. While participants challenged and tested the model, the ultimate conclusion was that the information technology industry does not have strong processes for extracting lessons learned and publishing them  when incidents occur. Today, cybersecurity has no authoritative, independent investigations whose focus is learning lessons, distributing them, and enabling systematic improvements.

The continuing success of attackers demands a new approach to science and policy. Lesson-learning systems — ranging from the in-depth investigations of the NTSB to studying statistics about attacks or near misses — almost certainly have a part to play. There is experimentation, the heart of science, to be done in understanding what that part may be. That experimentation includes many scientific questions, such as searching for ways to maximize the value of data to defenders and system designers while limiting how it might inform attackers. It includes policy questions such as the ability of investigators to compel participation.

Building a strong learning system for cyber incidents will require overcoming a host of challenges, and making decisions in the face of uncertainty. We are confident that many of these issues can be usefully resolved by research, and that such research can also contribute to the success of an already established organization.

This workshop has led to the discovery of over 50 discrete research questions that we believe are worth investigating; it has generated 24 concrete findings for consideration by the technical, policy, and research community; and a series of recommendations for the Biden Administration and Congress as they work to translate the concept of a Cyber NTSB into reality. Highlights of our findings include:

Third party and in-house investigations are no substitute for objective, independent investigations.
Market forces dictate that most cyber incidents will not be revealed. When information is released, it is limited to the minimal amount that must be disclosed by law. When companies choose to share more information, they choose to do so carefully and in such a manner that they control the narrative, excluding any information that will not put them in a positive light. Thus, a Cyber NTSB is necessary to understand what contributed to an incident occurring and how other organizations can prevent a similar incident from happening to them.

Companies are unlikely to fully cooperate under a voluntary regime.
Subpoena authority will likely be necessary for a board to succeed in gaining access to the necessary data and people to reconstruct a timeline and narrative of any incident. While the nascent Cyber Safety Review Board (CSRB) may be able to gain some insights into SolarWinds given the high profile of that incident, reviews of other incidents will likely be near impossible unless companies are required to cooperate.

Product, tool, and control failure must be identified in an objective manner.
The cybersecurity community shies away from identifying when products, controls, and the tools used to implement them fail. Without knowledge of these failures, control designers, toolmakers, and the defenders that implement them are missing critical information necessary to make good use of their security budgets and the time of security personnel.

Findings may be sensitive but should be disseminated as widely as possible.
Adversaries will doubtless pour over any reports produced by the CSRB or any other bodies stood up. This reality should not dissuade investigatory boards from producing reports; however, it may be that some reports or sections of the reports should be disseminated only over classified channels or their dissemination limited in other ways.

Fact finding should be kept separate from fault finding.
Lawmakers must find ways to create circumstances under which the targets of cyber adversary activity can share details on their incidents without fearing that they will be penalized. Finding the right balance between providing liability protection and letting courts and regulators hold companies accountable for poor security practices will be difficult but a balance must be struck.

“Near Miss” reporting can complement incident investigations.
Many of the factors that make investigation of incidents difficult are either missing or reduced when at least one control succeeded. A strong system of near miss reporting complemented by investigation of these near misses may yield meaningful results.

Making Progress

There is policy work to be done to enable us to learn lessons. Most importantly, Congress must work with the Administration to create and empower a set of entities to learn lessons at different scales and speeds, including the Solarium Commission’s Bureau of Cyber Statistics and a system to learn from near misses.

There are many immediate opportunities to improve security that policymakers in Congress and the Administration are pursuing. Similarly, there has been an explosion of research in information security, assurance, resilience and other disciplines. These have obscured and eclipsed the need for deep lesson learning systems. Our hope is that the full report informs not only the science and policy communities, but helps the new CSRB see and meet its important goals.

Note: The findings and recommendations contained in this report were drawn from the multi-session workshop and are based on panelist and participant comments both during discussions and in follow-on email and Slack conversations. This report, however, does not represent a consensus among workshop participants. Participation in the workshop in no way serves as an endorsement of the findings and recommendations contained in this report. Similarly, mentions of specific companies or incidents, security standards, and the like are intended to provide vibrant examples, not comment on them generally. The report’s authors are solely responsible for its content.

Recommended citation

Knake, Robert , Adam Shostack and Tarah Wheeler. “Learning from Cyber Incidents: Adapting Aviation Safety Models to Cybersecurity.” Belfer Center for Science and International Affairs, Harvard Kennedy School, November 12, 2021