When the government learns about a vulnerability in a piece of software or hardware, it can either stockpile that vulnerability for its own use or disclose it to the relevant software or hardware maker so that a fix can be implemented and distributed. The government thus faces a choice: Should it favor its offensive capabilities by keeping the vulnerability secret and potentially exploitable, or should it give the manufacturer the opportunity to fix its product?
The U.S. government鈥檚 answer to that question, as reflected in rules released last week governing its disclosure of software vulnerabilities, is 鈥渋t depends.鈥
Initially developed by the Obama Administration, the so-called 鈥淰ulnerabilities Equities Process鈥 (VEP) is an internal, executive-branch framework for determining when and whether the U.S. government should publicly disclose software and hardware flaws that it discovers that may leave computers vulnerable to attack. Once Obama鈥檚 VEP was announced, experts the strategic value of the process, its effectiveness, the secrecy surrounding it (only a partially-classified version had been made publicly available), and how to reform it.
In particular, experts were concerned that the VEP did not adequately prioritize defense. By keeping software and hardware vulnerabilities secret from those who could fix them, individuals, businesses, and critical infrastructure that use the vulnerable technologies 鈥 and not only the government鈥檚 targets 鈥 could be left open to attack.
The Trump administration clearly listened to these critiques, and the of the 鈥淰EP Charter鈥 issued this week is more comprehensive and transparent than its predecessor. In the debate over whether to favor offensive capabilities or defensive efforts, the document states that disclosure serves the national interest in the 鈥渧ast majority鈥 of cases.
To ensure that this conclusion isn鈥檛 simply lip service, however, more reforms are needed. That鈥檚 because the charter leaves some important questions unanswered and still fails to ensure that the decision-making is strongly weighed in favor of disclosure.
The Dangers of Stockpiling Vulnerabilities
As we have highlighted in several amicus briefs filed in cases challenging law enforcement hacking, government stockpiling of vulnerabilities creates security risks that experts do not know how to mitigate.
These include the risk that an attacker will steal and use malicious code for its own nefarious purposes, endangering businesses and human lives. For example, in 2016, the public learned that an entity calling itself the 鈥淪hadow Brokers鈥 obtained National Security Agency malware. Following some initial attempts to sell the exploits, the Shadow Brokers dumped dozens of NSA hacking tools online for free in April 2017. One of those tools exploited a flaw in Microsoft software. Once it was released, others on the internet repurposed it into a virulent piece of ransomware that infected hundreds of thousands of computer systems worldwide in May 2017. The very next month, another malware attack combined that tool with another NSA exploit released by the Shadow Brokers. After initially hitting critical infrastructure in Ukraine, that attack spread internationally and infected hospitals, power companies, shipping companies, and the banking industry, endangering human life as well as economic activity.
Given these and other risks, the VEP process must be designed to carefully figure out when offense should take precedence over defense.
A New and Improved VEP?
The new VEP establishes a process for identifying the small minority of cases where law enforcement or intelligence interests override the benefits of disclosure, and it provides a list of considerations that officials should weigh in deciding whether to disclose. Laudably, these considerations reach beyond law enforcement and national security interests to include information security and other personal and commercial concerns. The VEP also establishes that any decision not to disclose a vulnerability is to be reviewed at least annually. There鈥檚 also a process for a government agency to appeal a decision it doesn鈥檛 agree with.
For the charter to be truly effective, however, it has to be designed so that intelligence and law enforcement interests won鈥檛 be routinely favored over information security, commercial interests, innovation, and civil liberties.
That鈥檚 where it gets complicated. One problem is that not every vulnerability the government uses will go through the VEP process. Only the vulnerabilities the government itself discovers are subject to the VEP. This means attack tools obtained from private vendors, who usually insist on non-disclosure agreements, are not subject to the VEP. Nor are tools that friendly governments let the U.S. use included. So, the VEP will only apply to a subset of vulnerabilities
The second problem is that the VEP鈥檚 design is inclined to produce answers that favor offense. That鈥檚 true for two reasons. First, the questions the VEP sets out for deliberation are mostly unanswerable. For example: What is the potential value of the government using a particular vulnerability? What are potential consequences? Can exploitation of this vulnerability by threat actors 鈥 like private hackers and foreign governments 鈥 be detected by the U.S. government or other members of the defensive community? How likely is it that threat actors will discover or acquire knowledge of this vulnerability if the government doesn鈥檛 disclose it? Answering any one of these questions will require substantial guesswork.
Second, the people appointed to consider these unanswerable questions are disproportionately charged with an intelligence and law enforcement mission, and likely to favor those considerations over others.
The members of the 鈥淓quities Review Board鈥 that have a civilian mission 鈥 the Department of State and Department of Commerce 鈥 are vastly outnumbered by those agencies with a military, intelligence, or law enforcement mission, such as the Office of the Director of National Intelligence, the Department of Homeland Security, the Department of Justice, and more. If no consensus is reached, a vote takes place. And if there鈥檚 a vote, law enforcement and national security interests have a lot more votes. These interests may very well regularly win out.
That鈥檚 because the officials in the room guessing the answers to the VEP questions will be strongly influenced by what they鈥檙e getting graded on. If your job is to protect the public from terrorists or to spy on other nations, you鈥檙e more likely to conclude that a vulnerability should be kept and exploited to help you accomplish your job. But if your job is to protect the public from ongoing data breaches, you鈥檙e more likely to believe that the vulnerability should be disclosed to the vendor and patched before another hacker steals everyone鈥檚 , , or .
Who should be in the room?
For one, the Federal Communications Commission should participate in these decisions, given how many of these exploits involve mobile communications technology. The same goes for the Federal Trade Commission, the agency most clearly charged with ensuring the public鈥檚 privacy and data security.
The security risks from decisions about vulnerability disclosure are not theoretical. What鈥檚 at stake is the security of the public and the internet at large. That鈥檚 why we need a transparent, auditable VEP that values civilian interests and strongly favors disclosure. The Trump administration鈥檚 updated process is one step in the right direction, but more reforms are needed.