GCHQ has revealed for the first time how it makes decisions over which software flaws to keep secret to use for intelligence.
The service has teams of researchers that find vulnerabilities in a variety of computer software and systems, from the most popular used by millions of people to niche technical kit.
If weaknesses are found, a decision has to be made whether to tell the company responsible so they can take action, or keep the flaw secret and use it for intelligence purposes.
A statement published on the GCHQ and National Cyber Security Centre (NCSC) websites on Thursday said: “We’ve discovered vulnerabilities and informed the vendors of every major mobile and desktop platform for over 20 years.
“This work plays an important role in helping to secure the technology which underpins our economy and the everyday lives of millions of people in the UK and abroad.
“However, we do not disclose every vulnerability we find. In some cases, we judge that the UK’s national security interests are better served by ‘retaining’ knowledge of a vulnerability.”
It says the information can be used “to gather intelligence and disrupt the activities of those who seek to do the UK harm, including terror groups, serious and organised crime gangs, and malign states”.
Factors that might lead to a weakness being kept secret are:
– there is no way to fix it
– the product is no longer supported
– the product is so poorly designed it can never be secure
– there is an overriding intelligence requirement that cannot be fulfilled in any other way
The intelligence purpose has to be in a current case or one in the near future, and it is kept under review. The longest period it can be left is a year.
The practice of retaining vulnerabilities sparked controversy in the US after information stolen from the National Security Agency was used to stage the massive WannaCry attack in 2017, which affected a number of organisations internationally including the NHS.
After the attack, Microsoft president Brad Smith condemned US authorities for the process of “stockpiling vulnerabilities” – something GCHQ is adamant it does not do.
He used a blog entry in May 2017 to call for governments to be forced to report issues to vendors, and said: “Repeatedly, exploits in the hands of governments have leaked into the public domain and caused widespread damage. An equivalent scenario with conventional weapons would be the US military having some of its Tomahawk missiles stolen.
“The governments of the world should treat this attack as a wake-up call. They need to take a different approach and adhere in cyberspace to the same rules applied to weapons in the physical world.
“We need governments to consider the damage to civilians that comes from hoarding these vulnerabilities and the use of these exploits.”
However, earlier this year the tech giant named NCSC as one of its top five bounty hunters – researchers who find bugs and flag them up to the vendor.
Dr Ian Levy, technical director of the NCSC, said that if a vulnerability similar to the one exploited in the WannaCry attack was discovered in the future, it would “almost certainly” be flagged under the UK system.
He said: “Because it is quite highly wormable (capable of being turned into a malicious programme that spreads itself) we would have pushed for a disclosure. If a vulnerability similar to the one exploited in the WannaCry attack was discovered it would almost certainly have been disclosed in our process.”
NCSC’s decision-making process starts with the expert Equities Technical Panel. If that cannot agree on whether to keep the flaw secret the case goes to the GCHQ Equity Board which includes representation from other government agencies and departments.
If agreement still cannot be reached the decision goes to the chief executive of NCSC with advice from an oversight committee, and in rare cases, if there is still no consensus, it could referred up to the director of GCHQ and ultimately the foreign secretary.