National Cyber Security Centre has received mountains of feedback on the security of the government’s Covid-19 contact-tracing app, and has now taken the step of making multiple disclosures

The UK’s National Cyber Security Centre (NCSC) is working to triage a number of cyber security vulnerabilities in the beta Covid-19 coronavirus contact-tracing app, after receiving extensive feedback from researchers and the infosec community.

The NCSC and NHSX published technical details of the proposed app, including its code, earlier in May, with the intent of both demonstrating what the app will do, and to get peer review.

Ian Levy, the NCSC’s technical director, said that because of the urgency arising from the pandemic, the app was being developed along a very compressed timeline, and compromises would have been made for the sake of timeliness.

“We asked for feedback and we've had feedback – lots of it,” said Levy. “Thank you to everyone who’s taken the time to look at the design and the beta code and provide us with useful feedback, whether that’s directly, on GitHub or through the NCSC’s vulnerability disclosure programme. The whole team is genuinely grateful for the effort people have put in, to help us make the app the best it can be. Everything reported to the team will be properly triaged, although this is taking longer than normal.”

The three main vulnerabilities reported so far relate to the registration process for app users, the Bluetooth communication standard, and data encryption.

Levy said a number of people had contacted the NCSC to say there were weaknesses in the registration process, especially around the distribution of public keys and installation IDs. In response, he said the developers had taken a conscious decision to delay a more secure registration scheme until the Isle of Wight trial is concluded, and that these flaws will be fixed in a newly designed scheme.

The issues with Bluetooth centre on concerns raised around the possibility of identifying people from the data generated by the contact-tracing app. Levy conceded that the original documentation release may have given some cause to think that the NCSC was not taking this issue particularly seriously.

He highlighted one vulnerability that had been overlooked, which was identified by academics Vanessa Teague and Chris Culnane of the University of Melbourne. This was that because they are long-lived, it was possible for a malicious actor to link the encrypted IDs, or BroadcastValues generated for each user device together, which goes against privacy protections specified in the Bluetooth Low Energy standard.

Levy also acknowledged Teague and Culnane’s work detailing how contact events could be used to infer information about people, even if the encrypted ID information could not be recovered, and proposed to fix this by encrypting proximity logs on the device.

The encryption vulnerability in the beta app has arisen because the app does not encrypt proximity contact event data, and the data is not independently encrypted before it is sent to the central servers. This, said Levy, means that when data is transferred to the back-end, it is only protected by the transport layer security (TLS) protocol, so that if Cloudflare was compromised in some way, cyber criminals could access that data.

He pointed out that this was something else that was sacrificed at first because of the need for speed.

Finally, Levy noted some ambiguities and errors in statements made about the beta app. Among these was a statement that “the infrastructure provider and the healthcare service can be assumed to be the same entity”. This suggests that the NCSC trusts the network bridging the gap between user devices and the central NHS servers in the same way as it trusts the whole of the NHS, which is clearly not the case.

He added that some errors were made in statements about how the Cloudflare system used forwards client IP addresses; the back-end architecture of the app; and the computation of log submissions.

Read more about the security of contact tracing

Governance and data decentralisation are among measures that organisations can take to allay security and privacy concerns over contact-tracing apps, according to RSA.

The Covid-19 pandemic has necessitated extreme measures not seen in peacetime for over 100 years. Contact-tracing apps are being developed as a tool for managing the pandemic, but are they a step too far?

There was also some misleading wording in the app’s vulnerability disclosure policy, which said people could not publicly disclose any details of any vulnerabilities without NHSX’s consent. Levy said this had left egg on a few faces because it runs contrary to the NCSC’s approach to vulnerability disclosure.

“Responsible security researchers shouldn’t be arbitrarily gagged if the developer doesn’t respond,” he said. “This line got missed in the final review. My bad. It’s fixed.”

Levy added that in future versions of the app, the NCSC would try to publish more regular summaries of the backlog of issues, so that researchers can see what bugs have been spotted but not yet squished.

“This is what good security research should lead to – better products and services for all,” he said. “We get better by peer review, learning from each other, accepting when things aren’t quite right and fixing them.”

FireEye EMEA CTO David Grout said that although the NCSC was doing the right thing in addressing the security issues raised, the disclosures would make more people uneasy about the app ahead of its roll-out.

“Experts suggest that for the UK as a whole, about 60% of the population needs to install and use the software for it to live up to its full potential,” he said. “The government is relying on a public buy-in for the project to work.

“To get the public on side, the government will need to not only ensure data is stored securely, but also build trust by being open and transparent about the measures taken to defend citizen data, and also make the public aware of their rights to privacy.”

Content Continues Below

Download this free guide

Getting Cloud Security Right

Let's face it, cloud security can be done very wrong. Let's learn to do it right.
Regular Computer Weekly contributor Peter Ray Allison explores this issue, weighing up the questions organisations should be asking of their cloud service providers, and whose responsibility cloud security should be.

I agree to TechTarget’s Terms of Use, Privacy Policy, and the transfer of my information to the United States for processing to provide me with relevant information as described in our Privacy Policy.

Please check the box if you want to proceed.

I agree to my information being processed by TechTarget and its Partners to contact me via phone, email, or other means regarding information relevant to my professional interests. I may unsubscribe at any time.

Please check the box if you want to proceed.

By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.

Start the conversation

0 comments

Register

I agree to TechTarget’s Terms of Use, Privacy Policy, and the transfer of my information to the United States for processing to provide me with relevant information as described in our Privacy Policy.

Please check the box if you want to proceed.

I agree to my information being processed by TechTarget and its Partners to contact me via phone, email, or other means regarding information relevant to my professional interests. I may unsubscribe at any time.