COVID-19 Contact Tracing Apps: Effective Virus Risk Management Tools or Privacy Nightmare?
Contact tracing apps for COVID-19 (coronavirus) leverage technology as a route to release from citizens from lockdown. People would be set free and the economy could restart. It’s an attractive promise – but experts warn the technology hasn’t been sufficiently examined.
Different nations are adopting different approaches towards coronavirus contact tracing (Google and Apple have switched their terminology from the emotive ‘tracing’ to ‘exposure notification’.) The common factors are the use of mobile phones and Bluetooth.
An app uses Bluetooth to recognize and connect to a nearby phone running the same app. This can register the time, distance and duration of the proximity contact. If the user of one of the apps consequently develops COVID-19 symptoms, the app’s logs can trace other users who may have been within the infection danger zone (time, distance, duration) and warn them to self-isolate.
The primary difference between contact tracing technology is the location of the necessary log processing. It can be done locally on the phones themselves (the approach taken by Google and Apple), or by a central server (the approach taken by apps used by the UK and Australia). The former is claimed to be more privacy friendly, while adopters of the latter claim it is more efficient.
The degree of user anonymization and encryption can also differ between different apps.
Will it work?
The purpose of contact tracing is to aid the release from population lockdown by more efficiently recognizing and quarantining only those people who are infected or at risk of being infected. The effect will simultaneously reduce or eliminate the progress of the disease in the wider population.
If the apps do not achieve this, they have no legitimate purpose. The question then is whether contact tracing apps are likely to succeed in their specified purpose.
Figures vary over the app uptake required for success. Some estimates say 60% is required to halt the disease, although a lower figure might slow it. In the UK, the University of Oxford’s Big Data Institute has estimated that 56% of the general population must use the app to halt the outbreak – which equates to 80% of all mobile phone users.
These figures probably assume that the apps in question work as expected – but there are huge problems. Firstly, Bluetooth signals are not blocked by otherwise adequate virus blocks. A former national coordinator for health information technology at the Department of Health and Human Services told The Verge, “If I am in the wide open, my Bluetooth and your Bluetooth might ping each other even if you’re much more than six feet away. You could be through the wall from me in an apartment, and it could ping that we’re having a proximity event. You could be a on a different floor of the building, and it could ping. You could be biking by me in the open air and it could ping.”
These apps could easily become subject to a high number of false positives – and false positives always lead to a rejection by users.
But at the same time, there is a likelihood of a high number of false negatives. Many people will refuse to adopt the app, others will lose, forget, or have their phones stolen. In some cases, the phone will be turned off, while in others the battery will be dead. In such cases, infected people could be infecting dozens of other people during the incubation period with no app warnings.
The success of contact tracing apps will then depend on the overall active uptake by users, and whether the big data analysts have got their figures right. One danger is that governments may be tempted to make adoption a legal requirement rather than a personal choice – and that will lead to a new set of problems and issues.
The security of the apps is paramount to their adoption and use; but it is worth noting that they are being developed under conditions that normally lead to buggy applications – extreme urgency combined with reduced regard for security by design and privacy impact assessments.
This is already evident in the early government-sponsored apps designed to help fight the pandemic. Earlier this month, ZeroFOX reported that government-sanctioned COVID-19 mobile applications are affected by vulnerabilities and privacy issues that put citizens at risk. An analysis of dozens of nation and government-sponsored mobile applications for Android revealed the existence of privacy risks, vulnerabilities and backdoors.
Grant Goodes, chief scientist at mobile application security firm, Guardsquare, warns that without adequate application hardening during development, the result could be insecure apps causing unexpected problems. “Application security is often a distant second consideration in the development of new mobile apps, especially those required for rapidly evolving market demands (which definitely applies to the Covid-19 response),” he told SecurityWeek. “Mobile Apps are typically focused on fast-to-market development, and while testing-for-quality is a mature field and usually well-integrated into the mobile app development process, security and security assurance are infrequently incorporated, or only brought in after significant security breaches have already occurred.”
The problem here is that the apps are being rushed out, almost certainly without adequate security and privacy testing. That testing will undoubtedly come from the hackers that will try to break them – and we won’t know how successful they will be until it is possibly too late.
But it is not just the general security of the app that is being questioned. Bluetooth has its own problems. “Just last year a major vulnerability [key negotiation] was announced that facilitated interception of Bluetooth data by attackers,” warns Chad McDonald, VP of customer experience at Arxan. ”Given that the data in question is personal health data, there exists a substantial risk to the individual.”
“Bluetooth is not bulletproof,” adds Tom Kellermann, head cybersecurity strategist at VMware Carbon Black. “Numerous vulnerabilities have been discovered like BlueFrag, which affected IOS and Android.” He also warns, “Contact tracing apps need to be regularly tested for vulnerabilities and critical updates must be deployed immediately. These apps must also be prohibited from activating smart assistants. People must limit the location settings to run only when approved and when in use.”
Finally, the security of centralized datasets is also questioned. Darren Wray, CTO at data privacy firm Guardum, comments, “In the era of big data, there is a bad habit of collecting all the data you can, even if it is not required or needed right now. More than once I have heard teams say ‘storage is cheap and who knows what we’ll be able to do with the data once we have it’… Governments,” he adds, “have a hoarder mentality, keeping as much personal data as possible and keeping it far beyond its useful life.”
Such central datasets become an automatic attraction to hackers. Tony Cole, CTO at Attivo Networks warns, “First, although it will certainly give [governments] more data on how the virus spreads, it could also lead to focused efforts from adversaries to compromise the centralized database and either modify, destroy, or steal the data set. The data could also be used to violate people’s privacy since it could likely be correlated against other centralized datasets such as the CCTV systems to determine who spread an infection if both systems were compromised. It is likely over time that such contact tracing datasets would be compromised.”
Privacy is the biggest single concern raised over the contact tracing apps. It is most vocal where the data processing is undertaken by a central server, which will almost certainly have at least some government connection. Since Snowden’s revelations on NSA and GCHQ practices, there has been widespread distrust of government agencies’ use of personal data. This distrust was a major causative factor behind the evolution of Europe’s General Data Protection Regulation (GDPR), and it is worth considering contact tracing apps in the light of GDPR.
First, it should be noted that the concept of COVID-19 tracing has been accepted by most data protection regulators. The UK’s Information Commissioner’s Office, for example, stated, “Generalized location data trend analysis is helping to tackle the coronavirus crisis. Where this data is properly anonymized and aggregated, it does not fall under data protection law because no individual is identified. In these circumstances, privacy laws are not breached as long as the appropriate safeguards are in place.”
While this may be a legal view, it is not always accepted by academics. The problem is that anonymization does not work, and anonymized users can usually be re-identified. A study by Dr Yves-Alexandre de Montjoye published in 2019 claimed that 99.98% of Americans were correctly re-identified in any available ‘anonymized’ dataset by using just 15 characteristics, including age, gender, and marital status.
What is not easily understood is just how much data eventually gets collated in central contact tracing databases. However, UK-based privacy and GDPR expert Tara Taubman-Bassirian pointed to the Zoe Global privacy notice. Zoe Global has been collecting personal data voluntarily given by users to track the spread of COVID-19 within the UK. The personal data it collects includes information about health (including body temperature, height and weight), information about pre-existing conditions, information about symptoms, COVID-19 test status, details of any treatment, sex at birth, year of birth, location (including postcode), and whether the user is a health worker coming into contact with patients. This data is shared with (apart from the NHS and hospitals) universities, health charities and other research institutions. The legal basis, considering GDPR, is user consent.
It is unlikely that government produced contact tracing apps will collect quite so much personal data, but there is very little transparency over what is collected where central servers are concerned, nor what happens to the data that is collected. The UK’s central server-based contact tracing app is expected to roll out on a trial basis next week.
Meanwhile, the Australian app (COVIDsafe, which also uses a central server) has been analyzed by researchers. The code is similar to the Singaporean TraceTogether app, but with some differences. Encrypted IDs are downloaded to the app from a central server. These are regularly changed to prevent phone tracking rather than contact tracing. TraceTogether seems to download a batch every day, and rotate through them every 15 minutes. COVIDsafe downloads a single new ID every two hours. If one is missed, use of the existing ID is continued for at least a further two hours – making potential tracking more possible. It also means that the app user is informing the central repository if he or she has decided to turn the app off.
The researchers conclude, “Like TraceTogether, there are still serious privacy problems if we consider the central authority to be an adversary. That authority, whether Amazon, the Australian government or whoever accesses the server, can recognize all your encryptedIDs if they are heard on Bluetooth devices as you go, recognize them on your phone if it acquires it, and learn your contacts if you test positive.” None of this is strictly necessary if the purpose of the app is simply to inform other users that they may need to self-isolate.
The Google/Apple approach is different. They are building the ability for third-party apps to cooperate with their operating systems, but the operating systems rather than the apps will control the data sent to a server for processing. In this way, they can limit the amount of data that is shared with the server before the server warns those who may have come into contact with COVID-19 symptoms. According to both companies the data will not be linked to people’s names but will use anonymous IDs linked to a device. Phone owners will not know who might have passed on the virus. This is clearly more privacy-conscious than the app sending data of its own choice to a government-controlled central server, and is likely to be adopted by most European Union nations.
The future of contact surveillance
Despite current doubts over contact tracing and user privacy, the biggest concern among most security professionals is for the future – that the current pandemic and the response to it will normalize the concept of contact tracing by phone. This is not a universally held worry.
In the UK, former prime minister’s Tony Blair’s Institute for Global Change has reported, “Compared to the alternatives, leaning into the aggressive use of the technology to help stop the spread of Covid-19… is a reasonable proposition.”
In the U.S., Parham Eftekhari, board chair at the Institute for Critical Infrastructure, told SecurityWeek, “It’s worth pointing out that anyone who has a smartphone, smart-watch, or an Alexa already has a surveillance device in their life. The notion that corporations are somehow going to increase surveillance through technology designed to address a pandemic seems irrational. We willingly buy technology with surveillance technology built in it all the time. If society is truly concerned about corporate surveillance, they will demand congress enact meaningful legislation to protect consumer data rights and will begin to make purchasing decisions that prioritize security and privacy over functionality and cost.”
Other views take a ‘Yes, but…’ stance. Felix Marx, CEO at Trūata, told SecurityWeek, “There is clearly a societal need and purpose for utilizing this data for the greater good.” But he added, “issues such as transparency cannot be overlooked even in these most challenging circumstances. Questions that need to be considered include what type of personal data is being shared, for what purposes and for how long?”
Other pundits are more concerned. “This is turning on Pandora’s Bluetooth,” Arxan’s Chad McDonald told SecurityWeek. “Once we open the door to this, it’s never going away. I’m not arguing the efficacy or whether the greater good is being served, only that this will become persistent. This will be COVID-19 enabled Patriot Act of sorts.”
He added, “What we won’t see for some time is how a district attorney or federal agency used this data as part of a prosecution. In the same mentality as ‘follow the money’, information is the new currency and this type of location data is a veritable gold mine. This application or applications like it won’t go away and we’ll happily sign up to hand over just one more bit of our privacy for a promise to return to what we once knew as a normal day outside.”
Joseph Carson, chief security scientist and advisory CISO at Thycotic, said, “The problem I have is that once you start in this direction it’s hard to reverse it. We may not care that much at the minute, but some countries’ use of contact tracing apps could end up sacrificing our personal privacy rights long-term for the sake of surveillance use as a convenient solution to a short-term pandemic.”
Whether contact tracing apps succeed in curtailing the spread of COVID-19 will not be known for weeks or even months after their use. Justification for their privacy encroachments will depend upon the presumed success. A common fear, however, is that the privacy battle will only really begin after the COVID-19 all-clear (assuming that is ever even possible). Users will need to push back hard to ensure that data collected by the apps is deleted, and that the technology isn’t merely repurposed for other, alternative criminal, national security or simply marketing reasons.
“I find the whole concept to be of questionable value in exchange for another step down the loss of personal privacy,” Chris Morales, head of security analytics at Vectra, told SecurityWeek. “What is being proposed here is an API that would allow an installed app to extract specific information that maps user proximity with others who are using the same installed app via the use of Bluetooth signal.
“There have been similar efforts by marketing departments in the past to understand consumer habits. It is referred to as proximity marketing. I already found that concept to be a bit spooky. Unfortunately, most people don’t even know that exists. This just takes it to the next level.” Morales will not be voluntarily installing a contact tracing app.
Chris Hazelton, director of security solutions at mobile phishing firm Lookout, has similar concerns. “Tracking thru mobile phones will continue and increase beyond just mobile analytics within advertising networks. The value of tracking will be realized by governments and many will be reluctant to turn these capabilities off.”
The long-term danger of contact tracing apps is that they might normalize the concept of mobile phone based total government surveillance ‘for the greater good’.