14 May 2020

Tortoise AI Summit

Tracking the tracers

We read the privacy policies of 48 contact-tracing apps for Covid-19. They’re often opaque, incomplete and impenetrable to the average reader

By Alexandra Mousavizadeh, Kim Darrah and Alex Clark

This article is part of Tortoise’s pre-read content for the Tortoise AI Summit taking place tomorrow. It’s open to all – register your place here.

Smartphone apps deployed by governments to stop the spread of coronavirus often fail to be upfront with how they use the data of their users, Tortoise Intelligence has found.

Basic assurances such as a set date when personal data will be deleted from central servers, or an explanation of how to request to view one’s own data, are notably absent from a large number of the privacy policies of contact-tracing apps that we’ve examined, which have now been downloaded at least 55 million times globally.

Over a third of apps investigated had no dedicated privacy policy whatsoever, at least four will share data with law enforcement, and the overwhelming majority of policies are as hard to read as a university-level textbook.

The apps – designed to notify users when they’ve been close to someone with coronavirus – are being deployed at speed by governments to stop the spread of Covid-19 and offer a way out of lockdown. There are now at least 48 available across Google’s Play Store and Apple’s App Store, all released in the span of the past 10 weeks, including a test version of the UK’s app now being trialled on the Isle of Wight.

Potential collection of location and health data on a mass scale comes as governments and health authorities look to harness modern data science techniques to analyse, monitor and forecast the progression of the pandemic. When approached for comment by Tortoise Intelligence, the UK government did not rule out the use of machine learning techniques on the centralised data collected anonymously from the NHS contact-tracing app.

But data protection experts are warning that the coronavirus crisis could usher in a new contract between governments and their citizens, where privacy concerns are deprioritised in favour of finding ways to suppress the virus using technology and large datasets. And yet the success of South Korea in tackling Covid-19 has been credited, in part, to the country’s effective deployment of both digital and manual contact tracing. Widespread uptake of such apps are critical to their success, according to studies like that published by Oxford academics last month, which found that at least 60 per cent of the UK population would need to download the NHS app to suppress the epidemic.

To make sense of what agreements people across the world are entering into with governments for health alerts, Tortoise Intelligence studied the privacy policies of 48 contact-tracing apps, spanning 26 different countries, based on a list compiled by research site Top10VPN. Privacy policies are subject to varying legal requirements around the world, but, in general, they provide an opportunity for app developers to give a sense of what personal information is gathered, how it is used, and how it is protected.

We found a patchwork of approaches, varying wildly from carefully worded notices in some European countries subject to the EU’s data protection regulations to others that explicitly leave the door open for law enforcement or advertisers to access people’s personal data.

Our other findings include:

The apps covered by our analysis encompass a wide range of data collection approaches and technological designs, which each have their own implications for privacy. One important distinction is that some, like the UK’s new app, collect Bluetooth data, while others rely on GPS location data. While Bluetooth is generally seen as less intrusive from a privacy perspective, it also constitutes fine-grained close proximity information, which has the potential to be very sensitive.

The data behind both becomes much more sensitive if linked with people’s identities. Many apps, such as the one built by NHSX, deliberately ensure that app data isn’t connected to an individual for people’s names, but others, such as Iceland’s Rakning C-19 app, downloaded by at least 50,000, can explicitly link people’s phones to their legal identity (though in Iceland’s case this is only if they are diagnosed with the virus and they give their consent).

One design feature that privacy advocates have emphasised is whether the app’s framework is centralised or decentralised. The former means data generated by the app is uploaded to a central server, though often encrypted and anonymised in the process, whereas the latter means data stays on individual phones. Both Bluetooth- and GPS-based apps can have a centralised or decentralised framework.

One of the most striking things about our investigation of the 48 apps is that 16 had no privacy policy whatsoever, with the link to the policy on the Google Play Store either completely broken or leading to a privacy policy that failed to address the contact-tracing app itself.

Bahrain’s contact-tracing app, for instance, simply links to the privacy policy of its public health website, which instead deals with the likes of in-browser cookies and blog posts by the government. One app deployed in the US and developed by MIT links straight to the university’s five-year-old generic privacy policy.

Some apps included an automatically generated privacy policy that didn’t once mention anything to do with the app itself – these were also counted as having no privacy policy. One of these apps, which was built by the Indian state government of Maharashtra, concludes its privacy policy by saying: “This privacy policy page was created at privacypolicytemplate.net” and doesn’t once mention contact tracing or the data needed to do it.

Together, we found that apps with no dedicated privacy policy have been downloaded by at least 550,000 people. There are several more apps, including Australia’s COVIDSafe app, which do have a privacy policy, but fail to link to it straight from the Play Store, adding an extra obstacle to users looking to read up on their rights.

Out of those that did have a policy, we found that most were dense and tricky to read. A well-written privacy policy is never going to make for gripping reading, but data protection experts argue they should at least be digestible, clear and concise.

“Nearly all apps comply with the obligation to produce a privacy policy on paper. But is it actually comprehensible to the average reader? And do they put much effort into putting it in a form that people might actually digest the key information? A lot of the time, the answer’s no,” says James Clark, Senior Associate at law firm DLA Piper. “That problem becomes particularly acute if you’re talking about something like contact tracing because, in a free society in the West, where you can’t compel people to use these apps, it’s only going to work if you can engender public trust.”

Our analysis sees the vast majority scoring extremely low on the Flesch-Kincaid test, a scoring system that is sometimes used in schools to come up with a reading age for books. It judges texts with shorter sentences and words with fewer syllables to be more readable. Far from being easily accessible, we found that most privacy policies have the reading age of a university textbook, with the Israeli contact tracing app The Shield scoring worse than the introduction to German philosopher Immanuel Kant’s Critique of Pure Reason.

One of the big worries about these apps from the perspective of privacy advocates is “function creep” – the idea that, while data may not immediately be used for anything other than coronavirus, it could over time be repurposed. “If you have a contact tracing app that is collecting data about where you have been 24/7, it will be designed for a very clear purpose, which is to alert you and others of the risk of infection. But there is a risk of function creep where the data ends up being used for surveillance purposes. This is the issue with anything to do with location data collection – any type of data gathering that involves location can potentially be used for surveillance purposes, so it is important to consider how to prevent this at the outset,” says Eduardo Ustaran, co-head of the Hogan Lovells Privacy and Cybersecurity practice.

One well-established way of protecting against this is for the creators of an app to lay out a set date when all personal data will be deleted from central servers. Some apps, such as Canada’s ABTraceTogether, delete any centrally held data on users’ movements after 21 days. But not all were quite so strict – some specified a date long into the future. Poland’s Home Quarantine app, for example, said the government would delete data after six years, while an app for the Mexican state of Jalisco gave itself five years. Others would specify a date when data would be deleted from a user’s phone, but not when data would be deleted from a central server if uploaded. The privacy policy of North Macedonian app StopKorona! details how users could request to delete data from the government’s server – but does not specify if such data would ever be deleted without a request.

Out of all the privacy policies we read, 33 failed to set out a date or specific time frame by which point all personal data would be deleted from central servers.

Those without a concrete date were varying levels of vague – only 7 said data would be deleted when it was no longer needed for the purposes of containing Covid-19, while 26 failed to nail down any sort of assurances at all. The biggest app in India, Aarogya Setu, which has been downloaded over 50 million times and has faced intense scrutiny over privacy concerns, initially failed to give any information on when personal data would be purged. It was then updated in mid-April to state that data on those who contracted Covid-19 would be automatically deleted 60 days after they were “declared cured”. However, in the days leading up to publication of this article, Tortoise Intelligence received “403 Forbidden” errors when trying to access the policy from the Google Play Store, and we had to rely on an archived version of the policy. Decentralised apps, such as Israel’s The Shield, are designed to store data locally on users’ phones rather than on a central server, so they weren’t included in our count of those without a date for deletion.

For those worried about “function creep”, one of the biggest concerns is that the data could end up in the hands of law enforcement. To check out what assurances are being given on this front, we looked at whether or not privacy policies state that data is explicitly protected from being shared with police and security services.

Most apps were fuzzy on the details, such as Cyprus’ app, which said that data might be shared with “relevant authorities” for combating the outbreak, while Poland’s ProteGO Safe app transfers data to “authorized entities” but “only on the basis of applicable law”. One Indian app, which also uses facial recognition, says data will be available only to “authorized government officials”.

We found that four apps, together covering 160,000 million downloads, developed for use in Poland, Ukraine, India and Spain, actively confirm that app data can be shared with law enforcement. Ukraine’s Action at Home app, for instance, said data would be shared with national police, while Poland’s mandatory app shares information with both provincial and national police.

But other apps specifically ruled out such sharing. “Health information or location data cannot be made available to police or prosecutors or used for insurance purposes or by employers even if you consent. The personal data cannot be exploited commercially,” reads the privacy policy of Smittestopp, the Norweigen contact-tracing app.

For some, function creep is almost inevitable once a dataset of people’s movements has been created, even in western countries where there are more data protections. “There’s no doubt in my mind, because it’s logical, right?” says Alessandro De Carli, a software engineer who has built a privacy-focussed contact tracing app based in Switzerland called WeTrace. “If you have such a tool, and you can use it for good, or if someone can phrase it in a way to say you can use it for good things, then it will be used. And it’s just a matter of time before what is good and what is bad changes in definition. That’s the big danger behind these kinds of things. It starts with ‘yeah, we just want to track the burglars’, but then it starts to to move to other use cases.”

The other side to function creep is the worry that advertisers could also get their hands on some of the data collected on contact-tracing app users. “This is where the largest incentive is,” De Carli notes.

Indeed, several apps, such as the US Contact Tracer app, explicitly said in their privacy policies that data might be used for marketing purposes, albeit in de-personalised form. Its policy states: “For example, when you link Contact Tracer to certain third party apps, services, or devices, we may provide a recommendation related to or based off this linking.”

A separate question is whether, once data has been collected, users can easily ask for it to be deleted. When it comes to giving away personal data, one of the key principles in data protection – at least from the perspective of the EU’s framework, GDPR – is the ability to request to see any data held for oneself and to ask for it to be deleted. Based on this principle, it’s become common for privacy policies collecting personal data to explain how people should go about deleting or accessing their data. Our findings show that less than half of privacy policies do describe how to lodge such a request.

STOPPCorona, an app for Austria developed by the Red Cross, says it will automatically delete all centrally held data 14 days after a user deletes the app. In the UK, people can email the privacy officer to request to see or delete all data, except for the data that has been aggregated and anonymised. On the flip side, we found that a total of 23 apps, together downloaded 52 million times, have no explanation in their privacy policy as to how to access or delete data.

Before downloading a contact-tracing app, users must grant certain permissions so the app can access smartphone functionalities like its camera or GPS. Our analysis of the permissions requested by each app found that some contact-tracing apps are asking to draw on a wide range of a phone’s hardware.

Some, such as Poland’s Home Quarantine app, which is mandatory to those suspected of having Covid-19, require access to the phone’s camera so that those with the virus can take and send selfies of themselves at home to police to prove they are self-isolating. Others ask for the ability to use fingerprint sensors, including Ukraine’s Action at Home app. Many apps require permission for “activity recognition”, which is how phones can identify when someone starts walking or running, while some have permission to make a phone call without the user dialing. By contrast, the NHS’s contact-tracing app is much more minimal, and doesn’t require access to camera or biometric sensors.

Some app developers are at least up front about the control they have over phone hardware. “Location data is capable of being collected in certain circumstances even when the location services or mobile data is switched off,” reads the privacy policy of Indian contact tracing app SAIYAM, downloaded at least 1,000 times.

“In addition, the location data collected from a User will be linked to the respective User Account, even if you have not enabled location data to be collected from their device.”

 

All our journalism is built to be shared. No walls here - as a member you have unlimited sharing