Adam and Eve looking at apple

Paradise lost: Will beating Covid-19 cost us our data privacy?

May 2020

A week is a long time during a global pandemic. What has become the day-to-day reality of life under the corona virus would have seemed like science fiction a mere month ago.  And part of this new reality, along with the face masks and hand sanitizers, closed borders and social distancing, is a sudden urgency to talk about the role of data mining and machine learning.

Data as a weapon in the fight against the corona virus

The usefulness of data in tackling the spread of disease is well documented. In 2007, the World Health Organisation launched an initiative to stop the spread of Malaria in Africa, with Vodafone joining forces with scientists to collect data from people’s mobile phones in affected areas and track how the disease moved through the population. This has since been replicated in other epidemics, such as the Ebola outbreak in West Africa.

Most recently, in places such as Singapore, Taiwan and South Korea, algorithms have been credited with helping to slow the deadly spread of Covid-19. Location data has also been used across Europe, in Italy, Norway, Belgium and Spain, where heat maps have been helping officials understand how restrictions on movement are working. In Spain, for example, telecoms companies collected data that showed the movement of people in one city dropped by 90% the week after the lockdown. In Italy, people defied the lockdown with 800,000 to 1 million people travelling in and out of Milan in the first week.

In the US meanwhile, Google, Facebook, health experts and other tech companies are in talks with the White House about collecting and sharing location data from mobile phones. This information could be used to help predict the next infection ‘hotspot’, decide where to allocate medical resources, and measure the effectiveness of social distancing.

What about privacy law?

Telecoms companies insist that the data that’s being used to potentially control the corona virus pandemic is aggregated and anonymised. That means it cannot be traced back to individuals. But there are still, of course, longstanding concerns over privacy.

People are becoming more aware of – and uncomfortable with – the way big tech companies track their movements. There are questions to be asked about who controls the data, who owns it and what happens to it after it’s been used for its original purpose. As technology advances, it gives businesses more opportunity to collect our data and use it in more ways. And the law is always playing catch-up.

It was only earlier this year that the EU unveiled its new AI and data governance strategy, which was widely acclaimed for its emphasis on data privacy and trustworthy AI. But now, in the face of the pandemic, the worry is that any principles of privacy get brushed aside, or at least relaxed, in the name of the public good. And, when the pandemic is over, will governments want to put their surveillance toys back in the box, now that they’ve had chance to play with them? History would tell us not.

The three privacy principles

So there is a delicate balance to be struck between individual liberty and the need to protect the public good. And never before has a discussion about this been so urgent. The Chinese internet pioneer and critic Hu Yong has examined this dilemma and came up with three principles for policy-makers to abide by when trying to strike a balance:

· People accept that privacy cannot always come first, at the expense of all other values or interests. There are times when it must come second to, for example, maintaining public safety, law and order, national security or stopping the spread of epidemics. But that’s only true if restricting privacy is a last resort, and anything that interferes with the basic right to privacy must be within the law.

· Any measures put in place that restrict privacy cannot violate basic civil rights. Although we can accept a reduction in privacy if it’s absolutely necessary, we can’t accept it in an unlimited fashion. It must be reasonable and, importantly, it must be non-discriminatory.

· The way data is stored and used needs to adhere to fundamental guidelines, such as those issued by the OECD in 1980. It should be stored securely to prevent data theft and leaks, any personal information should be desensitized, and data collected for disease control and prevention should be used for that purpose only.

The corona-virus pandemic has been a global tragedy. It could also be a catalyst for change in the way we think and talk about data. Perhaps it’s an opportunity for us to shape policy around how data is used, in a way that makes sure there’s the right balance between individual and public interest. We can only speculate what the world will look like when we emerge from the crisis, but there is one thing we can be certain about: nothing will be the same.