Privacy and the Pandemic: Time for a Digital Bill of Rights
Democratic governments need digital tools and personal data to combat the crisis, but too much sharing can be dangerous to individuals. How can they strike the right balance?
In the United States, the coronavirus’s spread is rapidly undermining fundamental norms. Barely month ago, China’s lockdown seemed a draconian imposition by an authoritarian state. Now even the most libertarian-leaning U.S. states are implementing similar measures out of necessity. While severe outbreaks have likely peaked in parts of the country, the nation as a whole is set to endure what will be a lasting pandemic. Jerome Adams, the U.S. surgeon general, has warned that the surge in deaths on the East Coast will be akin to Pearl Harbor or 9/11.
As was the case following both of those catastrophes, a rapid restriction of civil liberties may seem necessary as governments fight to control the situation. The coronavirus pandemic has already caused public-health imperatives to collide with democratic principles as core as the freedom of movement. More limitations on civil liberties are likely to come—most crucially in the digital sphere. But history shows that there is little reason to believe that the end of the immediate threat will result in a loosening of those strictures.
Citizens, corporations, and leaders must therefore start working to create a bill of digital rights—now, before things get any worse and individuals permanently lose control of their online data. The pandemic will likely endure longer than the public can tolerate a lockdown. The shutdown has already generated 22 million unemployment claims in the United States and necessitated a $2.2 trillion federal stimulus package, possibly the first of several. Yet vaccines or other treatments are unlikely to materialize before the middle of 2021—so governments need to start adapting.
To date, the only model that appears effective in containing the spread of the virus in a partially reopened economy is a combination of extensive testing, the widespread use of personal protection equipment, and the deployment of surveillance technology. The governments of Hong Kong, Singapore, South Korea, and Taiwan have all managed to avoid prolonged lockdowns—some even keeping open businesses, restaurants, and schools—by applying this combined approach.
The United States and a number of European nations are now considering a similar effort on the grand scale, in partnership with the big technology firms. Google has begun sharing some of its vast collection of location data with public health researchers and epidemiologists to help model the movement of its users. A team at the Massachusetts Institute of Technology has developed a proximity-tracking application to trace those who have had contact with COVID-19 patients. Google and Apple have announced the introduction of Android and iOS application programming interfaces to facilitate voluntary contact tracing via Bluetooth Low Energy transmissions. For its part, Washington is now exploring the widespread use of such data-driven methodologies.
Public health, privacy rights, and economic prosperity are three necessities vital to all democratic societies. But they require trade-offs. In thinking them through, keeping a few principles in mind can help countries strike the right balance.
First, any surveillance measures taken must be reversible, strictly proportionate, and fully transparent. To ensure that they don’t outlive the emergency, the process for their removal must be defined at the moment they are implemented. The recent past has shown that such arrangements may be difficult to undo. For instance, many of the sweeping surveillance provisions of the temporary Patriot Act have been routinely reauthorized by Congress since 2005 and were most recently extended just last month.
To track potentially infected individuals, South Korea collects not only localization data from cell phones and GPS, but also public transport data, credit card data, immigration records, and so on. If the United States is to copy that model, the government should require the app’s developers to demonstrate how each piece of information collected can help counter the spread of the coronavirus. For instance, there is no legitimate reason to collect or process localization or proximity contact data for months to fight a virus with a two-week incubation period.
The holders of these records will also be tempted to use this invaluable data for purposes other than fighting the virus—purposes that range from the public-minded to the profit-seeking. Make no mistake: This data will be of immense value, including for the common good, for example to improve public transport, health care infrastructures, and so on. To preserve public trust, however, government must restrict the repurposing of this data for unrelated ends, even if its use appears to be for the greater good.
Efficiency alone cannot justify every policy. In China, a smartphone app that aggregates an individual’s health data and assigns a color code (green, yellow, or red) reflecting that person’s health status is required to enter a shopping mall or to board a train. Americans must be wary of this level of social control. It is too liable to be applied unevenly, in a discriminatory manner, and for purposes completely unrelated to containing the virus.
Finally, political leaders must address a structural problem: the age-old texts, norms, and institutions that underpin democracy function clumsily in the digital world. This is not a new problem, and digital policy experts have long foreseen a crisis that fundamentally disrupts the balance between digital technologies and individual freedoms. Western political leaders have simply lacked the political will to plan proactively. Their general unpreparedness for the pandemic itself should be warning enough that a failure to anticipate policy challenges can be disastrous for society.
In democratic countries, such a protective legal framework already exists in the realm of some civil liberties: Few citizens of democracies fear that the freedoms of movement or assembly will remain permanently restricted once the coronavirus pandemic subsides. There is no such level of certainty when it comes to digital rights, where matters are much murkier and fears arise that accepting emergency provisions might lead to mass surveillance becoming the new normal. Many of these rights are implied and assumed, but have not yet been enshrined in an enforceable capacity.
In recent years, as more information emerges about the extent to which public data is collected and manipulated by the tech giants, activists have pushed to ensure that the use of data is in line with democratic values. The digital response to COVID-19 magnifies this necessity.
In October 2019, at the Global Forum on Artificial Intelligence for Humanity, French President Emmanuel Macron called for experts and governments to define collectively a new bill of rights guaranteeing fundamental protections in the digital world, stressing that “what is at stake is absolutely critical and core for our democracies.” The United States should heed this call and help organize an international coalition of like-minded policymakers, experts, public advocacy groups, and concerned corporations.
Such a bill could clearly outline how digital rights and privacy should be guaranteed to prevent any lasting infringement post-crisis, and, as any fundamental rights, balanced by other rights and social goods—such as public safety and health. The questions we face today reveal why the current approach of placing the onus of privacy solely on users is incomplete. Arguments for deploying surveillance tools hinge upon the need to balance the respect for privacy with the mutual obligations we all have to one another—in this case, to restrict the spread the virus. To strike a meaningful balance, letting each individual opt in or opt out is insufficient. Personal data generates externalities. These are beneficial when it comes to containing a pandemic. They are detrimental when they lead to massive troves of personal data that lend power to the private or public actors that wield them to monitor our daily activities and infringe on our privacy. Deciding where to strike this balance should not be improvised in the middle of a crisis: It requires a full democratic discussion.
While most policymakers and governments are currently focused on the immediate needs of battling the pandemic, no one should lose sight of the trade-offs involved. To prevent taking steps that are hard to reverse, democratic societies should act now to codify digital rights and pass laws to protect them.
Dipayan Ghosh is co-director of the Digital Platforms & Democracy Project and Shorenstein Fellow at Harvard Kennedy School. He was a technology and economic policy advisor for former President Barack Obama, and until recently worked on U.S. privacy and public policy issues at Facebook.
Adrien Abecassis is a fellow at Harvard Kennedy School’s Belfer Center for Science and International Affairs. He was a diplomatic and a senior political advisor for the president of France from 2012 to 2017.
Jack Loveridge is a historian of science and technology and an associate of Harvard’s Weatherhead Center for International Affairs. He was a Fulbright scholar to India and writes on the links between science policy and economic development.