COVID-tracking apps help identify parties with whom a COVID-infected person had contact. The apps do so by drawing on information about the location of a person’s mobile phone and its proximity to other devices. Experts, including the Bloomberg School of Public Health at John Hopkins, view this technology as a necessary boost to manual contract tracing by public health officials.
Countries are currently split into those where the government requires the use of these apps and those that do not. Mandatory contact-tracking apps are in use in China, India, and Turkey. The rest of the world is following the voluntary route. Nations in this camp include Australia, Austria, Finland, Germany, Ireland, Israel, the Netherlands, and the United Kingdom.
In the United States, however, the question is not whether the government is going to require the population to download an app to monitor their movement and contacts. No one is proposing that approach in this country. Rather, the critical issue is how government and the private sector will restrict access to spaces and opportunities based on whether or not one “consents” to the use of an app or other monitoring device.
For example, an employer may block entry to a workplace unless an individual has an app on his phone that uses Bluetooth to track location, or copies a QVC code at a building’s entrance into an app. The future may be one of “no app, no entry” or even “no app, no job.”
In these situations, a reliance on consent is illusory. Even though the use of the app is voluntary, in the sense of not being government-imposed, its use is part of a take-it-or-leave-it situation.
In many areas of information privacy law, we’ve already been down the path of justifying monitoring through the fiction of consent. For example, “notice-and-choice” is frequently used to justify email monitoring at work; employers inform employees in advance of their policy and the use of a workplace email system is then considered to represent consent to the policy. A similar approach is taken by workplaces that require keycards to enter office spaces. In the employee handbook, a company tell folks about how the keycard collects data; it then distributes the keycards and mandates their use; and, presto, consent is granted each time an employee swipes the device at an entryway.
In responding to the privacy challenges of COVID-tracking apps, however, the need is to look beyond concepts of “voluntariness.” Instead of falling back on illusions of consent, we need a federal law that regulates use of COVID-19 tracking apps. Fortunately, there are now two proposals for such a law before the Senate. Before examining the two bills, however, it makes sense to think through first principles.
How should such a law proceed?
Any regulation of a COVID-tracking app should be pragmatic and proportionate. It should also reflect that public health during a pandemic is a priority. As the Supreme Court held in Jacobson v. Massachusetts (1905), “the social compact” requires that “all shall be governed by certain laws for ‘the common good,’” including by laws for the protection and safety of the population. Finally, regulation should be attentive to the use of these devices in workplaces because this context will be particularly prone to illusions of consent for COVID data collection.
As for the two competing federal bills, both have plusses and share large areas of agreement. The first bill is the “COVID-19 Consumer Data Protection Act,” a proposal introduced by four senators led by Roger Wicker (Mississippi). Currently, there is only a press release in place and not a bill, but the contours of this Act are already in view. The second bill, released more recently, is the “Public Health Emergency Privacy Act,” from Senators Richard Blumenthal (Connecticut) and Mark Warner (Virginia). This proposed statute closes several gaps in the Wicker bill and is generally preferable to it.
The good news first about both bills. Both agree on the need for data minimization, which means collection of the least amount of information. Further, the proposed statutes mandate data security, which is important as any information collected by these apps will be a target-of-interest for hackers, domestic and international.
The bills also both heighten transparency. They do so by mandating information to the affected party at the point-of-collection and by requiring public information. For example, the Wicker Bill requires “transparency reports to the public under which companies will describe their data collection activities relating to COVID-19.” In addition to requiring regulated entities to issue public reports, the Blumenthal-Warner Bill calls for the Secretary of Health and Human Services to consult with the Federal Trade Commission and the U.S. Commission on Civil Rights in reporting on the “civil rights impact of the collection, use, and disclosure of health information in response to the COVID-19 public health emergency.” These approaches have merit and should be incorporated in a consolidated bill.
Finally, both bills include an exit strategy and enforcement mechanisms. Once the public health emergency is over, the Wicker Bill obliges companies “to delete or de-identify all personally identifiable information.” This provision guards against the phenomena, identified by Northeastern University’s Woodrow Hartzog, of “surveillance inertia.” The Blumenthal-Warner Bill further specifies mechanisms for deletion based on a fixed date of sixty days after any collection of data, or once the public health emergency ends.
Regarding enforcement, the Wicker law would grant state attorney generals enforcement power. Judging from the experience under laws such as the Children’s Online Privacy Protection Act, these entities are likely to act vigorously. The Blumenthal-Warner Bill goes further and provides enforcement authority to state attorney generals, the Federal Trade Commission, and through private rights of action. Even without a crystal ball, one can predict controversy around the questions of enforcement mechanisms. The need will be to find a sensible compromise that allows enactment of a COVID privacy law.
Now for the gaps. At least based on available information, the Wicker Bill does not appear to set legal restrictions on the scope of data collection and the substantive uses to be made of personal information. Instead of crafting such limits, the Act relies on notice-and-choice and an individual opt-out provision. It requires “affirmative express consent from individuals to collect, process, or transfer their personal health, geolocation, or proximity information for the purposes of tracking the spread of COVID-19.” The Bill also requires companies “to allow individuals to opt out of the collection, processing, or transfer of their personal health, geolocation or proximity information.”
There are two possibilities here, neither of which is appealing. One is that the notice and “opt-out” provisions will fall prey to the myth of consent. Due to the lack of voluntary choice in many settings in which COVID-tracking will occur, this language may not further free decision-making because of imposition of these apps under take-it-or-leave conditions. In situations of unequal power, many people will be obliged to grant “affirmative express consent” and waive their ability to opt-out.
The second possibility is that the Wicker Act will bolster these provisions with a non-discrimination clause. Privacy laws that take this step prohibit the regulated entity from denying goods or services to individuals because of an exercise of their statutory rights. In the context of COVID-tracking apps, which in at least some targeted circumstances will be pragmatic and proportionate, this approach risks nullifying the entire effort. As more parties opt-out, the value of the information collected will drop and the protection offered to co-workers will decline.
The critical need for a COVID-tracking privacy statute is to set limits on personal data processing. At its foundation, this law must balance the value of privacy with the benefits of information from the COVID-app. And here is where the Blumenthal-Warner Bill shines. It avoids “illusions of consent” by calling for the collection, use, or disclosure of only such data that is “necessary, proportionate, and limited for a good faith public health purpose.” It also details a long list of prohibited uses of emergency health data, including for commercial advertising, or soliciting or selling services in a discriminatory fashion.
As a final note, the use of COVID-tracking apps can only contribute to ending the current emergency as part of a larger governmental response to the pandemic. Here, sadly, the United States is falling short. The paramount needs begin with a robust system for testing and tracing. There is also an urgent requirement for the creation of quarantine spaces for infected individuals who lack such safe environments. There must also be strong legal protections for people with the virus, including the creation of greater unemployment protections without which there will only be disincentives for individuals to seek out testing.
We are running a marathon and not a sprint, and the current crisis requires a pragmatic and proportionate response that sets legal limits on data collection and the subsequent use of collected data. COVID-tracking apps will be here soon; they won’t be truly voluntary; and the law should carefully regulate their use as part of a larger public health response.