There have been disease outbreaks throughout human history, but never one that has taken place in the era of high-tech tracking tools and “big data.”
Policymakers and technologists have proposed a number of ideas for leveraging such technologies to help suppress the spread of COVID-19. At the ACLU, we recognize the urgency of stemming the pandemic and re-opening America, and don’t think we should immediately write off any tools that may offer public health benefits. But we shouldn’t give up critical rights and freedoms unless a proposal is necessary, effective, and proportionate. We are particularly wary of technological solutions that would interfere with or divert resources away from public health solutions with proven effectiveness, or that risk exacerbating existing disparities that have already led to inequitable health outcomes.
A review of the most prominent proposals that have been put forward suggests that we need to remain vigilant lest we give away our liberty and get little in return. Here are just a few examples:
Tech-Assisted Contact Tracing
Perhaps the most prominent discussion about using technology to help fight the coronavirus has revolved around how high-tech tools can augment contact tracing. One of the oldest and most basic techniques for slowing the spread of disease, contact tracing involves tracking an infected person’s past interactions so that those who may have been exposed can be identified.
Early proposals to use cell phone location tracking data have rightly been dismissed. Mobile phone location data is not nearly accurate enough for identifying those who had been exposed to an infected person. And even if it were, this data (much of which is gathered by shady companies that sneak tracking software into phone apps) is collected through a variety of technologies and is scattered among a variety of companies. Attempts to deploy this concept have not gone well.
A more viable proposal has centered around using our smartphones’ Bluetooth technology to allow phones to detect other phones that come nearby, and use that to automatically keep track of who may have been exposed to an infected person. If done right, this approach can be more privacy-protective because it does not require the collection of sensitive location data and stores data locally and in ways that don’t identify people. The Bluetooth-enabled approach gained traction when Apple and Google announced they were teaming up to create such a system.
While we have a lot of skepticism about whether this concept is likely to prove practical, we have outlined a set of technology principles that the public and developers should use to assess any such proposal. We have also outlined principles that should inform policies and procedures governing the use of these untested technologies.
Another technology that is receiving a lot of attention is fever detection, which we have also analyzed in a white paper. Temperature screening is seen by public health experts as having very limited usefulness — many infectious people do not have a fever, and the technology is not very accurate — but companies and others are rushing to deploy it.
Temperature checks should not be deployed unless public health experts say conclusively that they will help. We don’t want to see a world where inaccurate tests disrupt people’s lives and invade their privacy, or waste time and other resources that could be better used in fighting the pandemic. If public health experts conclude temperature screenings will help, measures must be taken to minimize disruption and ensure fairness, and the use of remote temperature screenings must end with the outbreak. At no point should we deploy remote mass temperature screening using thermal cameras. That technology has both the least accuracy and the most potential to be diverted to other, privacy-invasive uses once COVID-19 is gone.
One proposed deployment of thermal scanners is in drones — another tool that is being pushed to combat the pandemic. Our research makes clear that putting thermal scanners on drones makes them laughably inaccurate for fever detection. Drones are also being deployed in other ways that seem more silly than anything. Drones have been sighted flying over parks to look for social distancing violations (how can it know which people are members of the same household?) and shout admonishments at those who appear to be in violation, for example. The use by police of these loud, short-battery life devices — which are not permitted by the FAA to be flown over people or beyond the line of sight of the operator — is more likely to acclimate people to drone surveillance than it is to slow COVID. The use of drones in this manner over homeless encampments in California has also drawn criticism from homeless advocates, who say it is a terrible way to approach the unhoused.
Another proposal for how to use technology against COVID is to create an infrastructure for so-called “immunity passports.” The idea, appealing at first blush, is to speed reopening by allowing people who are immune to COVID to have a way to certify that fact so that they don’t have to be subject to all the restrictions now in place. The problem, as we have outlined, is that the science is not there to support that approach, and that it would divide workers into two classes. One group, those with COVID immunity, would be given preferential access to employment, housing, or public accommodations.
This new class system would incentivize people, particularly the economically vulnerable, to risk their health by getting COVID-19 so they could get a passport and return to work. It would also likely worsen existing racial, disability, and economic disparities in America. Any immunity passport system would also likely endanger our privacy rights by creating a new surveillance infrastructure to collect health data.
Finally, technology has come to play a central role in education during the COVID crisis — and that has intensified some of the privacy and discrimination problems that technology often brings. A lot of school districts are distributing software that spies on the students it’s supposed to help. We have called on school districts to require these educational technology companies to disable any surveillance functions and limit their personal information gathering to only what is directly necessary for their products to work. We’re also calling on Congress and state and local governments nationwide to give all students equal access to the technologies that make effective remote learning possible. That includes funding to meet the broadband access and technology needs of students and people with low income.
The great 20th-century scientist Linus Pauling once said, “If you want to have good ideas you must have many ideas. Most of them will be wrong, and what you have to learn is which ones to throw away.” We need to continue to think creatively about how new technology can help us fight this disease and get back to normal, but we also need to remain skeptical before we compromise our civil liberties, realizing that there will be a lot of ideas that do need to be thrown away. Others may have promise but need to be implemented only with great care.
In the age when everyone looks to data and technology as the solution to all problems, the biggest solutions to the current crisis, according to public health experts, remain relatively low-tech: soap and water, widespread testing, access to health care, social distancing, old-fashioned human contact tracing, and, ultimately, a treatment or vaccine.
Jay Stanley, Senior Policy Analyst, ACLU Speech, Privacy, and Technology Project