South Korea confirmed dozens of new Covid-19 cases last month, most of them linked to an individual who had visited several nightclubs in Seoul’s Itaewon district before testing positive for the novel coronavirus.
In the next few days, public health officials had to trace more than 7,000 people who had recently visited nightclubs in the same district. It was a task that would be impossible with conventional, manual contact tracing approaches — and a perfect opportunity to use the technology-enabled track and trace model South Korea had adopted with enthusiasm.
There was one complication, though: Itaewon is home to many LGBTQ-friendly nightclubs. News reports focused on that detail and included the age, gender, location, and movements of the Covid-19 patient who had been there — all too predictably resulting in an increase in anti-gay rhetoric on social media. The South Korean LGBTQ community is now afraid that they will be forced to out themselves, and some 3,000 people who may have been exposed in Itaewon have not come forward for testing.
The Itaewon case exemplifies the double-edged sword of contact tracing technologies: information that can save lives can also foster discrimination and even persecution. That’s why these technologies need to come with proper safeguards.
Over the last several years, our lab has been developing tools to allow individuals to report symptoms, confirm diagnoses, and share epidemiological data in real time, including the use of mobile apps, analytics tools, and novel diagnostics. Our experiences have shown us the power of these technologies as well as their dangers, even in low-stakes contexts.
One of our tools designed to help people prepare for possible epidemics is an educational simulation called Operation Outbreak for students in grades kindergarten through 12. It uses Bluetooth technology to spread and track a virtual pathogen. In one outbreak simulation at Sarasota Military Academy, a middle school that co-developed and piloted the app, students playing the role of virtual government officials lost the trust of their constituents by hiding information about viral transmission and hoarding vaccines. The mock military then cracked down on population movement, in effect using each student’s simulated health status (as indicated on the mobile app) in determining who gets an immunity passport. This procedure led some students to try to avert the system by taking a screenshot of the app when it displayed a healthy status in order to falsify their immunity passports, which caused more clashes between civilians and the military.
This was just a simulation, of course, but it shows how technology designed to help us work together can instead drive us apart.
Outbreaks expose and amplify the cracks in trust among us. They are a crucible — a chaotic, unpredictable environment in which an insidious threat can cause leaders and citizens alike to revert to basic instincts for self-protection. Mistrust, paranoia, and suspicion can heighten, perverse incentives can take hold, and blame and stigma often thrive, creating a toxic and dangerous culture.
The death of George Floyd at the hands of police, and the consequent protests across the country, have lain bare the discriminatory profiling and brutality that many underrepresented groups faced long before Covid-19. It is reasonable to fear that information gathered from disease-tracking technologies could be misused to target and harm the same vulnerable groups. Afraid of data being weaponized against them, vulnerable communities might opt out of using outbreak surveillance technologies altogether, further increasing their risk of infection.
Sharing outbreak data saves lives. But its potential to do so can be reached only if the global community ensures that the development of the contact tracing technologies is guided by principles that protect and serve the most vulnerable among us. Ideally, these principles would follow those already laid out to ensure privacy protection in Bluetooth contact tracing, such as the Data Rights for Exposure Notification and the Contact Tracing Joint Statement, while also being expanded to cover broader data use scenarios such as self-reporting of symptoms and risk prediction from aggregated health data.
Among the key principles:
- Participation should be completely voluntary, with the option to stop at any time.
- Trust is essential. No one should fear that participation will lead to being tracked, deported, or worse.
- No one should be punished in any way or stigmatized for behavioral information reported through an app.
- No one’s data should be bought or sold by others.
- And no one who wants to participate should be left out, either for lack of access to technology or fear of consequences.
We look forward to these principles being widely used and guarding against the misappropriation of outbreak surveillance technology and data while upholding the social good. Like all other aspects of the Covid-19 crisis, this is a problem that we can solve only if we work together. And we must solve it, both to emerge safely as a global society from this pandemic and to better prepare for the next one.
Pardis Sabeti is a professor of organismic and evolutionary biology and immunology and infectious diseases at Harvard University and a researcher at the Broad Institute of MIT and Harvard. Andres Colubri is a computational scientist in the Sabeti lab and an incoming assistant professor of bioinformatics and integrative biology at the University of Massachusetts Medical School. Their group was recognized as part of the “Ebola Fighters” as TIME’s 2014 Person of the Year. The authors are inventors on patent applications related to this technology filed by the Broad Institute of MIT and Harvard, with the specific aim of ensuring this technology can be made freely, widely, and rapidly available for research and deployment. Sabeti is a co-founder and scientific adviser of Sherlock Biosciences and on the board of directors for the Danaher Corporation and holds equity in both companies.