Google and Apple announced last week that they are collaborating on a software project that allows smartphones to track user location via Bluetooth with the view toward alerting individuals if they have been exposed to the COVID-19 virus. The software is slated to roll out as early as next month. A similar Massachusetts Institute of Technology pilot project called “Private Kit: Safe Paths” is already in the beta stage. Both programs aim to safeguard the personal information of participants, while also creating the public benefit of telling people they have been in near proximity with a known virus carrier.

Unlike the centralized database programs used in China and South Korea, these tracking programs would not “phone home” to a government or corporate entity that a given set of individuals has self-reported their virus status. The fear of a disease-reporting system contributing to a larger surveillance state is well-founded, given the track record of authoritarian regimes that don’t fret about protecting individual privacy rights. The challenge for liberal democracies, however, is whether a self-reporting system will be comprehensive, effective and not misused.

One potential flaw in the design of an opt-in system is that it might miss significant numbers of individuals, because people who have the virus either don’t have the software or don’t wish to report their status. At this point in time, we don’t understand the social dynamics of downloading a tracking application by people with the good intention of exposing a change in their health status, once that status actually changes. Even if participants in the tracking program fully comply with reporting their positive COVID-19 tests, an opt-in program still misses the other people exposed to the virus who have come near enough to the participant’s phone, but who have not downloaded the software. This will give individuals with the application a false sense of security that they are not exposed, when in fact they are only looking at a small sample of potential virus-carriers in their various paths.

While protecting personal information, the fact that the identity of the positive testers is not revealed in this program makes it harder for individuals to understand the nature of virus exposure or even whether they may need to alert others in a family group or other social setting. Imagine that one finds out they have been exposed to the phone of a virus carrier five days ago at a certain point in time, but lack other information about the nature of the interaction because the identity of the carrier is not revealed to them.

The MIT Safe Path program protects privacy by time-limited storage of data on the user’s own phone and by mandating user consent for any data sharing. While these privacy principles would safeguard data and protect individual privacy, they might prevent health-care officials from obtaining data sets that give a clearer picture of virus hot spots or the travel patterns of the individuals involved. It seems that some form of anonymized data sharing should actually be required when individuals report a positive COVID-19 status, with extra levels of protection to assure that their data cannot be “reidentified” at a later date.

We should applaud the fact that Google, Apple, MIT and others are devising programs with “privacy by design” built in. This represents true evolutionary progress for the treatment of data sets that otherwise might be used in unintended ways. In designing any program, Europe’s privacy principles remain as a good touchstone. Software should promote transparency, minimize data collection and only use the data for the original identified purpose. People might have a noble intention of revealing their COVID-19 status for preventing the spread of the virus or promoting cures. Years later, however, we don’t want to see that information used to determine a person’s “preexisting condition” or affect a status that they did not contemplate during the outbreak.

California’s new privacy law may also come into play here, as it protects the collection of personal information and gives users the ability to delete personal information that a company has stored about them. Such laws are designed to address the “long tail” of data that is stored about individuals and often shared with third parties without user consent.

Contact tracing presents a “Brave New World,” scenario, where we can deploy software to tackle societal issues such as an epidemic. Through the deployment of these tracking applications, we will learn whether we can really achieve a balance between protecting personal data and curbing a terrible virus.