The Google-Apple infection tracker has a privacy problem. Just not the one you think.

It puts privacy ideology above public health effectiveness


Google and Apple have released specifications for how to use a mobile phone to track coronavirus infections. That's good news. As the country moves toward at least partial resumption of normal life, we're likely to need good tracking capabilities to avoid a second peak in infections, and that can't be done without the cooperation of Google and Apple.

But the more I study the design that these companies are promoting, the less attractive it looks. To be blunt, I think the companies were so eager to avoid criticism from privacy groups and Silicon Valley libertarians that they produced a design that raises far too many barriers to effectively tracing infections. The good news, though, is that Google and Apple won't have the last word. The two companies are creating an absolutely essential set of tools, or APIs, that will allow other tracking apps to interact with phone operating systems. They've also sketched what might be described as the default tracking system that they intend to implement "while maintaining strong protections around user privacy."  This default system is less essential, and a good thing too. The Google/Apple default tracking system is seriously flawed, mainly because it elevates privacy over effectiveness. Luckily, national health systems will be free to write better, more workable tracking apps that can still plug into Google and Apple operating systems without buying into the questionable choices those companies seem to favor.

Public health agencies have tracked infections as a way of stopping the spread of disease for more than a century, and not just for pandemics. It's a routine part of the public health response to syphilis and other sexually transmitted diseases. The process is straightforward. If someone tests positive for an infectious disease, health authorities ask for a list of all his contacts while he had the disease. They track those people down, treat or quarantine them, and then get a list of their contacts. Eventually, everyone in the chain of infection is accounted for, and the chain is broken. COVID-19 is a challenge to this model because asymptomatic infection is so easy, which makes it hard to recreate all of a COVID-positive person's contacts. Which is what makes mobile phone tracking so attractive. An app that knows where you've been for the last two weeks allows a reconstruction of your contacts that your memory can't match. That's why there's been such an emphasis in many countries on using location data for infection tracking. But as the initial enthusiasm encountered reality, public health officials realized that location data on phones was not well-suited to the task. It wasn't detailed enough, gathering everyone's locations over the length of the emergency felt unnecessarily intrusive. Singapore came up with a better approach: using not location data but an exchange of Bluetooth signals between the phones of people who were actually close to each other for a period of time. Then if one of them tested positive, the health authorities could collect the contact data and send alerts to everyone who had exchanged signals with the infected party.

But Singapore's system did not work well with the Android and iOS operating systems.  It only worked if Bluetooth was actively seeking connections all the time, which often required that the phone be continually unlocked. The app still hasn't achieved more than about twenty percent market share.

Google and Apple soon realized that for such a system to work, they would have to adapt the operating system to the needs of a disease tracking app. That's why they are developing new APIs. At the same time, their engineers seem to have decided that they could make an app that worked like Singapore's but had more privacy protections. That idea is the source of the default tracking app they are promoting.

It's also the source of most of the problems with the default app. Probably the biggest mistake the default app makes is trying to build a tracing system will be completely independent of any central health authority.  That's not realistic or wise. Real-world disease tracing systems like Singapore's (and like those of the United States for a century) all rely on the public health authorities to identify infected people, collect their contacts, and notify those at risk. But Silicon Valley is in love with a "trust no one" approach to security that is grounded in an assumption that centralized systems can be abused if authoritarian governments get access to the data. That's always possible, but the risk is pretty modest in this case since the only data at issue is two weeks' worth of contacts. And any authoritarian government worth its salt could get far more location and contact data simply by subpoenaing Google's adtech files. Nonetheless, Google and Apple are pushing a default app that does not depend on centralized administration and therefore is guarded against that particular abuse.

But in preventing abuses from the center, such an app would invite many abuses from the edge. The design seems to envision a regime in which testing results are known to health authorities, but the place and identities of those who've come into contact with each other are never disclosed. That means the health authorities can tell someone who tests positive to notify his contacts, but they will have no way of knowing whether he follows their advice. Predictably, some people won't.  Maybe they'll forget. Maybe they'll be worried about retaliation from those they've exposed. And maybe they'll just be freeloaders who wanted to be notified if they were at risk but have no interest in notifying others. By leaving the decision to the user, and even creating additional barriers to notification with a separate "consent to notify" hurdle, the default design from Google and Apple compromises public health in the name of privacy

Other abuses are made possible by the designers' preference for keeping information from the authorities. In the default design, those getting a notification are told only that they've been near an infected person, but not who and not where. Anonymity breeds irresponsibility, as many Zoom users learned in recent weeks. Surely some irresponsible app users will be delighted to cause random grief by sending out false infection alerts to everyone with whom they've been in contact. Google and Apple say that they can prevent that by having public health authorities verify test results. But a verification process, such as cryptographic signing of test results, is likely to add substantially to notification friction.

The same effort to keep data out of the hands of a central administrator leads the Google and Apple engineers to store and process all contact data locally on the phone.  This is bad news for anyone who loses or has to reset their phone.  It looks as though, under the default Google/Apple design, those already unfortunate people also lose their contact records and thus run the risk of missing a notice that they may have been exposed. Indeed, as far as I can see, the design doesn't even allow people to store a backup of their contacts in the cloud.  Similarly, suppose a person using the app comes to the health system's attention by collapsing in public, or dying before they make it to the hospital, as all too many victims have. In that case, it would be impossible to trace the person's contacts or notify those affected. Apple's famously law-enforcement-hostile phone design will ensure that the authorities cannot open either the phone nor the app.

That's a lot of dysfunction to suffer just to avoid the theoretical risks of a centralized infection tracing system like the ones we've been using for the last hundred years.

A related problem is that Google and Apple seem to assume that infection tracing needs the same kinds of user consent as a weather or traffic app. (And, to be fair, the designers may believe that more privacy features will induce more users to download the app.) But it's increasingly clear that to be effective, a tracing app is going to require a market share well over 50%. That can't be achieved by throwing something into the app store and waiting for users to find it.  Even though Singapore's app has substantial privacy protections and its populace is generally compliance-minded, its app is being used by less than a fifth of the population. In fact, as I've said before, it's likely that mobile phone infection tracking will only work if Google and Apple are required to install such an app automatically on Androids and iPhones, the way Apple Maps or iTunes updates are auto-downloaded.  (Apple is, after all, famous for having automatically sent all its users a U2 album that none of them ordered; last time I checked.) Google and Apple have said that they intend to add a contact tracing platform to their phone operating systems, though it's not clear that the feature will be on by default.

At the end of the day, the purpose of infection tracing is to notify people who may have been infected. Unfortunately, without a lot of changes, the Google/Apple system will make notification a lot less likely. First, in an effort to reduce the role of central authorities, the app design separates testing from contact tracing, so the health authorities who do the testing can't confirm that the contacts got notice of the result. Instead, the design imagines that someone who tests positive will be given the option of sending notice. That means friction every step of the way—a user who tests positive must remember to send the notice, open the app, navigate the "do you really consent?" screen, and wait for the notification to upload.  And all of those steps depend entirely on the user's sense of social responsibility.  That's just a bad idea. At a minimum, public health authorities need to be able to tell who tested positive but didn't send notifications.  That would allow the authorities to ping those who didn't notify others, reminding them to do so. It would also allow the health authority to impose fines or other sanctions on those who use the system to protect themselves but not others.

Finally, Google and Apple's assumption that tracking apps should keep location data away from health authorities leads to absurd results.  It's quite true that in most cases, there's no need to record the exact location where each contact occurred. What matters most is just whether one person was in close proximity to another. But there are times when location matters too. Someone who gets a notice and can't figure out how the contact occurred might want more data before quarantining himself; maybe there was a wall that Bluetooth didn't recognize between him and the infected party, or perhaps they were standing outside with a breeze blowing between them.  I suspect that few people will thank Google and Apple if we end up with an American tracking app that assumes that it's better for them to endure an unnecessary two-week quarantine than for them to tell health authorities the location of their potential infection.

Being able to get precise location data in at least some cases will be especially important if commerce resumes and the app is not universally adopted. Suppose that an infected person spends time at a Starbucks, where a quarter of the people in the shop have installed the Google/Apple app. When the infected person sends out a notice, only a quarter of the customers will get it. But some of them will suspect that the contact was at Starbucks. Without knowing the infected person's location data; authorities would have no idea where the exposures occurred. What if health authorities want to find the other customers, plus the barista? It wouldn't be that hard if they knew the place and time of the infected person's visit. The barista's hours are already recorded by Starbucks, and customers who were there at the time could be found through payment records. But it appears that the Google/Apple app makes no provision for public health authorities being able to identify hotspots by time and location. In fact, in the world the two companies envision, potentially infected parties themselves apparently won't know for sure where their exposure occurred.

These are a lot of problems, most of them stemming from design assumptions that start in the wrong place, privileging decentralization and anonymity over the goal of defeating the virus. Don't get me wrong. Google and Apple deserve a lot of credit for having stepped up to the challenge of providing essential APIs. We're going to need their help to get it onto phones all over the country. But the companies also have blind spots and a fear of getting crosswise with privacy advocates is one of them. This means mean we can't simply trust them to set the parameters for a tracing app that does what we want in the real world.

And we don't have to. Pandemics force us to trust elected leaders with great power over individuals and businesses. State governments have ordered people to stay home even at the cost of their jobs. Governors have been given authority to seize private supplies of ventilators and to close otherwise lawful businesses and gatherings—even church services on Easter.   And as I've written before at greater length, 40 states have adopted a post 9/11 public health emergency law that gives governors broad authority to intervene in private industry to respond to crises. This authority extends even to companies that deal in "communication devices." If they can mandate that people sacrifice jobs and access to in-person religious services, surely state governors can also order companies like Google and Apple to build an interface that works with tracking apps that actually fit society's needs and not Silicon Valley's conventional wisdom.

NEXT: Total Authority

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. One problem: AIDS.

    In my opinion, how we dealt with the AIDS pandemic set an irrevocable precedent for how we must deal with all communicable diseases in our more enlightened age — i.e. Post Warren Court and related transformation of our concepts of individual liberty.

    AIDS was *not* contained via the track/isolate/quarantining route. Instead we went with “universal precautions” and the presumption that everyone had AIDS. That’s the same approach that we’ve seen here.

    Assuming for the sake of argument that we aren’t ALREADY at the point where 80% of the population either has or has had the Wuhan Virus (at which point herd immunity hopefully would apply), all this would do is get people to leave their phones at home. And while THAT might be a good thing, it’ll defeat the purpose here.

    Maybe teaching people not to trust the government is a good thing, but we’ve already done that….

    1. “In my opinion, how we dealt with the AIDS pandemic set an irrevocable precedent for how we must deal with all communicable diseases in our more enlightened age”

      It really didn’t. It set a precedent as to how to deal with a communicable disease where discrimination against a historically oppressed group was a danger.

      1. Well happy Easter to you. Christians have been persecuted in the past. Everyone has been persecuted at some time.

      2. That may be true Dilan, but it’s also remarkably less effective.

  2. This is the craziest thing I’ve read on Reason in a very long time. Give the government a centralized database of individual movement, association, and geographic happenstance? Force citizens to carry a tracking device in their pockets in the name of national health?

    Are you mad? Because you’re certainly not anywhere near having libertarian values.

    1. Exactly

      Would take about 2 seconds from the time this dataset exists before warrants start being filed to access the contacts in proximity.

    2. Baker absolutely hates privacy. He would use any excuse whatsoever, from a pandemic to a Patriots Super Bowl victory to the release of a new Game of Thrones novel as an excuse to abolish it.

      1. Two of those things have happened. The third never will.

        1. I’ve heard GRR Martin is using the whole pandemic lock-down to work on writing. Of course, I still won’t believe 100% until I’m actually holding the book in my hands…

  3. Something about trading temporary security for permanent loss of liberty comes to mind.

  4. ” . . . even church services on Easter ”

    Among all of the problems and hardships being endured in the reality-based world, this is the one to identify as an example of extremity?

    Choose reason. Every time.

    1. This sudden big tech concern for privacy? I don’t believe it. Maybe we should put Facebook on the development…in two weeks you’ll know if you have the virus because your mentions and advertising will change.

    2. That wasn’t meant to be a reply to you. Whoops.

      That said, keep working on your routine. It’s a good thing you don’t earn your living from it.

  5. ” But Silicon Valley is in love with a “trust no one” approach to security that is grounded in an assumption that centralized systems can be abused if authoritarian governments get access to the data. ”

    Which is a perfectly reasonably assumption if one reads the work of Stewart Baker.

    Look, if you want the public and Silicon Valley to trust tracking apps, the government needs to develop a reputation for scrupulous privacy protection. It needs to basically outlaw using any of this for law enforcement unrelated to the pandemic.

    Because that’s what always happens. Once something like this gets installed, the government is like “why shouldn’t we get to use this to fight the war on drugs? We are leaving a weapon on the table”.

    1. I’ll trust the government with information like this when “parallel construction” results in law enforcement getting the death penalty. Not before. Have they EVER gotten their hands on information and not abused it?

    2. Government proclamations that they won’t misuse the data aren’t enough, because they will. Want to take a guess how often FBI agents do background checks on ex-girlfriends or similar types of targets for no valid work reason? That’s already prohibited by law, and the frequency is not zero.

      It’s not an assumption governments will misuse data. It’s a proveable empirical fact.

      1. Worse, the frequency at which those abusing the system face any meaningful discipline is near zero.

  6. Makes me glad that I don’t have a smart phone.

    If one wants to be really effective adding many CCTVs and strong facial recognition software should do the trick as it did in Wuhan.

    1. “like the ones we’ve been using for the last hundred years.”

      I didn’t know smart phones even existed a hundred years ago!

      “A related problem is that Google and Apple seem to assume that infection tracing needs the same kinds of user consent…”

      Well, it actually does. For example, the way I don’t consent to being constantly tracked is to leave my phone on the shelf about 99.44% of the time.

      1. (sorry, Don Nico, not a reply to you…)

  7. If the government would like to know where I’ve been and who I’ve been in contact with, they can get a warrant. Otherwise, I’m not voluntarily sharing it with them. I’m pretty unhappy about the Apple/Google system as well. For a start, it isn’t clear that not opting in to tracking prevents the tracking data from being collected and stored locally. If the data is there, sitting on your phone, it is available for someone to get it. I’d prefer they simply allowed app-level access to bluetooth while an app is running, as they currently do, and left it at that.

  8. Are you sure you didn’t mean to post this on some leftist authoritarian rag like Mother Jones? Or maybe some alt right national socialist publication?

    Yes, we totally have nothing to worry about from our government having access to a complete list of our contacts and when we were near them, nothing at all. Don’t mind big brother watching you, it’s for your own good.

    I for one will not be opting in to any government tracking system for something with as little danger as covid-19. There hasn’t been a government surveillance system yet that wasn’t abused, and not just once, but as often as they can get away with it.

  9. If they can mandate that people sacrifice jobs and access to in-person religious services, surely state governors can also order companies like Google and Apple to build an interface that works with tracking apps that actually fit society’s needs and not Silicon Valley’s conventional wisdom.

    Just like an elevator, once your velocity down the slippery slope stabilizes you don’t feel the motion at all.

  10. Silicon Valley is in love with a “trust no one” approach to security that is grounded in an assumption that centralized systems can be abused if sold to authoritarian governments get access to monetize the data.

    1. And government is inherently authoritarian. The only difference is one of degree. Just look at the massive, usually pointless and at least occasionally abusive surveillance that’s been going on right in here the “Land of the Free”.

  11. I could not disagree more with this. We’ve watched authoritarian governments use phone exploits successfully already.

    This isn’t a theoretical incident.

    We’ll come out of COVID. It’s critical we aren’t also letting the genie out of the bottle for the rest of time.

  12. While public health authorities in the past may have managed a supposedly centralized database of the infected and their contacts, the contacts list was generated by asking the infected. If someone wanted to hide their illicit affairs, or had extended contact with certain people they wished to conceal, they could just not divulge them among the list of contacts. What is being proposed here would allow the government to independently become aware of those intimate, private contacts.

  13. How did you solve problems with screens and apps on devices? Are not you get tired to google solutions in-network? Bro not that difficult find free driver updater DriverDoc with 21 menu languages. Only for a few clicks, they can make your laptop life changed forever so you will never have any problems with it.

  14. There are a lot of good tracking apps. You can rely on them I guess.

  15. I’m pretty unhappy about the Apple/Google system as well. For a start, it isn’t clear that not opting into tracking prevents the tracking data from being collected and stored locally. But that was before I call the They not only salve all my problems but have from €24.95 per month Track 1 Device. Now I always happy with my phone.

Please to post comments