An Unprecedented Collaboration

On April 10, 2020, Apple and Google announced something remarkable: the two fiercest rivals in technology would collaborate to build a system for digital contact tracing. Within weeks, they would deploy software to billions of devices worldwide.

The idea was elegant. Your phone would use Bluetooth to exchange anonymous codes with nearby phones. If someone later tested positive for COVID-19, their codes would be uploaded to a server. Everyone's phone would periodically download positive codes and check for matches. If your phone found a match, you'd get a notification: you may have been exposed.

No GPS tracking. No government database of contacts. No way to identify who infected whom. Just a private alert that you should get tested.

It was, in theory, exactly what the pandemic needed: automated contact tracing that could work faster than any human system while preserving privacy. In practice, it was a case study in why good technology isn't enough.

How Exposure Notification Works

The Apple-Google Exposure Notification System (ENS) relied on clever cryptography to protect privacy:

The Technical Flow
  • Key generation: Your phone generates random keys that change every 10-20 minutes
  • Broadcasting: Your phone broadcasts these keys via Bluetooth Low Energy
  • Collection: Nearby phones store the keys they receive, along with duration and signal strength
  • Positive diagnosis: If you test positive, you can upload your recent keys to a server
  • Matching: Everyone's phone downloads positive keys and checks for local matches
  • Notification: If a match is found (close contact for sufficient duration), you're alerted

The brilliance of this design is what it doesn't do. The central server never learns who was exposed, only who tested positive (and even that is just a set of anonymous keys). Your phone does all the matching locally. No one can use the system to track your location or identify your contacts.

This was a deliberate choice. Apple and Google knew that any system perceived as surveillance would face massive resistance. By building privacy into the foundation, they hoped to maximize adoption.

The Adoption Problem

For contact tracing to work, enough people need to participate. Epidemiologists estimated that at least 15% of the population would need to use the system for it to meaningfully slow transmission. Higher adoption rates would be better, with models suggesting 60%+ adoption could substitute for lockdowns.

Actual adoption fell far short. In the United States, only about 25 states deployed apps using the Apple-Google framework. Even in states that did, download rates were modest. National adoption never exceeded roughly 20% of smartphone users, and active usage was lower still.

The reasons for low adoption were multiple:

Trust: Despite the privacy-preserving design, many people didn't believe the technology was truly private. Years of data scandals had eroded trust in tech companies. The complexity of the cryptographic protections was hard to communicate.

Fragmentation: Instead of a single national app, the US had a patchwork of state apps that didn't always interoperate well initially. Traveling between states meant dealing with different systems.

Public health messaging: The apps were often poorly promoted. Many people never heard of them. Those who did weren't sure how they worked or whether they were worth installing.

Political polarization: In a deeply divided country, even pandemic response became partisan. Some communities were skeptical of any public health intervention, digital or otherwise.

The Effectiveness Debate

Did the apps actually work? This question proved surprisingly hard to answer.

The privacy design that protected users also made evaluation difficult. Because the system didn't collect centralized data, researchers couldn't easily measure how many infections it prevented. Some studies attempted to estimate impact through surveys and modeling, with mixed results.

A 2021 study of the UK's NHS COVID-19 app estimated it averted approximately 600,000 cases during the autumn 2020 wave. The app had unusually high adoption (roughly 28% of the population) and was credited with sending millions of exposure alerts.

Studies of US apps were less conclusive. Low adoption rates meant the mathematical conditions for effectiveness were rarely met. If only 10% of people use the app, and an infected person had 10 close contacts, on average only 1 of those contacts would receive a notification. The chain-breaking potential was limited.

"In the case of contact tracing and exposure notification apps, there is a trade-off between increased privacy measures and the effectiveness of the app."

What Other Countries Did

The Apple-Google approach wasn't the only model. Countries made different choices about the privacy-effectiveness tradeoff:

South Korea: Aggressive contact tracing using credit card records, phone location data, and CCTV footage. Privacy was secondary to disease control. Cases were identified and isolated quickly. The approach was effective but involved surveillance that would be legally and culturally unacceptable in many Western countries.

Singapore: Initially deployed TraceTogether, which stored contact data centrally. After controversy, the government promised to limit data use to contact tracing. Later, police used the data for criminal investigations, confirming privacy advocates' fears.

Germany: Initially planned a centralized system but reversed course after public backlash, adopting the decentralized Apple-Google model instead. The Corona-Warn-App achieved relatively high adoption (~40% of smartphone users).

China: Mandatory health codes integrated into payment apps tracked location, contacts, and health status. Codes determined access to public spaces, transit, and businesses. Extraordinarily effective at disease control, but a level of surveillance impossible elsewhere.

The Technology Wasn't the Problem

In retrospect, the Apple-Google system was a technological success deployed into a social failure. The technology worked as designed. Phones exchanged codes, matched exposures, and sent notifications. The cryptography held up. Privacy was protected.

What failed was everything around the technology:

Coordination: The US never mounted a coordinated national response. States went their own ways. Some built apps, others didn't. Some promoted them, others ignored them.

Testing: Exposure notifications are only useful if people can get tested quickly after receiving them. Throughout much of the pandemic, testing was slow, scarce, or inaccessible.

Support for isolation: Knowing you were exposed doesn't help if you can't afford to miss work or don't have somewhere to isolate. The apps assumed a functioning support system that often didn't exist.

Trust: Decades of privacy violations, government overreach, and corporate data harvesting had poisoned the well. Even a genuinely privacy-preserving system couldn't overcome accumulated distrust.

Lessons for the Next Pandemic

COVID-19 wasn't the last pandemic. The infrastructure built in 2020 offers lessons for future responses:

Privacy by design works technically but faces trust deficits. The Apple-Google system proved that you can build effective privacy protection into public health technology. But building trust requires more than good engineering. It requires transparency, accountability, and time.

Technology must be embedded in broader systems. Digital contact tracing is useless without testing, support for isolation, and public health follow-up. Technology alone can't substitute for functional public health infrastructure.

Adoption thresholds matter. There's a minimum participation level below which network effects don't kick in. Future systems might need to think more carefully about incentives, mandates, or alternative approaches that work at lower adoption rates.

Fragmentation kills effectiveness. The US approach of leaving everything to states produced a patchwork that was less than the sum of its parts. National coordination would have been more effective.

Build before you need it. Apple and Google accomplished something remarkable in weeks, but weeks mattered during exponential growth. Having privacy-preserving contact tracing infrastructure ready before a pandemic would be valuable.

The Infrastructure Remains

The Exposure Notification System still exists. Apple and Google built it into their operating systems. With minor updates, it could be reactivated for future outbreaks of COVID-19 or other respiratory diseases.

Whether anyone would use it is another question. The pandemic revealed deep fissures in how societies balance individual liberty against collective welfare, privacy against public health, technology against trust. Those tensions didn't resolve when case counts fell.

The contact tracing apps were a mirror reflecting back what we already were: a society capable of building remarkable technology but struggling to deploy it in service of the common good. The code was elegant. The implementation was messy. Welcome to public health in the digital age.

Sources

  1. Apple & Google. (2020). Privacy-Preserving Contact Tracing. covid19.apple.com
  2. Wymant, C., et al. (2021). The epidemiological impact of the NHS COVID-19 App. Nature, 594(7863), 408-412.
  3. MIT Technology Review. (2021). We investigated whether digital contact tracing actually worked in the US. technologyreview.com
  4. Ahmed, N., et al. (2020). A Survey of COVID-19 Contact Tracing Apps. IEEE Access, 8, 134577-134601.
  5. Kahn, J., & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing. (2020). Digital Contact Tracing for Pandemic Response.