This weekend, a million Australians downloaded a smartphone app called CovidSAFE that is designed to warn citizens if they’ve spent time near someone who has tested positive for the Covid-19. Voluntarily.
That would have been an unthinkable breach of privacy just a couple on months ago.
Nations around the world are readying similar apps, as tech giants Google and Apple prepare their version for Europe and the U.S. And contract-tracing smartphone appls are only the beginning. If the terrorist attacks of 9-11 sprang a leak in the levees that protect civil liberties, Covid-19 seems on the verge of overtopping them.
Today we’ve released the final episode of No Place To Hide, an audio documentary podcast I’ve been working on for more than a year. The issue of privacy takes on new importance right now, as the Covid-19 outbreak is challenging nearly every social, legal, and ethical framework in the Western world. More than ever, it’s important to understand what privacy really is, why it’s is a crucial element to our very humanity, and why its future is so endangered. I hope you’ll listen. This story continues below.
There should be little argument that tech can play an important role in beating back the scourge of Covid-19, and helping the world return to normalcy, even if that means opening the flood gates for a while. Our time calls for sensible concessions and nuanced judgments during an obvious emergency while placing time limits on expanded government powers.
But if presidential politics 2016 have taught us anything, we don’t do nuance well at the moment.
Here’s the truth: Everyone believes in privacy until their kid has been kidnapped and cell phone tracking technology would find the getaway car. And when it comes to civil liberties and privacy, hypocrisy is built into the equation. We tend to care about our own privacy, but not others’. A poll after 9-11 found that Americans were about twice as likely to approve use of technology to surveil *other* people than to surveil themselves. If someone asked the question right, I imagine people would say they’d want to be alerted if someone with coronavirus had passed them by via a tracker in that patient’s pocket, but they don’t care to carry around the same tracking devices themselves. Gun to my head, I’d probably admit to feeling that way.
Hey, these decisions are difficult.
I am heartened by public statements from Google and Apple as they dip into dangerous territory by making smartphone-aided virus tracking easier for governments. The software they are working on is designed with privacy in mind, which might be a first in this situation. Rather than make it easy for governments to track every phone, through some kind of centralized architecture, their tech will let phones talk to each other using short-range Bluetooth. Anonymized daily tracing keys will be used to alert citizens if their phone “talked” to someone else’s phone who later tests positive for Covid. This will theoretically aid in the tricky and laborious business of contract tracing. Critically, Apple and Google are speaking publicly about their protocols, taking feedback, and promising to “sunset” the software after the epidemic. The name was changed from “contact tracing” to the less-spooky “exposure notification.”
This tool sounds far less invasive than, say, the approach that Russia is reportedly taking. In Moscow, a patient-tracking app will make sure citizens are obeying quarantine orders, according to the BBC.
“(The app) will request access to the user’s calls, location, camera, storage, network information and other data. The intention is to check they do not leave their home while contagious,” the BBC says.
Russia shouldn’t be singled out. The Economist reports that Polish citizens with Covid-19 “must submit regular selfies to prove they are staying at home.” New arrivals in Hong Kong must wear a tracking bracelet for 14 days (though they don’t seem effective; more on that in a moment).
It’s easy to imagine even more invasive uses of cell phone data. In fact, you don’t have to. Israel’s controversial NSO Group, which claims to have terrorism-fighting software that allows governments to track the movements of almost any smartphone holder, has suggested its tools form the perfect Covid-19 fighting machine.
“Each person known to be infected with Covid-19 could then be tracked, with the people they had met and the places they had visited, even before showing symptoms, plotted on a map,” NSO told the BBC.
Invasive tech isn’t just for those are infected, either. The city of Westport, Conn., gave “pandemic drones” a try, using flights in an attempt to enforce social distancing rules. Westport’s plans were quickly scrapped after predictable outrage, but drones aren’t going away. One company that makes the gadgets claims they can even detect fever from up above.
The push to use privacy-invasive technologies to help fight the pandemic will not relent. Our vigilance, good judgment, and due process cannot relent, either. It is heartening to see that an Israeli court has temporarily put a stop to use of a terrorism-fighting unit’s cellphone surveillance for virus tracking, telling legislators they must pass a law to temporarily expand the unit’s powers.
While I’ve been writing about privacy longer than most, and feel a great need to draw attention to privacy issues as they are often overlooked, I consider my views to be moderate. I’ll probably use the Apple-Google technology. I’m willing to surrender some privacy in exchange for my health and safety.
But only if the exchange is real. I feel like the question that’s most rarely asked in these situations is: As we surrender our privacy to this technology, does it really work? Is it really making us safer? So often, the trade-off as presented is false.
There’s a lot of snake oil in spyware. Just think about all the terrible ads you see when web surfing — ads for products you’ve recently purchased, for example — and you’ll understand how often our privacy is exchanged for very little benefit.
Bluetooth contact tracing apps, like Australia’s CovidSAFE or the Apple-Google project, pose an interesting problem in this regard. Obviously, the technology is unproven. Bluetooth can be unreliable, as users know. It might miss connections. It might be unable to tell the difference between a brief outdoor encounter 10 feet apart and a longer one that takes place indoors under conditions far more likely to encourage virus spread. Its wireless signal will detect phones through walls, where the risk would be low.
“The result could be a flood of false positives,” warns Ross Anderson, a cryptography legend at Oxford in a story at NewScientist.com. “Even the Oxford team, which is advising (the British health system) on its app, say the accuracy with which Bluetooth can be a useful proxy for virus transmission risk is ”currently uncertain.’ ”
This is the main lesson I recall in covering the post 9-11 era. Millions were spent on airport facial recognition systems that didn’t work. Relaxed phone call record surveillance rules, the source of public pain even in the Russian election interference case, are of dubious terror-fighting value.
Trading privacy for ineffective technology is a lose-lose proposition.
I find kinship on this issue at LawFareBlog.com, which posed the issue this way, when discussing the now-suspended Israeli-tracking method:
“Is the Israeli solution efficacious? This is the first, and crucial, question to ask about any action that infringes on privacy and civil liberties. If a privacy- and civil liberties-infringing program isn’t efficacious, then there is no reason to consider it further…..When one has a hammer—in this case, cellphone location tracking—it is tempting to see nails everywhere. But it is crucial to be honest about what problems the technology can solve.”
It is incredibly tempting to look for a savior in a time of crisis. I love the way Stewart Baker frames this on his podcast: “Is Privacy in Pandemics Like Atheism in Foxholes?” Or is it like a parent whose kid is in a kidnapper’s car?
We’ve already seen many saviors come and go. Hydroxychloroquine. Remdesivir. Ventilators. Sunlight. Now, Pepcid. Throw magic software into that bucket, too. All of these things might be part of a Covid-19 solution. And all of them need to carefully considered. Dosages must be tested. Side-effects must be weighed. With all of them, the cure can’t be worse than the disease.
Sure take my privacy — but take as little as you have to, after exhausting all other options — to help get us through Covid-19. But don’t take it for nothing. And then give it back as soon as possible. I’ve been Tweeting about this issue with the hashtag #MakeItTemporary and I’ll continue to do so.
First and foremost, don’t get distracted by tech companies promising miracle cures. We can spend our time and focus in better places.
“What do you think doctors and nurses need today? Do they need facial recognition, or do they need masks?” FBI agent Ali Soufan said to The Atlantic in its privacy-vs-Covid essay. “We need more hospital beds, not more smart cameras surveilling people. We need more scientists. We need an international system that can deal with this kind of problem.”