Let’s hear it for the whistleblower!
No, not that whistleblower. I mean the Google whistleblower, the brave programmer who stepped forward and alerted the world to Project Nightingale. Google has secretly acquired medical data on 50 million Americans in a partnership with hospital chain Ascension. We wouldn’t know anything about it but for the employee who, at great personal risk, stepped forward and told the world what was going on (via the Wall Street Journal. Hey, that means it’s my turn next!).
Tech companies better get used to this. Increasingly, young programmers and technology professionals are taking a hard look at the impact of the work they do, and asking good questions. They are taking ethics classes right along with courses in data logic and computer architecture. They are staging walkouts when they disagree with company policies.
The Google whistleblower’s first-person account of the incident, published by The Guardian, is quite telling. Project Nightengale was exciting, and held the potential to tackle serious medical issues, perhaps even find new cures. But on the front lines, programmers couldn’t see the big picture. This is not unusual when working on a big software project, but it is disturbing.
“With a deal as sensitive as the transfer of the personal data of more than 50 million Americans to Google the oversight should be extensive. …. Working with a team of 150 Google employees and 100 or so Ascension staff was eye-opening. But I kept being struck by how little context and information we were operating within,” the whistleblower wrote. “Did patients know about the transfer of their data to the tech giant? Should they be informed and given a chance to opt in or out?”
Individuals or small teams of programmers working on specific tasks often don’t have enough context to ask these larger questions. Tech companies should make doing so easier, not harder. We want the smartest people in the room asking questions.
For some time, I’ve advocated a more formal procedure which allows programmers who feel queasy about their work to air their concerns. I’m a journalist, so I’d like them to do it in public. But there are other ways. Programmers can also reach out former professors, or professional organizations, or trade show groups. Ethical boards should be formed. (I am also an advisor at Duke University’s Ethical Tech organization).
Of course we should be throwing every bit of computing power we have at the big problems of our time. But there have to be guardrails. Ask any academic about the rules required to create an experiment involving human subjects, and you’ll get an earful about an IRB (institutional review board). These are a pain in the ass, but absolutely necessary, to prove that people will not be harmed in an experiment.
Google cannot experiment on 50 million Americans without anyone knowing what’s going on. That’s a recipe for disaster.
To be clear: I think there’s something even worse than Google violating millions of patients’ privacy rights — that would be missing out on cures, for years, because of technological bickering or privacy missteps. These things don’t have to be diametrically opposed. The key to making the best use of new technologies is designing them, up front, to respect privacy and self-determination rights. Sneaking around behind the scenes, experimenting in secret, will not lead to advances in medicine or technology. Instead, the ensuing controversies will set us back.
Google has a bad habit of trying to play God; Silicon Valley has a bad habit of not playing by the rules. That time must end. Whistleblowers can play a very powerful role in this. Bravo to this week’s tech hero. I hope there’s many more where that came from.