I believe there are good people inside Facebook — plenty of them — who know the company’s tech tools are causing great harm, and wish to do something about it. But right now, the firm has mounting evidence that its software does damage to a great many people — teen girls, victims of human trafficking, anyone who cares about politics, or Covid-19 — yet fails to take dramatic action.
Business interests — greed — continue to outweigh any internal evidence the firm accumulates about the harm it’s causing. Meanwhile, the company dares to muster a “well, cars hurt people too” defense, or it simply yells fake news at journalists who expose these hard truths.
For normal, non-sociopaths, Facebook must be a terrible place to work. I make this presumption (well, it’s more than a presumption) from the incredible set of stories published last week in the Wall Street Journal, and another published in MIT Technology Review, mainly based on documents leaked by employees who really seem desperate to make things better.
If you haven’t read and digested the stories, along with chief reporter Jeff Horwitz’s riveting Twitter feed about them, you should do that now. I’ll try to help.
The Wall Street Journal’s Facebook Files contains five stories:
1. Instagram really hurts teenagers, particularly girls. The company has studied this and knows it’s true.
“Here’s reality, as internally accepted by IG: The company “makes body image issues worse for 1 in 3 teen girls.” Horwitz writes. Also: “In one Instagram study (sample size not giant) a meaningful percent of US/UK users who reported feeling urges to self-harm within the last month traced that feeling DIRECTLY to the app. “Addictive” features prevent those most at risk from logging off.”
This is hardly a surprise to anyone who has watched people sit for endless selfie sessions and download “face-fixing” image filters. Still, the studies show Instagram has a five-alarm fire on its hand, knows this, and has…so far set up committees about it. Well, in fairness, Instagram has a stated intention of removing “likes” from its platform through something called Project Daisy. It’s easy to see how that might help. But changes like that take time, and. … how big is that fire again?
2. Facebook can’t seem to get a hold of anti-vaxx information that spreads on its networks — even with the platform’s stated intention of nudging 50 million people towards getting the Covid-19 vaccine. Employees issued dire warnings that Facebook was hurting the effort — President Biden has even said Facebook is killing people, though he later backed off those remarks. Still, the most popular post on all of Facebook in the first quarter of 2021 cast doubt on the vaccine.
3. Facebook rewards outrage and helps keep Americans at each other’s throats. On purpose. Its algorithms promote outrageous posts and comments. It got even better at this in the 2018 election, and caused major political campaigns to turn even more negative. In other words, Facebook profited when it “made political discourse more angry.”
Meanwhile, MIT Technology Review got its hands on 2018 internal report which showed that troll farms were operating important politically charged groups on Facebook. For example: troll agents in Eastern Europe controlled the most popular pages devoted to Black American and Christian content.
Facebook did not prioritize removal of such inauthentic behavior, which also contributes to virile political discourse.
One example: When an employee flagged a Mexican cartel for recruiting hit men, the firm was slow to stop it from posting on Facebook.
5. Some Facebook users are more equal than others
When it comes to content removal and what some call “censorship,” Facebook says it applies its rules even-handedly. It does not. A special VIP list of celebrities and politicians is often exempt from the rules. “We are not actually doing what we say publicly,” one internal document read.
In a blog post titled “What the Wall Street Journal Got Wrong,” Facebook issued a kind of blanket denial of the Journal stories without actually claiming anything in the pieces was inaccurate or wrong, a typical media relations technique.
“These stories have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees,” wrote Nick Clegg, Vice President of Global Affairs. The research on harm is …. incomplete, he suggested. “We fundamentally reject this mischaracterization of our work and impugning of the company’s motives.”
In a comparison, many are starting to make (I heard David Hoffman at Duke make it first!), try this mental exercise. Read Clegg’s post but substitute “cigarettes” or “nicotine” in a few choice places, and put yourself into a 1970s frame of mind. After all, research into the link between cancer and smoking was “inconclusive” at the time, and thank you for smoking.
Meanwhile, Instagram’s head Adam Mosseri — who often does say a lot of the right things — said a very wrong thing on the Recode podcast last week while defending his tool from the recent news coverage.
“We know that more people die than would otherwise because of car accidents, but by and large, cars create way more value in the world than they destroy,” Mosseri said Wednesday on the Recode Media podcast. “And I think social media is similar.”
Which is to say that social media could sure use a Ralph Nader and some seatbelts and airbags. We should all welcome that.
Ironically, Mosseri later expressed regret that his words were being twisted by “headline culture.” Caught up in social media, in other words.
Recall: Cars are recalled when a just handful of injuries can be directly attributed to a manufacturing flaw. I’ll wait for Instagram or Facebook or any other social media tool to get pulled off the road for the predictable injuries it is causing.
Horwitz says there’s more to come; I should hope so. Meanwhile, any non-sociopath at Facebook — I know there are many — who wishes to communicate is welcome to find my contact information at http://BobSullivan.net/about