Debugger: The Frances Haugen interview. Two years after Facebook, now what?

Nearly two years after focusing the world’s attention on Big Tech’s big problems, Frances Haugen remains a powerful force in the technology industry.  In today’s episode of Debugger, she tells me how Covid lockdowns played a key role in her difficult decision to come forward and criticize one of the world’s most powerful companies, what she’s doing now to keep the pressure on tech firms, and how she handles the slow pace of change.

For a new book she’s just published, Haugen researched Ralph Nader’s battle against the automotive industry in the 1970s — her fight is like his in some ways, very different in others. She’s created a non-profit to pursue research into harms that tech companies cause — some of that will be conducted this summer by Duke University students —  and she offers up some simple things companies like Facebook could do immediately to mitigate those harms.

I hope you’ll listen to the episode. Haugen is an engaging speaker.  But if podcasts aren’t your thing, a full transcript is below.

Click the play button below or click play to listen. You can also subscribe to Debugger on Spotify or wherever you find podcasts.

“One of the things I talk about in my book is… why was it when Ralph Nader wrote a book called Unsafe at Any Speed, that within a year … there were Congressional inquiries, laws were passed, a Department of Transportation was founded. Suddenly seat belts were required in every car in the United States. Why was that able to move so fast? And we’re still having very, very basic conversations about things like even transparency in the United States.”

Bob: So we’ve talked a lot about platform accountability on this podcast, the worry that Big tech doesn’t have to answer to anyone, not even governments. And this recent report by the Irish Council for Civil Liberties, which says that two thirds of cases brought before Ireland’s Data Protection Commissioner, which basically serves as the enforcement agency for the whole EU … says that and two thirds of the cases before it just resulted in a reprimand. Francis as someone who’s done a lot to try to make at least one big tech company accountable, how do you react to that?

Frances Haugen: One of the largest challenges regarding tech accountability is … legislation and democracy takes a lot more time than technical innovation. Pointing at things like adoption curves … you know, how long did it take us to all get washing machines? How long did it take for us to get telephones? What about cell phones? How many years do these processes take? And they’re accelerating. The process of adoption gets condensed. And when it comes to things like the data protection authority, it’s one of these interesting …  quirks, I would say, of how we learn to pass laws. Because when GDPR was passed, it was a revolutionary law. It was a generational law in terms of how it impacted how tech companies around the world operated. But we have seen over the years that the Irish Data Protection Authority is either unable or unwilling to act, and that pattern is consistent. One of the stats I was trying to find before I came on today was the fraction of complaints that they’ve even addressed is very, very small. So yes, they’ve only acted on a handful of cases in the last few years. It’s something like 95% of all the complaints that have been brought, they’ve never responded to. So I’m completely unsurprised by the recent report.

Bob: Is it frustrating that we’re still in this place?

Frances Haugen: Oh, no. This is one of these things where perspective is so important, trying to change the public’s relationship with these tech companies. And that’s fundamentally what the core of my work is — the idea that we should have an expectation that we have the right to ask questions and get real answers. That’s a fundamental culture shift … coming at a project like that from a place like Silicon Valley, where if you can’t accomplish something in two years, it’s, it’s not really considered valuable, right? Things get funded in Silicon Valley based on expectations two years out. If it takes five years or 10 years, that’s considered way too slow. And so I come at it assuming that it’ll take me years, like years and years, to get where I want the world to get. And that means that when there are hiccups like this, they’re not nearly as upsetting. And so I think it’s unfortunate. I think it’s unacceptable. But I think it’s also one of these things where I’m not surprised by it.

Bob: I wish I could say that I was surprised, but I am surprised to this extent. When you testified before Congress …The last time I recall such unanimity of opinion, the way that people on all sides of the aisle reacted, the way that journalists reacted … Everyone seems to agree there’s a problem. So why is it so hard to get to a solution?

Frances Haugen: You know, that is such a great question. I have a book coming out called The Power of One. And one of the things I talk about in my book is, why was it when Ralph Nader wrote a book called, Unsafe at Any Speed, that within a year of them publishing that book, there were Congressional inquiries, laws were passed, a Department of Transportation was founded. Suddenly seat belts were required in every car in the United States. Why was that able to move so fast? And we’re still having very, very basic conversations about things like transparency in the United States. Europe has really set the standard in terms of the Digital Services Act, in terms of saying, ‘Hey, the public has the right to know what they’re using.’

The core difference in my opinion of what were the kind of … the environment that Ralph Nader emerged into versus the environment I emerged into … is there were easily a 100,000 automotive engineers in the world when Ralph Nader came forward, right? It wasn’t just one person saying, “There is this problem and I’m gonna try to explain it to you as cleanly and clearly as possible.” Because then, we can act. It was 100,000 engineers saying, ‘Oh yeah. We’ve known for years that seatbelts would save lives, but we’ve been afraid to act because safety hasn’t been part of the conversation. Right?”

We’re living in a world when it comes to these algorithms where … you can’t take a single college class anywhere in the world on what the tradeoffs are in designing a social network and what are the choices you’re gonna have to make. What are the consequences of those choices? You know, you can’t do that. And that’s extremely intentional. The tech companies understood that they were the only ones who had access to the data. And that meant they … could define the conversation any way they wanted to because we didn’t have the tools to have even basic conversations around things like: Does social media use cause depression? Does social media use lead to more teen suicides? We can do correlative studies from the outside, but the only people in the world who can really do causal studies are the platforms.

And so, I think we have to change what we call the ecosystem of accountability when it comes to any kind of opaque large tech platform. We’re focusing on social media right now, but there are others like … large language models where we’re gonna have to have very similar conversations.

But yeah, we have to educate a million people around the world at a level of fluency that I have to really have a robust democratic debate about how our information environment should be run.

Bob: Those million people, are you talking about engineers inside tech companies?

Frances Haugen: I’m talking about professors. I’m talking about concerned parents. I’m talking about … one of the things that we’re working on is a simulated social network. So that … I have a question for you. Did you ever do high school newspaper or junior high newspaper when you were younger?

[00:07:09] Bob: I sure did. And they’re almost gone now.

[00:07:10] Frances Haugen: Yeah. So junior high newspapers … let’s take a step back for a moment. No offense to the enthusiastic junior high journalists of the world, or former journalists …  I would like to present a hot take that there are no good junior high newspapers in the world, right? That’s not why we do it. We do it because we believe that journalism is so integral to democracy, right? That without a healthy, robust information environment, we can’t make … we can’t have the public hold things accountable. So we do things like build out junior high newspaper programs. Right now if we wanted to teach at an age appropriate level about social media systems, we lack a lab bench to teach those classes. If we were talking about chemistry … I can teach a high school chemistry class and you’re gonna blow up stuff and make things change color and things make horrible smells. Or I can teach a college level class, or I can teach a graduate level class, or a professor can innovate on their own. When we have things like simulated social networks, suddenly you can have high school data science clubs or students argue over: Should we run the social network this way or should we run it that way? That’s what I’m talking about, broad scale empowerment of information. Because without having people who understand what’s going on, you can’t have democratic discussions around how the systems should operate.

Bob: You know, I like the comparison. I often lament the end of …at many schools …  high school, junior high … school newspapers. And not just because it’s no longer a training ground for journalists, but the vast majority of people who contributed to those when they were kids didn’t become journalists. But they understand a lot about journalism and they can have a far more high-minded, even-handed conversation about what it’s like to write a story about the new teacher at school and why the new teacher at school didn’t appreciate the way that you described her or something. Right? And so not everybody who attends these clubs you’re talking about for tech will become engineers, but they will be able to talk about it, right?

Frances Haugen: And they’ll be able to have conversations about it because the reality is … I don’t wanna live in a world where technocrats are the only people who get to have opinions. I don’t think anyone does. But to have a world that does have a larger diversity of opinions, of voices, we have to think about how we bring this information into the world at scale.

And the second thing is, part of why I think this is such an important area to have to work on is right now tech companies are entirely dependent on safety employees that were trained in-house. I always like to say, SpaceX is going to go to Mars because Elon Musk can hire aeronautical PhDs who got like 10 years of education in aeronautics on the public dime, right? You know, you did your undergrad. You did your PhD. You did your postdoc, your fellowship. We’re going to Mars because he can rely on the fact that we have a pipeline. You know, I learned everything I knew in this field, starting at Google. And we don’t have enough safety employees in the world today, partially because of that lack of pipeline.

Bob: It’s also a really terrible job in a lot of cases, right?

Frances Haugen: It’s interesting. So part of why I’m so excited about the Digital Services Act, is when you begin to make the societal costs of these opaque systems available to the public, suddenly a safety employee isn’t just a cost center. That’s a big downer, right? You know, ‘Oh my God, they’re complaining again about societal violence or something, oh no, they’re talking about genocide again.” It’s, “Oh, we’re gonna get fined really badly if we don’t  make progress on this problem.” And it’s interesting because I’ve talked to reporters who’ve gone and done these conversations with safety employees who say, “Changing incentives externally … meaningfully changes the circumstances of my job.” And so that’s the level that we need to be engaging with these problems. We need to say, ‘What are the incentives and how can changing the incentives help these companies operate in the interest of the public good?

Bob: So you’ve already mentioned the word ecosystem, so I’m guessing that’s part of the new ecosystem you would like. Could you explain that just a little bit more?

Frances Haugen: Sure. So let’s back to Ralph Nader. I dug into the land of Ralph Nader a bunch when I was in the research process for my book and when Ralph Nader came forward….Let’s walk through some of the players in the automotive ecosystem of accountability. So you had insurers, people who gave insurance to drivers. So if they got in an accident, they hurt someone else, they wouldn’t be bankrupted. Those insurers really wanted to pin why people got hurt or why accidents happened on the auto automakers, right? Because any accident that’s not the fault of the driver is one where the insurance company doesn’t have to necessarily pay out. They can hold that automotive company accountable. And so, because there was an economic interest there in understanding how these cars worked, they funded public research. There were lots and lots of research labs at places like the University of Michigan. There were a number throughout Europe. Pipelines like that make spaces for graduate students to exist, which means that automotive companies themselves can hire quality workers. You had things like lawyers who were looking for public injury settlements. They wanted to understand … they went in when there were deficits in cars. They went and found drivers or passengers that were hurt by those deficits. And then held the companies accountable. There were other parties that had economic interests in helping automotive companies move towards the public good.

In the case of Facebook or TikTok, Twitter … all these companies know that as long as they’re the only ones who can see the data, they’re the only ones who can grade the homework.

You know, some of the tools that are in our tool chest in terms of the larger ecosystem of accountability are things like advertiser boycotts or divestment campaigns, like stock divestment campaigns … those are only possible if you can say, ‘Hey, this is the level of hate speech we see on the platform today. This is the level of violence. This is the biases in your, your censorship systems, and these biases are unacceptable.” Right now, we don’t get to see the censorship systems, we don’t get to see the content distribution systems. We don’t get to see anything, really, about how these systems actually perform. And so it’s not possible for us to come in there and let those gears begin turning in terms of pushing the companies back towards the common good.

Bob: I keep hearing you suggest, I think, the main thing that is necessary is more transparency by these companies, which is going to look something like… what? Academics are allowed to go in and just see how the algorithms work, see what’s in the trust and safety queue? I think a lot of companies will push back on that and say, ‘Oh, that’s our whole product. We can’t open up our whole product to this outside world.”

Frances Haugen: But the reality is for most industries, your product is open for inspection. Right? Like if you look at cars, people can buy cars and crash them. People can buy cars and take them apart. When it comes to technology that’s physical, like phones … you know, I think the reason why we see so few whistleblowers from Apple is because Apple has very little incentive to lie. Anyone can buy an iPhone the day it’s released and take it apart. And people do and they put up YouTube videos saying, “They claim this chip was in there, this chip is actually in there. … we ran benchmarks on this phone. It can actually do what they said it can do.” And so a big, big part of my work, it’s just educating the public that they could want more. They deserve more.

And one of the projects we’re starting this year is called Minimum Viable Queries. So I started a nonprofit last year. It’s called Beyond the Screen, Beyondthescreen.org if you wanna come check us out. When in Startupland, the thing you’re always aiming for is the minimum viable product. So the minimum viable product is … what’s the least number of features we can do, and it’ll be compelling to a consumer? When it comes to information, we don’t have to let academics just go root around in the raw data. There’s a lot of different ways  that even small amounts of data would change everything.

I’ll give you an example. The number one risk factor for a ton of problems for kids is sleep deprivation. Imagine a world where Facebook or TikTok had to publish….this is how many kids are online after 11, after midnight, after 1 a.m., after 2 a.m. I guarantee you, if they had to publish that every week, very rapidly, they would figure out tools that helped kids actually manage their own usage. But they’re afraid of having to release data like that because every extra minute they have you on the app, that’s a minute that they make more money from you.

Bob: Gasp! Is what I have to say to that. I can’t imagine how parents would react to seeing such a fever chart. That’s an interesting example.

Frances Haugen: And so our goal, like minimum viable queries, is to say, “Hey, we wanna paint a picture for you of … there are happy mediums.” A big part of the work we wanna do for our nonprofit is showing people there are happy mediums. When we actually have structured conversations about how to move forward, we don’t have to go to extremes. We don’t have to say, “No one can ever use social media again.” You know? And nothing can ever be taken down. Everything must be taken down. We talk about a lot of these issues in absolutes. And the reality is there’s lots and lots of shades of gray, and if people were aware of all the colors they could paint with, they might be a lot more excited to imagine a better and safer world.

Bob: So is there an institution that is ready to take on the kind of role you are describing? A government and academic institution? What do you think?

Frances Haugen: So my hope is that as part of the Digital Services Act … one of the things that’s guaranteed under that law is researcher access for data from social platforms. And my hope is that we will be able to get data, at least, probably, regarding the European Union, and we won’t necessarily know how things operate in the United States, but that we’ll be able to work through that system to establish norms around… this much data needs to be freely available.

You know, one of the things I always like to say is … if you believe in the free market, …A lot of the people who are saying, “We can’t act, we have to protect these companies. We can’t act.” If you believe in innovation, if you believe in the free market, you only can have consumers making real choices and being forces for driving companies towards change if customers are actually informed of the choices that are being made. In the United States, when you buy an appliance, when you buy a car, there’s usually a sticker on it that informs you of how much money you’re gonna have to spend putting gas in that car, paying for electricity to run that refrigerator, that air conditioner. Imagine a world where we had a baseline set of information that allowed people to say, “Oh, interesting …this app has a very low rate of people who say they regret using it.” Right? That, that sounds really basic, right? Right now there’s no financial interest for advertising supported apps to help you stop using it at the point where you no longer enjoy it, right?

In a case of something like Netflix, Netflix has an active financial interest in you not using it any more than you’re satisfied because they pay for the bandwidth. You know, they pay for the licensing. But when it comes to things like TikTok, as long as you come back, they don’t care if you used it so much, you regretted it the next day, or you used it so much that you’re gonna do bad in math class when you come in.

Bob: I’m pretty sure this is true. I once heard Reed Hastings say that his biggest competitor was sleep.

Frances Haugen: Yes, yes. Great. Great quote.

Bob: So we’ve been discussing the need for more research about social media and its impact on people. And I know you gave a talk at Duke University recently and a project has come out of that. So can you describe to me what it is you’re going to be doing with Duke University?

Frances Haugen: So Duke is one of, right now, two academic partners that we are actively collaborating with as part of my new nonprofit, BeyondTheScreen.

So right now we can each only see what happens on our own phones, our own screens. We need to work together in order to see beyond our screens. One of our first projects is something called standard of care … the idea that we don’t have right now, norms, around what we deserve to get from social platforms. What does it mean to be doing a good enough job across a range of social problems? And this summer we’re gonna be working with a handful of fellows at Duke and at Georgetown University around documenting harms –  specifically, harms to children and families, because we’re at a very early phase.  But we’re gonna branch out to more harms this fall and identify what we call levers.

So these are generalized techniques for preventing or mitigating harm. The motivation for this project is … we’ve watched a lot of conversations between people who understand the social problems of social media, social platforms, and the people who understand the technology that’s possible for what’s driving those problems or what might solve them. And it means that — often — conversations about how to move forward are artificially fast.

I’ll give you an example. A common lever across many harms to children is … we should keep under-13 year olds off these platforms. That’s the law today. There’s a lot of the policies, explicitly, for these platforms. But they don’t enforce it. And it’s because … if you are the first to move on actually getting under-13 year olds off your platform, and no one else moves, you’re giving up the next generation. So people who understand challenges for kids, like problems children are facing, often say things like, “We should check IDs. Every social media account should have to have a driver’s license associated with it.” And the reality is, that doesn’t actually work. I could give you a 20-minute tech talk on why that doesn’t work. But it also has a bunch of civil liberties problems as well. But if you’d ask a technologist, “Hey, I have this lever. I need to figure out how to keep under-13 year olds off my platform,” they would say, “Oh, cool, here’s 10 or 15 different strategies for finding an under-13 year old.” Some of them are really, really basic. It’s things like …  kids aren’t very clever and they’ll put things like, “I’m a fourth grader at Jones Elementary” in their bio. You can find this on Instagram today, shows you how little effort is going into this. Or kids do things like they report each other, to punish each other. So you’re mean to me on the playground, I report your Instagram account. Honestly, as punishment. And if you can find even, you know, 1,000 kids who, they’re under 13 … 10,000 kids …  you can use machine learning techniques to find all the rest.

So the role that the Duke Fellows will be playing is they’re helping us to do the research around documenting what harms exist for children and families, so that we can have a shared set of facts…. a parent can go to one place and learn about what the issues are with social media. And they’re helping us to identify levers so that this fall we can begin collaborating with technologists around enumerating, “What are the many strategies for pulling each of those levers?”

Bob: So this is fascinating to me. I know for credit card companies, for example, they do an awful lot of things to see whether or not a transaction is fraudulent. They don’t catch all of them. But they catch thousands and thousands of them. And I think what you’re telling me is Instagram and Facebook do almost none of those things right now. And with those kinds of tools they could at least eliminate 99% of underage users.

Frances Haugen: Totally, totally. And that’s just one example, right? If we were gonna talk about bullying, if we were gonna talk about eating disorders, there’s a lot we should…. to be clear, in the case of a lot of those problems, it doesn’t have to just be conversations about prohibiting content. It can be things like… if we know one of the drivers, one of the things that causes issues with eating disorders on social media for kids is that the algorithms progressively push people towards more and more extreme content, one strategy could be … a lever could be, “Hey, what if we gave people more control over what they saw?’ What if we told them, “Hey, we noticed you’re looking at more content that people associate with eating disorders. Do you want to continue looking at that content?”

There’s a lot of different ways to solve these problems, and we believe the first step to having conversations about… What should the floor be like? What does it mean to be responsible? … Is just making it even possible for us to be able to sit around a table and look at a shared menu of possibility. And so, the fellows at Duke will be helping us begin to flesh out the framework where we can then put in that menu of possibility.

Bob: I know it’s early, but is there any way you can drill down just a touch on precisely the kind of research they’re going to do or how they might quantify things?

Frances Haugen: Sure. So this summer, they’re working on … it’s interesting, like being able to go to a place and get a centralized summary … right now I’d say the best analogous kind of resource is Jonathan Hay at NYU has done what he calls a collaborative literature review around, what research do we know about children’s mental health and social media? So, he’s invited academics, he’s invited experts in the space to submit whatever research they find, documenting what do we know about the causality of social media hurting kids’ mental health.  Imagine if we could have a similar shared set of facts … consensus reality around a broader set of harms on social media. So instead of just having that be about kids and depression, kids and anxiety, we could have it on eating disorders, we could have it on bullying, we could have it on child predation … Or, a growing problem is what’s known as self-exploitation of children online. That’s where kids … the Wall Street Journal did a very, very good expose on this recently … it’s where kids post revealing images of themselves … sometimes for sale online. You know, what if we could sit around and have an ability to form a shared perspective on what the problems are ?

So they’re, they’re gonna be documenting what is known on a bunch of these different problem spaces, and they’ll be helping to identify and brainstorm, what are levers that would be addressing these harms? Because that way in the fall, when we enter  the next phase of this specific slice of the project, we can begin recruiting help from technologists and say, “Hey, can you help us identify what technology exists today to pull these levers?” Because it makes it possible for things like … there’s some number of lawsuits right now around kids safety and social media. This research could be used for helping articulate remedies. And say, “Hey, we know these levers exist for making kids safer. Part of our lawsuit is saying you have to actually pull these levers and you have to give us enough data to show us that you’re pulling these levers.” And our hope is that by beginning to crystallize this conversation, we’ll be empowering that ecosystem of accountability. To be able to actually do its job in making sure these platforms are aligned with the public good.

Bob: We’re running out of time and I don’t wanna ask you to talk about the entire world of whistleblowing. However, I just wanna ask you, if you would tell 2021 Francis, anything, what would you tell her right now?

Frances Haugen: Great question. You know, it’s interesting, I’m always really careful when I talk about whistleblowing, because I feel like I had such a charmed whistleblow, right? I think there’s very few people in the world who are fanboys of Mark Zuckerberg. There are many, many fan boys of, say, Elon Musk in the world. So it meant that when I crossed this big company I didn’t really get any blow back. That’s, not gonna be a universal for everyone. I think the thing that I would tell my 2021 self is … you know, someone asked me, “What’s the thing you’re most grateful for about everything that’s happened, since you came out?” and  … I spent most of my life really trying to avoid attention, right? I’m a very tall person, which makes that harder because you literally stand out in a crowd. I really didn’t…I didn’t throw birthday parties. I didn’t … I was very happy being a data scientist, with a beautiful data set and sitting in the corner and … Being a whistleblower has forced me to show up in my own life in a very, very different way. And you have to think about: what do I believe in? And why do you keep going? Right? Questions like that. And, I’m really grateful for… I think the thing I would tell myself is … it’s real. I know what you’re going through is really scary right now because of all the unknowns. But on the other side … I’m literally talking to you and looking at a blue ocean right now, out my window …. there’s a happier world on the other side. And you can know that the world will change because….  I never thought the Surgeon General would put out an advisory on social media. You know, there’s a lot to look forward to.

Bob: Is there a single tool, a tip, a trick, something practical that you might suggest to somebody?

Frances Haugen: Sure. I think one of the things that …. So I have a new book coming out and I go into a lot of detail around how someone who is thinking about whistle blowing could hypothetically think through the constraints that they’re placed under, right? Because it is a cat and mouse game. The company has different interests than you do, if you’re whistleblowing. And so I would recommend reading through that just to get a framework for things to think about. Because it would take too long for me to go through right now.

Most practical … I have a very clean, practical tip. And I always say this. Find one person you really trust and be honest with them because … I have met a lot of whistleblowers since I came out who really were in bad shape when they came out. You know, they suffered in silence for a long period of time before they became whistleblowers.

And so it meant that by the time they were a whistleblower, they had already climbed a mountain. It just was a very private mountain. And for me, I lived with my parents during Covid. And so I didn’t really go through that same traumatic experience of questioning, “You know, am I crazy? Is Facebook unreasonable?” I never had to live through that because, I could sit down with my parents and be like, “Hey, I’m seeing this, this doesn’t make sense, this doesn’t make sense, this doesn’t make sense.” And my parents would be like, “This is unreasonable. You’re not being irrational. You’re actually seeing real things.” And so find one person you can be honest with and be honest with them, because it will be a thing that sustains you and makes you stronger.

Bob: I read that in an interview with you a couple of months ago where basically you discussed the role that Covid played in your future as a whistleblower, because you had that time with your parents and they were understanding and listened. That’s a pretty amazing part of the story.

Frances Haugen: It’s interesting. My husband also went and lived with his parents during Covid, and it’s one of these things that’s a shared part of our relationship origin story. Because both of us found … and I think this is a thing that a lot of adults will never consider this because, you know life circumstances don’t align … But Covid was in some ways like a really lovely gift because we got to have time with our parents that you never know if you’re ever gonna get that again. So I’m very grateful.

 

Don’t miss a post. Sign up for my newsletter

About Bob Sullivan 1668 Articles
BOB SULLIVAN is a veteran journalist and the author of four books, including the 2008 New York Times Best-Seller, Gotcha Capitalism, and the 2010 New York Times Best Seller, Stop Getting Ripped Off! His latest, The Plateau Effect, was published in 2013, and as a paperback, called Getting Unstuck in 2014. He has won the Society of Professional Journalists prestigious Public Service award, a Peabody award, and The Consumer Federation of America Betty Furness award, and been given Consumer Action’s Consumer Excellence Award.

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.