This is the third part of the transcript from part 2 of our podcast, “What is the Original Sin of the Internet.”
Tim Sparaponi is a really engaging speaker, and I’m sure you’ll enjoy listening to him more than reading his comments. But if you’d like to skim what he told me, read below. He stressed to me that Facebook had almost two decades to prepare for the messes it would, predictably, make in the world of discourse. Instead of investing in solutions, it just kept cashing checks. And that’s how we got to today’s dark place.
Among his greatest hits in this chat:
- Since humanity understood how to use fire, or that wheels make things easier to move. There have been good and bad things about every technology. Every new invention is a double-edged sword. And because we have seen so many technological revolutions come, we should understand and take precautionary action at the outset with every new technology.
- And you have to, as a society, in my opinion, understand, and predict the bad things that are going to happen. These things are not impossible to understand. They’re cyclical. W see this with every new technology. We can anticipate the problems because we have seen them with the dawn of every new technology. That doesn’t mean we should stop the March of technology, quite the opposite. It means that we need to be getting smarter as a species and as a society where we can embrace technology and do the things that we need to do to make it better so that we get all the good and a lot less of the ill from any new technology. I don’t think it’s that hard. We just failed.
- Facebook had the last 17 years to spend money on building capacity to control the fact that people have less common sense than we expect, right? Or that virality will take hold or the deep fakes will be easily produced or that foreign governments with malicious intent will use some of the American public scalability against it to create division. And so, Facebook failed. Not because they’re a publicly-traded company, but because they thought only in short-term thinking instead of long-term investment, they only built community standards, processes to review content at scale … grudgingly.
[00:21:32] Bob Sullivan: [00:21:32] If speech is being pushed by robots, it’s very hard for people trying to create real speech to win the day. That concept…it turns a lot of free speech thinking on its head. And this is another theme I keep hearing from my guests. We’ve tried so hard to carry over laws and concepts from the past, into the digital age to extend these imperfect metaphors.
And sometimes they just don’t fit. You won’t find a bigger free speech advocate than Tim Sparapani. He spent the first part of his career at the ACLU. Then he went to go work for an outfit he calls the biggest free speech experiment in history: Facebook. Like Ari, he now finds himself saying things he never expected to say:
Is there an original sin of the way that the internet works? And if you believe that, what might it be?
[00:22:28] Tim Sparapani: It’s a fabulous question. I think we trusted too much in the notion that good speech would drown out really terrible speech.
I was raised as somebody as a committed civil libertarian with the notion of the marches in Skokie, Illinois emblazoned in my brain, which was to say that, as a country, we decided to make the choice to let awful hateful shabby speech go forward and not in any way, crimp people’s First Amendment rights.
With the understanding that enough people’s common decency, uh, would be brought to bear. And the weight of public opinion would be such that awful hateful, um, violent speech would be so publicly derided that we would make sure that everyone understood how sick, wrong, confused, um, erroneous, terrible it is.
When we set up section 230, that was the understanding … that that line would hold the common decency. And frankly, the common sense of the American public would be enough. Turns out that’s maybe not necessarily the case. The ability for people to do deceptive things, for people to manipulate, for people to end up in echo chambers is so strong. we may need another mechanism over and above transparency. And it hurts me to say that. I want to believe that there’s common sense out there, but it turns out that common sense isn’t so common.
[00:24:52] Bob Sullivan: There’s that idea again, the notion that things would somehow fix themselves, that the market would correct itself. That’s a driving force behind so much tech development, but if you believe in the concept of an original sin, that there are flaws baked in fundamentally part of the digital age, then you believe the internet isn’t going to fix itself. I’m not saying Tim believes that, by the way. He bristles at the notion, for example, that social media is fatally flawed.
I was just reading some of the things I’ve seen you say recently, and I heard you describe Facebook as perhaps the greatest experiment in free speech in human history. Do we now need to announce, so maybe this is a failed experiment?
[00:25:36] Tim Sparapani: [00:25:36] No. Not to sing Facebook’s praises, but social media in general … I mean, anytime you can connect one in three people on the planet effectively for free… we can drive down the cost of communication to zero, when it used to cost many dollars just to make a one minute phone call internationally… right now we have the ability to communicate for free. This isn’t a failure of communication, and social media itself shouldn’t be castigated or destroyed. It’s not a societal ill, but what didn’t get put in place were guardrails. And there have not been enough people who have been willing to say flatly that the companies who are the dominant companies have, have failed.
Facebook speaks freely about having community standards, right? And having a safer place. That was the original understanding, that Facebook was a much safer place than Myspace, and Friendster, which, you know, sort of predated it, in terms of its popularity. And in fact, Facebook was a safer culture relative to those two, um, because it was a real name culture, and the others were pseudonymous or anonymous.
And so they truly were the wild West. And anything that could be said was said, but what I think has failed is Facebook’s leadership, right? And the desire to eat ever more profit out of an institution, as opposed to looking in a short, sort of a short term way, um, is a failure rather than having a long-term view of, of the health of the company they’ve created.
[00:27:20] Bob Sullivan: Right then as we’re talking, Tim becomes a bit more animated. His time at Facebook was short. He only lasted from 2009 to 2011. Facebook’s initial public offering was in 2012 and it didn’t go well at first, Tim and I have spoken before, and I know he has very strong views about Facebook’s decision to let itself be led by the insatiable hunger of Wall Street for growth. And that led to perfecting the algorithms, which seem to drive all our lives today to turning social media into an addictive tool and to making the entire internet, maybe all knowledge today into, as I like to say, a high school popularity contest
[00:27:59] Tim Sparapani: Virality has been elevated over all other goods. Not only Facebook, but every other social media site out there. And that’s led to really short-term thinking, like nanoseconds. I think the other failures that having a publicly traded company, Facebook and other companies, like it are thinking, how do I please investors? And investors, unfortunately, have a time horizon … it used to be years now, it’s quarters. Every quarterly earnings update. In some cases, the traders are trading also in high frequency and thinking of fractions of seconds as well.
And so being a publicly traded company has not been a good thing for Facebook. It’s probably not a good thing for most social media entities, because instead of thinking about the long term, about the community they created and the community standards that they’ve established, they’re thinking about: how do we drive up shareholder value? And that is, I think, inconsistent with. Having community standards that they should have and could have
[00:29:21] Bob Sullivan: This issue of virality I take personally because I feel like a failure …. to make this about traditional media companies. Journalism is a popularity contest now. I mean, everything is a popularity contest now. And I thought we all learned in high school that popularity contests are the road to perdition, but here we are.
I feel like we’re all back in high school fighting for popularity.
[00:29:45] Tim Sparapani: I agree with you. Ad driven media, ad driven social media…the fact that this is our monetization scheme that we’ve landed on is … made this in some ways for many people is zero sum game. Right? Where every entity that is seeking ad dollars is at war with everyone else who might be seeking the same ad dollars.
And it leads not only in social media, but across all media. In my perspective, too. Really poor thinking. Right? And it leads to promoting the things that will incite outrage, be popular at the cost of anything else.
[00:30:32] Bob Sullivan: Was Facebook’s leadership naive or was there another failure of imagination? Was that social media’s original sin?
[00:30:41] Tim Sparapani: So I actually think the sin is much more grievous from Facebook’s leadership in the last decade. Conscious choices were clearly made to elevate virality at all other costs. Right? That was not the case when I was there. It certainly became so many years after my departure. And I think that’s clear based on the algorithmic operation we see, right?
And Facebook has admitted as much, many times. YouTube has admitted as much, many times. All these other social media have followed suit and maybe taking it to the end power, right. The ones that are receiving even less public scrutiny.
[00:31:19] Bob Sullivan: And so with years of warnings from people like Tim and Ari and Joe Lynn and Richard, Facebook and other tech companies failed to prepare for the moment. We now find ourselves in.
[00:31:31] Tim Sparapani: , In terms of, you know, playing to the public market. I won’t condemn capitalism outright. I will condemn the leadership again for thinking of quarter on quarter profit increases rather than doing the sort of R&D expenditure, the capital expenditure that businesses that expect to be around for a long time do. They build for the future. They spend money now, even when it’s expensive, to have more capacity later to build their widgets. To manufacture, to have the next scientific breakthrough. Facebook had the last 17 years to spend money on building capacity to control the fact that people have less common sense than we expect, right? Or that virality will take hold or the deep fakes will be easily produced or that foreign governments with malicious intent will use some of the American public scalability against it to create division. And so, Facebook failed. Not because they’re a publicly traded company, but because they thought only in short term thinking instead of long-term investment, they only built community standards, processes to review content at scale … grudgingly.
00:32:59] Bob Sullivan Tim is as disappointed as anyone, but like Jolynn. He’s not really interested in saying all this was inevitable or couldn’t have been stopped.
[00:33:08] Tim Sparapani: We should get all the benefits of technology. Right. We should have customized things. We should have personalized things. We should have individualized things. That’s part of the promise of the data age. It is one of the great advantages of the data age. The problem, again, is that we failed as a society to put in guard rails around us. We have not said what should be out of bounds. We should not put any behaviors in the penalty box. We’ve not, beyond that, built systems that require when mistakes are made and they will be made and are made daily. These companies have to have human beingsto respond to inquiries from the public to make things right.
And more importantly, finally, we haven’t put in place redress systems for companies that have harmed individuals. And there again, there will be harms. And it seems to me that you can’t have an unshackled system, uh, and expect it to turn out. Well, it turns out that isn’t a good idea.
It’s the same conversation we were having about, um, speech on the internet. And you have to, as a society, in my opinion, understand, and predict the bad things that are going to happen. These things are not impossible to understand. They’re cyclical. W see this with every new technology. We can anticipate the problems because we have seen them with the Dawn of every new technology.
That doesn’t mean we should stop the March of technology, quite the opposite. It means that we need to be getting smarter as a species and as a society where we can embrace technology and do the things that we need to do to make it better so that we get all the good and a lot less of the ill from any new technology.
[00:34:53] I don’t think it’s that hard. We just failed.
[00:34:55] Bob Sullivan It’s really not that hard. I’m not sure whether I should be inspired or depressed by that thought. I suppose I shouldn’t have been surprised when Tim, like John Perry Barlow the co-founder of EFF, who I mentioned at the beginning of this podcast, compared the internet to fire. But this time, the comparison wasn’t quite so favorable.
[00:35:18] Tim Sparapani: I am not in the camp of, you know, Throwing off the shackles. I’m a civil libertarian, not a libertarian. And so I want there to be more speech. I want there to be more action. I want there to be more freedom in society for people to make the choices they want to make. But with that freedom comes responsibilities. And we can’t pretend it shouldn’t, or it doesn’t.
Since humanity understood how to use fire, or that wheels make things easier to move. There have been good and bad things about every technology. Every new invention is a double-edged sword. And because we have seen so many technological revolutions come, we should understand and take precautionary action at the outset with every new technology.
It seems to me that we are well into the artificial intelligence moment, but it hasn’t, you know, sprung fully into our society. There’s a chance now to do this in advance of all the things that could go wrong with AI, I just use it as an example, and I think we can retrofit most of the technologies. I think we could put in place guard rails around social media. I think we could take things off the table. I think we could require companies who have done ill or had made mistakes to make it better. It’s really not that hard.