This is the second transcript from part 2 of the “What is the Original Sin of the Internet” podcast launched this week. Here, Ari Schwartz — whose career has spanned the Center for Democracy and Technology and the White House and private industry — explains why it’s so hard to get all these competing interests moving in the same direction. He also talks about why it’s easier to get action on cybersecurity issues than privacy issues. (“It’s easier to agree who the bad guy is.”) I think it’s a great discussion. Listen by clicking play below, or visiting the podcast webpage. Or browse the transcript below.
[00:07:51] Bob Sullivan: Okay. So maybe I-told-you-so’s, aren’t often useful, but it-could-have-been’s can be. I think in this case they show how inaction has been bad for consumers, bad for technology and bad for democracy.
Ari Schwartz worked in the Obama White House and was one of the first people whose job it was to get America interested in privacy. Before that he was an executive at the Center for -Democracy and Technology, CDT. He has plenty of I-told-you-so’s to offer. It was hard for him to hide his sarcasm when I asked whether any of these issues we face today were predictable or preventable.
[00:08:28] Ari Schwartz: There was something in 2016 where … I think it was one of the FTC commissioners who is still there … said, “Who could have imagined 10 years ago. that we would be in this situation now, where privacy would be where it’s at?’ And I had given testimony literally 10 years before that in front of the same committee saying we need a privacy law. It’s only gonna get worse, you know? Uh, so I felt like maybe I wasn’t heard so well in the room
[00:08:58] Bob Sullivan: [00:08:58] Ari’s version of the original sin revolves around the mistaken notion that the internet was a small neighborhood and everyone would just sort of do the right thing. Let me
[00:09:08] Ari Schwartz I mean, to me, you could see problems even from the beginning of the internet that people thought could be handled through codes of conduct. And through technology. Right. I mean, the first spam message was sent in 1978, before there even was an internet, someone trying to sell computers and they thought that it was totally crazy. And who would think that even do that, you know? So they just kind of handled it by codes of conduct and telling people how you can’t really do that here. And then, then that was enough. I think that that was sort of the attitude. I mean, if you talked to some of the people that created the internet originally on the security side, right and I think this goes for privacy too. The feeling was well, you know, they’re doing this for the research. This is just a research network, right. We trust each other and we’ll know who does something wrong. And when it got so much bigger and kind of started to spiral and get bigger and bigger, there’s some realization that, um, codes of conduct and technology, weren’t going to be able to handle it.
[00:10:18] Bob Sullivan: They were naive about spam, but that was only the beginning. There was also this idea that, Hey, everyone who has something to say now as a platform to say it, no more gatekeepers in the publishing world. Who cares about who owns the printing press, everyone owns a printing press. Everyone can publish a blog. Isn’t that great.
[00:10:37] Ari Schwartz: [00:10:37] Right before I got there, they had, um, met with their board and kind of put together a set of kind of set of first principles that CDT would believe in. And one of those is that everyone should be a publisher. I think that that’s true. And that’s one of the things that made the internet great was that everyone could be a publisher, but I also think that there should have been some recognition. And now we see it with media that maybe there needs to be more clear rules about what it means to be a publisher, even if everyone could do it. And I don’t think we did a good enough job of working out what those rules should be and what you do with people.
[00:11:16] Bob Sullivan: Would it be fair to say maybe people were naive back then?
[00:11:19] Ari Schwartz: [00:11:19] I would go all the way back to the seventies and say that we’re naive all the way through. I mean, I would admit that if I was naive in 1996, about, about that piece, about everyone being a publisher and really believing in that. And I still believe in free expression online, but, I do think that you still can hold the platforms accountable and have free expression. This question is to what that means in terms of holding the platforms accountable and what it means for individuals that the rules should be there, individuals and how you hold the individuals accountable on those platforms.
But I think we didn’t go into quite enough detail about what that should be, or have enough, quite enough imagination about where we could get to.
[00:12:06] Bob Sullivan: [00:12:06] This idea of a failure of imagination. It’s an expression I like quite a bit. First popularized by the 9/11 commission, but it applies to a lot of problems like free speech online. It’s already limited by various existing laws. It’s a crime to threaten people. And if you knowingly lie about someone, you can get sued. And those libertarian tendencies so strong in Silicon Valley made a lot of people feel pretty sure existing limits were enough. There was no need to make more laws to get in the way of this speech revolution. This was indeed a failure of imagination around misinformation.
Misleading people about the coronavirus isn’t really a crime, but it is dangerous. And so undesirable that as a society, we have to do better than just passively see what happens. The U.S. hasn’t done very well at dealing with these internet age issues. However, Jolynn Dillinger in part one talked about Congress being distracted by other emergencies, but Ari offers a slightly different insight into why we’ve avoided dealing head-on with many of these issues through the years. It’s just so much easier to blame somebody else for the problem. As you listen to Ari explain this, keep in mind the built-in cultural differences between the East coast and the West coast in America, between old media and new software between New York and Seattle.
When I was preparing for this interview, I read back on some stuff about the Patriot act and the encryption wars. The fear on issues 20 years ago was big brother, right, that technology, would be used to attack or interfere with democracy through government surveillance. The thing we’re worried about today seems so far afield from that it’s quite a leap.
[00:13:52] I mean, not that we’re not worried about government surveillance anymore, but today we’re really worried about publishers and misinformation. Right?
[00:13:59] Ari Schwartz: I would say that, I mean, the ones a lot less concerned about, um, how bad misinformation would get in about, certainly about other countries trying to push and less misinformation, but there was concern about the accountability of the platforms for personal privacy at that time, you know, 20 years ago, I remember it was right when the Patriotic Act stuff was happening, I gave a talk a few times on the East coast and everyone on the East coast is talking about Silicon Valley and the responsibility of Silicon Valley. And this is while the Patriot Acts happening. I’m just giving a talk to the army right at the one of the Army’s conferences and I’m giving a talk about government surveillance and government issues and the concerns about running the patriotic, et cetera, et cetera. And one of the people stands up and says, yeah, but what about those people on the, you know, what about the browsers? How much information can the browser collect by what we do?
[00:14:52] And it seems like they’re pulling all this information together. There’s no accountability for them. We know we have a strict laws on government, but where’s the laws on the companies, right? And then I flew across the country to give a talk at HP and I spoke there and the folks at HP, the first question I got was, yeah, you’re saying all this stuff about what Silicon Valley needs to do and what we need to do to improve privacy and how we need to start to build better practices into what we do. We know what we collect, but what about the census? Right? What about all this information that they want to collect from us? You know, it’s, it’s almost as if at that time, I mean it’s 20 years ago, they were kind of very focused on internally in what they were doing and not seeing all of the, the whole picture.
I felt like I had a little bit better view in that because I was doing both government and the private sector and could be equally yelled at, by both sides, at least, but I think it was still hard to get the message across to people. Like if you continue to do going down this road and only care about what the other is doing, what government’s doing are only caring about what industry is doing and not caring about the other side. We’re going to end up in a pretty bad place
[00:16:00] Bob Sullivan: That bad place… Well, Ari thinks the information bubbles we created a decade ago, a problem we talked a lot about, but didn’t really do anything about, are you hearing a theme from me?
Well, these information bubbles led directly to the misinformation problem we have today.
[00:16:18] Ari Schwartz: There had been a lot written, maybe 10 years ago, more on kind of the bubbles that we were all creating for ourselves through the filters that had been created on through social media and through, and the, and the, uh, search engines that we were only seeing content that we cared about.
And I thought that was really relevant and important at the time. Um, and a lot of people seem to sort of shrug it off and saying, well, but that’s what people want. And I think that that has sort of led us down to that misinformation road more than I thought, even more than I thought it would at the time
[00:16:49] Bob Sullivan: [00:16:49] When Donald Trump was de-platformed by Facebook and Twitter, after the 2020 election, there was an outcry to end section 230 protections for web companies without getting into that grand debate, at least for now, the argument shows just how ill equipped our system seems to be, to deal with these emerging big tech issues. Section 230 is 25 years old. The privacy act is almost 50 years old. The rules aren’t keeping up.
Have you ever played a board game with your family and you don’t take the time to read all the rules at the beginning of the game? What happens? Chaos? Lots of chaos usually hurt feelings. Also cheaters often win. That’s not a bad metaphor for the internet today.
[00:17:32] Ari Schwartz: I think there’s and attitude for a long time that if it’s legal, that it’s okay. And instead of, uh, looking at what’s moral and ethical, um, and some, some folks obviously don’t fall in that camp, but I think a lot of folks did that. And so there, there was a lot of, uh, kind of, are you saying I can’t do that or that I shouldn’t do that? And if it’s, I shouldn’t, then they, sometimes they would go ahead and do it anyways, if it was what they needed to do to make money, that attitude led a lot of people down the road.
[00:18:00] Bob Sullivan: [00:18:00] So maybe the real original sin of the internet is that. We’ve never really figured out a process to fix it, or at least to make course corrections. It didn’t come with a repair manual and we’ve not really taken the time to write one. Many people have tried, are you tried, but on the list of high priority problems, hackers, get the attention. Hackers are sexy.
Hackers create emergencies. Privacy and civil rights…well, these issues just have a hard time competing.
You were having some of these conversations, even when you were working at the Obama White House, what kind of reception did you get when you raised issues like this?
[00:18:41] Ari Schwartz: I guess it depends on what issue and where I was raising it. I think most of the time there was a desire to try and come up with the right solutions, but figure out also how to go about doing it. I mean, I ended up working a lot more on security than on privacy in the government. And one of the reasons for that is because it was just so much easier to get things done on security, right? When it comes to privacy, there’s a lot of gray. No, one’s going to agree on who the bad guy is. And that goes back to that government versus the private sector too.
Like, I mean, yeah, I’m collecting this information, but I’m also giving you this service, right? When it comes to security, everyone agrees who the bad guy is it’s just how to go about getting them and stopping them from doing it. That’s the problem. Right. So, um, so there’s a, it was a lot easier to make progress on policy in security than it is on privacy. That’s not to say that we’ve gotten a lot further in the security area, but it is easier to make policy there.
[00:19:36] Bob Sullivan: [00:19:36] And so we’re left with some really big problems, a situation I feel comfortable saying no one is happy with where do we begin? What’s the first thing that, I mean, if you were in the Biden administration, what, what, what would you try to tackle first?
[00:19:52] Ari Schwartz: I do think that the disinformation problem is really a tough one and one that I think we need to get a handle on because I do think it’s eating away at our society and the ability to govern and trying to develop means to identify factual information without censoring information is really a challenge of hand. I think we need to have. Broader discussions about doing that and about having civil discussions as well, across the board on all different issues. Um, so I think that those are probably two major issues right now. I still think getting a privacy law in a commercial privacy law in place would be a good idea.
There’s a lot more support for it now, especially because Europe has done it in California is doing it. Um, so I think having a federal privacy law, I still would put up there very high on the list
[00:20:41] Bob Sullivan: [00:20:41] Is there something that you would tell Ari Schwartz working at the Obama white house, like a piece of advice you would have that, you know, now that you could have given to yourself back then?
[00:20:51] Ari Schwartz: [00:20:51] Yeah, I mean, I think, you know, seeing what some of the, um, what the Russians have done in terms of disinformation, um, where that has led just a complete false hoods and, and, and the ability to continue to push falsehoods that has come from fake speech from like robots, et cetera, and what that has done.
You know, cause we, we used to say the way to combat bad speeches with more speech, right. Figuring that the best ideas win. But if more speech is being pushed upon people with robots … It’s very hard for the people that are actually trying to create a real speech to win the day. I don’t think I understood that at the time.