‘We haven’t looked at privacy. As a crisis’ — The Original Sin of the Internet

This is part 3 of the transcript for the Debugger podcast, “What is the Original Sin of the Internet.”

Somehow we went from bankrupt sock puppets to a fundamental attack on free will to billions of dollars of research designed to find psychological vulnerabilities and hack people’s minds. Bill’s version of the original sin, the collapse of the telecom investment bubble, the dot-com bubble and the rise of data and of AI.

Maybe these things couldn’t have been predicted or stopped, but people have been worried about them for a long time. Jolynn Dellinger was among the first, in fact, she’s one of the co-creators of Data Privacy Day that I celebrated with Richard Purcell. She was at Intel at the time. Now a professor at Duke, she just celebrated the event’s 11th anniversary. A decade is a long time to make progress on an issue what’s taking so long. I asked her what’s taking so long?

[00:36:41] Jolynn Dellinger: We’ve had some issues are getting things done in Congress. A lot of things have come up demanding Congressional attention. Um, I think honestly, people have not looked at privacy as a crisis. They haven’t seen it as an emergency. That this is something that needs to be dealt with tomorrow. And when a government is faced with responding to 9-11 and responding to COVID and responding to some of these things that are … that seem more emergent in nature, maybe it’s easier to get things done in that context.

I’m thinking that to the Snowden disclosures, in 2013, we did end up with a freedom act in 2015. And I think that in large part, that was a response to some of the information that citizens learned, during those revelations from Edward Snowden. And so when we feel like something is an emergency, we’re able to address it more quickly. And I think that privacy is more like the environment. I mean, just, it requires collective action.

And it’s a slow process.

[00:37:51] Bob Sullivan: Bill talked about the shift away from commercial products to free products. At about the same time as the shift, the first privacy policies were being written. That was now part of the bargain. Okay. You’re not paying, but here’s this list of things we’re going to do with the cookie crumbs. You leave behind on our site. I’ve talked a lot about privacy policies in the past. What could be any less productive than devoting more time and more words to privacy policies. We all know they don’t work, or maybe we don’t?

[00:38:25] Jolynn Dellinger: I think there was a study, a couple of different studies that showed a majority of people thinking that if a company had a privacy policy, that meant that the company was protecting their data and not sharing it without permission.

Which of course, as we know, couldn’t be farther from the truth. So when people lack the information, to make informed autonomous decisions about their participation in online services and products, it breaks the market in a way, it makes it impossible for us to act on our values.

[00:39:00] Bob Sullivan: So the words, privacy policy felt to consumers like, “Oh, I’m okay.

They have a privacy policy.”

[00:39:08] Jolynn Dellinger: [00:39:08] Right. I mean, maybe we should call it a data use and disclosure policy instead of a privacy policy, because it’s really, you know, of course it does affect your privacy, but the representation that privacy is being respected is of course not the case.

[00:39:23] Bob Sullivan: Jolynn went on to cite research showing it would take the average person 76 days to actually read privacy policies for every company they interact with. Well, that was 10 years ago. No doubt the number of days actually approaches infinity now, but the real problem is these shrink wrap contracts aren’t really contracts. They’re more like lists of bad things that can happen to consumers as part of the bargain when they use a service for free. Maybe the policy isn’t really the problem at all. Maybe the problem …. the original sin … is the use of the word free.

I’m also just really struck by, you know, what is designed or masquerades as a fair bargain, right? Like you’re getting something for free, you know, something you can swipe pictures on or whatnot on an exchange for that free. The cost is, is what, uh, who knows what the cost, the cost is? Some infinite license to use my information however you want forever? I don’t even know how to weigh that as a consumer.

[00:40:27] Jolynn Dellinger: [00:40:27] Right. I think that’s right. And you know, when we talk about unfair and deceptive practices, I really think that, uh, calling something free under the current circumstances, This is slightly deceptive. I think the word free is a weighty word in American culture. Um, and it’s certainly persuasive, um, and alluring when it comes to things that we get for free. There’s no free ride. There’s no free lunch. We see all those kinds of sayings because we know really is it for you?

We’re suspicious a little bit and we should be in this case. So if, if what we’re really doing is a barter transaction. If we’re trading our data for that free service, I don’t think that’s really true.

[00:41:17] Bob Sullivan: [00:41:17] I am so glad you brought this up because that’s a much better way to say something that’s been rolling around in my head for a long time.

It may very well be that the original sin of the internet is that it’s, quote unquote, free.

[00:41:29] Jolynn Dellinger: Yes. I mean, again, back to the, I understand the impetus. I understand when, when folks are talking about the importance of innovation and the importance of not having unnecessary or barriers to entry, um, allowing a company to offer something for free.

Maybe what gets it to get enough customers in the first place to really make a go of something. And I get that and I get that innovation is important. Uh, but that then evolved, right? I mean, some of these early companies that were free, didn’t start out with this behavioral advertising model. They, they took that on in later years as a way to monetize and start making money others didn’t right. For example, WhatsApp in his early incarnation at a subscription model. And so there are different models that people could use, I think, in the case of offering things for free.

I just think we should just be more clear here about what it is, but, but that won’t solve the problem, even if we’re clear. And even if we say it’s a barter arrangement and you’re giving your data, I still think that won’t solve our problem if people don’t understand the value of their data to the corporation.  And, I think that’s the system that we have right now …..  a notice and consent system. And the idea is that corporations say how they’re using our information. We read the policy. We say, if we agree, and then we go from there. That might’ve worked. That might’ve been an okay idea in 1994 when there weren’t quite so many websites, but it doesn’t scale. And we, what our system is doing right now in my view is putting the onus on the consumer. To figure all of this stuff out, data brokers, if you don’t want them to use your information, call them up and ask to be removed. I mean, that’s just not a realistic response.

[00:43:39] Bob Sullivan: This has me really thinking the original sin of the internet may very well be that it was free, like television, but Jolynn doesn’t like the way I frame my question here.

[00:43:49] Jolynn Dellinger: [00:43:49] I think first of all, I would look at the term original sin as an interesting one. I think in the religious tradition that believes in original sin, it’s something that you’re born into, that you’re just, you’re automatically just born with original sin.  You didn’t do anything to get yourself there. Your job is to get out of it.

I think in this case, that’s not what happened. What I would focus on is that we all have choices. Our government has choices, corporations have choices, developers, technologists, innovators have choices, and we as citizens and consumers have choices.

And. When we look at the choices we’ve made, I think that’s, what’s gotten us to where we are. So if you’re asking about this technology doing good for the world, I would say absolutely. I think technology does amazing things for the world and I love it and rely on it every day, but I think we haven’t done the things we need to do to put guard rails around what’s happening there. And I think each of those groups that I mentioned has responsibilities in that way, um, that we just haven’t really lived up to.

[00:45:02] Bob Sullivan: [00:45:02] And because we haven’t lived up to those responsibilities, you know, we just kinda moved fast and broke things.Well, we’re asking some people to suffer much more than others

[00:45:12] Jolynn Dellinger: Going back to safety. I was reflecting that…. Facemash… I think that’s what it was called. Whatever’s the first iteration of Facebook that Mark Zuckerberg made in his dorm room, which he described as a prank was actually a service to evaluate and compare the women’s students on campus.

So you have these abuses of information. When women lack control over their information or others have unnecessary access to their information, there can be significant harms and substantial harms. Non-consensual pornography, which has also termed revenge porn has been an issue that many women have to deal with.  And I will point out that, you know, these are women’s issues but by no means, are women the only victims of these issues. Domestic violence, sexual violence, revenge porn, absolutely affects men as well. And so I don’t mean to say it’s only a women’s issue, right? But it, it is an area where, for me, the studies overlap.

[00:46:16] Bob Sullivan: Remember what Richard Purcell said when the internet was built, it was for scientists to share ideas. Nobody really thought about the harms that could come from it. Nobody wondered if it was a bad idea to make it easy, to take pictures of women without their consent and share them with millions of people. Nobody thought about Amy Boyer, who I’ve often called the first person murdered by the internet who was tracked down by a stalker who bought her information from a data broker. Or about the web being used to attack democracy. Why didn’t anyone think about these things?

It’s one of those things we have to constantly ask ourselves is how would this conversation be different if there were a more diverse group of people in the room. So have you went through this, you know, dating back to the design of the internet and the way these decisions were made 1998, 2002, how might things would’ve been different if they were a more diverse set of people in the room?

[00:47:07] Jolynn Dellinger: [00:47:07] Yeah, so I think that’s one of the main issues that when we’re looking at, what do we need to do today, we need to be aware.

Of the ways that lack of perspective, lack of diversity influenced the design and implementation of the technologies that we’re using. And there are so many amazing people doing work in this area, but. Getting back to that idea of the responsibilities of technologists, engineers, computer science, people, designers, um, the idea that we can really put some thought and ethical consideration into what we choose to build, how we choose to design technologies.

Um, when we’re looking at patient recognition for just an example, we know that facial recognition technologies do not work as well on people of color or on women, particularly on women of color. And this matters in life. This matters in implementation of this technology. And so just to name a few, the people who are doing work in this area.

Khiara Bridges, Dr. Nicky Washington here now at Duke, ….Virginia Eubanks, Cathy O’Neil …There are many, many people in this space talking about how important these issues are. And also talking about the harms that can result from surveillance and that do result from surveillance, the disparate impact on different groups of people.

And so going back to privacy as an individual, right or interest versus a social good, a collective good. I mean, I can’t tell you how many times I’ve had a conversation where people say, “Oh, well, you know, I mean, I care about my privacy, but I don’t care about getting that Nike ad.” And I’m like, well, okay, well, but that’s not even scraping the surface.

That’s not really what we’re talking about here, because that same technology that serves you that Nike ad is serving you your news. And when we get into surveillance, if some people. Are effected by surveillance more than others in ways that chill their speech in ways that prevent them or make them more reluctant to associate in certain ways.

Like we’ve seen historically the surveillance of people involved in the civil rights movement and Black Lives Matter, the participation of those people and the civil rights movement and Black Lives Matter. Hearing those voices is a good for society as a whole. It’s not just about our individual privacy rights. It’s about creating a world where privacy is valued at a social good level to promote expression and to promote participation in democracy. So it’s just a lot more than whether or not you get that Nike ad.

[00:50:26] Bob Sullivan: [00:50:26] It is about a lot more than whether or not you get that Nike ad, as we now know it’s about whether or not technology and democracy can coexist.

It’s about manipulation. It’s about free will. It’s about perhaps the most powerful tool humankind has ever invented and whether it will be used for good or evil, or we’ll try to shove it back into some genie bottle. But for me at this moment, I just keep thinking, did it really have to be this way? In our next episode, I’ll continue asking about the original sin of the internet and you’ll hear – It wouldn’t have actually been that hard to fix it. We just failed.

 

 

Don’t miss a post. Sign up for my newsletter

About Bob Sullivan 1508 Articles
BOB SULLIVAN is a veteran journalist and the author of four books, including the 2008 New York Times Best-Seller, Gotcha Capitalism, and the 2010 New York Times Best Seller, Stop Getting Ripped Off! His latest, The Plateau Effect, was published in 2013, and as a paperback, called Getting Unstuck in 2014. He has won the Society of Professional Journalists prestigious Public Service award, a Peabody award, and The Consumer Federation of America Betty Furness award, and been given Consumer Action’s Consumer Excellence Award.

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.