What is the original sin of the Internet?

INTRODUCTION

Perhaps you were a holdout against the notion that something is terribly wrong with the Internet. The events surrounding Jan. 6 — both the plot to attack the Capitol and tech giants’ reaction to it — surely make that position harder to defend. The argument that democracy is under assault via technology seems easy to make.

If the Internet is really broken, what went wrong? Was there a fundamental mistake, some kind of original sin, that led to the mess of disinformation and Facebook angst and techno-exhaustion we all seem to feel?  For this big, complex discussion I’ve enlisted the help of Richard Purcell, the first privacy executive at Microsoft; Khiara Bridges from UC Berkeley; Anne Klinefelter from the University of North Carolina; and also Jolynn Dellinger and David Hoffman from Duke.

Today’s question around the original sin of the Internet is also the subject of a podcast I’ll be hosting next month.

(If you are new to In Conversation, I am a visiting scholar at Duke University this year studying technology and ethics issues. These email dialogs are part of my research and are sponsored by the Duke University Sanford School of Public Policy and the Keenan Institute for Ethics at Duke University. See all the In Conversation dialogs at this link.)


FROM: Bob Sullivan
TO: Khiara, Anne, Jolynn, Richard, and David

Looking backward isn’t always worthwhile, but sometimes it is. When you are doing a long mathematics calculation and make a mistake, it’s not possible to erase the answer and correct it. You have to trace your steps back to the original error and calculate forward anew.  I think it’s time we did that with the Web.

So what’s the Original Sin of the Internet? Nearly all business models it supports require spying on consumers and monetizing them. I heard a tech executive recently describe this version of the original sin as, “This notion that widespread invisible collection of your data is just the way the Internet works.” While most people were busy searching out cute cat videos, the Internet was built on the idea that companies would just load up on data, all the time, and keep it forever.  “Nobody really understood at the beginning what that would turn into.”

And I fear that so many other ills — whether it be Covid-19 misinformation, or data-driven discrimination, or our newly fragile ability to hold free and fair elections, or just the inability to be left alone — flow from that error.

What do you think?


FROM: David Hoffman
TO: Bob, Khiara, Anne, Jolynn, and Richard

Bob, thanks for kicking this conversation off. For me, the original mistake in the mathematical equation was the idea that allowing the collection of large amounts of personal information about people would not also need a commensurate increase in protections to make certain that people at risk are not taken advantage of.

By way of a metaphor, it is as if we decided to throw a party at a club and originally invited fifty people and hired one person to work security. The party was then so successful that we decided to hold it in a sports stadium and to invite 50,000 people but still only hired one person to work security. That is a recipe for making the most vulnerable people at the party into prey. Now let’s say that we allow some people at the party to know exactly where each person parked and to be able to track when they will be walking to their cars and are most vulnerable.

That is the situation we have now with the internet. Individuals have little to no idea what malicious actors can learn about them and how they can use it to harm them and society. Whether this is from buying data from data brokers, third party code downloading tracking software to our phones and computers, or cybersecurity attacks, we have not created law enforcement and regulatory organizations that have sufficient authority and resources to protect individuals (and especially the poor and non-white communities) from harm. In the U.S. we convinced ourselves that the Federal Trade Commission could enforce its unfair or deceptive trade practices authority with similar amounts of resources to what they had thirty years ago and that the market would largely self-regulate to punish bad actors. That clearly has not worked. If we had passed a strong U.S. federal privacy law in 1990 and created a metric for scaling the resources of the Federal Trade Commission at a rate similar to the scaling of the data collection, then our mathematical solution might be much closer.


FROM: Anne Klinefelter
TO: Bob, David, Khiara, Jolynn, and Richard

You ask, what went wrong with the Internet.  I agree the business model is deeply flawed, and it is increasingly used to play on our very human insecurities.  Even when we pay money for internet services, those interactions allow us to be tracked, profiled, monetized, and manipulated by layers of actors.  And, it is not just the internet.  We are spied-upon through all sorts of technologies even in what we still call real life.

No doubt the harms I experience and try to avoid are not as bad as those with fewer resources than I have.  I can afford to limit some of my engagement and create workarounds to distract the trackers.  But none of us is an algorithm.  And, frankly, the collective harm to our society is as big a problem as our individual harms.  So, we do need to increase protections for each and all of us.

David is right. We need more regulation to respond, as law does, to new harms.  We are overdue.  Stronger federal privacy law is a great goal, and I support more funding for and encouragement of the FTC, perhaps with broader authority.  I think it is realistic to also support state efforts to identify problems and develop innovative solutions.


FROM: Khiara Bridges
TO: Bob, David, Anne, Jolynn, and Richard

I think that at the dawn of the internet, it was impossible to predict just how huge and vicious a beast it ultimately would become. Because of that, I think that the most dangerous sin of the internet is not an original one; rather, it is our refusal to competently regulate it today. As I type this email, we know that the lack of regulation has led to the destabilization of democracy. The failure to regulate has exacerbated the marginalization of already marginalized groups—like the poor and racial and religious minorities. It has allowed individuals to construct realities—around viruses and vaccines, voter fraud, the issue of child sex trafficking, etc.—that are entirely fact-free. All of these things are crystal clear to us. Yet, we have not taken decisive steps to fix it—to atone for the sins of the internet.

I fear that we won’t regulate until we believe that regulation can generate a profit for a few lucky actors. By then, it really may be too late.


FROM: Jolynn Dellinger
TO: Bob, David, Anne, Khiara, and Richard

I think comments of the type you mention that this “is just the way the internet works” and that “nobody really understood at the beginning” both reflect an absence of intentionality and moral agency.  These statements echo the old idea that technology drives us, and that we are just along for the ride; increasingly, we are hanging on for dear life and dealing with the inevitable wreckage. The Internet is a virtual reality but it is also a tool, and it is up to us how we choose to use it and how we choose to let it loose in the world. I think part of the original sin of the Internet is our collective failure to realize our responsibilities in this area – as lawmakers and regulators, as corporations, as innovators, as technologists, and as citizens.

There are many different ideas to take up in this line of thought but for now I will focus on power and commercial use of data. In unequal relationships, power differentials are easily exploited. While relationships of unequal power are just part of life – parents and kids, employers and their employees, a government and its citizens – implicit duties of care, laws and regulations often protect the more vulnerable person in a relationship. In the case of the Internet and commercial data, we decided to let the more powerful entities, the companies, just decide for themselves how they chose to treat those affected by their products and services.  We didn’t insist on guardrails. While we can fall back on “no one really understood at the beginning” how that would turn out, I think it actually never required a feat of imagination to conclude that for-profit businesses that exist to make money would optimize for just that – profitability, and that other values like privacy, justice, fairness, and equality would fall by the wayside, especially where providing any of those things costs money. Add to this the information asymmetry that necessarily characterizes people’s relationship with online platforms and apps when it comes to the collection and use of data, and you get a situation in which the more vulnerable parties here – the individuals – can’t even use the one option available to them in a so-called “free” market, an informed choice about participation.  If we don’t know what is happening with our data, we cannot vote with our feet, leave or choose another service, or make any other kind of informed decision. Add to this the proliferation of data brokers, businesses that exist to trade in our personal data but that don’t even have a direct relationship with us. How could we possibly be expected to assert our preferences effectively (assuming we could learn enough to have a preference) when there is no relationship in the first place?

Americans are culturally dedicated to the notion of freedom – we love our free market, our freedom of speech, our freedom of choice, and getting things for free (undoubtedly a key to the success of the Internet’s dominant business model). But at the same time we admit there is no such thing as a free ride or a free lunch; we criticize free-loaders; we caution that “you get what you pay for”; and we understand at some level the oft-cited concept that if a service is free, we are not the customer, we are the product.

The power and information asymmetries at play in the context of many Internet services and the prevailing business model built upon the collection and unrestrained use of data make us fundamentally less free in the ways that matter.  We can make a different choice.


FROM: Richard Purcell
TO: Jolynn, Khiara, Anne, David, and Bob

Perhaps the original sin of the Internet is the universal and constant original sin of humanity – we put our own wants ahead of our collective needs.  In truth, why are we surprised that bad things happened as a result?  Engineering the internet took hard work, brilliant minds and naïve souls.  Legal, cultural and social leaders were not only absent, they largely abdicated.  Commercial leaders complied with their self-interest, naturally, and it all began.  I do not take this view lightly, as I was among the complicit.  I accepted the self-interested argument that individuals could control their own personal information.  That, of course, is and always has been a myth.

But, wow, we thought we were gods.  The power of the internet to communicate a tightly controlled narrative is awesome.  We had control and we took it.  Coincidentally, money poured into the technology companies, allowing them broad influence over public policy, a trend that only accelerated with Search, Social Media, Cloud and AI.  Many good people, including some on this thread, worked hard to stem the tide.  Perhaps we accomplished some good, generated some trust, influenced some changes.

Now, we reflect on more than two decades of this asymmetrical power relationship in which power behooves to those who control the narrative.  Can you see the parallels with the rise of religion as a violent force?  Or with slavery as an accepted evil?  The original sin does not lie with any singular action; it is in us every day.  Any no singular action remedies our condition.  It’s back to basics now.  We must follow centuries of teaching.  We must exert public control through strong, meaningful legal standards.  We have to exert social control through education with standards of awareness, values and skills.  We have to exert individual control through normative expectations and standards.  Western democracy is replete with examples of how we’ve fallen down; it’s our turn to show history we can recall our better angels and recover through law, justice and community.  We must be impatient, demanding laws that work, fair treatment that matters and self-control that recognizes we are, actually, all equal.

What do you think? Add your thoughts below:

Don’t miss a post. Sign up for my newsletter

About Bob Sullivan 1646 Articles
BOB SULLIVAN is a veteran journalist and the author of four books, including the 2008 New York Times Best-Seller, Gotcha Capitalism, and the 2010 New York Times Best Seller, Stop Getting Ripped Off! His latest, The Plateau Effect, was published in 2013, and as a paperback, called Getting Unstuck in 2014. He has won the Society of Professional Journalists prestigious Public Service award, a Peabody award, and The Consumer Federation of America Betty Furness award, and been given Consumer Action’s Consumer Excellence Award.

2 Comments

  1. I don’t think it’s realistic, when you’re pushing for more regulation as the solution. Look at what happens in Europe, where their governments ride off the coat tails of US big tech companies, but complain about them and get into repeated arguments, while the tech companies still pretty much get their way.

    The solution being proposed by the Non-Human Party is that goverments need to spur on open-source projects that would compete with big-tech; creating open-source social networks; open-source search engines and the like.

    Our world runs on software and yet we have little democratic say over it. By moving to open-source programs powering the Internet, we could transition to Nationality as a Service; opt-in, digital-first nationality.

1 Trackback / Pingback

  1. Notable quotes/Monumental Trees/Vanishing Asia | Cool Tools – My Blog

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.