Tech’s dark side takes the spotlight with tales of the Infocalypse; is there a glimmer of hope?

You should stop what you’re doing and read this Buzzfeed story…along with several other recent dark tech stories.

If you think fake news is bad, wait until fake news video becomes a thing. Or until an entire generation of kids hacked by Silicon Valley’s attention-mining tools comes of age.  Or until Facebook throws up its hands and realizes it simply can’t stop the Frankenstein it has created.

We’re talking about the Infocalypse. That’s a word you’ll probably have to get used to — but only if my story here somehow fought its way through your news feed or spam folder.

A set of big stories recently has laid out the dark future that tech is driving us towards.  All intelligent people should digest them carefully. Regular readers know I’ve devoted a lot of ink…er, bits and bytes..to my Restless Project in the past several years. I see a connection between the squeezing of the middle class and always-on technology; I feel like people are living crazy lives, unaware of the larger forces baring down on them, both from tech and from economics.  It scares me. I think it threatens the fabric of our society and the American Way of life.

I think people are starting to agree with me.

Here are the pieces you must read:

First, Wired’s piece on Facebook’s existential crisis is worth every word. It’s stunning what’s happened to the company, and the world, in just 24 months.  I covered Facebook from the beginning. The firm spent years with Peter Pan syndrome, trying to avoid growing up, and avoid taking responsibility for its outsized role in society.  It was cult-like in its insistence on being merely a platform, just a tech company, one that promoted only openness and sharing. Facebook’s utopia, like all utopias became a nightmare. It seems Mark Zuckerberg might finally “get it,” but it could be too late. From the piece:

Even current Facebookers acknowledge now that they missed what should have been obvious signs of people misusing the platform. And looking back, it’s easy to put together a long list of possible explanations for the myopia in Menlo Park about fake news. Management was gun-shy because of the Trending Topics fiasco; taking action against partisan disinformation—or even identifying it as such—might have been seen as another act of political favoritism. Facebook also sold ads against the stories, and sensational garbage was good at pulling people into the platform. Employees’ bonuses can be based largely on whether Facebook hits certain growth and revenue targets, which gives people an extra incentive not to worry too much about things that are otherwise good for engagement. And then there was the ever-present issue of Section 230 of the 1996 Communications Decency Act. If the company started taking responsibility for fake news, it might have to take responsibility for a lot more. Facebook had plenty of reasons to keep its head in the sand.

The only thing more unnerving that this detailed look at tech missteps of the past is this Buzzfeed story about an even darker future — one where the reality distortion field around us becomes complete. Knight Fellow  Aviv Ovadya uses the term Infocalypse to talk about what will happen to our perceptions when it’s easy to create fake video to support any point of view. It’s just really hard to un-see things; our brains won’t adapt to this nearly fast enough. If you think “Memo-gate” is hard to sort through, just wait.

“Alarmism can be good — you should be alarmist about this stuff,” Ovadya said one January afternoon before calmly outlining a deeply unsettling projection about the next two decades of fake news, artificial intelligence–assisted misinformation campaigns, and propaganda. “We are so screwed it’s beyond what most of us can imagine,” he said. “We were utterly screwed a year and a half ago and we’re even more screwed now. And depending how far you look into the future it just gets worse.”

That future, according to Ovadya, will arrive with a slew of slick, easy-to-use, and eventually seamless technological tools for manipulating perception and falsifying reality, for which terms have already been coined — “reality apathy,” “automated laser phishing,” and “human puppets.”

For just a *bit* of positive-leaning news, I’ll close with word of Silicon Valley cult escapees who are hard at work trying to build tools to save us from this dark future. Roger McNamee, an early advisor to Mark Zuckerberg and Facebook investor (and sometimes critic), has helped start something called  the Center for Humane Technology.  

The group is already behind a new campaign called “The Truth about Tech,” aimed at education kids, parents and teachers about tech risks.  Why?  They explain in The New York Times:

The largest supercomputers in the world are inside of two companies — Google and Facebook — and where are we pointing them?” Mr. Harris said. “We’re pointing them at people’s brains, at children.”

Silicon Valley executives for years positioned their companies as tight-knit families and rarely spoke publicly against one another. That has changed. Chamath Palihapitiya, a venture capitalist who was an early employee at Facebook, said in November that the social network was “ripping apart the social fabric of how society works.”

The group also favors smart regulation and, of most interest to me, an effort to appeal to tech workers called a Ledger of Harms — designed to help software developers think more about what they are making, and tools for when they are concerned about products they are being asked to build.

The group appeared last week at a panel discussion in New York called The Dark Side of Design: A Conversation About Addictive Technology” that was covered by Mashable:

We’ve reached a moment of realization. The conversation around tech’s negative influence has continued to build. Last year, venture capitalists Sean Parker and Chamath Palihapitiya both criticized the negative effects of social media on our lives. Just over the weekend, The Guardian published an investigation into YouTube’s algorithms promoting conspiracy theories.

The tech companies themselves are beginning to react. Last week, in a list of five priorities for creators, YouTube CEO Susan Wojcicki said the company has “a serious social responsibility” to get issues like hate speech and self-harm right. Zuckerberg has been touting his new focus on “meaningful connections” and the idea of “time well spent,” most recently in a call with investors and analysts after another stellar earnings report.

Get reading.  Then, back away from your gadgets for a while.

ORDER THE NEW EDITION OF GOTCHA CAPITALISM NOW! (Print edition also available)

AlertMe
If you’ve read this far, perhaps you’d like to support what I do. That’s easy. Buy something from my NEW LIBRARY AND E-COMMERCE PAGE, click on an advertisement, or just share the story.


Marriott Hotels

Don’t miss a post. Sign up for my newsletter

About Bob Sullivan 1645 Articles
BOB SULLIVAN is a veteran journalist and the author of four books, including the 2008 New York Times Best-Seller, Gotcha Capitalism, and the 2010 New York Times Best Seller, Stop Getting Ripped Off! His latest, The Plateau Effect, was published in 2013, and as a paperback, called Getting Unstuck in 2014. He has won the Society of Professional Journalists prestigious Public Service award, a Peabody award, and The Consumer Federation of America Betty Furness award, and been given Consumer Action’s Consumer Excellence Award.

2 Trackbacks / Pingbacks

  1. Facebook had me like a Congressional candidate without my consent — bobsullivan.net
  2. Facebook set me to 'like' a Congressional candidate without my consent — bobsullivan.net

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.