We’re spying on students. Is that safe or dangerous?

This Seattle Times / AP investigation turned into a really important story. Click to read.

I spent last week interviewing the parents of a teenager who committed suicide after a terrible encounter with sextortion. The entire episode began and ended in a single, horrible night on Instagram.  I’ll tell their full story another day, but the interviews are haunting me. Can’t something be done? Companies like Meta and Alphabet know where I’m having lunch next week. Can’t they stop sextortions in progress?

School districts around the country are already trying — and the results are also haunting me. A fantastic story published this week by The Seattle Times and The Associated Press takes an honest, even-handed look at surveillance software designed to protect kids, and all that can go wrong with it. Their piece begins:

One student asked a search engine, “Why does my boyfriend hit me?” Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves. In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state.

Software named “Gaggle” had flagged these posts during a typical school year in that small Pacific Northwest town.  According to the Electronic Frontier Foundation, thousands of schools around the country use such software. Getting flagged by it is so common the kids call it getting “Gaggled.”

I’d like to begin my thoughts in a place I rarely admit; I’m deeply conflicted.  For years, people have wasted money on technology solutions for major social problems. Tech companies often overpromise, while buyers are often hoping for a cheap, quick answer that doesn’t exist.

But with those parents in my head, I can’t dismiss the idea out of hand.

Here are just some of the problems.

Many parents and kids are unaware that they are under such intense scrutiny. And, as is often the case with privacy issues, others might be vaguely aware thanks to a fine-print disclosure, but not aware of the extent of the surveillance — they’ve not given informed consent.

Picking through thousands of messages each minute to decide which are dangerous or illegal is a sensitive and challenging task.  It’s easy to imagine a literature student searching for phrases found in classic novels could trigger nudity or violence alerts, for example.

And — this should surprise no one —  when the Seattle Times and AP asked questions about the software, they discovered some pretty egregious vulnerabilities that exposed students’ very private data.

Students around the country have fought back against Gaggle and similar software. Student journalists in Kansas have complained the software violates their free press rights (they are correct).  When clever students ran “tests” against the software to see what phrases would trigger it — administrators told them to cut it out.

The EFF points out that companies which sell the software keep their algorithms secret, so parents and students don’t really understand how they are being watched.  Also, the software could be used to expose LBTQ+ students against their will, a potentially dangerous misstep.

And the software isn’t cheap.  The story points out that the six-figure price tag could instead fund a full-time school counselor, maybe more.

It’s important to note that student surveillance software is used — for now, anyway — only on school-issued devices.  That changes the calculus around students’ expectations of privacy.

Now for the obvious and more compelling counter-argument — if a student says they are planning a mass shooting on a school-issued device, or sends a note indicating they are contemplating self-harm, don’t we want to do everything we can to stop that? In fact, isn’t there a burden on schools to do so? Yes, and yes.

These all-or-nothing privacy debates feel coldly academic in a blog post, but they have serious, life-or-death consequences.  If you stick around the privacy world long enough, you’ll hear variations of this debate over and over.  Imagine the health secrets we could unlock if all patient medical data were fed into brilliant AI machines, for example?

I’m not arrogant enough to offer an answer about school surveillance software here, though I have observations to share.  Firms that sell it should welcome clever experiments by students; they will only serve to improve the product.  Parents and students should never feel surprised by it.  Disclosures should be clear and frequent.  Secrecy around this kind of software raises my spidey sense, and it should raise yours, too.

Companies that sell the software should open themselves up to rigorous, independent testing for efficacy. I have a sneaking suspicion that it does not work as well as advertised. First, because kids have a way of evading filtering tools, and second, because the chilling effect of always-on surveillance could easily do more harm than good.  (Here’s more reading from Rand on that.)

The hope lies in the future.  As I often say, tech created this problem, it’s going to have to solve it.  Plenty of tech firms are working on “privacy enhancing technologies” like differential privacy, which offer the promise of data sharing without privacy compromises.  Whatever realm you can imagine — fighting terrorism, evaluating mortgage applicants, finding at-risk children — privacy enhancing technologies will be absolutely essential to unlocking tech’s true potential. It will also get us out of these endless debate loops that pit privacy and safety as enemies.

In the meantime, I have to say, I sure wish school districts would invest in great teachers and counselors rather than questionable software.  But I take it on good faith that many people are trying their best.  I don’t want to ever interview parents whose child was killed by software again — though sadly, I’m sure I will.

Don’t miss a post. Sign up for my newsletter

About Bob Sullivan 1675 Articles
BOB SULLIVAN is a veteran journalist and the author of four books, including the 2008 New York Times Best-Seller, Gotcha Capitalism, and the 2010 New York Times Best Seller, Stop Getting Ripped Off! His latest, The Plateau Effect, was published in 2013, and as a paperback, called Getting Unstuck in 2014. He has won the Society of Professional Journalists prestigious Public Service award, a Peabody award, and The Consumer Federation of America Betty Furness award, and been given Consumer Action’s Consumer Excellence Award.

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.