Erin and Noah, on the run from Erin’s abusive ex-husband, make their way from Ohio to Missouri as they search for a safe place to spend the night after a near-disaster that morning at Noah’s school. Even simple tasks like shutting off Internet service are dangerous and mind-numbingly frustrating. That’s the fiction component of this episode of No Place to Hide.
Then, Alia and I talk about the billion networked cameras that watch our moves every single day of our lives, making all of us constantly play a game of I-Spy. We hear from Brian Hofer, who’s little brother was shoved to the ground with a police gun to his head because of a database error around a potentially stolen car. We hear from Catherine Crump about license plate reader technology — and the use of cameras to stop illegal dumping. Finally, we hear about the image above, and explain why the Baltimore Police Department is launching a small fleet of aircraft above the city, with a set of cameras trained on residents at all times. It might be America’s most controversial crime-fighting ploy yet.
A partial transcript is below. Listen to the podcast at iHeartRadio.com or click the play button.
BOB: It’s a basic tenet of U.S. law that police can’t barge into your home or rummage through your belongings unless they have a good reason to do that..like probable cause, or an immediate danger. But right now, there’s no limit to the number of times people can take your picture in public, and store it, and use it for some purpose later.
ALIA: And sometimes, big mistakes are made.
HOFER: Well, last Thanksgiving, uh, two Thanksgivings ago cause we just passed one a couple months ago. But, uh, my brother and I were driving home. Uh, he lives with me in Oakland. We were driving home for Thanksgiving, go see the family, No problems driving the exact same car back to Oakland. And, uh, in Contra Costa County, the Sheriff’s deputies, uh, get behind me on the freeway, flashed their lights, tell me to pull over, uh, don’t say a word to us for quite awhile. Um, I’m in a very brightly lit area. the cars turned off, you know, the dome lights are on.
ALIA: We met Brian Hofer in the last episode
HOFER: I tried to make it as, you know, easy going as possible, and they just won’t say a word. Four cars are pinning us in. They force us out of the car at gunpoint, uh, handcuff us, put us in different cars. Um, my little brother, for some reason, they really freaked out on. He was totally compliant. Uh, they forced him out of the car as well. And when they went to put handcuffs on them, they ended up throwing him on the ground, put a gun against his head, you know, executioner style. Uh, threw him face first on the ground, handcuff him, put him in a separate car. They finally come up to me and said, Hey, you know, we got an alert this car is stolen. I was like, that’s ridiculous and I could prove it. And um, I, I had rented the car, there was an app on my phone, and I was like, you know, take the handcuffs off, let me, let me open my phone and I’ll show you. And they said no. They forced me to give over my passcode. I never consented for them to search my phone. Um, and meantime, these other officers are opening all the car doors, the trunk, our suitcases, searching all this stuff, again, no consent. And um, they eventually confirmed that yes indeed, I had rented the car and um, there was no fact pattern or anything, uh, present to suggest we’d ever been violent. They just immediately drew guns.
HOFER: And as I’m sitting in the back of this police car, I see the Vigilant Solutions, uh, monitor and I know Vigilant are a Bay area company out in Livermore. They are, you know, a big player in the license plate reader market. And I see this like, kind of line, uh, of code. And then it says like San Jose. And what we sort of found out later is that the car, maybe, we’re actually not even sure it ever has been stolen now, but maybe at one point in time it had been stolen in the past, It was recovered, and nobody ever took it off the hotlist. So, it was just a matter of time, I guess, until this particular vehicle was used by someone, um, that it was gonna, you know, get dinged by a license plate and whoever just happened to be the unlucky person might’ve received the same treatment as me.
BOB: Ok, this story is terrifying — it could literally happen to any of us. You could think of it as just a database error. But I think we should consider the license plate reader technology. Next time a police car goes by, notice those Wall-e looking cameras poking out from the hood, angled at the parked cars. They are automatically collecting plate numbers and running them against a database.
ALIA: And it could be just to collect parking ticket money. But those license plate cameras are everywhere, not just on police cars. They capture plates as people drive in and out of places like New York City. String data from all those cameras together and license plate reader cameras can track our movements, without a smartphone.
BOB: It’s a human rights issue, so much so that the ACLU has been paying attention to license plate readers for years.
CRUMP: You know, the topic always interested me because I grew up in part in what was then the former Soviet Union. Um, I wasn’t Russian but my parents were diplomats stationed there. We knew a lot of dissidents, and it’s hard to grow up in that environment without developing maybe an overzealous interest in surveillance and its impact on people.
ALIA: Catherine Crump was a Staff Attorney at the ACLU before she took her current job at Stanford.
CRUMP: In my uh, first full week of law school, the events of September 11th tragically occurred. The Patriot Act passed in my second month, which vastly expanded the government surveillance powers. And I’ve just been fascinated by the subject ever since. Um, and particularly how as technology makes it easier and easier to engage in surveillance, we’re going to preserve a sphere of civil liberties.
ALIA: So an automatic license plate reader —
CRUMP: — is a special kind of camera, it takes a photograph of every passing car, the license plate, but also sometimes the surrounding area. And it can translate the license plate number into machine readable text. And that’s useful because then you could compare that plate number against watch lists. So for example, you could have a list of stolen cars or you could have a list of cars that belonged to people for whom arrest warrants have been issued. Um, or you can have sort of more nebulous lists that might raise more concerns like lists of suspected gang members or lists of, um, maybe people, the government suspects may be involved in terrorist activities. And so, um, that way, um, when they drive past the plate reader it can signal an alert. And by the way, I don’t think the civil liberties concern with license plate readers is necessarily that they exist. 04:38 I think we all agree that we want to locate people who, um, who have outstanding arrest warrants. Um, it’s that the data then gets stored indefinitely and if you take all of the license plate hits, um, for that are that are collected over long periods of time, you end up with a pretty detailed and granular picture of where people have been, and where people go can actually be quite revealing. Right. Where do they go to the doctor? Where do they go to the pharmacy, who their friends are, when they stay there, where they spend the night. So on and so forth and giving law enforcement agents or really anyone access to that type of information without any clear legal standards. Um, it’s something to be really concerned about because it has so much potential for abuse.
ALIA: Tracking people over time…sounds a lot like what we talked about with cell phone location data.
BOB: Ya. Data over time is a big problem. But now that we’re started with all these cameras, they’re like a virus. They just keep spreading. There’s no vaccine. License plate readers that took photographs were just the beginning.
Marc Groman, Former Senior Advisor for Privacy in the Obama White House, paints the picture of how real time video cameras are filling out this surveillance picture, in a dramatic way:
MARC GROMAN: So podcast hosts and journalists who are looking for the rosy optimistic view, never call me. I’m not the guest for that. Um, you know, I, I sort of see certain things as inevitable and this proliferation of cameras and other sensors means that we will all be living in this 24/7 observation surveillance world. Um, and I do think that is likely to be altering behaviors and, and chilling conduct. And I think that for many people is really concerning. I talk about this issue with my son in this context, which is that, you know, for teenagers there is nowhere they go ever, where there isn’t a camera. Everyone has it on
BOB: OK, I have an aside.
BOB: That I have to say. Which is this diabolical combination of services like Nextdoor and Ring cameras. Now have you seen any of these stories? So Nextdoor is one of these neighborhood email lists, essentially.
ALIA: I use it.
BOB: Sure. And what you see now are people posting videos all the time, saying here’s a suspicious car, here’s a suspicious person. And now you’ve created a public storm about something, and nobody knows who any of these people are or whether they’re criminals or whether it’s just an innocent person walking by. But the, sort of, heightened level of paranoia that all of this creates, combined with the level of false accusations, I think, again, we’re heading towards a police state where we don’t even have to involve the police. We’re doing it ourselves.
ALIA: It’s such a good point. I, I saw a Nextdoor posting in my neighborhood the other day that was from somebody, and he was like– it was a picture that his Ring doorbell had captured of like some, like a blur of a person walking by, right?
And he essentially was like, the gist of the post was, this is a suspicious person. It is our duty as neighbors to post these things and let each other know when we see something suspicious. I just hope that you all do this more.
And I remember being really struck by that and feeling sort of like, that feels, this all just sort of feels wrong. Y’know, because what if we’re wrong? We don’t know. This wasn’t, you didn’t capture this person breaking into your home. You just think they look suspicious.
BOB: And like, let your mind wander to how far you can go and how bad this could get, if everyone starts posting pictures because people look like they don’t belong in their neighborhood.
ALIA: Yeah. It’s a really good point.
ALIA: You know, Bob, probably the craziest story we’ve heard while putting this podcast together was the Baltimore surveillance plane project
ALIA: The city of Baltimore is so concerned about crime that it is going to put a small fleet of low-flying aircrafts in the sky, outfitted with cameras, so they can gather evidence on residents. 24/7. Local authorities say the cameras will cover 90% of the city. Right now, it’s just an experiment, but if it really does cut crime, it could become permanent.
BOB: I know. This permanent eye in the sky thing, on the one hand, sounds like the creepiest thing ever. But, I think I’m going to tell you something that’ll surprise you, Alia.
BOB: For a while, my neighborhood in DC has a terrible problem with car break ins. My car windows have been smashed five times over the years. It’s an awful feeling, leaving your car at night, not knowing if you can drive it the next morning. You probably remember it happened once in the afternoon when I was going to drive to cover a hurricane
ALIA: Yes, that was a bad day
BOB: Well, despite all my talk about privacy…I think I’d be happy if cops put up cameras on my street. I want to feel safe parking there.
BOB: And I know that’s a contradiction. Or at least seems like one.
ALIA: But I get it. And I get how people in Baltimore are so frustrated by crime that they might be ok with an eye in the sky
BOB: I’m not the only one who’s seemingly comflicted. Listen to Catherine Crump talk about cameras around dumpsters in the Bay Area.
CRUMP: a city department is like, well we want to use surveillance cameras to try to catch these dumpers and they had to come to this body and sort of debate. And it was really interesting because it was sort of cause Oakland’s a liberal city, but there was like one liberal interested in another liberal or interests like the people, everyone wanted the dumping to be stopped. 16:57 Right. Um, but at the same time people were concerned about the privacy of the homeless people under the bridge. And then there was a question about whether it worked, right. Cause oftentimes they might see the truck but they wouldn’t get the license plate reader. And so it raised all these interesting issues about surveillance, right? Is it effective? Right? Is this a strong enough interest to engage in surveillance? How do we mitigate the privacy interest? And I think it was a, it’s always such an issue how this group of people deals with different issues has just been really, um, has just been a really, really interesting to watch, right? What is the place of this powerful technology? No one wants to ban it or not very many people want to ban it, right? I was an ACLU lawyer for 10 years. I don’t want to ban it.
BOB: So, it’s complicated. It’s all about the guardrails around the data. I’m very concerned because I don’t think there are great examples where we’ve succeeded with good guardrails around data. Can we do that? I don’t know. But, failure to do that means we won’t be able to use tech to keep us safer, or find cures, or make sure smarter.
BOB: But one thing we cannot forget in this discussion of cameras is the chilling effect. What does it feel like to be watched, Alia?
ALIA: I feel…awful. I feel like screaming, or I just want to leave.
BOB: That’s just it. A thing observed, changes. People act differently when they know they are being watched. What Marc Groman says to his teenager, about cameras being everywhere, that’s just awful. Because teenagers…well, all of us…should be free to do the occasional dumb thing without fear of becoming a viral video, or ending up in a cop database, or a Chinese government database
ALIA: You know, I was really touched by something Brian said. And maybe you should think about this, Bob. A single story of a crime tends to make people overreact. The truth is, in a lot of America, people are pretty safe.
HOFER: You know, we have to teach people to stop being so afraid of each other. Um, it’s, it’s really bewildering to me that we’re nationally, and I know here locally for for certain, uh, we are at a 40 year crime low, um, in the Bay area.
BOB: That’s really, really important. We can’t let fear lead us on this issue. Those who would trade liberty for safety deserve neither, Ben Franklin said. We face that choice now. But it’s even more serious. This episode is about the massive increase of surveillance cameras we can expect in the next couple of years. But facial recognition companies like Clearview are at the tip of a much more important topic – artificial intelligence. We spent the past few decades violating people’s privacy by collecting information – their location, their image, their shopping habits. But the next 10 years will be much more about feeding the information that’s been collected into powerful computers that can predict the future, even change the future. For good, or for bad. Our final episode will examine this turning point. By 2030, will we be enjoying the fruits of all this amazing technology, or will we be living in fear of it?