“I can tell you from a purely anecdotal standpoint, as a person who works with organizations that work with survivors of domestic abuse every day, that nearly everyone that they see who has come in … in addition to suffering what we think of as physical or emotional or mental abuse, it almost always has some technical aspect to it.”
I have long believed that tech companies should be required to test all new products for their potential to be abused: by scam criminals, by domestic abusers, by governments. This should be as standard as cybersecurity penetration testing, or usability testing. We are far from there right now, which is why it’s important that universities and private organizations conduct such tests. I’m excited to share that a group of Duke University students I’ve worked with are doing some of that crucial testing. Eva Galperin was at Duke last week to help organize tests around the use of tracking devices like AirTags and Tile gadgets by abusers, and how victims can protect themselves. I am fond of her term “abusability” and I hope it enters the lexicon in short order.
Many seemingly helpful new gadgets can have a dark side. That small tracking device which might help you find a lost set of keys can also be used by a potential stalker to track you. No one has spent more time in energy calling attention to these kinds of threats than Eva, director of cybersecurity at the Electronic Frontier Foundation, where she has spent the past 15 years or so working on privacy and security issues, particularly as they pertain to vulnerable populations. She’s co-founder of the Coalition Against Stalkerwear and has worked to convince big tech companies to do a better job warning people about potential abuses of their products.
Eva joined me on Debugger in 10 recently to talk about the work she’s doing with Duke — specifically, efforts to warn victims they are being tracked by tracker gadgets. But first I asked her to talk with me about her work on stalkerware.
Click here to listen to Eva Galperin on Debugger, or click the play button below. Below that, you’ll find a transcript if you don’t want to listen to the podcast.
Eva Galperin: So stalkerwear is the entire class of commercially available applications that are designed to be covertly installed on a device and that exfiltrate data from that device.
Bob: And what kind of people uses this kind of software?
Eva Galperin: Stalkers, as it turns out. So the entire purpose of this software is to non-consensually track someone. So by definition, the people who use this are stalkers, and frequently they are domestic abusers who are abusing someone in their family, or with whom they are in an existing relationship and in a power dynamic in which they are the person with more power.
Bob: And this sounds very dangerous.
Eva Galperin: Indeed it is. It is extremely common for women all over the world to experience some form of abuse in their relationships over the course of their lives. It is super common to see that abuse comes with an element of tech-enabled abuse such as stalkerwear. The greatest danger to women … women are most commonly murdered by a person that they know and with whom they are in a relationship. And frequently, leading up to this, kind of you see stalking and other forms of tech-enabled abuse.
Bob: I wonder if there’s any hard research on this, but it certainly seems to the naked eye that increasingly tech at least plays a role in that kind of violence.
Eva Galperin: I can tell you from a purely anecdotal standpoint, as a person who works with organizations that work with survivors of domestic abuse every day, that nearly everyone that they see who has come in … in addition to suffering what we think of as physical or emotional or mental abuse, it almost always has some some technical aspect to it.
Bob: Why is this software legal?
Eva Galperin: Well, the software is legal for a variety of reasons. The first of which is that writing software is protected very strongly by the First Amendment as free speech. However, using this software for stalking is illegal. But very frequently, the people who use this software simply do not face the consequences because either they’re not caught or if they are caught, the justice system is not particularly hard on them.
The companies that make stalkerwear frequently also do a very poor job … and they make stalkerwear that is not very secure. Security researchers have repeatedly found entire classes of stalkerwear that have security vulnerabilities that enable anyone to see the exfiltrated data. And the FTC has actually taken action against those companies, including, Spy Phone and Retina X.
A couple of years ago, they, the FTC even issued a full on ban against one of these companies and its parent company and the CEO. Unfortunately a few months ago it became clear that the company had not in fact gone out of business, that they had merely turned around and re-skinned their product and were back on the market.
Bob: This sounds like the usual cat and mouse game that companies play with the legal authorities. Given that it’s legal to write the software and the software is obviously in use, what kinds of things can victims do to protect themselves from this software?
Eva Galperin: Well, one of the things that I did was I went to the security industry, the industry of people who make antivirus software, which is supposed to scan your devices and tell you whether or not there’s something on there that may be spying on you or that is malicious or that you do not want. I found that the antivirus industry did not take stalkerwear very seriously and was not particularly good at detecting it. And I have managed to convince several of the larger antivirus companies to do a better job of detection. And research has shown that over the last several years, the companies that are involved in my coalition — I am the co-founder of the Coalition Against Stalkerwear — have done an increasingly good job of detecting stocker wear on devices.
Bob: The essential point here is antivirus software keeps software you don’t want off your computer, whether it’s malicious or whatever it does to your computer. So increasingly antivirus software is going to warn you that you might be a victim of stalkerwear.
Eva Galperin: Yes. The antivirus software is viewed with some suspicion in the cybersecurity industry at large, mostly because of the sort of eternal cat and mouse game between threat actors and the people playing defense. But I have definitely, through my activism, done a pretty good job of improving the track record that AV has when it comes to detecting this class of of malware.
Bob: So I know you’re looking into this with students from Duke University. What are you discovering?
Eva Galperin: So, the work that I’m doing with Duke University students is not actually related to stalkerwear, though it is related to stalking. What we are doing is we are testing out the various detection apps for physical trackers such as, as the AirTag, the Chipolo, and Tile… There are a variety of apps out there that tell you that they can detect these physical trackers when they are following you. And no one has really done a particularly good job of testing these claims out.
Bob: So somebody… a stalker might put one of these new Apple devices on a car or on a person to try to stalk them. And there’s new attempts to warn victims that this is happening. And how well is that effort going?
Eva Galperin: Well, Apple actually does a pretty good job of warning people when an AirTag is following them, but only if you exist within the Apple ecosystem. So if you have an iPhone, you will get a warning if you are being followed around by an AirTag after about somewhere between eight and 24 hours. And also it will start to beep. But if you have an Android phone, if you have the gall to exist outside of the Apple ecosystem, then you actually have to download an app that Apple wrote called Tracker Detect. And you have to manually run a scan in order to see whether or not there is an AirTag in your immediate vicinity. This is also true of Tile. In fact, you have to download a different app and run a different scan. And, there is also an entire ecosystem of apps that say that they will scan for all Bluetooth physical trackers. And those claims have not been thoroughly tested. So, that’s what we’re doing this week.
(Note: Eva discussed the results at her RSA talk on April 24, 2023)
Bob: You’re, you’re testing whether or not those apps can really find a Bluetooth tracker that might be following you? What is your hunch about how well they work?
Eva Galperin: My hunch is that they don’t work very well.
Bob: It also seems like it’s a really demanding on a person who may not have any idea that they’re a target to install all this software and make sure that it’s working correctly and it’s updated, all those things. Right?
Eva Galperin: Absolutely. One of thebig problems with the entire notion that you need to detect…that you need to download an app in order to detect a physical tracker, is a sort of hellscape in which if you are worried about physical trackers, you must download every physical tracker’s detection app and then run a scan every time that you are concerned that you are being followed, which is just not a reasonable thing to ask people who are concerned about being stalked. Survivors of domestic abuse are already under a tremendous amount of stress. They are already having a very bad day. And asking them to do this much proactive work, especially when results can be inconsistent and confusing, is simply not a good, effective, or long-term solution to this problem.
Bob: What is a better solution?
Eva Galperin: I have an idea. What I’m trying to get the makers of physical trackers to do is to agree on and publish a standard which will allow the makers of operating systems, such as Android and iOS, to build background scanning capabilities for trackers directly into the OS, and which would allow app makers to build capabilities into apps that would also work in the background, and that would consistently work with every kind of physical tracker. What we are currently up against are companies like Life 360, which, makes the Tile. Life 360 did a very good thing, which was that they added a tracker detection capability to the Tile app, and that was great. And a few months later, they turned right around and they added a capability to turn that detection off, which they refer to as an anti-theft mode.
Bob: So this is an uphill battle, but this whole, this whole area is an uphill battle. These new gadgets get invented, the new software gets invented, somebody says, oh, ‘Gee, this would be a great tool.’ And then you come along and say, ‘Have you thought about what this might mean to survivors of domestic abuse?’ Why do we keep going through this same cycle?
Eva Galperin: Well, one of the reasons for it is the frequently the people who build software and gadgets… build these tools and these platforms for people like themselves. The stuff is largely being built by 30 year old white guys living in Santa Clara and they build tools for people whose lives are like theirs ….for other 30 year old white guys living in Santa Clara.
And that is the group of people who are extremely unlikely to be victims of domestic abuse. And so that just doesn’t enter into their threat modeling when they’re putting out their products.
Bob: If I made you king of Duke University student research for a day, what other research projects like this would you want to do?
Eva Galperin: I would actually start by looking into what vulnerable populations are not currently being served by our cybersecurity industry. And we can start there. One of the great things about working with students is that they come from all walks of life and because they do, they’re aware of all kinds of populations that I don’t even know about. As an example from my own work and from my own office, very recently, one of my coworkers on the public interest team at EFF did a whole bunch of research into the privacy and security problems, with childcare apps in the East Bay in California, in the San Francisco Bay area. And this was .. because I have no children … not a thing that I would have ever come across, I would’ve never known that this entire sort of category of apps exists. And it would not have occurred to me that they are extremely insecure in that they have privacy problems.
Bob: Eva, thanks for your time. I really appreciate it.
Eva Galperin: No problem. I am really looking forward to the results of this research. I’m actually really excited about working with Duke students to look into a question that I feel has not been adequately answered. And I’m really hoping that I can inspire students to direct their research in areas where other people aren’t currently looking, because I think that that is a really impactful way to contribute to safety and knowledge and to the information security industry.