$95 million is a headline-making number, especially when it comes as a result of a proposed class action lawsuit settlement in which claimants accused Apple of unlawfully surveilling them through Siri and other Apple devices. If the settlement is approved by the district court in Oakland, California, overseeing the case, people who owned a Siri between September 17, 2014, and December 31, 2024, and who believe they experienced an “unintended Siri activation,” will be able to file a claim for $20 as compensation.
The lawsuit began in 2019 after a Guardian investigation in which a whistleblower came forward to allege “countless instances” in which Apple devices, including Siri and the Apple Watch, had inadvertently listened in on users. At the time, Apple had staffed numerous third-party contractors to listen to the devices, which included the inadvertently obtained data — though it claimed this was only for purposes of improving them, not, as many litigants alleged, selling the data to advertisers.
The company quickly stopped the practice, though not before public debate about whether Siri was really spying on its users became widespread. After all, this wouldn’t be the first time a tech company had been accused of audio surveilling its users without their knowledge or consent. Not only that, but a nearly identical lawsuit concerning Google’s Voice Assistant, filed in the same court and likely to result in a similar settlement, is waiting in the wings.
What’s happening with all of these audio devices? Are Amazon and Microsoft listening in as well? And does this all actually constitute a serious breach of privacy?
The answers to these questions are simultaneously simple and complicated. The experts Vox spoke with to find out more told us that the public outcry over Siri’s data collection may all be much ado about, relatively, nothing.
Ah, but it’s the relativity that’s the concerning part. The reason Siri’s data collection may not matter in the bigger picture isn’t because it’s not potentially harmful or unethical.
It’s because it’s just a drop in the bucket.
We probably won’t know the extent of Apple’s audio surveillance
The ongoing anxiety over the potentially invasive practices of large tech companies like Apple and Google may have distorted our understanding of what the Apple lawsuit is about. Alex Hamerstone, a cybersecurity consultant for the security consultancy TrustedSec, told Vox that the lawsuit may be “projecting a lot of people’s concerns about the overall surveillance state.”
In other words, what feels like it’s a major referendum — a chance to hold major tech companies accountable for a major privacy violation — is really what Hamerstone described as “a niche case.”
The way Siri and similar smart assistant devices are supposed to work is that you use a “wake phrase” to activate them — like, “Hey, Siri.” The problem is that Apple devices (and, allegedly, those of other tech companies including Google and Amazon) can be activated inadvertently in any number of ways, and users haven’t always known when those activations have occurred. The lawsuit against Apple alleges that not only was its unauthorized listening a deceptive business practice, the impact involved multiple instances in which Apple violated confidentiality laws as well as the privacy of minors. The Guardian’s 2019 investigation reported that Siri had allegedly unlawfully recorded everything from confidential doctor’s visits to illegal drug deals to couples having sex.
In the Apple lawsuit, claimants point out that Apple, in a 2018 letter to Congress, had stated in no uncertain terms that Siri would never be activated without users’ express consent — arguably a boldfaced lie. But Apple continued to protest that it hadn’t violated the letter of its privacy contract with consumers. While Apple did acknowledge to the Guardian that it made inadvertent recordings and passed “a small portion” of these unauthorized recordings on to its third-party contractors, it never clarified whether it sold any of that data to advertisers, which was one of the biggest allegations made by consumers in the class action lawsuit.
“There is a widespread belief that these devices are listening to you,” Cindy Cohn, executive director of the Electronic Frontiers Foundation, told Vox. “People think Facebook’s listening to you, all sorts of things are listening to you, and they’re placing ads based upon what they hear.”
As it stands, we’re not likely to find out if that’s true. As the lawsuit against Apple moved forward, it seemed possible that we would; the court had found as recently as 2021 that “the targeted advertising claims … are rendered plausible by the unique nature of oral communications.” In other words, the court was sympathetic to the view that the private conversations you have are unique; it shouldn’t be possible for you to receive advertising based on those conversations unless your privacy has been violated in a major way.
Under the terms of the proposed settlement, however, Apple doesn’t admit liability — not for recording users without their explicit consent to begin with, nor for any potential misuse of those recordings. And without a trial to force its hand, the company most likely won’t have to disclose what it did with all that data.
“The upshot for all of us is that Apple claims that they did get these accidental recordings, and they used them to make the system better, but that they didn’t use it for other things,” Cohn told Vox. “And we aren’t going to know the answer to that … I would say at the end of this case, we’re no closer to learning whether that’s true than we were at the beginning.”
It’s worth noting that existing US surveillance and technology laws haven’t always kept up with the emergence of “smart” devices. When a similar lawsuit was brought against Samsung and others in 2017 for unlawfully recording its users on its smart TVs, a New Jersey federal court ultimately dismissed it on a technicality — though the court also seemed skeptical of the ephemeral nature of the suit’s allegations. After the lawsuit was dismissed in 2018, a new civil suit was brought against it; that case is still making its convoluted way through the courts. As for the forthcoming Google lawsuit, it’s so similar to the Apple one that it will likely have a similar outcome.
One reason it’s hard to pin down tech companies for these types of violations in the legal system is that the nature of laws against, for example, wiretapping rely on an outdated understanding of the tech in question. “So much information security these days is contractual and regulatory,” Hamerstone told Vox. “The legislation has not kept up with the speed of technology.”
Another reason is that it’s hard to know what’s been taken from you in this situation; since (as the court noted) Apple was the one in possession of the intercepted recordings, the claimants struggled to articulate what exactly had been intercepted. Cohn noted that Apple had deleted many of the recordings in question, which made it even harder for the lawsuit to move forward effectively.
Another reason the lawsuit may have faced difficulty is that it’s hard to claim you haven’t consented to having your data collected in one way when you’ve likely unwittingly consented to having it collected in so many others.
Your phone doesn’t need to listen to you
Cohn emphasized to Vox that Apple’s insistence that it wasn’t intentionally illegally surveilling users was probably true — because, after all, why would it need to?
“That’s a labor-intensive, compute-expensive thing to do to track us when they’ve got this cornucopia of completely legal, easy to do, low-computational ways of surveilling all of us,” she said.
“I think people are just unaware of how much of a profile marketers and companies have on each of us and how much data we’re sharing,” Hamerstone told Vox. “Generally we agree to do it through these end user license agreements … even if you could get through the legalese, it’s just impossible to read that many words.” Through the use of metadata like cookies on websites, and your daily purchases, media consumption, and behaviors, companies can find out more about you than you ever realized. They can engage in highly complex predictive algorithmic marketing that makes you feel spied upon. Even when you might think you’re just sharing that information in a private conversation that your Alexa just happened to overhear, you could be sharing it in numerous other ways without realizing it.
“Everything is just constantly being collected about us and put in databases,” Hamerstone said. “Your phone doesn’t need to listen to you. It knows what you’re likely to buy and all these other things because of all the other data that we have about everybody.”
“The good news is your phone probably isn’t listening to you in the way that you’re afraid it is,” Cohn said. “The bad news is that doesn’t really matter in terms of [marketers’] ability to place these uncanny ads and make you feel like you’re being watched all the time.”
In case this all makes you feel a bit hunted, you’re not alone; browsers that emphasize privacy control have grown in popularity in recent years. Cohn was adamant that “we need legal protection and that the onus shouldn’t be on individuals to try to protect themselves” from the ever-encroaching reach of data collection into our lives.
“All your devices should come with privacy protections that just work and you don’t have to think about it,” Cohn said.
Cohn also stressed the need for increased privacy laws, more protections for consumers, and an updated legal system that can more readily handle these types of lawsuits. She pointed out that the Apple lawsuit “had to go through a lot of hoops that it shouldn’t have had to go through” and shouldn’t have taken four years just to result in a settlement.
“We need a comprehensive privacy law that’s got real teeth that empowers people to sue,” she said.
The important next step, she said, is to create laws that address not only what the tech companies could be doing with our data, but the invasive things they’re doing already.
“We need to simplify things for these privacy cases and we need to expand the reach of [the law] to include metadata … rather than just having these cases where we’re trying to push the edge.”