A class action lawsuit that started more than 5 years ago seems to be coming to an end with Apple agreeing to pay $95 million for allegedly eavesdropping on its users.
At the heart of this case, we find the privacy problems linked to modern voice assistants like Siri. While Apple admits to no wrongdoing, the company wants to “move on” from this case.
The class action settlement would compensate US Apple users of Siri-enabled devices with up to $20 per device if they meet certain conditions.
We looked into the case to understand how the issue impacted affected users and what this means for Apple users today.
A case that started in 2019 with a whistleblower
The case, which now affects a large number of Apple users, stems from a 2019 Guardian report in which Apple contractors were accused of routinely listening to “confidential user medical information” and other sensitive information via “accidental” recordings made by Apple’s Siri voice assistant.
In the Guardian’s report, a whistleblower went on the record to express concerns about Apple’s lack of disclosure and the frequency of “accidental Siri activations” that picked up extremely sensitive personal information.
The whistleblower said in 2019, “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters, and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
What followed was Apple going into full damage control. The same year, Apple issued apologies and announced stricter privacy changes to Siri.
Today, Siri’s privacy runs on Apple Intelligence standards. We will have more on this later.
The action that Apple took to improve privacy in 2019, however, was not sufficient enough for the lawyers leading this class action suit. In January 2025, after years of litigating the case, a preliminary settlement agreement was filed in federal court in Oakland, California — a state that leads the US in digital privacy laws.
Does this mean Siri listens to my conversations?
Those who have not heard of this legal case may ask themselves how it affects them today. A valid question is, “Does Siri still listen to my personal conversations without my consent?”
The short answer today is no.
Siri recordings have always been activated by keywords, such as “Hey Siri.” These keywords are the means by which the user consents and allows Siri to operate and record. The type of “accidental” Siri recordings that this legal case covered have already been fixed and patched by Apple.
Apple made privacy modifications to Siri in 2019, and today, Siri’s privacy is managed by Apple Intelligence — the company’s new AI. Siri will still record what you say when you engage with it and stores recordings on your device, offering the Apple Intelligence on-device privacy technology protection standards.
Apple spokesperson Nadine Haija recently told The Verge that Siri has been engineered to protect user privacy from the beginning.
“Siri data has never been used to build marketing profiles, and it has never been sold to anyone for any purpose,” Haija said.
Hajia explained that Apple “settled this case to avoid additional litigation, so we can move forward from concerns about third-party grading that we already addressed in 2019.”
Legal cases regarding voice assistant privacy are not exclusive to Apple. Google has faced its share of legal problems with its voice assistant, while other companies like Amazon have faced issues as well.
The new era of Siri: Apple Intelligence
Apple says that with the rollout of Apple Intelligence, Siri entered a new era.
“Siri learns what you need. Not who you are. What you ask Siri isn’t associated with your Apple Account,” the company claims.
As mentioned, any Siri recording done on iPhone, iPad, Apple Watch, or Apple Vision Pro today never leaves your device — unless you choose to share it.
While the privacy of Siri has been significantly enhanced, users should understand that no system is immune to data leaks, malfunctions, vulnerabilities, data exfiltration, or cyberattacks.
Who will get paid by Apple in the Siri case agreement?
While US District Judge Jeffrey White still needs to approve the settlement agreement, conditions have been laid out to define which users will receive compensation.
If the settlement is approved, Apple users need to meet the following conditions to receive a payment:
- Users must be US-based.
- Users must have owned a Siri-enabled iPhone, iPad, Apple Watch, MacBook, iMac, HomePod, iPod touch, or Apple TV between September 17th, 2014 and December 31st, 2024.
- Users would also need to swear under oath that they accidentally activated Siri during a conversation intended to be confidential or private.
- Users and the family of users must not work for Apple or any related entity.
- Household members cannot file multiple compensation claims.
The maximum cap for compensation is $20 per Apple device. As with all class action cases, individual compensation payments depend on how many users demand compensation.
The settlement says that alleged claims include violation of the Wiretap Act and the California Invasion of Privacy Act, Cal. Penal Code (CIPA).
Apple will create a $95 million non-reversionary common fund that will be used to pay all approved settlement claims, notices, and settlement administration costs.
The settlement also gives Apple 6 months to confirm that they have permanently deleted individual Siri audio recordings collected before October 2019.
Apple would also be required to publish a webpage to “further explain” the process by which users may opt out of Siri options.
Final thoughts on the future of AI voice assistants and privacy
The payout amounts for affected users in class action settlements like this one may seem low. Nevertheless, these legal actions are necessary to pressure companies to deploy stronger privacy protections.
While a lot has happened since this case began, with Siri’s privacy being considerably improved through on-device security and AI, as mentioned, no system is foolproof. Users are also beginning to reject the old marketing tech model, in which their data is collected and sold to third parties for targeted advertisement.
Voice assistant technologies are poised to play a major role in the next several years as AI-driven technologies are embraced. To secure users’ privacy, innovation in this area must also evolve and be strengthened.
This is an independent publication, and it has not been authorized, sponsored, or otherwise approved by Apple Inc. Siri and iPhone are trademarks of Apple Inc.