Ever since 2019, Apple’s reputation for guarding user privacy has been under scrutiny. A whistleblower revealed back then that snippets of Siri voice commands were being listened to by human reviewers without users’ explicit knowledge. Now, six years later, French consumers are banding together in a sweeping class action lawsuit that could cost Apple dearly if the court finds the iPhone maker in violation of Europe’s strict data protection rules.
The Allegations and GDPR Concerns
The new lawsuit, reported by Le Parisien, is spearheaded by former Green Party deputy Julien Bayou along with attorneys Eva Naudon and Olivia Roche from law firm Phaos. They argue that Siri recordings were collected and processed without properly informed consent, which runs afoul of the EU’s General Data Protection Regulation (GDPR). Under the GDPR, any processing of personal data must be transparent and based on a legal foundation – a standard the plaintiffs say Apple failed to meet.
At the heart of the complaint is the claim that Apple never clearly informed users that Siri voice interactions could be stored, transcribed, and reviewed by humans. While Apple has maintained that the scheme was purely to improve Siri’s accuracy and that participants could opt out, the plaintiffs insist that many users had no idea their private conversations were being shared externally. If the court finds Apple guilty of processing personal data unlawfully, fines can be steep under the GDPR, potentially reaching up to 4% of global annual revenue.
The Advertising Angle
Beyond privacy breaches, the lawsuit raises the specter of targeted advertising. Plaintiffs suggest that these recorded voice snippets may have been used to refine ad profiles or for other marketing gain. While Apple has repeatedly denied any marketing use—stating that Siri data was never tied to user profiles nor sold to advertisers—the doubt persists.
Julien Bayou, in his interview with Le Parisien, emphasized that European law demands informed consent for any personal data processing, especially if it might fuel ad targeting. “If Apple wanted to train Siri by listening to our conversations, it had to ask permission in clear terms,” Bayou said. “They didn’t, and that’s the problem under GDPR.” Apple counters that only 0.2% of all Siri requests were ever sampled, anonymized, and kept purely for system improvements, assuring users that individual identities could not be extracted from the data.
Potential Payouts and Global Impact
The proposed class action in France is asking for a full refund of the purchase price of every Apple device bought in the last ten years, capped at five devices per claimant. If the court grants that relief, consumers could be in line for substantial compensation, making it a potential windfall for participants.
Moreover, the lawsuit allows for aggravating circumstances—like professions bound by confidentiality (lawyers, doctors, journalists)—to trigger additional damages. “If someone’s private conversation about a medical condition or sensitive legal matter was intercepted, that’s a serious breach,” notes the plaintiffs’ legal team. They warn that affected professionals could seek extra remuneration above the base refund.
In the United States, a similar class action reached an $95 million settlement with Apple, which opted to pay millions rather than engage in a protracted trial. That landmark deal may have inspired French consumers to go the collective route, hoping to replicate or even surpass the American outcome if European courts prove even tougher on data privacy violations.
What’s Next for Apple?
Apple has already adjusted its Siri review process: it now defaults review participation to off, clearly warns users when they opt in, and allows complete opt-out at any time. Despite these changes, the French lawsuit is focused on past practices, seeking recompense for years in which users arguably weren’t given honest notice.
If the court sides with the plaintiffs, Apple could face a hefty judgment and an urgent need to revise global policies to comply fully with GDPR’s transparency and consent requirements. It might also spur similar actions in other European countries, as privacy watchdogs and consumer rights groups continue to scrutinize big tech’s handling of personal data.
For now, French consumers can join the class action via the legal team’s website, potentially securing refunds and additional damages. Whether this case will be Apple’s most expensive privacy battle in Europe remains to be seen, but one fact is clear: the era of silent voice reviews is under legal fire, and the outcome might reshape how virtual assistants manage user data forever.