Siri Is Listening: The Uncomfortable Truth About Voice Assistant Privacy
Apple admitted that human contractors regularly listened to Siri recordings, many of which captured intimate personal moments.
In July 2019, a report by The Guardian revealed that Apple employed contractors who regularly listened to Siri recordings as part of a "grading" program designed to evaluate the accuracy of Apple's voice assistant. The recordings, which included accidental activations, captured deeply personal content — medical consultations, business conversations, sexual encounters, and discussions of illegal activity. The revelation shattered Apple's carefully cultivated image as the tech industry's privacy leader and exposed a gap between the company's public commitments and its internal practices.
The contractor review program, Apple acknowledged, involved human analysts listening to a small percentage of Siri interactions to assess whether Siri correctly understood and responded to user requests.
Key Takeaways
- Apple contractors listened to Siri recordings containing medical consultations, business deals, and intimate moments
- Recordings often contained enough contextual information to identify speakers despite Apple's anonymity claims
- Accidental Siri activations from similar-sounding words capture and process private conversations