Microsoft Recall: The Screenshot Feature That Sparked a Privacy Firestorm
Windows' AI-powered screenshot recording tool raised alarm bells among security researchers and privacy advocates before its troubled launch
Microsoft's Recall feature, announced as part of its Copilot+ PC initiative, represents one of the most controversial product decisions in the company's recent history. The feature continuously captures screenshots of user activity at regular intervals, processes them with on-device AI to make the content searchable, and stores the results in a local database. The concept—allowing users to search their visual history of computer use—immediately drew fierce criticism from security researchers and privacy advocates who warned it could create an unprecedented surveillance tool.
Security researchers were among the first to raise alarms. Within days of Recall's initial preview, researchers demonstrated that the local database storing screenshot data was inadequately protected, with sensitive information including passwords, financial data, and private messages stored in plaintext.
Key Takeaways
- Recall continuously captures screenshots of all user activity and stores them in a searchable local database
- Security researchers found the initial implementation stored sensitive data including passwords in plaintext
- Microsoft was forced to delay the launch and make Recall opt-in after intense public backlash