Microsoft’s ‘Recall’ feature draws criticism from privacy advocates
Microsoft’s plans to introduce a “Recall” feature powered by artificial intelligence in its Copilot+ PCs lineup has evoked considerable privacy concerns. But the extent to which these concerns are fully justified remains a somewhat open question at the moment.
Recall is technology that Microsoft has described as enabling users to easily find and remember whatever they might have seen on their PC. It works by taking periodic snapshots of a user’s screen, analyzing those images, and storing them in a way that lets the user search for things they might have seen in apps, websites, documents, and images using natural language.
Photographic Memory?
As Microsoft explains it, “With Recall, you can access virtually what you have seen or done on your PC in a way that feels like having photographic memory.”
Copilot+ PCs will organize information based on relationships and associations unique to each user, according to the company. “This helps you remember things you may have forgotten so you can find what you’re looking for quickly and intuitively by simply using the cues you remember.”
Default configurations of Copilot+ PCs will come with enough storage to store up to three months’ worth of snapshots, with the option to increase that allocation.
In introducing the technology, Microsoft pointed to several measures the company says it has implemented to protect user privacy and security. Recall will store all data it captures only locally on the user’s Copilot+ PC in fully encrypted fashion. It won’t save audio or continuous video, and users will have the ability to disable the feature. They also can pause it temporarily, filter out apps and websites that a user might not want saved as snapshots, and delete Recall data any time.
Microsoft will give enterprise admins the ability to automatically disable Recall via group policy or mobile device management policy. Doing so will ensure that individual users in an enterprise setting cannot save screenshots and that all saved screenshots on a user’s device are deleted, according to Microsoft.
“You are always in control with privacy you can trust,” Microsoft said.
No Recall data will ever go back to Microsoft, and none of the accumulated data will be used for AI training purposes, according to the company.
Little Reassurance
Such reassurances, however, have done little to assuage an outpouring of concern from several quarters — including entities like the UK’s Information Commissioner’s Office (ICO) — about potential privacy and security risks associated with Recall. The company’s own admission that Recall will happily take and save screenshots of sensitive information, such as passwords and financial account numbers, without doing any content moderation has fueled those concerns.
To read the complete article, visit Dark Reading.