Microsoft is adding new security measures to assuage widely publicized concerns over its new “Recall” AI feature. Some, though, still aren’t convinced the company went far enough.
It’s now just eight days until Microsoft releases Recall, a new artificial intelligence (AI)-driven program that will periodically take, store, and analyze screenshots of Copilot+ PCs as they’re being used day-to-day. Recall is supposed to act like a kind of memory bank, allowing users to instantly find and reference things they’ve come across recently: apps, websites, images, and documents.
From the outset, Recall has been criticized as a potential goldmine for personal data theft. The noise got loud enough that, on Friday, Microsoft announced three new security-oriented updates for it:
-
In a reversal of its initial stance, Microsoft will now ship Recall turned off by default.
-
Users will need to enroll in Windows Hello in order to enable it, and so-called “proof of presence” will be required to use its primary features.
-
Recall data will be encrypted, and only decrypted and accessible once a user authenticates via Windows Hello.
Though they may represent a step in the right direction, experts remain skeptical that these changes will be enough to protect users’ most sensitive passwords, photos, personally identifying information (PII), and financial information from hackers.
Risks in Recall: A Case Study
Many security experts cringed when Recall was announced, but few more than Marc-André Moreau, CTO of Devolutions. He worried that Windows’ newest toy would inevitably capture and store visible passwords from his company’s software for managing remote connections. With such passwords in hand, hackers would be able to easily connect to and manipulate any victim PC.
“Looking at documentation for how Recall works,” he recalls, “it literally said that it wouldn’t make an effort of removing sensitive information, credentials, or PII — anything which you would want scraped out, it would just keep in local files.”
Microsoft’s logic, it seemed, was that because Recall screenshots were stored only on the user’s machine, they would remain safe from remote access. “Microsoft has this new chip which makes it possible to do the processing locally, and they thought that everybody would be fine since the data isn’t uploaded to the cloud,” Moreau explains. “But you wouldn’t install a keylogger on your machine just because the files are stored locally. Files can be grabbed by malware. So why would you enable Recall?”
To demonstrate the point, he performed a simple red team exercise. In his telling, “I didn’t have to do much. I just set up an environment, used some tool that somebody made online to force-install it, and then I installed [Devolutions’] Remote Desktop Manager. I clicked ‘view password,’ then ‘record,’ and then I found the database. I opened it, and I could see the extracted password alongside the screenshot that includes the password.”
Here’s Recall capturing temporarily visible passwords from Remote Desktop Manager in a test Azure VM. It’s less effective that I would have thought, the search results are screenshots, and it’s unclear how one can obtain the full OCR text it used for the match pic.twitter.com/RUiLs57bKz
— Marc-André Moreau (@awakecoding) June 3, 2024
Other researchers have also found simple ways of accessing sensitive data in Recall screenshots. One has already developed and released an open source tool to help speed up the job.
To try to protect his customers, Moreau next looked for a way to exclude his company’s software from Recall by default. He came up short.
Are Microsoft’s New Updates Enough?
Users will have more control over their data privacy now, thanks to Microsoft’s turning off Recall by default.
Moreau is skeptical, though, that Windows Hello can be fully and properly integrated into Recall without delaying its preview release, which is mere days away. “I’m in software, things don’t happen that fast,” he says.
Dark Reading reached out to Microsoft for comment on how it will be able to marry Windows Hello and Recall in time for June 18. In response, Microsoft said in a statement: “As we shared in our May 3 blog, security is our top priority at Microsoft, in line with our Secure Future Initiative (SFI), and we are evaluating Recall through that lens. As we implement SFI across Microsoft, we may shift some feature release dates and will update our public roadmaps as this happens.”
In the barely a month since that blog post, and Satya Nadella’s letter “prioritizing security above all else,” for some critics, Recall recalls other AI products that are getting rushed to market.
Ironically, AI could well solve these programs’ most pressing security flaws. “I could upload a Recall screenshot to ChatGPT today and tell it to identify the data which looks sensitive, and it will be able to,” Moreau notes. “They could have used their AI chip to help solve this [data leakage] but they didn’t even try. They were too eager to ship.”
Source: www.darkreading.com