Monday, 10 June 2024

Microsoft Makes Major Changes to Recall Following Backlash

Microsoft announced major changes to its AI-powered Recall feature following backlash over significant privacy and security concerns.

Recall is designed to take screenshots of users’ activity, making it easier to search for information later using natural language queries. Microsoft said that privacy and security were two of its primary goals, and that Recall stored its information locally—never in the cloud—and that the data could not be exfiltrated by hackers.

Despite the company’s assurances, security researchers quickly found major flaws in the system. For example, the on-device data was only encrypted when a user was logged out. Once a user logged in, the data was decrypted, making it a simple matter to access the pain text database Recall uses to store the data once it processes the screenshots it takes. While the feature was designed to require admin rights to access, researchers quickly demonstrated how easy it was to circumvent that basic safeguard.

Read More: Windows Recall Will ‘Set Cybersecurity Back a Decade’

In addition, in its first incarnation, Recall could not be turned off during setup. Instead, a person had to go through the setup process before they could disable the feature.

Microsoft’s Response

It seems Microsoft has heard the concerns and is making changes to Recall before it rolls out later this month. Pavan Davuluri—Corporate Vice President, Windows + Devices—says the company has “heard a clear signal…and improve privacy and security safeguards.” The exec outlines a number of major changes aimed at addressing most of the criticism.

  • First, we are updating the set-up experience of Copilot+ PCs to give people a clearer choice to opt-in to saving snapshots using Recall. If you don’t proactively choose to turn it on, it will be off by default.
  • Second, Windows Hello enrollment is required to enable Recall. In addition, proof of presence is also required to view your timeline and search in Recall.
  • Third, we are adding additional layers of data protection including “just in time” decryption protected by Windows Hello Enhanced Sign-in Security (ESS) so Recall snapshots will only be decrypted and accessible when the user authenticates. In addition, we encrypted the search index database.

Davuluri reiterated some of the privacy and security options Microsoft is building into Recall.

  • Snapshots are stored locally. Copilot+ PCs have powerful AI that works on your device itself. No internet or cloud connections are used to store and process snapshots. Recall’s AI processing happens exclusively on your device, and your snapshots are kept safely on your local device only. Your snapshots are yours and they are not used to train the AI on Copilot+ PCs.
  • Snapshots are not shared. Recall does not send your snapshots to Microsoft. Snapshots are not shared with any other companies or applications. Recall doesn’t share snapshots with other users who are signed into the same device, and per-user encryption ensures even administrators cannot view other users’ snapshots.
  • You will know when Recall is saving snapshots. You’ll see Recall pinned to the taskbar when you reach your desktop. You’ll have a Recall snapshot icon on the system tray letting you know when Windows is saving snapshots.
  • Digital rights managed or InPrivate browsing snapshots are not saved. Recall does not save snapshots of digital rights managed content or InPrivate browsing in supported web browsers.
  • You can pause, filter and delete what’s saved at any time. You’re always in control of what’s saved as a snapshot. You can disable saving snapshots, pause them temporarily, filter applications and websites from being in snapshots, and delete your snapshots at any time.
  • Enterprise and customer choice. For customers using managed work devices, your IT administrator is provided the control to disable the ability to save snapshots. However, your IT administrator cannot enable saving snapshots on your behalf. The choice to enable saving snapshots is solely yours.

See Also: Why Windows Recall Is a Nightmare

It’s Not Enough—Microsoft Must Do More

Unfortunately, while Microsoft’s willingness to adjust course and listen to use feedback is admirable, it should never have required any backlash for Microsoft to incorporate the improved security Davuluri outlines. Microsoft recently vowed to prioritize security above all else in the wake of the devastating security breaches.

If Microsoft is pivoting to security above all else, however, how did the company miss such basic and obvious security measures in its initial Recall implementation? Why did it take outside researchers to discover how easily the system could be compromised and data exfiltrated? Why didn’t Microsoft, a $3 trillion company with vast resources, realize how flawed Recall was by default?

The fact that it took outside researchers to hold Microsoft’s feet to the fire for the company to implement acceptable security measures seems to indicate that its business as usual at Microsoft, and the company has yet to take security seriously—despite its claims to the contrary.

Windows Central’s Zac Bowden calls Windows Recall “a PR disaster” and says, “Microsoft has lost trust with its users, and Windows Recall is the straw that broke the camel’s back.” He goes on to highlight how Microsoft has squandered its users’ trust and goodwill by constantly trying to monetize its paying users with pushy ads, as well as selling their data to advertising companies. As a result, people simply don’t believe that Microsoft will live up to its security and privacy promises, even if it appears to be doing so now.

Microsoft’s handling of Recall’s security (or lack thereof) is proof the company needs to do far more to win back user trust.



from WebProNews https://ift.tt/AeQZ8kF

No comments:

Post a Comment