Sunday 2 June 2024

Why Windows Recall Is a Nightmare

Microsoft recently unveiled a new AI-powered feature, Windows Recall, that takes constant snapshots of your system and activity—but it’s being labeled a privacy nightmare.

On the surface, Windows Recall sounds like a cool new way to use AI to make life easier. Forget a website and can’t remember if you visited it in Firefox, Chrome, or Brave? No problem, just look through your snapshots. Trying to remember that statistic that was in a document you can’t seem to find? Just use Recall to find it.

Microsoft describes the feature in one of its Learn articles:

Recall utilizes Windows Copilot Runtime to help you find anything you’ve seen on your PC. Search using any clues you remember, or use the timeline to scroll through your past activity, including apps, documents, and websites. Once you’ve found what you’re looking for, you can quickly jump back to the content seen in the snapshot by selecting the relaunch button below the screenshot.

Microsoft is marketing Recall as a “photographic memory” for your PC. Talking with the Wall Street Journal’s Joanna Stern, it’s clear that CEO Satya Nadella is excited about the technology and the possibilities it offers.

Why Recall Is a Horrible Idea

Despite Recall’s wow factor, there’s a number of reasons why users should be wary.

See Also: OpenAI Loses Another Researcher Who Raises More Concerns About Company

Disk Space

The first, although certainly not the biggest, issue is the disk space Recall uses. By default, Recall uses at least 10% of a computer’s disk space. On larger disks, it eats up 15% by default, taking 150 GB of a 1 TB drive.

Much of this is due to the fact that the snapshots are photos, which take up far more space than other snapshot/rollback methods.

As a result, users need to plan for Recall’s storage requirements if they want to use the feature.

Privacy

Microsoft has vowed to make Recall as private as possible, saying users will be able to disable the feature and restrict what it snapshots. The company outlined a number of default features and restrictions:

  • Filtering out specific websites will only work in supported browsers such as Microsoft Edge, Firefox, Opera, and Google Chrome. You always have the option to filter out all browsing activity by adding an app filter for a browser. To add support for website filtering, developers need to implement Recall activity APIs.
  • Recall won’t save any content from your private browsing activity when you’re using Microsoft Edge, Firefox, Opera, Google Chrome, or other Chromium-based browsers.
  • Recall treats material protected with digital rights management (DRM) similarly; like other Windows apps such as the Snipping Tool, Recall will not store DRM content.
  • To help you access text and images currently on your screen, when you launch Recall or when you select the Now button, your current screen will be displayed in Recall without saving a new snapshot.

Right off the bat, there’s a major concern with what Microsoft outlines:

Recall does not automatically stop snapshotting when entering passwords. While many websites and apps hide the password as you type it, not all do. What’s more, most password fields have the ability to show you the password so you can be sure you entered it correctly. Throughout it all, Recall will continue snapshotting.

As a result, Recall can store copies of your most sensitive data, everything from passwords to documents, pictures to passwords.

While Microsoft says snapshots will only be stored locally, that decision is a policy choice, not a technical limitation.

The situation is reminiscent of Apple’s decision to scan for CSAM content locally on iPhones, iPads, and Macs. Princeton researchers had developed such a system before Apple, only to scrap it and warn against anyone deploying a similar technology. Their concern was that once unleashed, there was no way to prevent governments and organizations from forcing a company to change their policy, forcing them to scan content online since there was no technical limitation to prevent it.

To make matters worse, even though Apple realized they could never safely guarantee how their on-device CSAM scanner would be used and abandoned the plan, the damage was done. Because Apple showed it could be done, governments around the world are asking tech companies to implement the very type of technology that Apple and Princeton abandoned.

The same is true of Microsoft. The current decision to store snapshots locally can be easily changed at some future date, either at Microsoft’s instigation or at a government’s insistence, completely negating any promised privacy.

Security

Microsoft has a long history of security issues, prompting criticism from industry leaders, government officials, a government review board, and more.

Read More: Microsoft’s Security Issues—Why the Company Is Failing

While Microsoft has vowed to put security above all else, its abysmal track record puts the burden of proof on the company to prove it can live up to its promise.

It’s a safe bet that snapshots will quickly become a high-priority target for bad actors, giving them easy access to a goldmine of valuable data. Until Microsoft proves it can keep its users secure, trusting the company to protect data as sensitive as screenshots of all your activity may well be a fool’s errand.

We’re not the only ones who think Microsoft’s security is a problem in the context of Recall, with The Register making some of the same arguments.

Just Because You Can…

Recall is shaping up to be the poster child of the kind of irresponsible behavior that is increasingly characterizing the AI industry. Nadella was clearly enamored with the technology his company is preparing to unleash on Windows users.

Unfortunately, Nadella and company seem to have forgotten the old adage: Just because you can do something doesn’t mean you should.



from WebProNews https://ift.tt/9jQmW1Y

No comments:

Post a Comment