
Google Chrome quietly altered text in its settings this week. The change removes a direct assurance that its on-device AI model keeps user data away from company servers. Privacy researcher Alexander Hanff spotted the edit days after he exposed how the browser downloads a 4GB Gemini Nano model without asking first.
The discovery has stirred fresh doubts about how Google handles local AI. Users expect on-device processing to mean exactly that. No cloud. No telemetry. Yet the company scrubbed the phrase that made that promise explicit.
Hanff runs the site That Privacy Guy. In his May 8 post he laid out the before and after. Earlier Chrome versions told users under the System section that the browser “can use AI models that run directly on your device without sending your data to Google servers.” The sentence vanished. New wording simply notes the models run on device. Turn the toggle off and features might stop working. Nothing more.
The edit lands at a bad moment. Hanff’s earlier report detailed how Chrome drops the Gemini Nano weights.bin file into a folder called OptGuideOnDeviceModel. It happens automatically on capable hardware. No consent dialog. No settings checkbox labeled “download this 4GB AI model.” Delete the files and Chrome fetches them again.
His analysis pulled from macOS filesystem logs. On a fresh profile the directory appeared at 16:38. The weights unpacked minutes later. Smaller models for text safety and prompt routing arrived too. Chrome checks device specs in its Local State file. Performance class 6. Plenty of VRAM. Then it pulls from Google’s edge servers.
At billion-user scale the numbers add up. Each 4GB download burns roughly 0.24 kilowatt-hours. Multiply across hundreds of millions of machines and the electricity and carbon footprint grow large. Hanff calculated potential emissions in the tens of thousands of tonnes of CO2 equivalent. He called the pattern troubling. Similar behavior showed up in software from Anthropic.
Google pushed back. A spokesperson told multiple outlets the model has been available since 2024. It powers scam detection and developer APIs. Data stays off the cloud. The company added a toggle in February. Turn off “On-device GenAI Enabled” or the later “On-device AI” setting and the model deletes. No more downloads or updates.
That statement appeared in coverage from Gizmodo and Engadget. Both ran the company’s words in full. The model uninstalls automatically on low-storage devices. Features like real-time warnings against fake sites run locally.
But the removed text raises questions. Hanff listed three possibilities. The original claim was never accurate. An architecture shift now sends some data back. Or the company simply wants legal breathing room. None sit well. He argued the assurance counted as a binding representation. Users kept the toggle on because they believed it.
Legal experts may take interest. Hanff pointed to the EU’s ePrivacy Directive. Article 5(3) requires consent before storing information on a user’s device. The Gemini Nano download looks like it fails that test. GDPR articles on transparency and data protection by design enter the picture too. Similar rules apply under UK and California law.
Chrome’s market share makes the stakes high. The browser sits on more than 60 percent of desktops and mobiles worldwide. Default settings reach hundreds of millions. Many users never open the System page. They never see the toggle at all.
Earlier coverage from Forbes in January noted the incoming control. The pre-release toggle deleted models tied to scam detection. Google described it as processing locally. No personal data sent to the cloud. The piece captured the tension. Users gained an off switch yet the software had already placed the files without clear notice.
Developers gained new tools. Chrome 148 integrates Gemini Nano through a JavaScript API. Websites can call summarization or rewriting functions that run locally. The promise was speed and privacy. Yet the silent install undercut the message.
Security researchers flagged another issue. The added model expands the browser’s attack surface. Local inference means new code paths. Potential for exploits that never leave the machine. Mozilla has pushed back on similar web AI standards. The open web risks fragmentation when one vendor ships large binaries by default.
Google insists the feature helps users. On-device scam detection spots phishing in real time. Text suggestions stay private. The company points to automatic cleanup on resource-constrained devices. Still the pattern feels familiar. Roll out first. Answer questions later.
Hanff demanded clarity. Confirm whether any data ever left the device. Restore the exact wording if the claim holds. Move to explicit opt-in. He posted the questions publicly and tagged Chrome security leader Parisa Tabriz. No detailed reply has surfaced yet.
The episode highlights a larger shift. Browsers once shipped lean. Now they bundle large language models measured in gigabytes. Storage is cheap for some. Bandwidth costs add up for others. Metered connections in developing markets feel the hit first.
Users can act. Open Chrome settings. Head to System. Flip the on-device AI control off. The model should remove itself. Flags at chrome://flags let power users dig deeper. Enterprise policies offer stronger blocks. But most people won’t bother.
And that is the point. Default behavior shapes what millions experience. When the default includes a multi-gigabyte download and then drops the privacy language that justified it, trust erodes. Google built its brand on search and speed. Privacy rhetoric helped too. The latest moves test how far that rhetoric stretches.
Recent reporting from Tom’s Hardware tied the download to possible EU law violations and large energy waste. The piece echoed Hanff’s calculations. At scale the electricity burned for initial distribution alone rivals small-country consumption.
Tech observers watch closely. Browser engines compete. Chrome’s decisions set expectations for everyone. If local AI becomes table stakes then consent, transparency and resource costs must improve. Otherwise users grow numb. They accept the bloat. They ignore the toggles. And the quiet accumulation of models on their drives continues.
The text change itself is small. One sentence gone. Yet it signals discomfort. Google no longer wants to make that particular promise in plain view. The reason matters. Users deserve to know it. So far the company has offered general statements but not the specifics Hanff requested.
Chrome will keep evolving. New versions arrive every few weeks. AI features will expand. The question is whether the company learns from this episode. Clear communication. Honest defaults. Or the cycle of silent rollout, public surprise and partial walk-back repeats.
For now the 4GB model sits on countless machines. The privacy sentence is gone. And the conversation about what on-device really means has only grown louder.
from WebProNews https://ift.tt/AlxO09P
No comments:
Post a Comment