Tuesday 19 September 2023

Microsoft AI Researchers Exposed 38TB of Sensitive Company Data

WebProNews
Microsoft AI Researchers Exposed 38TB of Sensitive Company Data

Microsoft’s reputation for not being able to protect its data and keys has taken another hit, with its AI researchers exposing 38TB of data.

According to TechCrunch, security firm Wiz discovered a GitHub repository belonging to Microsoft that directed users to download source code and AI training models from an Azure storage URL. Unfortunately, Wiz’s security researchers found that the URL was misconfigured, giving users access to everything on the storage account.

Unfortunately, the Azure storage account in question held some 30,000 internal Teams messages, secret keys, passwords to corporate services, and personal backups of at least two employees. To make matters worse, the URL granted users “full control” instead of restricting them to “read-only,” meaning anyone who accessed it had free reign to wreak havoc. The URL has been misconfigured since at least 2020.

“AI unlocks huge potential for tech companies,” Wiz co-founder and CTO Ami Luttwak told TechCrunch. “However, as data scientists and engineers race to bring new AI solutions to production, the massive amounts of data they handle require additional security checks and safeguards. With many development teams needing to manipulate massive amounts of data, share it with their peers or collaborate on public open source projects, cases like Microsoft’s are increasingly hard to monitor and avoid.”

Microsoft has been under fire recently for its security, or lack thereof. The company’s services were compromised by Chinese hackers, leading to US government email accounts being compromised. At the same time, Tenable CEO Amit Yoran has accused the company of being “grossly irresponsible” with its Azure security.

This latest revelation is unlikely to improve Microsoft’s reputation in the realm of security.

Microsoft AI Researchers Exposed 38TB of Sensitive Company Data
Matt Milano



from WebProNews https://ift.tt/YHaBIRD

No comments:

Post a Comment