Artificial IntelligenceNewsSecurityTech

Microsoft AI Team Accidentally Leaked 38 Terabytes of Sensitive Data

0
microsoft windows 12

Microsoft AI team accidentally leaked/exposed tens of terabytes of internal data, including passwords and private keys. The data was disclosed while publishing a storage bucket on GitHub.

Users of the GitHub repository, where Microsoft provided open-source code and AI models for image processing and recognition, were asked to download the models from the Azure Storage URL.

According to cloud security startup Wiz, the URL gave permissions to the entire storage account where millions of sensitive data were stored. The Azure URL contained 38 terabytes of sensitive information, including private keys and passwords.

Microsoft AI researchers also, by mistake, shared the personal backups of two Microsoft employees’ personal computers. The URL also contained more than 30,000 internal team messages from Microsoft employees.

The URL was supposed to provide read-only permission but was misconfigured to allow complete control. Due to this mistake, anyone can delete, add, replace, and inject malicious content into users’ sensitive data.

While answering the question regarding the data leakage, Microsoft relied on the fact that no customer data was exposed, and no services were at risk due to this issue.

Masri
Masri serves as the Chief Content Editor at BestKodiTips. With three years of experience, she excels in creating technical content, focusing on how-to guides, Android and Kodi tutorials, app reviews, and addressing common technological challenges. She ensures to stay abreast of the latest tech updates. Outside of work, Masir finds pleasure in reading books, watching documentaries, and engaging in table tennis.

    CloudNordic Lost all Data due to a Ransomware Attack on its Data Centers

    Previous article

    Meta Changed Facebook Branding, and There is No Major Difference At All

    Next article

    You may also like

    Comments

    Comments are closed.