您现在的位置是:休閑 >>正文
【】
休閑1人已围观
简介AI researchers at Microsoft have made a huge mistake.According to a new reportfrom cloud security co ...
AI researchers at Microsoft have made a huge mistake.
According to a new reportfrom cloud security company Wiz, the Microsoft AI research team accidentally leaked 38TB of the company's private data.
38 terabytes. That's a lotof data.
The exposed data included full backups of two employees' computers. These backups contained sensitive personal data, including passwords to Microsoft services, secret keys, and more than 30,000 internal Microsoft Teams messages from more than 350 Microsoft employees.
Tweet may have been deleted
So, how did this happen? The report explains that Microsoft's AI team uploaded a bucket of training data containing open-source code and AI models for image recognition. Users who came across the Github repository were provided with a link from Azure, Microsoft's cloud storage service, in order to download the models.
One problem: The link that was provided by Microsoft's AI team gave visitors complete access to the entire Azure storage account. And not only could visitors view everything in the account, they could upload, overwrite, or delete files as well.
Wiz says that this occurred as a result of an Azure feature called Shared Access Signature (SAS) tokens, which is "a signed URL that grants access to Azure Storage data." The SAS token could have been set up with limitations to what file or files could be accessed. However, this particular link was configured with full access.
Adding to the potential issues, according to Wiz, is that it appears that this data has been exposed since 2020.
Wiz contacted Microsoft earlier this year, on June 22, to warn them about their discovery. Two days later, Microsoft invalidated the SAS token, closing up the issue. Microsoft carried out and completed an investigation into the potential impacts in August.
Microsoft provided TechCrunch with a statement, claiming “no customer data was exposed, and no other internal services were put at risk because of this issue.”
TopicsCybersecurityMicrosoft
Tags:
转载:欢迎各位朋友分享到网络,但转载请说明文章出处“夫榮妻貴網”。http://new.maomao321.com/news/40c35299607.html
相关文章
Bisquick's tone
休閑Never trust a breakfast food with your politics coverage.Everyone knows the best way for a brand to ...
【休閑】
阅读更多Google wants to give you back your face when you're in VR
休閑The blank, robotic VR masks seen in mixed reality videos could soon receive a facial fix that's pret ...
【休閑】
阅读更多Want to leave an angry comment? This site forces you to take a test first.
休閑You see an infuriating headline, skim through some paragraphs just enough to know this story is terr ...
【休閑】
阅读更多
热门文章
最新文章
John Oliver explains the enormous problem of police accountability
Hyperloop One begins initial talks with the Indian government
Rich people deserve more nice things, so here's a bowling alley just for them
Peeps Oreos taste nothing like Peeps, and that's a good thing
Marriage equality haters made a 'data
How one construction company is thinking outside the box