Microsoft’s AI research division recently experienced a data leak due to a misconfigured Azure Blob storage bucket. This incident highlights the importance of data security in the age of artificial intelligence [1].

Description

The incident involved a misconfigured Azure Blob storage bucket [1], which resulted in an excessively permissive Shared Access Signature (SAS) token [1]. This token granted unauthorized access to sensitive data, including personal information of Microsoft employees [1] [4] [6], passwords to Microsoft services [3] [6], secret keys [3] [4] [5] [6], and internal Microsoft Teams messages [1] [2] [3] [4] [5] [6]. Unauthorized users were able to access 38TB of additional data.

Cloud security company Wiz promptly discovered the issue and notified Microsoft, leading to the revocation of the SAS token in June 2021. Microsoft claims that no customer data or internal services were compromised [5] [6]. However, the exposure could have potentially allowed malicious actors to manipulate Microsoft’s systems and services [4].

In response to the incident, Microsoft has taken steps to address the issue. They have enhanced GitHub’s secret scanning service to monitor for exposed credentials or secrets in open-source code modifications [4]. Additionally, Microsoft advises against using Account SAS tokens for external sharing and recommends using them sparingly due to their lack of monitoring and governance.

Conclusion

The data leak incident at Microsoft’s AI research division serves as a reminder of the importance of data security. While Microsoft claims no customer data or internal services were compromised [5] [6], the potential for manipulation by malicious actors highlights the need for robust security measures. Microsoft’s actions to enhance GitHub’s secret scanning service and caution against the use of Account SAS tokens for external sharing demonstrate their commitment to addressing the issue. Moving forward, it is crucial to prioritize data security in the age of artificial intelligence and exercise caution when using SAS tokens.

References

[1] https://dataconomy.com/2023/09/18/microsoft-data-leak-wiz-azure/
[2] https://itwire.com/business-it-news/security/microsoft-exposed-38tb-of-private-data-on-github-wiz-researchers.html
[3] https://techcrunch.com/2023/09/18/microsoft-ai-researchers-accidentally-exposed-terabytes-of-internal-sensitive-data/
[4] https://www.neowin.net/news/microsoft-researchers-leak-38tb-of-sensitive-data-due-to-misconfigured-storage-on-github/
[5] https://mashable.com/article/microsoft-ai-researchers-leaked-private-data-azure-link-github
[6] https://www.infosecurity-magazine.com/news/microsoft-ai-researcher-leaked/