Unsecured Microsoft Azure Server exposes passwords etc. of Microsoft systems (Feb. 2024)

Sicherheit (Pexels, allgemeine Nutzung)[German]Security researchers from SOCRadar have discovered an unsecured storage server on Microsoft Azure on which internal information about Microsoft's Bing search engine was stored. Well, the storage server, which was freely accessible to anyone, only contained unimportant stuff from Microsoft employees such as code, scripts and configuration files with passwords, keys and login data for accessing other Microsoft databases and systems.


Advertising

SOCRadar is a security provider that helps companies detect security vulnerabilities and misconfigurations, regularly scans the Internet for unprotected servers. Security researchers Can Yoleri, Murat Özfidan and Egemen Koçhisarlı from SOCRadar discovered a storage server hosted by Microsoft on Azure that was publicly accessible by third parties during such a search on the Internet. When the security researchers took a closer look, they were astonished, as they revealed to Techcrunch.

Unprotected Microsoft Azure storage server

At first glance, the server only contained internal information relating to Microsoft's Bing search engine. However, as this server was not protected by a password and could be accessed by anyone interested via the Internet, the security researchers took a closer look. The Azure storage server contained code, scripts and configuration files with passwords, keys and credentials that were used by Microsoft employees to access other internal databases and systems.

Techcrunch writes that the security researchers informed Microsoft about the vulnerability on February 6, 2024. Must have been a honeypot, because Microsoft secured this server on March 5, 2024 – or the motto was "good things take time". It is currently unknown how long the cloud server was freely accessible on the internet or how long third parties other than the SOCRadar security researchers had access to this data.

Case after case at Microsoft …

During my research, I saw that SOCRadar had already uncovered a case in September 2022 where customer data may have been exposed due to a server misconfiguration. Microsoft had commented on this in this article and complained that SOCRadar had disclosed the information. And then there was the case uncovered by SOCRadar in 2023, in which 38 terabytes of data were exposed – I reported on this in September 2023 in the article Data leak: Microsoft AI researchers lose 38 TBytes of internal data via GitHub/Azure cloud storage. Malicious tongues are already saying that Microsoft must have misunderstood the term "open data".

Well, nobody is immune to a vulnerability or a configuration error. But at Microsoft, such "mistakes" are now commonplace. In July 2023, it became known that suspected Chinese hackers from the Storm-0558 group had captured a Microsoft account (MSA) customer key and were able to use this key to generate tokens to access Azure cloud accounts. I reported on this in the blog post China hacker (Storm-0558) accessed Outlook accounts in Microsoft's cloud and followed up on the case in other blog posts. To this day, Microsoft does not know how the hackers obtained the (MSA) customer key.


Advertising

In January 2024, it became known that the suspected Russian hacker group Midnight Blizzard had been able to crack an account on a test system since November 2023 through password spraying attacks. This account was used to access Microsoft's internal mail system from the test system and read the emails of executives and other employees (see Microsoft hacked by Russian Midnight Blizzard; emails exfiltrated since Nov. 2023).

Passwords and other confidential information must also have been intercepted. Microsoft had to admit that the hackers from Midnight Blizzard had access to Microsoft source codes and probably continued to have access to the systems or attempted to do so even after the discovery and alleged lockout (see Microsoft confirms: Russian spies (Midnight Blizzard) stole source code while accessing systems).

The incident outlined above is just the icing on the cake, confirming that Microsoft is a "shoddy store" that should be given a wide berth.Well, that was unfair dialectics – but it's highly official: the US Cyber Safety Review Board was spooked by the Storm-0558 hack and conducted an investigation into the security incident. The report published a few days ago reveals that Microsoft does not have security under control and that a chain of errors made the Storm-0558 cloud hack possible in the first place (see Microsoft slammed for a cascade of faults that leads to Storm-0558 cloud hack).The recommendation to Microsoft was to get its own store in shape before embarking on new developments. What comes to mind is the half-finished Copilot, which is being herded through the village as a "new pig".

Microsoft comments …

SOCRadar security researcher Yoleri told TechCrunch that the data exposed in this incident could help malicious actors to attack other systems on which Microsoft stores its internal files. The identification of these storage locations "could lead to major data leaks and potentially jeopardize the services used," said Yoleri.

Techcrunch had asked Microsoft for a statement. But Microsoft has remained silent – another familiar tactic from Redmond that was explicitly reprimanded in the US Cyber Safety Review Board's report. In a statement shared after publication on Wednesday, Microsoft's Jeff Jones told TechCrunch: "Though the credentials should not have been exposed, they were temporary, accessible only from internal networks, and disabled after testing. We thank our partners for responsibly reporting this issue."

More details

After I covered the topic within the German edition of this article, based on the Techcrunch article, Can Yoleri, security researcher at the Turkish branch of the security company SOCradar spoke with German site heise and revealed more details.

Can Yoleri came across an unprotected area in Azure Blob Storage during a routine internet-wide scan and came across over a million files in various formats. He found source code in various Windows scripting languages from BAT to PowerShell, JSON and Excel files and source code in other formats.

The researchers told heise, that he also found access data to databases and protected APIs not only in configuration files intended for this purpose, but often also hard-coded in source code, for example in Python scripts, excerpts of which were made available to the editorial team. User names and passwords were also found in batch files, such as those that allow access to a Microsoft-internal Docker container registry.

The openly accessible data collection comprised a total of 1,138,558 files. Yoleri can only speculate about the purpose of the data found. The security researcher does not want to comment on whether third parties, possibly cyber criminals, have also accessed the storage blob, according to heise.

After the security researcher from SOCradar contacted MSRC (Microsoft Security Response Center) at the beginning of February 2023, the data leak has been confirmed at the beginnung of March 2024. But SOCradar security researchers are still waiting for an official acknowledgment on Microsofts security site.

Regarding Microsoft's statement that the credentials were only temporary and only accessible from internal networks and were deactivated after the test, Yoleri sees for some samples further need for action. The Azure Storage Blob in question is still accessible from the outside, even if particularly critical files have been removed. So it took Microsoft nearly a month from report until things has been (partly fixed). But the Azure Storage Blob in question is still accessible from the outside. Is that the Microsoft way of security?

Similar articles:
China hacker (Storm-0558) accessed Outlook accounts in Microsoft's cloud
Microsoft hacked by Russian Midnight Blizzard; emails exfiltrated since Nov. 2023
Microsoft confirms: Russian spies (Midnight Blizzard) stole source code while accessing systems
Microsoft slammed for a cascade of faults that leads to Storm-0558 cloud hack
Data leak: Microsoft AI researchers lose 38 TBytes of internal data via GitHub/Azure cloud storage


Cookies helps to fund this blog: Cookie settings
Advertising


This entry was posted in Cloud, Security and tagged , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *