Multiple privilege escalation issues in Microsoft Azure’s cloud-based Health Bot service opened the platform to server-side request forgery (SSRF) and could have allowed access to cross-tenant resources.

The vulnerabilities, identified by Tenable Research, were quickly patched by Microsoft but showcase inherent concerns about chatbot risks, researchers warned.

The Azure AI Health Bot Service enables healthcare organizations to build their own virtual health assistants to interact with patients and manage administrative workloads. They can integrate any manner of internal processes and information into those workloads, meaning that the chatbots potentially have privileged access to extremely sensitive health information.

“Risk for any given customer of the health bot service is entirely dependent on the information they have made available to the service,” says Jimi Sebree, senior staff research engineer at Tenable.

Azure Chatbots & Cross-Tenant Access

If a malicious actor had exploited the issues, they would have been granted management capabilities for hundreds of resources belonging to other Azure customers, Tenable warned.

According to a blog post released today, exploitation of the bugs allowed researchers to access the service’s internal metadata service (IMDS) and subsequently access tokens allowing for the management of cross-tenant resources.

“Based on the level of access granted, it’s likely that lateral movement to other resources in customer environments would have been possible,” Sebree says. “This is common in cloud services such as this, and safeguards are put in place to prevent cross-tenant access. The vulnerabilities discovered by Tenable Research are essentially bypasses of these safeguards.”

The researchers found the issues to affect endpoints within the Data Connections function that allows developers to integrate external APIs, including the endpoint that supports the Fast Healthcare Interoperability Resources (FHIR) data exchange format.

In a nutshell, the attack involved configuring a data connection using a malicious external host, and setting that up to respond to any queries from the platform with 301 or 302 redirect codes indicating that the webpage had been permanently moved. Those redirect responses were sent back to the IMDS, which in turn responded with metadata that leaked the access tokens.

“Exploitation of these issues was trivial, and no prior knowledge beyond general usage of the health bot service was required for exploitation,” Sebree says.

Rushed AI Development Risky

Sebree also explains that the vulnerabilities detailed in Tenable’s analysis for the health bot service showcase the risks introduced with rushed development and deployment cycles of these interactive services.

“Instead of prioritizing being first to market, businesses must prioritize taking the time to ensure their product security and customer security,” Sebree says.

According to the Tenable blog post, “The vulnerabilities raise concerns about how chatbots can be exploited to reveal sensitive information. In particular, the vulnerabilities involved a flaw in the underlying architecture of the chatbot service, highlighting the importance of traditional Web app and cloud security in the age of AI chatbots.”

This is especially important given that the global healthcare industry, which is undergoing a transformational wave of digitalization as well as adoption and integration of AI-powered applications, is consistently a target of cybercriminals owing to the extremely valuable personal information health records contain.

Fortunately, there are efforts underway to bolster healthcare security in the cloud and AI realm and beyond. In May, the Advanced Research Projects Agency for Health (ARPA-H) announced it was investing $50 million into its Upgrade program to enhance healthcare cybersecurity through automation, allowing providers to concentrate more on patient care.

Healthcare providers and medical device manufactures are also being encouraged to improve data security across medical devices through closer cooperation.

Source: www.darkreading.com