Ambient clinical intelligence (ACI) solutions already support clinicians by capturing patient-provider conversations and contextualizing them to create accurate clinical documentation. But in the future, ACI will go much further, analyzing thousands of voice characteristics to detect disease, injury, and mental illness—delivering powerful patient insights and decision support to clinicians at the point-of-care.

As I write this, I’m in Orlando, Florida, at the conclusion of the HIMSS Global Health Conference & Expo. It’s an important annual event for us, giving us the opportunity to unveil our latest AI-driven ambient clinical intelligence (ACI) innovations, and connect with the healthcare community.

At the start of the conference, I took part in a media tour with one of our customers, Dr. Hal Baker, SVP and Chief Digital & Information Officer at WellSpan Health. We spoke about how our ACI solution—the Dragon Ambient eXperience (DAX)—helps improve the patient-clinician experience and reduce clinician burnout.

One of the highlights of HIMSS 2022 was a very well-attended educational session with Greg Moore, Corporate VP of Microsoft Health & Life Sciences, and Joe Petro, Nuance EVP and Chief Technology Officer. They gave us a fascinating insight into how ACI technology is being used today—and what’s next for ambient AI in patient care. In this article, I’ve captured some of the key points of their session.

Ambient clinical intelligence alleviates burnout and improves experiences

The healthcare industry is under enormous pressure, perhaps more than ever before. Amid a worsening labor shortage, clinician burnout is rising, while patient expectations have never been higher. One of the biggest contributory factors to burnout is the huge documentation burden placed on clinicians, but voice AI and ACI solutions are helping reduce that burden significantly.

AI-powered speech recognition solutions have been widely used by clinicians for some time, enabling them to capture the patient story using only their voice. But more recently, ACI technology has taken those automatic documentation capabilities a step further.

ACI solutions like DAX securely record patient-provider conversations and automatically create accurate clinical notes directly in the EHR. This dramatically reduces the time clinicians must spend documenting care, helping alleviate burnout.

Freed from the demands of manually documenting patient encounters, clinicians can focus instead on the patient in front of them, delivering a much better patient experience and rediscovering the joy of practicing medicine. Plus, the time saved allows clinicians to see more patients—without increasing their documentation burden.

As Dr. Baker said in our conversations at HIMSS, “DAX helps you wait for your patients instead of your patients waiting for you.” And one of his colleagues put it another way, saying, “DAX makes it feel less like an interview and more like a conversation.”

But this is just the beginning for ACI. In the future, we’ll use AI-driven ambient clinical intelligence to become clinically aware and do so much more on behalf of the care team and the patient.

What if we could do more with conversation than documentation?

Speaking is the byproduct of a complex system. Every time we speak, we use our lungs, vocal cords, tongue, lips, nasal passages, and brain. And a disease, injury, or medical event involving any of these systems may leave diagnostic clues—biomarkers—that ACI solutions can detect in a patient’s voice.

There are more than 2,500 biomarkers in the sub-language elements of human speech, and they can offer all kinds of insights into a patient’s health and wellbeing.

For example, patients with Parkinson’s have weak, soft voices, including characteristics such as breathiness. Patients with Alzheimer’s use shorter words and more sentence fragments. And children with ADD speak louder and faster than their peers.

Right now, data scientists, researchers, and AI developers are working on ways to use sensory and signal data in patient voice samples to detect disease, injury, mental illness, and even environmental conditions. Their goal is to use ACI technology to give clinicians real-time patient insights and decision support that will fundamentally transform the way we deliver healthcare—and have a huge positive impact on patient outcomes.

The potential clinical applications for these AI-driven solutions are almost limitless, but I’m going to zoom in on two use cases that already show significant promise.

Detecting depression and anxiety from a patient’s voice

Several companies are now validating vocal biomarkers related to symptoms of disease. For example, one of our potential partners, Ellipsis Health, is using vocal biomarkers to give clinicians insights into patients’ emotional state. The company’s clinical support tool uses machine learning algorithms to measure and monitor the severity of depression and anxiety at scale by analyzing the words people say and how they say them—clinically validated vital signs for deep anxiety.

It’s a great example of using voice-based AI to provide real-time, evidence-based clinical decision support. Just imagine the societal impact we can have if a clinician is given a proactive notification of depression symptoms and is then able to successfully intervene and get the patient the help they need, which would have gone untreated otherwise. This is where we’re going with ACI, both through internal development between Nuance and Microsoft as well as through partnering with 3rd party AI services that leverage the DAX signal to bring additional clinical intelligence back to the care team.  

Identifying social determinants of health from a conversation

Another high-value use case for ACI in the future is identifying social determinants of health (SDoH)—factors like socioeconomic status, employment, food security, education, and community cohesion that can have a profound impact on healthcare outcomes.

Future ACI solutions may be able to capture SDoH insights in conversations and, when necessary, help mitigate the effects of SDoH on patient populations. By being aware of SDoH, clinicians and other stakeholders may be able to make better-informed decisions about patient treatment and support. And if healthcare organizations can identify and incorporate SDoH into patient care plans to treat the whole person and not just a disease, patient healthcare outcomes should greatly improve.

ACI opens an interconnected AI ecosystem and a world of limitless opportunities

Since the launch of DAX, clinicians have captured millions of patient voiceprints that are a goldmine for teams researching powerful new applications for ACI technologies. Over time, DAX will open a vast clinical intelligence ecosystem, where healthcare organizations can select from AI-powered solutions to help care teams improve care delivery and patient outcomes.

Discover the Dragon Ambient eXperience

Learn more about how AI-driven ambient clinical intelligence helps improve care delivery and outcomes.

Learn more
Kenneth Harper

About Kenneth Harper

Kenneth Harper is the Vice President and General Manager of Nuance's Healthcare Virtual Assistants and Ambient Clinical Intelligence business. Kenn has been working in the conversational AI industry for 15+ years, helping to shape virtual assistant solutions across mobile phones, TV’s, cars, wearables, robotics, and most recently healthcare systems. Kenneth leads Nuance's Healthcare Virtual Assistant business, which leverages an advanced suite of technologies combined with purpose-built hardware to streamline interactions with the EHR and creation of clinical documentation, allowing physicians to remain 100% focused on the patient without technology getting in the way. Kenn holds a B.S. in human factors engineering from Cornell University and a M.S. in human factors from Bentley University.