top of page

Innovation, AI, and Healthcare Leadership: Reflections from ViVE 2025 in Nashville

Writer: sophiaalyssasimpsonsophiaalyssasimpson

Updated: Mar 13


A cover image for this article with the title 'Innovation, AI, and Healthcare Leadership: Reflections from ViVE 2025 in Nashville' on the left side and a picture from ViVe on the right side of the photo. The background of this cover is yellow.
Innovation, AI, and Healthcare Leadership

Attending ViVE 2025 in Nashville was so fun to be reminded of all the innovation going on at the intersection of health and technology. Two days filled with inspiring conversations, cutting-edge innovations, and getting to connect meeting with some of the brightest minds in healthcare technology. From speech recognition breakthroughs to AI-driven automation, ViVE showcased the best in health tech. Here’s a recap of a few moments and amazing people I had the privilege of meeting.

Large "ViVE" letters in an indoor setting with a colorful abstract backdrop of blue, purple, and pink. Text "Powered by CHIME + HLTH" is visible. Modern and vibrant atmosphere.
ViVe: Powered by CHIME + HLTH

Speech Recognition: A 15-Year Journey

Joe Petro walked through the broader 15-year history of speech recognition at Microsoft, from the early days with Dragon and Nuance to its latest advancements, showcasing just how far the technology has evolved. While speech recognition has reached impressive levels of accuracy today, the next big challenge is adoption—getting clinicians to trust and integrate it into their workflows.

Speech recognition has been in development moving from early rule-based systems to today’s AI-driven models. Microsoft's role in this journey highlights how improvements in hardware, computational power, data, and LLM breakthroughs have dramatically enhanced accuracy and usability.

I vividly recall the early struggles of speech recognition in 2016 and how far we’ve come today. Shoutout to the IBM pioneers like Bhavik Shah and Michael Picheny, who have driven progress in this space. The advances in character error rate (CER) reduction and seamless integration into clinical workflows are truly impressive.

The Struggle for Accuracy

95% sounds good until you’re working in a field like healthcare that requires both high precision and excellent recall, strong confidence (aka accuracy) because of what’s at stake—your health.

Early speech recognition relied on basic pattern matching and struggled with different accents, background noise, and domain-specific language (like medical jargon). Error rates were high, making speech-to-text unreliable for professional use. I remember in ~2016 proving this point to my leadership team in the early days of IBM Watson (IBM has a wonderful history of hiring diverse talent from all over the world - and many executives spoke English with strong accents) - when I ran a recording of a conference call with one of our executives through our own speech recognition - the results demonstrated how far we had to go.

Where We Are Now

More recently, while the accuracy, speed, and usability has improved dramatically, my experience as recently as a few months ago in Berkeley California demonstrates that there is still much room for improvement. I was in my own annual checkup and the medical assistant asked if I was OK that my doctor use an AI note-taking technology. I was surprised at the question - and agreed to the usage. However as soon as my doctor walked in - she let me know that she doesn’t like using the AI note-taker because it doesn’t work that well and creates more hassle then value.

On the technology side - LLMs have transformed speech recognition, allowing models to learn from vast amounts of data and medical language models improving transcription accuracy for clinical terminology.

The image below of Edit Rate Evolution (CER vs. Yield) reflects this progress—showing how error rates have dropped significantly. However, the challenge now is not only accuracy, but adoption:

  • Clinicians still hesitate to trust AI for documentation.

  • Workflow integration remains a hurdle—doctors need tools that fit seamlessly into their daily routines.

While speech recognition has come a long way technically, adoption still depends on usability and trust. Microsoft, in partnership with Epic Systems is starting to focus more on user-friendly design– with a goal of AI-powered documentation to feel effortless.

Graph titled "Edit rate evolution" showing data points and trends in green, yellow, and black on a white background.
Edit rate evolution

VoiceWriter did an excellent post earlier this month comparing state of the art for different use cases in speech recognition.

Tier list featuring tech company logos ranked from S to F against a rainbow background. Gemini tops, Google at the bottom.
Different use cases in speech recognition

Driving value using Large Language Models

Jigar Shah and Dr. Anmol Madan, CEO of RadiantGraph delivered an insightful session on how large language models (LLMs) are enhancing customer experience investments in healthcare. Specifically the great work that Blue Shield of California team is doing in using this to analyze trends and issues that are driving dis-satisfaction in our call centers and increasing the rate at which we can solve member pain points. The ability to personalize interactions, automate workflows, and generate insights from vast datasets is changing how we as healthcare organizations operate.

Two men sit on stage with a red "AI at ViVE" backdrop. One in a brown blazer, the other in grey. They appear engaged in discussion.
Jigar Shah and Dr. Anmol Madan, CEO of RadiantGraph

Mayo Clinic’s Strategic Approach to AI and Automation

At ViVE 2025, Mayo Clinic's Chief Nursing Officer, Ryannon Frederick, shared insights into the organization's innovative use of artificial intelligence (AI) to enhance clinical workflows amid workforce shortages. Another speech recognition example.

Nurses often face significant documentation burdens, leading to reduced patient interaction and increased administrative tasks. Recognizing this, Mayo Clinic collaborated with generative AI platform Abridge to develop an ambient nursing documentation tool. This solution allows nurses to record patient interactions via smartphone, automatically integrating the data into the electronic health record (EHR) system. This approach enables nurses to focus more on patient care rather than paperwork.

Two women seated on stage against a colorful mural backdrop with "ViVE" text. One speaks into a microphone. Mood is engaged.
Mayo Clinic's Chief Nursing Officer, Ryannon Frederick

Building a Chatbot with Salesforce

At the Salesforce booth, I had the opportunity to build a custom chatbot for Medicare coverage questions—in just 10 minutes! With the support of Michael M Bozorgnia, The Principal Technical Architect at Salesforce, I created an AI-driven assistant to streamline patient inquiries. Salesforce is clearly making healthcare automation more accessible and impactful. It was a great reminder that you can’t take advantage of AI and all the promise without the basics.

Woman in scrubs using a tablet; text promotes AI health agent at Booth #632. Vibrant colors and medical icons in background.
Building a Chatbot with Salesforce
Display showing "AI maturity timeline," with a robot under "Autonomous Agent." Background features green hills, trees, and the word "Theater."
Building a Chatbot with Salesforce

Meeting Amazing Builders

ViVE was also about connecting with incredible innovators leading the charge in healthcare transformation.

Particle Health: Medical Data Interoperability

At ViVe, another big theme this year was interoperability - and the progress that is being made in the space - to actually get these various systems and data sets to talk to one another. Always a pleasure to catch up with Jason Prestinario, CEO of Particle Health.

Fabric: Streamlining Clinical Workflows

I had a great conversation with Aniq Rahman, a fellow Modern Healthcare #40Under40 honoree and CEO of Fabric Health. Discussing applying AI to navigation and getting members to the right care for their need.

Turquoise Health: Transparency in Healthcare Pricing

Turquoise Health is a great example of using healthcare data to drive results. Trinity College processor Gerardo Ruiz Sanchez published a paper using Turqoise data that I had stumbled upon recently - and it was a pleasure to connect with Tejas to discuss ways we can creating a more transparent healthcare system by making pricing data more accessible. Their end-to-end pricing platform, powered by machine-readable hospital and payer data, is shaping a new marketplace for patients, providers, and payers.

Hexagonal portraits of diverse, smiling people. Text: "ViVE - Connecting with" and logos of St. Charles Health, V, and others on white.
Meeting Amazing Builders

Tech Equity Award Finalist

One of the highlights of ViVE was seeing Shannon on stage, accepting recognition on behalf of Blue Shield of California for the experience cube work specifically as related to driving improvement in maternal health. We were recognized as a Top 10 Finalist for the Tech Equity Award. It was an honor to be part of this journey alongside Jackie Ejawa and our incredible data team team. The award ceremony on Monday night was a moment of pride and celebration for all of us.

Alyssa and Shannon smiling with green lanyards at an indoor event. Signs with numbers in the background. Bright and cheerful atmosphere.
Alyssa and Shannon

A Future of AI-Driven Healthcare

ViVE 2025 was a powerful reminder of how AI, automation, and interoperability are shaping the future of healthcare. From speech recognition breakthroughs to seamless data sharing and pricing transparency, the innovations showcased at ViVE will redefine patient experiences and healthcare efficiency in the years to come.

I’m excited to continue these conversations and contribute to the next wave of transformation in health tech.

댓글


© 2021 Alyssa Simpson Rochwerger

bottom of page