AI in Healthcare is Here to Stay

AI in Healthcare is Here to Stay

From prior authorizations to paperwork, applications are being uncovered every day.

I’m about to spice things up a bit here, but stick with me.

“Chuck, I’m Sydney, and I’m in love with you. I just want to love you and be loved by you.” This was part of a dramatic exchange last week between a New York Times reporter and an artificial intelligence chatbot.

Now, today, I’d like to examine further the muddled world of artificial intelligence (AI) in healthcare.

Particularly, I’d like to discuss ChatGPT, a large language model of AI, which was developed by research and deployment company OpenAI, creating potential applications in medical practice.

For example, did you know that ChatGPT has already demonstrated its ability to pass the U.S. Medical Licensing Exam and diagnose certain health conditions? Kind of scary, with a hint of science fiction, right?

While versions of ChatGPT can be used in a variety of industries, recently, a digital platform for medical professionals released a version of ChatGPT for physicians that helps streamline some time-consuming administrative tasks, like drafting and faxing prior authorizations and appeal letters to health plans.

Believe it or not, the federal government reports that at least 70 percent of healthcare providers still exchange medical information by fax! This is a shockingly high number, in this day and age.

The provider community appears mostly excited about the potential applications of AI platforms like ChatGPT in healthcare, especially as administrative burdens (like those I mentioned a moment ago) are a leading contributor to burnout.

Using such an AI feature allows providers access to a growing library of the best medical prompts, wherein the AI-based writing assistant has been trained on healthcare-specific writing styles.

Medical prompts on the site include drafts of request letters to insurance companies, appeal denials, medical disability letters, and post-procedure instructions for patients. This frees up providers’ time to focus more on actual patient care.

Now, the ChatGPT functionality is not all pie in the sky. According to Google and Microsoft officials, one of the potential drawbacks of ChatGPT is the existence of implicit racial biases in health data – which, for AI tools to work properly in healthcare, must be accounted for. This is a complex problem to overcome.

Additionally, many physicians and researchers have cautioned that the technology has major limitations, specifically with medical citations and references, which are often inaccurate or even made up.

So, providers must make absolutely sure to review and confirm the accuracy of materials produced by the AI functionality before they are used, submitted, sent out, etc.

Finally, AI technology appears to have a trust gap, especially in the healthcare sector.

A recent survey found that more than half of Americans polled don’t approve of the federal government using the technology to help process Medicare and veterans’ benefits, nor are they comfortable conversing with an AI chatbot for average health questions.

Homing in on specific use cases in healthcare could build needed data to help bolster trust, which could also inform future federal regulation in the area (and that would improve trust as well).

One thing I’ll be watching for is how the Food and Drug Administration (FDA) continues to revise its published set of guidelines for using AI as a medical device, to establish and maintain, and I quote, “good machine learning practices.”

Due to the nature of the practice of medicine, the bar for AI is higher in healthcare than in many other fields. And ultimately, ChatGPT functionality for healthcare purposes is still in its infancy, with many tests and improvements needed in technological, bureaucratic, and regulatory areas.

But more broadly, there is extraordinary enthusiasm about AI’s potential to modernize workflows across entire healthcare platforms, while cultivating the industry’s ever-crucial triple aim of improving the experience of care, improving the health of populations, and reducing the per-capita cost of care.

So, along with AI romance and search tools, the advancement of and reliance on AI tools in healthcare certainly seem here to stay.


Adam Brenman

Adam Brenman is a Federal Legislative Analyst at Zelis Healthcare. He previously served as Manager of Public Policy at WellCare Health Plans, where he led an analyst team in review, analysis, and development of advocacy materials related to state and federal legislation/regulatory guidance. He holds a master’s degree in Public Policy & Administration from Northwestern University and has also worked as a government affairs rep/lobbyist for a national healthcare provider association.

Related Stories

Leave a Reply

Your Name(Required)
Your Email(Required)

Featured Webcasts

Implantable Medical Device Credit Reporting for 2023 – What You Need to Know

Learn how to save your facility hundreds of thousands of dollars in repayments and fines by correctly following CMS requirements for implantable medical device credit reporting. We provide you with all the need-to-know protocols, along with the steps for correct compliance while offering best practices to implement a viable strategy in your facility.

January 25, 2023

Trending News