States and Industry Guiding Healthcare AI Use in the Absence of Federal Regulation

States and Industry Guiding Healthcare AI Use in the Absence of Federal Regulation

As artificial intelligence (AI) becomes increasingly embedded in the U.S. healthcare system, the lack of comprehensive federal regulation has created something of a vacuum – one that some industry leaders are stepping in to fill.

And with growing concerns about things like patient safety, inherent bias, and ethical boundaries, states such as Illinois are establishing their own standards for responsible AI use.

But despite AI’s rapid expansion into clinical workflows, diagnostics, patient-facing tools, and back-office processes like billing and payment, there is no single or predominant federal law governing its use in healthcare.

Agencies like the Food and Drug Administration (FDA) have issued guidance on AI-based medical devices, and the Office of the National Coordinator for Health Information Technology (IT) has promoted transparency in AI in the past – such as proposed criteria for developers of federally certified health IT to ensure clinical users can access consistent information about the AI algorithms they use – but these efforts fall short of comprehensive governance.

The result is a fragmented oversight landscape, wherein AI systems are sometimes deployed with limited testing, transparency, or accountability, all under a presidential administration that continues to push for, quote: “progress without regulation.”

Mental health applications in particular have raised alarms, as several AI chatbots and “therapy assistants” have been shown to deliver inappropriate or misleading responses without adequate human oversight or safeguards to maintain ethical standards in mental healthcare.

In response, Illinois passed the Wellness and Oversight for Psychological Resources (WOPR) Act, signed into law by the Governor last week. The legislation is the first in the country to explicitly ban AI systems from the delivery of mental health treatment and from making related clinical decisions. Under the new law:

  • AI cannot perform therapy or interact with patients as a replacement for licensed professionals;
  • AI tools may only be used in administrative or supplemental roles, such as scheduling, reminders, or drafting non-therapeutic communication; and
  • Any AI use in mental health services must occur under the direct supervision of a licensed provider.

In the absence of federal rules, healthcare providers and tech companies are also increasingly taking matters into their own hands. Several leading health systems have established AI oversight committees to evaluate things like safety, equity, and efficacy before adopting new technologies.

Meanwhile, companies like Google and Microsoft have published AI ethics frameworks to guide product development, including commitments to transparency and “human-in-the-loop decision-making.”

Some healthcare organizations have even created internal review boards to assess algorithmic bias and clinical risk, though participation remains voluntary and standards vary widely across the industry.

Third-party entities have also stepped up, such as the Utilization Review Accreditation Commission (URAC) developing accreditation and certification standards and the Coalition for Healthcare AI’s flexible guidelines and playbooks.

Some experts caution that without federal leadership, the nation could face a regulatory Wild West of sorts, where inconsistent oversight allows potentially harmful tools to reach patients in less regulated areas of the country.

On Capitol Hill, healthcare advocates have been urging Congress to act. Several federal proposals have been introduced, focusing on a variety of AI issues, from transparency to data privacy to the need for clinical validation. However, nothing has passed into law, leaving states like Illinois and the industry to lead in the meantime.

So, Illinois’ WOPR Act demonstrates how state governments can maneuver decisively to protect patients when federal action lags. And as AI continues to shape the future of healthcare, the responsibility to ensure appropriate use increasingly falls to those willing to step in – until national standards catch up.

Reference material

  1. ONC finalizes AI transparency, interoperability rule.
  2. FDA.gov.
  3. Nature.com.
  4. idfpr.illinois.gov.
  5. natlawreview.com.
  6. ai.google.
  7. axios.com.

EDITOR’S NOTE:

The opinions expressed in this article are solely those of the author and do not necessarily represent the views or opinions of MedLearn Media. We provide a platform for diverse perspectives, but the content and opinions expressed herein are the author’s own. MedLearn Media does not endorse or guarantee the accuracy of the information presented. Readers are encouraged to critically evaluate the content and conduct their own research. Any actions taken based on this article are at the reader’s own discretion.

Facebook
Twitter
LinkedIn

Adam Brenman

Adam Brenman is a Sr. Gov’t Affairs Liaison at Zelis Healthcare. He previously served as Manager of Public Policy at WellCare Health Plans, where he led an analyst team in review, analysis, and development of advocacy materials related to state and federal legislation/regulatory guidance. He holds a master’s degree in Public Policy & Administration from Northwestern University and has also worked as a government affairs rep/lobbyist for a national healthcare provider association.

Related Stories

Goodbye Shutdown, Hello Funding

Well, it’s what we’ve all been waiting for… In a late-night move last Wednesday, Nov. 12, President Trump signed the Continuing Appropriations Act (CAA) of

Read More

Leave a Reply

Please log in to your account to comment on this article.

Featured Webcasts

Mastering Principal Diagnosis: Coding Precision, Medical Necessity, and Quality Impact

Mastering Principal Diagnosis: Coding Precision, Medical Necessity, and Quality Impact

Accurately determining the principal diagnosis is critical for compliant billing, appropriate reimbursement, and valid quality reporting — yet it remains one of the most subjective and error-prone areas in inpatient coding. In this expert-led session, Cheryl Ericson, RN, MS, CCDS, CDIP, demystifies the complexities of principal diagnosis assignment, bridging the gap between coding rules and clinical reality. Learn how to strengthen your organization’s coding accuracy, reduce denials, and ensure your documentation supports true medical necessity.

December 3, 2025

Proactive Denial Management: Data-Driven Strategies to Prevent Revenue Loss

Denials continue to delay reimbursement, increase administrative burden, and threaten financial stability across healthcare organizations. This essential webcast tackles the root causes—rising payer scrutiny, fragmented workflows, inconsistent documentation, and underused analytics—and offers proven, data-driven strategies to prevent and overturn denials. Attendees will gain practical tools to strengthen documentation and coding accuracy, engage clinicians effectively, and leverage predictive analytics and AI to identify risks before they impact revenue. Through real-world case examples and actionable guidance, this session empowers coding, CDI, and revenue cycle professionals to shift from reactive appeals to proactive denial prevention and revenue protection.

November 25, 2025
Sepsis: Bridging the Clinical Documentation and Coding Gap to Reduce Denials

Sepsis: Bridging the Clinical Documentation and Coding Gap to Reduce Denials

Sepsis remains one of the most frequently denied and contested diagnoses, creating costly revenue loss and compliance risks. In this webcast, Angela Comfort, DBA, MBA, RHIA, CDIP, CCS, CCS-P, provides practical, real-world strategies to align documentation with coding guidelines, reconcile Sepsis-2 and Sepsis-3 definitions, and apply compliant queries. You’ll learn how to identify and address documentation gaps, strengthen provider engagement, and defend diagnoses against payer scrutiny—equipping you to protect reimbursement, improve SOI/ROM capture, and reduce audit vulnerability in this high-risk area.

September 24, 2025

Trending News

Featured Webcasts

Surviving Federal Audits for Inpatient Rehab Facility Services

Surviving Federal Audits for Inpatient Rehab Facility Services

Federal auditors are zeroing in on Inpatient Rehabilitation Facility (IRF) and hospital rehab unit services, with OIG and CERT audits leading to millions in penalties—often due to documentation and administrative errors, not quality of care. Join compliance expert Michael Calahan, PA, MBA, to learn the five clinical “pillars” of IRF-PPS admissions, key documentation requirements, and real-life case lessons to help protect your revenue.

November 13, 2025
E/M Services Under Intensive Federal Scrutiny: Navigating Split/Shared, Incident-to & Critical Care Compliance in 2025-2026

E/M Services Under Intensive Federal Scrutiny: Navigating Split/Shared, Incident-to & Critical Care Compliance in 2025-2026

During this essential RACmonitor webcast Michael Calahan, PA, MBA Certified Compliance Officer, will clarify the rules, dispel common misconceptions, and equip you with practical strategies to code, document, and bill high-risk split/shared, incident-to & critical care E/M services with confidence. Don’t let audit risks or revenue losses catch your organization off guard — learn exactly what federal auditors are looking for and how to ensure your documentation and reporting stand up to scrutiny.

August 26, 2025

Trending News

Happy National Doctor’s Day! Learn how to get a complimentary webcast on ‘Decoding Social Admissions’ as a token of our heartfelt appreciation! Click here to learn more →

CYBER WEEK IS HERE! Don’t miss your chance to get 20% off now until Dec. 1 with code CYBER25

CYBER WEEK IS HERE! Don’t miss your chance to get 20% off now until Dec. 2 with code CYBER24