ipopba - stock.adobe.com
Navigating the regulatory pitfalls of AI charting
AI charting tools bring key workflow advantages for clinicians as well as possible regulatory pitfalls.
On Feb. 4, Epic rolled out its AI Charting application, which offers a built-in set of AI capabilities for drafting clinician notes and recommending orders. Seven healthcare organizations had adopted the tool upon its launch, including the nonprofit health plan Group Health Cooperative of South Central Wisconsin.
AI scribes listen to conversations between patients and providers and generate clinical documentation in real time; meanwhile, ambient AI works in the background at all times to suggest information for clinicians to follow up on. But the term AI charting offers the most comprehensive definition, explained Julie McGuire, managing director for the Center for Healthcare Excellence and Innovation at accounting and advisory firm BDO.
"AI charting encompasses that entire process of using AI to create, update and manage patient charges and clinical documentation, health alerts or maintenance alerts within the EHR," McGuire said. She referred to an EHR as including not just medical records but also billing, legal and health information management processes.
Epic originally announced AI Charting, a component of Art, the vendor's AI tool for clinicians, last August at its annual Users Group Meeting. At that time, it also introduced Penny, an AI revenue cycle tool, and Emmie, an AI tool that helps patients with tasks such as scheduling appointments and making payments.
Epic says AI Charting allows clinicians to stay focused on caring for patients, while the AI tool builds a complete record of a patient visit. Similar tools include Abridge, an ambient AI tool that generates clinical documentation with context; Suki, an ambient intelligence tool that assists providers throughout their workflow with tasks such as voice-enabled pre-charting; and ModMed, which has added features such as intelligent note reconciliation to resolve conflicts with duplicate notes.
With health systems now facing regulatory pitfalls around AI, they must tread carefully from a compliance standpoint when using AI charting and other ambient scribing tools.
Understanding the regulatory pitfalls associated with AI charting
Although AI tools decrease administrative burden, health systems need to ensure that the data they are sending to CMS or private payers is accurate. If healthcare organizations are using AI tools for billing and documentation, they need proper oversight before sending over a bill or a claim to avoid upcoding, McGuire advised.
Additionally, tools such as Epic's AI Charting are subject to HIPAA compliance as well as privacy and security concerns, McGuire noted.
Health systems must ensure Systems and Organization Controls 2 compliance. SOC 2 is a data privacy standard developed by the American Institute of CPAs based on Trust Services criteria. It provides guidelines on how organizations should manage client data while protecting security, availability, confidentiality, processing integrity and privacy.
Health systems are also using AI documentation tools amid changing FDA guidelines. Sometimes the apps are considered medical devices and need FDA approval, and sometimes they are not. AI Charting, a native tool within Epic's EHR platform, is likely not considered a medical device, according to McGuire. But, if the AI tool is considered a medical device, it could be subject to more regulatory scrutiny from the FDA. The FDA has a database of AI-enabled medical devices on its website.
"Make sure you know the tools that you're using, and that you're following the ever-changing FDA regulations we're seeing, no matter who the administration is," McGuire suggested.
Best practices for investing in AI charting
When deciding whether to invest in AI clinical documentation and ambient listening tools, consider several factors, including when to involve clinicians and how to maintain equity in the use of the technology. Here are some strategies to consider when investing in AI charting:
Update patient consent forms. To ensure regulatory compliance with guidelines such as HIPAA when using AI Charting and ambient scribes, organizations should update their patient consent forms to indicate that they're using these tools, McGuire advised. Patient consent forms have been updated for many purposes over the last 10 to 15 years, including to address data sharing and use of new tools, she added.
Keep a human in the loop. Healthcare organizations should maintain a "human in the loop" and have clinicians authenticate AI tools as well as weigh in on selecting and implementing solutions, McGuire said.
"Have your clinicians at the table when you're making those decisions, not only selecting but the implementation," she added. "I think that's a mistake where folks or organizations may miss out."
People should also vet the amount of Medicaid reimbursement that AI charting services code for. These regulations are getting more complicated under the One Big Beautiful Bill Act, and AI charting could potentially lead to upcoding, making human oversight critical, McGuire suggested.
Understand state requirements as well as federal. Venson Wallin, managing director and national healthcare compliance and regulatory leader at BDO USA, stressed the need to study state guidelines regarding AI tools compared with federal guidelines like HIPAA.
"It is not a one-size-fits-all environment, and just because you may meet HIPAA requirements does not mean you are 'safe,'" Wallin said.
Keep an eye on AI bias. Data collection practices over time could introduce bias, and organizations must be aware of this possibility and how AI automation could lead to bias, according to McGuire. Tackling bias could mean monitoring how health systems record ethnicities in patient populations. The automation may need to be fixed to avoid swaying toward a particular ethnicity while not detecting bias toward another ethnicity, she warned.
"It's really understanding your data source that these tools sit upon, removing any bias, and making sure you have the right baseline data to help suggest clinical accuracy and liability," she said.
Minimizing bias also requires strong data governance. McGuire advised that healthcare organizations have a quality-control process for implementing AI tools. Health systems experience problems when they lack routine checks to validate AI tools and ensure they work as expected.
Create a heat map. Heat maps can help evaluate the level of regulatory compliance in different areas of your organization when it comes to AI use. For instance, green could indicate compliance is being met, while yellow could mean the organization should tighten up compliance, McGuire said.
Understand how the vendor uses patient data. Vetting vendors and their AI software is essential to protect both patients and the healthcare organization, Wallin noted.
"The key here is to make sure that any AI software that is utilized in coding and/or any other part of the healthcare organization goes through a complete and detailed vetting process, including the vendor and the software itself, to minimize any risk that a patient's personal health information may be inappropriately disclosed that could lead to financial and reputational risk to the healthcare organization," Wallin said.
Vendors usually sign a business associate agreement related to privacy, McGuire noted.
"It's really, just truly, understanding the data flow and the full ecosystem and the security posture of everyone that's involved in touching the data," McGuire said.
Brian T. Horowitz started covering health IT news in 2010 and the tech beat overall in 1996.