Getty Images/iStockphoto
How health systems can navigate the rise of the consumer AI chatbot
Healthcare AI chatbots aren't going anywhere, so provider organizations need to determine the best path forward to enable patient engagement -- and patient safety.
As consumer-facing health AI chatbots burst onto the scene, industry professionals are quickly finding themselves in one of two camps: those excited about the chatbots' potential and those worried about the damage they could cause.
But healthcare professionals, regardless of their AI skepticism, need to understand that AI is here to stay, if for no other reason than it fills a key patient engagement gap.
"People have at some time or another experienced difficulty understanding a doctor's instructions," according to Foluke Omosun, Ph.D., an assistant professor of communication and media studies at Sacred Heart University, where she recently published data about consumer health AI habits.
"Patients can use these tools to help them understand instructions, their symptoms and what questions to ask their doctors," Omosun said in an interview.
Earlier this year, OpenAI released a report about its flagship product, showing that users send almost 2 million messages to ChatGPT focused on healthcare each week.
Omosun's data showed that patients use these tools because the current slate of patient engagement resources falls short of their needs. About a third of patients already use these tools to research medical topics, and the survey showed they're willing to use them even more for care coordination.
Although there's a clear market for these tools, healthcare professionals' advice remains highly sought after. Gallup data shows that 73% of patients foremost consult their doctors for medical information, compared to just 16% who primarily turn to chatbots.
Doctors still have significant power in patient education and care navigation. To accept that patient-facing AI is here to stay, healthcare organizations must fold these tools into their digital front doors and encourage clinicians to discuss best practices with their patients.
How healthcare organizations can integrate AI chatbots
It'd be a mistake for healthcare organizations to ignore AI's popularity.
"If you do, your organization becomes out of touch," Nicole Lamoureaux, the CEO of the National Association of Free and Charitable Clinics, also said in an interview. "The toothpaste is out of the tube already. If organizations don't start looking at these, you could lose trust with your patients, as well."
Rather, the solution might be to work with AI. The technology boom is an opportunity for healthcare organizations to learn how to integrate the most popular platforms into their technology stacks, thereby gaining ownership and oversight capabilities.
This is a fledgling space, but Forrester analyst Shannon Germain Farraher insists that health systems need to take the lead on patient-facing AI bots.
"Healthcare organizations can't let health tech continue to lead full throttle in this initiative," she said in a previous interview. "They have to step up, take ownership and understand that this is not going away."
There's a clear appetite among consumers for these types of tools.
"Now, health organizations have to meet consumers where they're at," Germain Farraher continued. "They are the experts on healthcare. That is their core expertise."
Therefore, these technologies need to be clinician-led with oversight, management and understanding on the part of health systems, she said.
Organizations should consider embedding AI chatbots into the health system patient engagement suite. Instead of having patients access AI chatbots on third-party websites, integrating ChatGPT Health and similar services into a provider website or app will enable the system to integrate with the facility's IT systems and promote better oversight.
That said, it's essential that these AI tools -- no matter who developed them -- be subject to rigorous oversight. According to Omosun's data, this is exactly what patients want.
"There are good sides to technology," she noted. "It's when there's no oversight that we run into problems."
Patients want medical professionals -- doctors, physician assistants/associates, nurses and public health professionals -- to conduct oversight of these tools, not the government or health IT companies, Omosun's data showed. This is reflective of the trust patients place in their clinicians.
Clinicians must discuss AI best practices with patients
For all of this promise of AI chatbots, there's also peril.
Security concerns abound, as patients have the option to upload their digital medical records to systems that are not yet HIPAA-compliant. In an age of low digital health literacy, it's unlikely patients will seek out more information about how these tools protect their data.
What's more, there's a big question about the accuracy of the information AI chatbots provide. Can they really know as much as a doctor? Can they truly put information into the appropriate context?
Preliminary data questions that, with the first report on OpenAI's ChatGPT Health finding glaring gaps in its ability to accurately triage patients. Although the tool is not explicitly intended for triage, the researchers said the study's results emphasize the importance of third-party vetting.
Medical professionals will need to help patients understand these limitations and outline the best use cases for AI chatbots, experts advise.
Doing so isn't exactly new for clinicians. When patient-facing medical websites like WebMD first cropped up, healthcare professionals worked to understand how patients used these resources to self-diagnose. AI chatbots present a similar opportunity.
"Healthcare providers should pay attention to the role these new tools play in the lives of their patients," Omosun said. "It also boils down to health education."
Most experts agree that patients need to be better informed about what AI chatbots can and cannot do. This needs to be a doctor-led conversation, according to Lamoureaux.
"Clinicians need to ask if patients have used ChatGPT or another AI to look up their symptoms," she suggested.
This creates the opportunity for clinicians to learn how patients might be using the tools and address any potential risks.
Empathy will be essential for these conversations. Patient-facing AI has become popular because there aren't enough reliable, accessible avenues for health information. Providers who might be justifiably skeptical of AI chatbots should avoid shaming patients who use the tech, even if the tool surfaces inaccurate or unhelpful information.
"Be empathetic and say, 'I hear you. I see where you got your information. Now, as the medical professional, I'm going to tell you why I don't believe that that's accurate,'" Germain Farraher advised.
It'd also be helpful to have some signage around the facility about AI chatbots. These messages -- which, ironically, could probably be written by AI, Lamoureaux pointed out -- should explain to patients the best use cases for AI, how it can help them ask more detailed questions and what AI can't do to support their health.
But organizations need to be mindful not to let AI chatbots deepen digital divides. Promoting an AI chatbot as the key to better patient engagement risks excluding folks who don't have access to the devices or broadband that support AI chatbots or lack digital health literacy.
There's also the challenge of time. Most clinicians only have 15 or 20 minutes to spend with patients, so integrating conversations about AI chatbots will require workflow changes. There isn't a great answer for those limitations right now, but it should be made a priority as organizations contend with AI chatbots.
After all, these tools aren't going anywhere.
"AI is here to stay, and it has good sides to it," Omosun concluded. "I am really optimistic about the future of healthcare, especially because this is going to create empowered patients who will be better engaged with their healthcare. It's all about people being aware of misinformation and not replacing the expert with AI."
Sara Heath has reported news related to patient engagement and health equity since 2015.