Getty Images/iStockphoto

Are AI Chatbots, ChatGPT the Solution to Healthcare’s Empathy Problem?

AI chatbots like ChatGPT can answer patient queries with accuracy and empathy, which could help providers answer secure direct messages on the patient portal.

Chatbots and ChatGPT are almost 10 times more empathetic when answering patient queries than clinicians are, according to a new JAMA Internal Medicine report, signaling the utility of these tools for patient engagement and symptom-checking while also meeting patients’ communication needs.

That heightened level of empathy—coming from a robot, no less—is mostly possible because ChatGPT has the gift of time when most providers find themselves stretched thin between patient interactions and documenting requirements, the authors said.

ChatGPT, an artificial intelligence (AI) chatbot, has taken the world by storm this year. For the healthcare field in particular, the chatbot’s ability to understand and communicate medical information has been highlighted as a possible solution for the industry’s biggest woes.

This most recent study looked at ChatGPT’s patient-provider communication skills, looking at whether the information the tool provides to patients is high-quality and whether the information is communicated with empathy, a key element of the patient-provider relationship.

The researchers scoured Reddit’s r/AskDoctors page and found 195 real patient queries that came with a physician answer. The team documented those physician answers and then asked the same 195 questions to ChatGPT.

After that, a team of healthcare professionals reviewed both answers and used a five-point scale to rate the answers by the quality of the information (very poor, poor, acceptable, good, or very good) and the level of empathy conveyed (not empathetic, slightly empathetic, moderately empathetic, empathetic, and very empathetic).

Overall, ChatGPT performed better than physicians on both measures.

The AI chatbot was 3.6 times more likely to give a “good” or “very good” answer compared to the physician, with 78.5 percent of ChatGPT answers getting that distinction compared to 22.1 percent of physician answers.

Moreover, ChatGPT was able to do so with more empathy, primarily by usually telling the patient the bot is sorry the patient feels unwell. ChatGPT was 9.8 times more likely to convey empathy or high empathy in its responses, with 45.1 percent of its responses getting this distinction. That compares to just 4.6 percent of physician responses that were empathetic or highly empathetic.

It should be noted that AI chatbots deliver more empathic patient communication than providers, not because they are better, but because they are not under the same time pressures as clinicians. Clinician burnout is at an all-time high, as they find themselves bogged down by documenting demands and tight appointment slots.

It is difficult for providers to connect with patients personally during every interaction, including during online interactions.

Still, these results are promising for secure direct messaging, a key function within the patient portal. Secure direct messaging allows patients to ask their doctors questions and get advice without having to make a full office visit, but they are also a source of some provider burnout clinicians need to take time out of their days to attend to their inboxes.

This study indicates that ChatGPT can help streamline that process for providers by helping to craft some responses prior to provider review.

“If more patients’ questions are answered quickly, with empathy, and to a high standard, it might reduce unnecessary clinical visits, freeing up resources for those who need them,” the researchers said. “Moreover, messaging is a critical resource for fostering patient equity, where individuals who have mobility limitations, work irregular hours, or fear medical bills, are potentially more likely to turn to messaging.”

In an accompanying commentary, the researchers added that ChatGPT could also aid in patient health literacy, especially in the age of open clinical notes as mandated in the 21st Century Cures Act. These notes are often laden with complex medical jargon—a fear of some providers and something proven to confuse patients—and AI chatbots could help interpret the information.

“We are entering a new era amidst an abundance of information but a scarcity of time and human connection,” the researchers wrote in their commentary.

“The practice of medicine is much more than just processing information and associating words with concepts; it is ascribing meaning to those concepts while connecting with patients as a trusted partner to build healthier lives,” they concluded. “We can hope that emerging AI systems may help tame laborious tasks that overwhelm modern medicine and empower physicians to return our focus to treating human patients.”

Next Steps

Dig Deeper on Patient data access

xtelligent Health IT and EHR