There has been a predictable response to the introduction of Chat GPT, a new artificial intelligence (AI) engine. Some have been eager to give it a try, some have touted it as the new panacea (cure for everything,) and some have decried it as the end of civilization as we know it. History tells us that none of these reactions has turned out to be true with previous new technology. In my lifetime television was touted as a replacement
for radio and movies, but as we now know, we have all three. Likewise, when personal computers were introduced, it was going to be the end of books, newspapers, and paper. It almost seems a rule that new technology may replace older technology as the dominant
form, with attendant economic impact, but almost never makes the older technology completely obsolete. Apparently, vinyl LP records are the hot new thing even with streaming and digital formats!
All of this is background to headlines in late April 2023 reporting a study from JAMA Internal Medicine in which a chatbot was used to provide responses to patient questions compared to physician responses. 1 The headline conclusion: “The chatbot responses were preferred over physician responses and rated significantly higher for both quality and
empathy.” You can find a range of responses using a quick Google search. Most point out the responding physician was not the treating physician and tended to see the post as a request for information, which may have shaped their responses.
The issue was anticipated in a scholarly paper published last year, which argued that empathy is not possible for AI. 2
“It is generally assumed in AI research that there are no “in principle” or a priori limitations about the applications and range of AI. In contrast, we argue that empathic AI is impossible, immoral, or both. Empathy is an in principle limit for AI. While the current argument is likely to generalize to other professions that rely on empathy, our attention here is confined to outcomes improved by empathy in clinical medicine and why AI cannot achieve these. Thus, our argument is not dependent on practical considerations and is limited in scope. Since it is an in principle problem,
considerations about architecture, design, computer power or other pragmatic issues are not helpful in addressing it. But given that we consider primarily the area of patient care, rather than other aspects of medical applications of AI such as diagnostics, resource optimization or data gathering, in which AI has enormous potential for improving medical services, the difficulty we present does not amount to a categorically general objection to the use of AI in medicine…
Clinical empathy, the use of empathy by nurses, doctors, therapists, etc. is emotion-guided imagining of what a particular moment, or slice of life (a time-slice or segment of one’s life), feels like or means to the patient…(Empathy) enables not only
1 Ayres JW, Pollak A, Dredze M, et al. Comparing Physician and Artificial Intelligence Chatbot Responses to
Patient Questions Posted to a Public Social Media Platform. JAMA Internal Medicine, 28 April 2023.
doi:10.1001/jamainternmed.2023.1838. Accessed online at
2 Montemayor C, Halpern J., Fairweather A. In Principle Obstacles for Empathetic AI: Why We Can’t Replace
Human Empathy in Healthcare. AI & Society 2022; 37:1353-1359. doi:10.1007/s00146-021-01230z.
Accessed online at https://link.springer.com/article/10.1007/s00146-021-01230-z.
more meaningful but more effective medical care for at least three reasons. Getting an accurate history is crucial for medical diagnosis…Replicated empirical studies show that patients disclose their histories selectively to physicians according to how emotionally attuned and empathic in real time they perceive their physician to be.
Studies show that they do not reveal information at first, but give emotional hints—saying that “my headache just kept coming back” with a lot of anxiety, till they sense that their physician is resonating with the importance of this moment in their story. When they sense this attunement, they reveal information, when they don’t,
they don’t disclose.
Second, effective medical care depends on patients adhering to
treatment—the biggest cause of poor results in medicine (for people with access) is that approximately half of medical recommendations including prescriptions are not followed or taken as prescribed. The biggest predictor of adherence to treatment is trust in the physician, and it turns out that a major predictor (in some studies, the biggest predictor) of trust is the patient’s perception that the physician is genuinely worried when they talk about something worrisome, that the physician is empathically accompanying them in real time.
Third, a big part of medical care is helping patient’s cope with bad news and regain agency to take necessary next steps to help themselves. Oncology patients who sense that their physician was empathizing with them when discussing their cancer diagnosis cope better, seeking treatment and support groups more actively than patients who did not feel so accompanied and had longer periods of confusion and
anxiety after receiving difficult information.”
Using historical experience suggests what we want to do is give many of the computerized tasks currently imposed on physicians by EMR and management systems to a personalized AI system, and free the clinician to do the task only he/she can do—spend time with the patient and establish rapport. This need not be restricted to life-threatening
diagnoses—it applies to “simple” things like getting a young man to acknowledge and accept the need to begin maintenance medication for hypertension.
I often think medicine is conflicted by the forces embodied in the original Star Trek series by Spock and Dr. McCoy. Spock would frequently respond to McCoy’s suggestions by saying “But that is not logical.” McCoy would then reply, “People aren’t logical.” I also find it suggestive that Dr. McCoy had computerized “AI” to make diagnoses and administer
treatments. His job was to explain them to the patient in a way that made sense. AI is not going to replace clinicians, but used properly, it can free them up for doing the tasks that only people can do. In other words, we need to making caring as important as science in
15 May 2023
Is empathy the value we have tossed out as part of "improving" health care?
Fluid Intelligence vs Crystallized Intelligence
Conversation is an essential step if we are to overcome the problems with our current dysfunctional health care system.
Our current cultural norms make following traditional medical advice, like eating less and exercising more, difficult for most people to do. Improving health may have more to do with modifying these forces, which is beyond the competence of health care providers and organizations.
The Repair Shop
Sinsky and Panzer argue for rebalancing medical practice between solution shop (repair shop) work and production work.