Skip to main contentSkip to navigation
The ChatGPT icon displayed on a smartphone
‘I confess I am a latecomer to ChatGPT, cautiously viewing it as a competing columnist...’ Photograph: Jaap Arriens/NurPhoto/Shutterstock
‘I confess I am a latecomer to ChatGPT, cautiously viewing it as a competing columnist...’ Photograph: Jaap Arriens/NurPhoto/Shutterstock

I am an oncologist. Can ChatGPT help me deliver bad news to a patient?

Ranjana Srivastava

The AI tool won’t offer a healing touch or resolve existential grief – but I will still be telling trainees to consult the chatbot in trying times

Are you getting nausea?

No.

How about your appetite?

OK.

Tired?

I’m managing.

Anyone could administer the checklist, but I know that what she really hopes is that with her as my passenger, I might be able to steer the ship of uncertainty to a safe harbour.

“Will I see you again?”

Even my response comes from a checklist. “We can try but the public system is not always accommodating.”

My patient is stoic on the outside, but no doubt suffering on the inside. I can’t let her leave the room like this.

“Is there anything else you would like to talk about?”

“You don’t have time.”

“I have time.”

Suddenly the tears are coming.

“I guess you can’t say how long I have got.”

My throat catches. How easy it would have been to fulfil the transactional routine and see her go home with the real issue untouched.

Although I have broken bad news countless times, I cast around in my mind to find the right words. An old memory jars me. I am a trainee attending a communication skills program funded by the National Cancer Institute. In small groups led by an expert faculty, we are taught how to be more compassionate and communicative oncologists. Professional actors appear at different stages of their illness: we learn to talk to them using roadmaps to attend to the patients’ priorities, spot opportunities for empathy and use silence with skill.

On the final day, there is an exam. My simulated patient is a middle-aged man with advanced cancer who has exhausted all treatment options. I have got this, I think. Break the news gently but honestly to avoid confusion later. Pause. Look for cues. Use empathy. I can see this must be hard for you. Be honest. I too wish things were different.

Above all, don’t be clever, just be human.

The clock starts.

“So, doc, there is nothing else?”

The man’s face crumples, and he starts crying. Actual tears. My pulse quickens.

“I am sorry.”

“But I have so much to live for.”

We could look at other treatments elsewhere. Stop, I can’t say that.

Silence. Interminable silence.

skip past newsletter promotion

“And the holiday with my grandkids …”

Seize the cue. Give him hope. Ask what he might do on the holiday.

The learning is there but my words stick. We dance around metaphors. Thank God, I think, the man isn’t truly sick.

The feedback is brutal. From the time he met me, my patient felt bad. Unable to connect, burdened by my discomfort. How easy it was to destroy human spirit, even while pretending.

Today, a gradually acquired repertoire of language and experience allows me to hold my nerve and help my patient discover a glimmer of hope and even relief, but I reflect on all the times I must have let patients down in the process of learning through trial and error.

Communication errors are the most cited underlying cause of complaints in the Australian healthcare system. In one survey 85% of patients valued compassion over cost and waiting time. In the same survey, doctors agreed that compassion in medicine trumped cognitive prowess, observing that doctors who communicated well were more likely to have compliant patients.

Given the unquestionable importance of doctor-patient communication it is surprising how little time goes in to teaching doctors to do it well. Training programs are scarce and considered optional; participants are self-selected; and sporadic instruction tends to have a temporary effect.

Despite the evidence that communication is a learned skill, an ossified belief that doctors “either have it or they don’t” allows institutions to avoid tackling the issue with as much energy as say, falls prevention or hand hygiene.

If I were a young doctor navigating today’s challenging world of medicine, I would desperately want someone to help me get the communication right, knowing it is the key to patient satisfaction and professional longevity.

As an older doctor, I would love an occasional coach and critic to save me from complacency and bad habits. Alas, to advocate for this kind of help for patients, amid a host of competing priorities, is like wanting a trip to the moon when the trains are down.

Enter ChatGPT.

I confess I am a latecomer to ChatGPT, cautiously viewing it as a competing columnist, although I am satisfied that it is not yet replacing me as an oncologist.

But after reading a recent article, I recall my patient and type in a series of prompts: I am an oncologist, help me deliver bad news. What can I get wrong with my communication? I need tips to support my patients receiving bad news.

The responses are detailed and helpful. They contain reminders to take time, avoid jargon, acknowledge emotion and be sensitive. There is sound advice but also specific language to consider, my favourite being: “Before we proceed, I want to make sure you are comfortable having this conversation now. Make sure to stop me. We can take this at your pace.”

I am embarrassed that I can’t remember the last time I said those words, the patient’s agenda easily hijacked by something else.

Some might warn against an overreliance on artificial intelligence to do innately human tasks and indeed, the essence of good medicine is human connection. But everywhere you look, the obstacles in the way are causing moral distress.

Sure, ChatGPT won’t fix the doctor shortage. It won’t resolve existential grief, offer a healing touch, or sense the tears and be ready with the tissues. But so long as there is no wave of humans with the time and expertise to teach doctors how to get better at giving bad news, I will be telling my trainees to open another browser and chat to ChatGPT in times of need.

Rather a patient rescued with a little help from a chatbot than one devastated by a doctor.

Ranjana Srivastava is an Australian oncologist, award-winning author and Fulbright scholar. Her latest book is called A Better Death

Most viewed

Most viewed