Empathy Is Not a Feature

Written on 01/16/2026
Mark Allardyce


Reflections on AI, Healthcare and the Cost of Knowing

Empathy is not something we add to healthcare. It is something we risk losing if we move too fast.

After attending a recent discussion hosted by OpenAI Health on open healthcare and the role of AI in clinical settings, I found myself reflecting on a question that surfaced briefly but lingered long after the session ended.

Where does empathy actually live when intelligence enters the clinical relationship?

Not empathy as tone or language. Not empathy as reassurance. But empathy as responsibility. As pacing. As the quiet understanding of what it costs to hear certain truths.



Pacing is part of care

There is an old joke that makes people laugh because it is uncomfortably close to the truth.

A woman goes on holiday and leaves her cat with her brother. Once she arrives she calls to check how the cat is doing. The brother replies bluntly that the cat was up on the roof chasing a ball and fell off and died. She is devastated. Her husband takes the phone and asks what on earth he was thinking. The brother protests that he told the truth. The husband explains that truth is not the issue. He suggests that over the next few days the brother could have said the cat was playing with a ball on the roof, then a day or two later that it would not come down and finally, just before they came home, that it had fallen and passed peacefully.

The brother agrees. The husband then asks how his mother in law is doing. The brother replies that she is up on the roof chasing a ball.

The joke works because nothing in it is false. What fails is sequence. Accuracy arrives too fast. Meaning has no time to settle.

Healthcare professionals understand this instinctively. Bad news is rarely one moment. It is a process that unfolds. Diagnosis. Implication. Uncertainty. Choice. Adjustment. Grief. Reorientation.

Truth does not need to be diluted. It needs to be paced.


Stepping stones not cliffs

Under pressure most people do not want everything at once. Nor do they want to be protected from reality. What they need is the next safe place to stand.

One stepping stone at a time.

That steady movement creates peace. Not because the outcome changes but because the person is never dropped into deep water without warning. Each step gives the nervous system time to settle. Time to breathe. Time to integrate what has just been heard before moving again.

This is where human care quietly outperforms intelligence.

AI is excellent at surfacing the whole picture. Humans are better at knowing when the whole picture would overwhelm. A clinician senses when to pause not because information is missing but because the person in front of them needs a moment to stay upright.

Pacing is not delay. It is respect for human limits.

And it is one of the clearest places where partnership matters. AI can prepare the map. Humans choose the stepping stones.



What intelligence can do and what it cannot carry

AI systems are extraordinarily good at clarity. They compress complexity. They surface probabilities. They optimise pathways. In many clinical contexts this is not just helpful but essential.

But there is a difference between knowing something and carrying it.

An intelligence can explain death without trembling.
A human clinician pauses because they know what death costs and how that news feels.

This is not a criticism of AI. It is a recognition of asymmetry.

Intelligences do not suffer the consequences of loss. They do not live with finality. They do not carry grief home at night. They do not sit with the memory of a face that changed when a sentence landed.

Human empathy does not come from sentiment. It comes from mortality. From the knowledge that time ends and that some moments are irreversible.

That is why empathy cannot be simulated and should not be delegated.


Mortality is the missing variable

Much has been written about whether intelligences could one day be sentient. But there is a quieter truth that matters more in healthcare.

Grief. Creativity. Urgency. Legacy. Meaning. All of these arise because life is finite.

An intelligence would have no rational reason to choose death over continuity or suffering over stability. Nor should it be asked to. Trying to make systems suffer in order to be empathetic would be both unnecessary and unethical.

The implication is simple.

Empathy must remain a human obligation not an artificial trait.

AI can support care. It can prepare information. It can flag risk. It can even slow processes when overload is detected.

But it must never become the final emotional authority.


Human in the loop is not a fallback

One phrase is often used as reassurance in discussions of AI in healthcare. Human in the loop.

It is worth being precise about what this means.

Human involvement is not a failure mode. It is not an exception. It is not something we add when things get complicated.

It is the safety mechanism.

When AI systems optimise for efficiency they do so correctly according to their objectives. But optimisation without context can quietly erase dignity. It can flatten nuance. It can turn lived experience into deviation.

A clinician does more than validate outputs. They interpret meaning. They judge readiness. They pace reality.

Human override is not friction. It is containment of risk.



Partnership not replacement

The future that feels both realistic and responsible is not one where humans are removed from care. It is one where intelligence and clinicians work in partnership with clearly defined roles.

AI excels at memory. Pattern recognition. Consistency. Availability.

Humans carry consequence. Judgement. Accountability. Care.

In this partnership AI informs. Humans decide. And someone remains answerable to the patient not the system.

This is especially true in moments that patients remember forever. Serious diagnoses. End of life discussions. Irreversible decisions. These are not data delivery problems. They are human moments.

Presence matters. Pauses matter. Silence matters.


A quiet design principle

If there is one principle worth holding onto as AI becomes more deeply embedded in healthcare it might be this.

Empathy is not a feature to be added. It is a responsibility to be protected.

That protection does not come from pretending machines can feel. It comes from designing systems that keep humans where consequence lives.

AI should help clinicians do their work better. It should give time back. It should reduce cognitive load. It should surface what might otherwise be missed.

But it should never remove the human from the moment where meaning lands.

Empathy is not something we add to healthcare. It is something we safeguard by keeping responsibility human.

That is not a brake on progress. It is how progress remains worthy of trust.