Column
Comment
4 min read

There’s more than one way to lose our humanity

How we treat immigrants and how AI might treat humans weighs on the mind of George Pitcher.

George is a visiting fellow at the London School of Economics and an Anglican priest.

A grey multi-story accommodation barge floats beside a dock.
The Bibby Stockholm accommodation barge in Portland Harbour.
shley Smith, CC BY-SA 4.0 , via Wikimedia Commons.

“The greatness of humanity,” said Mahatma Gandhi, “is not in being human, but in being humane.” At first glance, this is something of a truism. But actually Gandhi neatly elides the two meanings of humanity in this tight little phrase. 

Humanity means both the created order that we know as the human race and its capacity for self-sacrificial love and compassion. In the Christian tradition, we celebrate at Christmas what we call the incarnation – the divine sharing of the human experience in the birth of the Christ child.  

Our God shares our humanity and in doing so, shows his humanity in the form of a universal and unconditional love for his people. So, it’s an act both for humanity and of humanity. 

This Christmas, there are two very public issues in which humanity has gone missing in both senses. And it’s as well to acknowledge them as we approach the feast. That’s in part a confessional act; where we identify a loss of humanity, in both its definitions, we can resolve to do something about it. Christmas is a good time to do that. 

The first is our loss of humanity in the framing of legislation to end illegal immigration to the UK. The second is the absence of humanity in the development of artificial intelligence. The former is about political acts that are inhumane and the latter goes to the nature of what it is to be human. 

We have literally lost a human to our inhumanity, hanged in a floating communal bathroom. It’s enough to make us look away from the crib, shamed rather than affirmed in our humanity. 

There is a cynical political line that the principal intention of the government’s Safety of Rwanda (Asylum and Immigration) Bill, voted through the House of Commons this week, is humane, in that it’s aimed at stopping the loss of life among migrants exploited by criminal gangs. But it commodifies human beings, turning them into cargo to be exported elsewhere. That may not be a crime – the law has yet to be tested – but it is at least an offence against humanity. 

Where humanity, meaning what it is to be human, is sapped, hope withers into despair. When a human being is treated as so much freight, its value not only diminishes objectively but so does its self-worth. The suicide of an asylum seeker on the detention barge Bibby Stockholm in Portland Harbour is a consequence of depreciated humanity. Not that we can expect to hear any official contrition for that. 

To paraphrase Gandhi, when we cease to be humane we lose our humanity. And we have literally lost a human to our inhumanity, hanged in a floating communal bathroom. It’s enough to make us look away from the crib, shamed rather than affirmed in our humanity. 

That’s inhumanity in the sense of being inhumane. Turning now to humanity in the sense of what it means to be human, we’re faced with the prospect of artificial intelligence which not only replicates but replaces human thought and function.  

To be truly God-like, AI would need to allow itself to suffer and to die on humanity’s part. 

The rumoured cause of the ousting of CEO Sam Altman last month from OpenAI (before his hasty reinstatement just five days later) was his involvement in a shadowy project called Q-star, GPT-5 technology that is said to push dangerously into the territory of human intelligence. 

But AI’s central liability is that it lacks humanity. It is literally inhuman, rather than inhumane. We should take no comfort in that because that’s exactly where its peril lies. Consciousness is a defining factor of humanity. AI doesn’t have it and that’s what makes it so dangerous. 

 To “think” infinitely quicker across unlimited data and imitate the best of human creativity, all without knowing that it’s doing so, is a daunting technology. It begins to look like a future in which humanity becomes subservient to its technology – and that’s indeed dystopian. 

But we risk missing a point when our technology meets our theology. It’s often said that AI has the potential to take on God-like qualities. This relates to the prospect of its supposed omniscience. Another way of putting that is that it has the potential to be all-powerful. 

The trouble with that argument is that it takes no account of the divine quality of being all-loving too, which in its inhumanity AI cannot hope to replicate. In the Christmastide incarnation, God (as Emmanuel, or “God with us”) comes to serve, not to be served. If you’ll excuse the pun, you won’t find that mission on a computer server. 

Furthermore, to be truly God-like, AI would need to allow itself to suffer and to die on humanity’s part, albeit to defeat its death in a salvific way. Sorry, but that isn’t going to happen. We must be careful with AI precisely because it’s inhuman, not because it’s too human. 

Part of what we celebrate at Christmas is our humanity and, in doing so, we may re-locate it. We need to do that if we are to treat refugees with humanity and to re-affirm that humanity’s intelligence is anything but artificial. Merry Christmas. 

Snippet
Character
Comment
Digital
Film & TV
3 min read

Here’s why we play judge and jury on social media

Discovering the truth about celebrity feuds.

Rosie studies theology in Oxford and is currently training to be a vicar.

A montage shows two celebrity faces in opposition
Lively and Baldoni face off.

Depending on your Instagram algorithm, you might have seen that Hollywood actors Blake Lively and Justin Baldoni continue to make news with their ongoing feud, which is soon to reach litigation in the US civil courts. Then again, maybe you haven’t – in which case kudos to your scrolling habits and for avoiding celebrity clickbait (unlike me). 

What interests me about their dispute – and others that have gone before it – is how it spotlights our need, as the general public, to search out the truth. And to make ourselves judge and jury on the matter. 

Having starred together last summer in It Ends With Us, Lively soon after accused Baldoni of sexual harassment and of orchestrating a smear campaign against her during the film’s press tour. Baldoni responded by suing the New York Times for libel, and Lively for civil extortion and defamation. Cue some biased media reporting, and conflicting evidence being released by their legal teams, and both actors’ reputations have been significantly damaged by the dispute.  

With their accounts remaining at complete odds with each other, the question Instagram’s pundits keep coming back to is: which one of them is telling the truth? 

The reality is we’ll probably never fully know (and, obviously, it’s not actually any of our business, so I won’t speculate).  

But it makes me reflect on how, in lots of instances of conflict, the answer can be blurrier than we’d like. 

The judges and juries of Instagram rarely, if ever, offer us this kind of impartiality in their search for the truth.

So often, in disagreements and disputes, both parties’ accounts have a seed of truth in them. But as we ruminate on the event afterwards, the risk is that we re-interpret it according to our values, biases, and past experiences. That seed of truth is watered by the stories we tell ourselves, growing and morphing into something that can become hard to untangle. 

Over time, as we centre ourselves in the narrative, we become the ultimate arbiters of our truth.  

But when the stories we tell ourselves become the stories we also tell others, and we discover that our respective truths are in fundamental conflict with each other, it exposes how our perception of a situation might differ from is reality. 

Which is why, so often, we have to defer to impartial third parties to search out the ultimate truth. Judges and juries who seek to understand each person’s story but who also inhabit the fuller narrative, and who can untangle the layers of interpretation we unknowingly heap onto our experiences. 

The judges and juries of Instagram rarely, if ever, offer us this kind of impartiality in their search for the truth. 

But they remind us that truth is, ultimately, found outside of ourselves. And that, in discovering the truth, we can also find the justice we’re so often longing for. 

Maybe we’re all just suckers for a bit of clickbait. But perhaps the need to make ourselves judge and jury also points to a deeper part of our humanity. We’re all seeking after truth in this world – if only we can find it. 

Celebrate our 2nd birthday!

Since Spring 2023, our readers have enjoyed over 1,000 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief