Review
Community
Culture
3 min read

One life's relevance to today

One Life is a historic story retold for today audience, highlighting the response of individuals, families and leaders. Krish Kandiah ponders what it can teach us about sanctuary.

Krish is a social entrepreneur partnering across civil society, faith communities, government and philanthropy. He founded The Sanctuary Foundation.

An old man wearing a suit and tie sits in a TV audience as people stand around him.
Anthony Hopkins plays Nicholas Winton.
BBC Film.

There’s an elderly man with thick-rimmed glasses sitting in the studio audience of a popular 1980s television programme. The camera lingers on him as the presenter on the stage, in her signature blue dress, opens up a scrapbook detailing a hitherto unknown mission at the beginning of the second world war that rescued 639 Jewish children from the Nazi genocide. 

The man in the audience was the force behind this rescue mission, and the camera is focussed on him because there’s about to be one of the best television moments in history. Unbeknown to him, he is sitting next to a lady whose life he once saved. As Esther Rantzen reveals the connection, a look of shock, wonder and amazement crosses his face.  

The story that was kept secret for nearly a lifetime was broken in front of a live television audience of millions. I’ve watched the recording a hundred times; it never fails to make me tear up. I’ve spoken to people who were on the production team of that show who say that this programme was the highlight of their careers. It was a truly brilliant piece of television. 

40 years later and I am sat in the Royal Festival Hall next to another elderly gentleman. We have just watched Anthony Hopkin’s incredible performance as Nicholas Winton, that man in the studio, in the new movie One Life. The director of the movie, James Hawes, makes his way to the front and asks if there is anyone in the audience who is alive today because of Nicholas Winton. The elderly gentleman beside me stands along with hundreds of others. Some of those standing were on the Kindertransport in 1939. Others were their children, grandchildren, and great grandchildren.  

It was an immense privilege to spend some time with these survivors. Many had their original identity photographs with them. It was an emotional evening as I heard stories from those who remembered boarding the trains in Czechoslovakia in 1939 and saying goodbye to their parents for the last time. 

Many of the Kindertransport descendants had met Nicholas Winton personally before he died and were astounded by Hopkin’s ability to capture his likeness and his story.  

I never met him myself, but as I watched One Life, I felt like I was in the room with him. The audience meets him as a young man discovering the terrible situation for Jews in Europe and deciding to take action. We journey through the many obstacles to the rescue mission.  At first nobody would take in the Jewish children because of the misconception that migrants would overwhelm local services at a difficult time for the country. Yet through savvy use of media, great administration and pure unrelenting persistence, Winton and his mother (Helena Bonham Carter) were able to get a system running that meant hundreds of temporary foster parents not only came forward but paid for the privilege of helping to save the lives of these children.  

As many of the children lost their families to the horrors of the gas chambers and could not be reunited with their families, a large number were adopted by their foster carers and grew up in the UK. Some went on to greatness, others lived quiet lives of service. The 91-year-old man who sat next to me at the premiere had dedicated his life to the church and also to making sure the next generation didn’t forget either the horrors of the holocaust, or the hospitality of ordinary people. 

One Life is a deeply inspirational film. As I reflected afterwards, I couldn’t help but draw parallels with the situation in the world I live in now with terrible wars that are in full swing. I wondered what Nicholas Winton would do for the children being slaughtered today. What would a modern equivalent of the Kindertransport look like? Who could step forward to inspire our nation once again to offer sanctuary, protection and hope to our world’s most vulnerable children? 

  

https://youtu.be/8u1UAc7GKek 

Watch

Kirsh Kandiah reports from the One Life premiere.

Article
AI
Culture
Generosity
Psychology
Virtues
5 min read

AI will never codify the unruly instructions that make us human

The many exceptions to the rules are what make us human.
A desperate man wearing 18th century clothes holds candlesticks
Jean Valjean and the candlesticks, in Les Misérables.

On average, students with surnames beginning in the letters A-E get higher grades than those who come later in the alphabet. Good looking people get more favourable divorce settlements through the courts, and higher payouts for damages. Tall people are more likely to get promoted than their shorter colleagues, and judges give out harsher sentences just before lunch. It is clear that human judgement is problematically biased – sometimes with significant consequences. 

But imagine you were on the receiving end of such treatment, and wanted to appeal your overly harsh sentence, your unfair court settlement or your punitive essay grade: is Artificial Intelligence the answer? Is AI intelligent enough to review the evidence, consider the rules, ignore human vagaries, and issue an impartial, more sophisticated outcome?  

In many cases, the short answer is yes. Conveniently, AI can review 50 CVs, conduct 50 “chatbot” style interviews, and identify which candidates best fit the criteria for promotion. But is the short and convenient answer always what we want? In their recent publication, As If Human: Ethics and Artificial Intelligence, Nigel Shadbolt and Roger Hampson discuss research which shows that, if wrongly condemned to be shot by a military court but given one last appeal, most people would prefer to appeal in person to a human judge than have the facts of their case reviewed by an AI computer. Likewise, terminally ill patients indicate a preference for doctor’s opinions over computer calculations on when to withdraw life sustaining treatment, even though a computer has a higher predictive power to judge when someone’s life might be coming to an end. This preference may seem counterintuitive, but apparently the cold impartiality—and at times, the impenetrability—of machine logic might work for promotions, but fails to satisfy the desire for human dignity when it comes to matters of life and death.  

In addition, Shadbolt and Hampson make the point that AI is actually much less intelligent than many of us tend to think. An AI machine can be instructed to apply certain rules to decision making and can apply those rules even in quite complex situations, but the determination of those rules can only happen in one of two ways: either the rules must be invented or predetermined by whoever programmes the machine, or the rules must be observable to a “Large Language Model” AI when it scrapes the internet to observe common and typical aspects of human behaviour.  

The former option, deciding the rules in advance, is by no means straightforward. Humans abide by a complex web of intersecting ethical codes, often slipping seamlessly between utilitarianism (what achieves the most amount of good for the most amount of people?) virtue ethics (what makes me a good person?) and theological or deontological ideas (what does God or wider society expect me to do?) This complexity, as Shadbolt and Hampson observe, means that: 

“Contemporary intellectual discourse has not even the beginnings of an agreed universal basis for notions of good and evil, or right and wrong.”  

The solution might be option two – to ask AI to do a data scrape of human behaviour and use its superior processing power to determine if there actually is some sort of universal basis to our ethical codes, perhaps one that humanity hasn’t noticed yet. For example, you might instruct a large language model AI to find 1,000,000 instances of a particular pro-social act, such as generous giving, and from that to determine a universal set of rules for what counts as generosity. This is an experiment that has not yet been done, probably because it is unlikely to yield satisfactory results. After all, what is real generosity? Isn’t the truly generous person one who makes a generous gesture even when it is not socially appropriate to do so? The rule of real generosity is that it breaks the rules.  

Generosity is not the only human virtue which defies being codified – mercy falls at exactly the same hurdle. AI can never learn to be merciful, because showing mercy involves breaking a rule without having a different rule or sufficient cause to tell it to do so. Stealing is wrong, this is a rule we almost all learn from childhood. But in the famous opening to Les Misérables, Jean Valjean, a destitute convict, steals some silverware from Bishop Myriel who has provided him with hospitality. Valjean is soon caught by the police and faces a lifetime of imprisonment and forced labour for his crime. Yet the Bishop shows him mercy, falsely informing the police that the silverware was a gift and even adding two further candlesticks to the swag. Stealing is, objectively, still wrong, but the rule is temporarily suspended, or superseded, by the bishop’s wholly unruly act of mercy.   

Teaching his followers one day, Jesus stunned the crowd with a catalogue of unruly instructions. He said, “Give to everyone who asks of you,” and “Love your enemies” and “Do good to those who hate you.” The Gospel writers record that the crowd were amazed, astonished, even panicked! These were rules that challenged many assumptions about the “right” way to live – many of the social and religious “rules” of the day. And Jesus modelled this unruly way of life too – actively healing people on the designated day of rest, dining with social outcasts and having contact with those who had “unclean” illnesses such as leprosy. Overall, the message of Jesus was loud and clear, people matter more than rules.  

AI will never understand this, because to an AI people don’t actually exist, only rules exist. Rules can be programmed in manually or extracted from a data scrape, and one rule can be superseded by another rule, but beyond that a rule can never just be illogically or irrationally broken by a machine. Put more simply, AI can show us in a simplistic way what fairness ought to look like and can protect a judge from being punitive just because they are a bit hungry. There are many positive applications to the use of AI in overcoming humanity’s unconscious and illogical biases. But at the end of the day, only a human can look Jean Valjean in the eye and say, “Here, take these candlesticks too.”   

Celebrate our 2nd birthday!

Since Spring 2023, our readers have enjoyed over 1,000 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief