Essay
Comment
Morality
5 min read

Oppenheimer, my father, and the bomb

One week after its release, Christopher Nolan's latest blockbuster has left Luke Bretherton pondering an un-resolved disagreement with his late father and the theology of Oppenheimer's creation.

Luke Bretherton is a Professor of Moral and Political Theology and senior fellow of the Kenan Institute for Ethics at Duke University in Durham, North Carolina.

IMAGE

I went to see the film Oppenheimer on its opening night at my local, community run cinema in Acton in west London. It was packed. The event felt more like going to church than to the movies. The film itself is a biopic of scientist Robert J. Oppenheimer who was a pivotal figure in leading the development of nuclear weapons during World War II.

Reflecting on the film afterwards it brought to mind a difficult and never resolved argument with my late father. In the aftermath of watching the film, I realised I was still haunted by our dispute.

Our argument centred not on whether it was right to drop the bomb. Our argument was about whether it was Christian.

My father was 18 in 1945 when atomic bombs were detonated over the Japanese cities of Hiroshima and Nagasaki, killing over 200,000 souls. He was conscripted into the British Army that year and stationed in India. If the war had not ended, he would have been among those deployed to invade Japan.

Our argument was not just about whether it was right to drop the bomb. It was also about whether it was Christian. My father was an ardent believer who converted to Christianity in the 1950s. His Christian commitments deeply shaped every aspect of his life and work. I followed in his footsteps, and at the time of our argument I was doing a PhD in moral philosophy and theology. In part I was trying to make sense of what it meant to be a Christian in the aftermath of events like the Holocaust and the dropping of nuclear weapons over Hiroshima and Nagasaki, events in which it seemed Christian beliefs and practices played a key part. In the film, this is marked by the stark symbolism of Oppenheimer naming the first test of the prototype nuclear weapon “Trinity” – an often used and key way in which Christian name God.

I had been learning about just war theory when the argument with my father erupted. I was having dinner with my mum and dad at their house. To give a bit of context, my father and I had a long history of sometimes bitter arguments over political matters. These began in the 1980s when I was a teenager. He thought Mrs Thatcher a hero. I did not.

I was telling them about just war theory and its history in Christian thought and practice. As with most of our arguments, we stumbled into it. I made a throwaway remark about how, in the light of just war theory, nuclear weapons were immoral and that their use in 1945 was wrong. And yes, I was probably being pompous and annoying like all those possessed of a little new knowledge and a lot of self-righteous certitude and fervour.

My dad replied with anger that I did not know what I was talking about. Didn’t I realize that if the bombs hadn’t been dropped many more would have died, including him, which meant I would not exist. Something like this argument was used in the film and was often used by Oppenheimer to justify his own involvement in developing atomic weapons.

At the time, I replied with a procedural point that nuclear weapons do not distinguish between combatants and non-combatants, a key distinction in determining the morality or otherwise of targets in war. To use nuclear weapons is to deliberately intend the indiscriminate killing of the innocent. This constitutes murder and not, as the euphemism has it, unintended collateral damage. I added insult to injury by declaring that my dad’s argument was also deeply unchristian as it was a version of the ends justify the means. Was it ever right to do evil even if good might be the result? This upset my father still further. For him it was personal. It was existential. The bombs saved his life. The bombs made our life possible.

The meal, like the argument, did not end well. We had both upset my mother. She banned us from ever talking politics at the family dinner table again. It was a lifetime ban.

What dawned on me was that the question of whether it was moral to possess, let alone use, nuclear weapons was also an existential question for me. 

Afterwards I thought more about our row. I replayed the script in my head, trying to think of what I should have said. In my immaturity, I never thought to consider how I should have said it.

What dawned on me was that the question of whether it was moral to possess, let alone use nuclear weapons was also an existential question for me. It was a question of what kind of existence warranted anyone possessing nuclear weapons. To use the language of the Cold War of which I was a child: was it better to be red than dead? Was it better to be invaded and taken over by Communists and see capitalism abolished and the British nation subordinated to a foreign power or to deter this possibility by possessing nuclear weapons, weapons that threatened to destroy all life on this planet? In other words, was my way of life really worth the threat of nuclear annihilation. Was any way of life or ideology or commitment or abstract principle worth that? I concluded that it was not and promptly joined the Campaign for Nuclear Disarmament (CND).

I have not attended a CND rally for many years. And what happened in 1945 is more complicated than I used to think. But I still disagree with my dad and think Oppenheimer was deeply misguided. And what happened after 1945 with the advent of the nuclear arms race is not complicated. The film portrays Oppenheimer as anticipating and trying to forestall the process of one-upmanship that developing the A-bomb and then the H-bomb set in motion. He was right to do what he could to stop the arms race, even though, as the film portrays, the authorities tried to silence and marginalize him for his efforts.

Today, if my father and I were able to have the argument again, I would approach it very differently. I hope I would be less pompous, annoying, and self-righteous. But mostly, I would be more theological. I would ask him whether he thought Jesus would drop a nuclear bomb to save a life, or whether Jesus’s own life, death, and resurrection pointed in a different direction. And then see where that conversation took us.

Column
Comment
4 min read

There’s more than one way to lose our humanity

How we treat immigrants and how AI might treat humans weighs on the mind of George Pitcher.

George is a visiting fellow at the London School of Economics and an Anglican priest.

A grey multi-story accommodation barge floats beside a dock.
The Bibby Stockholm accommodation barge in Portland Harbour.
shley Smith, CC BY-SA 4.0 , via Wikimedia Commons.

“The greatness of humanity,” said Mahatma Gandhi, “is not in being human, but in being humane.” At first glance, this is something of a truism. But actually Gandhi neatly elides the two meanings of humanity in this tight little phrase. 

Humanity means both the created order that we know as the human race and its capacity for self-sacrificial love and compassion. In the Christian tradition, we celebrate at Christmas what we call the incarnation – the divine sharing of the human experience in the birth of the Christ child.  

Our God shares our humanity and in doing so, shows his humanity in the form of a universal and unconditional love for his people. So, it’s an act both for humanity and of humanity. 

This Christmas, there are two very public issues in which humanity has gone missing in both senses. And it’s as well to acknowledge them as we approach the feast. That’s in part a confessional act; where we identify a loss of humanity, in both its definitions, we can resolve to do something about it. Christmas is a good time to do that. 

The first is our loss of humanity in the framing of legislation to end illegal immigration to the UK. The second is the absence of humanity in the development of artificial intelligence. The former is about political acts that are inhumane and the latter goes to the nature of what it is to be human. 

We have literally lost a human to our inhumanity, hanged in a floating communal bathroom. It’s enough to make us look away from the crib, shamed rather than affirmed in our humanity. 

There is a cynical political line that the principal intention of the government’s Safety of Rwanda (Asylum and Immigration) Bill, voted through the House of Commons this week, is humane, in that it’s aimed at stopping the loss of life among migrants exploited by criminal gangs. But it commodifies human beings, turning them into cargo to be exported elsewhere. That may not be a crime – the law has yet to be tested – but it is at least an offence against humanity. 

Where humanity, meaning what it is to be human, is sapped, hope withers into despair. When a human being is treated as so much freight, its value not only diminishes objectively but so does its self-worth. The suicide of an asylum seeker on the detention barge Bibby Stockholm in Portland Harbour is a consequence of depreciated humanity. Not that we can expect to hear any official contrition for that. 

To paraphrase Gandhi, when we cease to be humane we lose our humanity. And we have literally lost a human to our inhumanity, hanged in a floating communal bathroom. It’s enough to make us look away from the crib, shamed rather than affirmed in our humanity. 

That’s inhumanity in the sense of being inhumane. Turning now to humanity in the sense of what it means to be human, we’re faced with the prospect of artificial intelligence which not only replicates but replaces human thought and function.  

To be truly God-like, AI would need to allow itself to suffer and to die on humanity’s part. 

The rumoured cause of the ousting of CEO Sam Altman last month from OpenAI (before his hasty reinstatement just five days later) was his involvement in a shadowy project called Q-star, GPT-5 technology that is said to push dangerously into the territory of human intelligence. 

But AI’s central liability is that it lacks humanity. It is literally inhuman, rather than inhumane. We should take no comfort in that because that’s exactly where its peril lies. Consciousness is a defining factor of humanity. AI doesn’t have it and that’s what makes it so dangerous. 

 To “think” infinitely quicker across unlimited data and imitate the best of human creativity, all without knowing that it’s doing so, is a daunting technology. It begins to look like a future in which humanity becomes subservient to its technology – and that’s indeed dystopian. 

But we risk missing a point when our technology meets our theology. It’s often said that AI has the potential to take on God-like qualities. This relates to the prospect of its supposed omniscience. Another way of putting that is that it has the potential to be all-powerful. 

The trouble with that argument is that it takes no account of the divine quality of being all-loving too, which in its inhumanity AI cannot hope to replicate. In the Christmastide incarnation, God (as Emmanuel, or “God with us”) comes to serve, not to be served. If you’ll excuse the pun, you won’t find that mission on a computer server. 

Furthermore, to be truly God-like, AI would need to allow itself to suffer and to die on humanity’s part, albeit to defeat its death in a salvific way. Sorry, but that isn’t going to happen. We must be careful with AI precisely because it’s inhuman, not because it’s too human. 

Part of what we celebrate at Christmas is our humanity and, in doing so, we may re-locate it. We need to do that if we are to treat refugees with humanity and to re-affirm that humanity’s intelligence is anything but artificial. Merry Christmas.