Article
AI - Artificial Intelligence
Creed
Morality
Spiritual formation
6 min read

The moral machine: algorithms that give a window into the soul

In TikTok’s algorithm Graham Tomlin saw something that got him thinking. Could it lead to moral health rather than harm?

Graham is the Director of the Centre for Cultural Witness and a former Bishop of Kensington.

A abstract grid of colourful cubes with arrows, crosses and cubes, viewed from above and at an angle
Champ Panupong Techawongthawon's illustration of artificial intelligence.
Google DeepMind on Unsplash.

A few years ago, I was thinking of buying a camera for my wife as a birthday present. I lazily browsed a couple of websites to check out the options. Then something odd started to happen. Somehow, my laptop seemed to think this was a good idea and sprang into action. Whenever I went onto Amazon, Ebay or any other website selling stuff, it kept pushing adverts for cameras at me. Canon or Nikon? Point and Shoot or DSLR? How did it know? Could it read my mind?

It was the first time I noticed the power of the algorithm.

A bit later on, thinking I ought to get up to speed with the regions of social media that I had little clue about, I opened up TikTok and started to swipe upwards (apparently that seemed to be the way to do it). This time I was determined not to like or unlike anything, follow anyone or be followed by anyone. Yet mysteriously, it still worked out what I liked and kept pushing short, addictive videos at me, enticing clips of football, mountains and music along with other random stuff mixed in. How did it know me so well?

It is as old as the hills. The algorithm simply takes the desires of your heart and amplifies them.

Of course, better informed people than me know how all this works. The algorithm figures out which accounts you follow, any comments you’ve posted, clips you’ve liked or shared, and in particular, videos you watched all the way to the end. So, if you linger over a video, it knows you like it. If you rush on quickly to the next one, it makes a mental note that you’re not so keen.

It all feels a little sinister, yet very clever. You often read dark theories of social media, and the way it is re-wiring our brains. Yet when you look a little closer, it is as old as the hills. The algorithm simply takes the desires of your heart and amplifies them. So, if you like or linger over certain videos expressing a particular cultural or political opinion it will send you more of the same. The result is we get confirmed in our own frameworks which never get challenged by others. It's part of why we are so polarised as societies these days. When you ask why ‘the other side’ cannot see the obvious truth that you see, the answer is that they literally don’t see it. They don't see it because the algorithm doesn't feed them the same things as it feeds you.

As a result, TikTok or Facebook is an alarming mirror into the soul – see what it sends you and it just may be that it tells you more about yourself than you would like to know.  

Social media like TikTok, Facebook and Twitter (or X)  learn to recognise what your heart really desires (not just what you say you do). They notice what you linger over, what catches your fancy and sends you more of the same. They are, apparently, studiously neutral on moral questions. They seem to have no moral designs on you to school or form your soul in particular ways, but are simply a reflection of your own longings. What TikTok, Facebook, Instagram and the others all do is to propel you further in the moral direction in which you are already headed. Which for most of us, is not a great idea.

Now of course there would be howls of protest if TikTok announced a moral code – that it was about to encourage virtue and discourage vice by deliberately sending us improving videos, material that the mysterious people who run it think is good for us. And that is not because we think virtue is bad and vice is good, but because we can’t decide on what virtues we want to encourage or what vices to stamp out. We draw a line at cruelty to children and extreme violence, but not much else. It is also because we hold as sacrosanct the freedom of the (adult) individual to choose his or her own way in life, as long as they don’t hurt anyone else.

Such sites are examplars and vehicles of expressive individualism – not just in the myriads of people who show off their dance moves, sing their songs or act out half-funny scenes on a golf course, but in that they confirm me in me my own wishes. They don’t tell me what to want. But they give me more of what I want. As a result, TikTok or Facebook is an alarming mirror into the soul – see what it sends you and it just may be that it tells you more about yourself than you would like to know.

It matters what we feed our souls with. It matters what stories we allow ourselves to be told.  

Such sites appear to be morally neutral. They don’t seem to aim to educate or form you in any particular direction. Or at least they are supposed not to. But of course nothing is entirely neutral.

Funnily enough, it’s not how we bring up children, or educate ourselves. When we bring up a child, most of us have some kind of vague or not so vague moral code in mind. We reward kind and helpful behaviour, and we punish selfish and mean actions. We don’t tend to give more of the same to a child who has eaten the first half of the packet of biscuits, or encourage a brother to hit his sister yet again. We have a goal of some form of moral formation in mind.

Yet, despite our confusion over which virtues to encourage, we need some kind of moral guidance for our wandering and flawed hearts, linked to eyes that are tempted to feast on things that fascinate but are not good for us. Like a glutton who cannot stop eating, even if these sites don’t themselves push extreme violence, pornography, aggression, they offer enough of the soft version of these to draw you in. And it’s not hard to find sites that will take you deeper into the darkness. And those sites will already know the way you are thinking and desiring and are ready to pull you in deeper into the mire.

The problem is not so much with the algorithm. It is with us. Netflix’s documentary, ‘The Social Dilemma’ quotes an alarming statistic - that fake news spreads six times faster than the truth. The reason is not hard to find. We are fascinated by the sensational and alarming rather than something a little more ordinary yet which happens to be true. As one person in the documentary put it: “The internet has a bias towards false information. Because false information makes more money. The truth is boring.”

The moral philosopher Gilbert Meilaender wrote:

“Successful moral education requires a community which does not hesitate to inculcate virtue in the young, which does not settle for the discordant opinions of alternative visions of the good, which worries about what the stories of its poets teach.”

It matters what we feed our souls with. It matters what stories we allow ourselves to be told.

The purveyors of social media are not innocent in this as they do exploit our worst tendencies, but in the end they simply confirm us in our own moral confusion. Yet it does point up the problem in the liberal ideal of leaving ethical decisions entirely up to the individual, to give entirely free choice without any guidance, because with our crooked hearts, it will always end up feeding the darker sides of our characters without a corresponding pull in the other direction, something which Christians called divine Grace.

St Paul wrote to the small group of Christians in Philippi, surrounded by the highly sexualised and violent culture of the Roman empire: “whatever is true, whatever is honourable, whatever is just, whatever is pure, whatever is pleasing, whatever is commendable, if there is any excellence and if there is anything worthy of praise, think about these things.”

I’m not saying don’t watch TikTok. But here’s an idea. Why not try to make it into a morally forming version of what you want to be, not what you are? Exercise a bit of moral direction yourself. If you see a video which you know in your conscience is not good, or is spreading lies, swipe it away quickly. If you see something positive, dwell on it.

If you approach it this way, you might just be able to persuade the algorithm to shape you in good ways and not the bad. It could become a means of growing in goodness, but only if you want it to be.

1,000th Article
AI - Artificial Intelligence
Creed
Death & life
Digital
6 min read

AI deadbots are no way to cope with grief

The data we leave in the cloud will haunt and deceive those we leave behind.

Graham is the Director of the Centre for Cultural Witness and a former Bishop of Kensington.

A tarnished humaniod robot rests its head to the side, its LED eyes look to the camera.
Nicholas Fuentes on Unsplash.

What happens to all your data when you die? Over the years, like most people, I've produced a huge number of documents, letters, photos, social media posts, recordings of my voice, all of which exist somewhere out there in the cloud (the digital, not the heavenly one). When I die, what will happen to it all? I can't imagine anyone taking the time to climb into my Dropbox folder or Instagram account and delete it all? Does all this stuff remain out there cluttering up cyberspace like defunct satellites orbiting the earth?  

The other day I came across one way it might have a future - the idea of ‘deadbots’. Apparently, AI has now developed to such an extent that it can simulate the personality, speech patterns and thoughts of a deceased person. In centuries past, most people did not leave behind much record of their existence. Maybe a small number of possessions, memories in the minds of those who knew them, perhaps a few letters. Now we leave behind a whole swathe of data about us. AI is now capable of taking all this data and creating a kind of animated avatar, representing the deceased person, known as a ‘deadbot’ or even more weirdly, a ‘griefbot’. 

You can feel the attraction. An organisation called ‘Project December’ promises to ‘simulate the dead’, offering a ghostly video centred around the words ‘it’s been so long: I miss you.’ For someone stricken with grief, wondering whether there's any future in life now that their loved one has gone, feeling the aching space in the double bed, breakfast alone, the silence where conversation once filled the air, the temptation to be able to continue to interact and talk with a version of the deceased might be irresistible. 

There is already a developing ripple of concern about this ‘digital afterlife industry’. A recent article in Aeon explored the ethical dilemmas. Researchers in Cambridge University have already called for the need for safety protocols against the social and psychological damage that such technology might cause. They focus on the potential for unscrupulous marketers to spam surviving family or friends with the message that they really need XXX because ‘it's what Jim would have wanted’. You can imagine the bereaved ending up being effectively haunted by the ‘deadbot’, and unable to deal with grief healthily. It can be hard to resist for those whose grief is all-consuming and persistent. 

Yet it's not just the financial dangers, the possibility of abuse that troubles me. It's the deception involved which seems to me to operate in at a number of ways. And it's theology that helps identify the problems.  

The offer of a disembodied, AI-generated replication of the person is a thin paltry offering, as dissatisfying as a Zoom call in place of a person-to-person encounter. 

An AI-generated representation of a deceased partner might provide an opportunity for conversation, but it can never replicate the person. One of the great heresies of our age (one we got from René Descartes back in the seventeenth century) is the utter dualism between body and soul. It is the idea that we have some kind of inner self, a disembodied soul or mind which exists quite separately from the body. We sometimes talk about bodies as things that we have rather than things that we are. The anthropology taught within the pages of the Bible, however, suggests we are not disembodied souls but embodied persons, so much so that after death, we don't dissipate like ethereal ‘software’ liberated from the ‘hardware’ of the body, but we are to be clothed with new resurrection bodies continuous with, but different from the ones that we possess right now. 

We learned about the importance of our bodies during the COVID pandemic. When we were reduced to communicating via endless Zoom calls, we realised that while they were better than nothing, they could not replicate the reality of face-to-face bodily communication. A Zoom call couldn't pick up the subtle messages of body language. We missed the importance of touch and even the occasional embrace. Our bodies are part of who we are. We are not souls that happen to temporarily inhabit a body, inner selves that are the really important bit of us, with the body an ancillary, malleable thing that we don't ultimately need. The offer of a disembodied, AI-generated replication of the person is a thin paltry offering, as dissatisfying as a virtual meeting in place of a person-to-person encounter. 

Another problem I have with deadbots, is that they fix a person in time, like a fossilised version of the person who once lived. AI can only work with what that person has left behind - the recordings, the documents, the data which they produced while they were alive. And yet a crucial part of being human is the capacity to develop and change. As life continues, we grow, we shift, our priorities change. Hopefully we learn greater wisdom. That is part of the point of conversation, that we learn things, it changes us in interaction with others. There is the possibility of spiritual development of maturity, of redemption. A deadbot cannot do that. It cannot be redeemed, it cannot be transformed, because it is, to quote U2, stuck in a moment, and you can’t get out of it.  

This is all of a piece with a general trajectory in our culture which is to deny the reality of death. For Christians, death is an intruder. Death - or at least the form in which we know it, that of loss, dereliction, sadness - was not part of the original plan. It doesn't belong here, and we long for the day when one day it will be banished for good. You don’t have to be a Christian to feel the pain of grief, but paradoxically it's only when you have a firm sense of hope that death is a defeated enemy, that you can take it seriously as a real enemy. Without that hope, all you can do is minimise it, pretend it doesn't really matter, hold funerals that try to be relentlessly cheerful, denying the inevitable sense of tragedy and loss that they were always meant to express.  

Deadbots are a feeble attempt to try to ignore the deep gulf that lies between us and the dead. In one of his parables, Jesus once depicted a conversation between the living and the dead:  

“between you and us a great chasm has been fixed, so that those who might want to pass from here to you cannot do so, and no one can cross from there to us.”  

Deadbots, like ‘direct cremations’, where the body is disposed without any funeral, denying the bereaved the chance to grieve, like the language around assisted dying that death is ‘nothing at all’ and therefore can be deliberately hastened, are an attempt to bridge that great chasm, which, this side of the resurrection, we cannot do. 

Deadbots in one sense are a testimony to our remarkable powers of invention. Yet they cannot ultimately get around our embodied nature, offer the possibility of redemption, or deal with the grim reality of death. They offer a pale imitation of the source of true hope - the resurrection of the body, the prospect of meeting our loved ones again, yet transformed and fulfilled in the presence of God, even if it means painful yet hopeful patience and waiting until that day. 

Celebrate with us - we're 2!

Since March 2023, our readers have enjoyed over 1,000 articles. All for free. This is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief