Article
AI
Comment
4 min read

It's our mistakes that make us human

What we learn distinguishes us from tech.

Silvianne Aspray is a theologian and postdoctoral fellow at the University of Cambridge.

A man staring at a laptop grimmaces and holds his hands to his head.
Francisco De Legarreta C. on Unsplash.

The distinction between technology and human beings has become blurry: AI seems to be able to listen, answer our questions, even respond to our feelings. It becomes increasingly easy to confuse machines with humans. In this situation, it is increasingly important to ask: What makes us human, in distinction from machines? There are many answers to this question, but for now I would like to focus on just one aspect of what I think is distinctively human: As human beings, we live and learn in time.  

To be human means to be intrinsically temporal. We live in time and are oriented towards a future good. We are learning animals, and our learning is bound up with the taking of time. When we learn to know or to do something, we necessarily make mistakes, and we take practice. But keeping in view something we desire – a future good – we keep going.  

Let’s take the example of language. We acquire language in community over time. Toddlers make all sorts of hilarious mistakes when they first try to talk, and it takes them a long time even to get single words right, let alone to try and form sentences. But they keep trying, and they eventually learn. The same goes with love: Knowing how to love our family or our neighbours near and far is not something we are good at instantly. It is not the sort of learning where you absorb a piece of information and then you ‘get’ it. No, we learn it over time, we imitate others, we practice and even when we have learned, in the abstract, what it is to be loving, we keep getting it wrong. 

This, too, is part of what it means to be human: to make mistakes. Not the sort of mistakes machines make, when they classify some information wrongly, for instance, but the very human mistake of falling short of your own ideal. Of striving towards something you desire – happiness, in the broadest of terms – and yet falling short, in your actions, of that very goal. But there’s another very human thing right here: Human beings can also change. They – we – can have a change of heart, be transformed, and at some point in time, actually start to do the right thing – even against all the odds. Statistics of past behaviours, do not always correctly predict future outcomes. Part of being human means that we can be transformed.  

Transformation sometimes comes suddenly, when an overwhelming, awe-inspiring experience changes somebody’s life as by a bolt of lightning. Much more commonly, though, such transformation takes time. Through taking up small practices, we can form new habits, gradually acquire virtue, and do the right thing more often than not. This is so human: We are anything but perfect. As Christians would say: We have a tendency to entangle ourselves in the mess of sin and guilt. But we also bear the image of the Holy One who made us, and by the grace and favour of that One, we are not forever stuck in the mess. We are redeemed: are given the strength to keep trying, despite the mistakes we make, and given the grace to acquire virtue and become better people over time. All of this to say that being human means to live in time, and to learn in time. 

So, this is a real difference between human beings and machines: Human beings can, and do strive toward a future good. 

Now compare this to the most complex of machines. We say that AI is able to “learn”. But what does it mean to learn, for AI? Machine learning is usually categorized into supervised learning, unsupervised and self-supervised learning. Supervised learning means that a model is trained for a specific task based on correctly labelled data. For instance, if a model is to predict whether a mammogram image contains a cancerous tumour, it is given many example images which are correctly classed as ‘contains cancer’ or ‘does not contain cancer’. That way, it is “taught” to recognise cancer in unlabelled mammograms. Unsupervised learning is different. Here, the system looks for patterns in the dataset it is given. It clusters and groups data without relying on predefined labels. Self-supervised learning uses both methods: Here, the system uses parts of the data itself as a kind of label – such as, for instance, predicting the upper half of an image from its lower half, or the next word in a given text. This is the predominant paradigm for how contemporary large-scale AI models “learn”.  

In each case, AI’s learning is necessarily based on data sets. Learning happens with reference to pre-given data, and in that sense with reference to the past. It may look like such models can consider the future, and have future goals, but only insofar as they have picked up patterns in past data, which they use to predict future patterns – as if the future was nothing but a repetition of the past.  

So this is a real difference between human beings and machines: Human beings can, and do strive toward a future good. Machines, by contrast, are always oriented towards the past of the data that was fed to them. Human beings are intrinsically temporal beings, whereas machines are defined by temporality only in a very limited sense: it takes time to upload data, and for the data to be processed, for instance. Time, for machines, is nothing but an extension of the past, whereas for human beings, it is an invitation to and the possibility for being transformed for the sake of a future good. We, human beings, are intrinsically temporal, living in time towards a future good – which machines do not.  

In the face of new technologies we need a sharpened sense for the strange and awe-inspiring species that is the human race, and cultivate a new sense of wonder about humanity itself.  

Column
Belief
Creed
4 min read

Let 2025 be a year of cultural Christianity - celebs and all

New epiphanies challenge traditional authority.

George is a visiting fellow at the London School of Economics and an Anglican priest.

A man in a suit stands in front of a orchestra, by a lectern, gesturing while talking.
Tom Holland.
x.com/TheRestHistory.

Monday is the Feast of the Epiphany, marking the end of the 12 days of Christmas. It’s rather good when Christmas falls on a Wednesday, so that Epiphany is on a Monday – the start of a new working week. No need to have returned from holiday during this previous week, for me at least, the only “work” I’m doing before Epiphany being the writing of this column. 

Epiphany celebrated the nativity and/or baptism of the Christ historically, but in the western Church we’ve moved those festivals, not least to mark the birth of the Christ child at Christmas. The Greek root of Epiphany means “manifestation” and in popular, polytheistic religions nature is of full of local manifestations of the gods. 

A deity might manifest in a divine human, a monarchical figure or a miracle worker. In ancient Greek philosophy, epiphany-religion is the foundation of a natural theology discerning these manifestations of the divine in all things, which we might call pantheism. 

We’re more reticent about direct epiphanies of God in our biblical religion, but they do occur, notably for Moses in his witness of the burning bush (and thereafter in his regular audiences with the Godhead as he leads his people from Egypt). 

Such manifestations invariably come by way of a promise, supremely in Christianity in the incarnation at Christmas. Our theology might hold that not until the epiphany of the Christ at the end of history can we speak of the Feast of the Epiphany in any fulfilled sense. 

But there is another, more immediate side to the Epiphany. The coming of the magi – or sages – from the East to pay homage to the Christ child has long been interpreted as the manifestation of God to the Gentiles in the poverty of a Judean stable. 

There is nothing actually in the scriptural story to indicate that these grandees are not Jews of the Diaspora, journeying back to their homeland to witness the Christ. But, importantly, this has come to represent the gift of a new covenant to the world, rather than the Mosaic covenant exclusive to the Jews.    

And it’s with that idea that I suggest our Epiphany has very current cultural implications. The point of Epiphany is that it’s not ours, it’s everyone’s. It’s not owned here, it’s out there. It’s not multicultural, it’s transcultural, even supercultural.  

So the Christian faith defies ownership, as the magi demonstrate. That’s a vital notion at a time when the phrase “cultural Christianity” has gained fresh traction. It’s customary at this point to list who most famously identifies as culturally Christian. So, briefly, here goes. 

The professional atheist (he has earned a living from it) Richard Dawkins claims such status; former Dutch MP Ayaan Hirsi Ali has moved from Islam to atheism to Christian faith, claiming it as a bulwark against cultures that threaten us and the historian Tom Holland has expended a substantial proportion of his scholarship demonstrating that western civilisation is built on Christian foundations. Elon Musk has chimed in as a cultural Christian. So has rocker Nick Cave. There are many more. 

The response from what might be called pro-am Christians isn’t always edifying. At best, it’s condescending – these starlets really don’t get it and need to study and qualify properly as card-carrying Christians. At worst, it’s belligerent – these charlatans want the fruit from our tree, but attack its roots. 

It’s fair to observe that Christianity has always been cultural, not just through its initial and expedient spread through the trade routes of the Mediterranean until its adoption under Roman emperor Constantine, but in its very genesis in Jerusalem. The insurgent Nazarene movement showed far more interest in the lived experience of the new faith than in establishing an alternative Temple authority with it. 

It’s a saying misattributed to St Francis of Assisi that evangelists are to go out into the world and spread the gospel and, if we have to, to use words. It’s about actions in the Christian life, not words of intent. In that, the former US president Jimmy Carter, who has just died aged 100, is a worthy exemplar. 

By contrast, we have the modern versions of the corrupt and self-serving Temple of Jerusalem in our Christian Churches. Elites who believe in a primacy of status, marching around with sticks, bear as much fruit as the withered fig tree of the gospel. More than arguably, it must be worth turning away from them and towards the cultural Christians mentioned above. 

For I’m finding that I may have more in common with them than with archbishops and priests, endlessly debating how to improve their woeful their Church. This is my epiphany.  

So, for me, let 2025 be a year of cultural Christianity. Let them say we pick the fruit and ignore the roots. Because perhaps that’s preferable to sticking with a thick trunk that despises those fruits.  

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief