Article
AI
Comment
4 min read

It's our mistakes that make us human

What we learn distinguishes us from tech.

Silvianne Aspray is a theologian and postdoctoral fellow at the University of Cambridge.

A man staring at a laptop grimmaces and holds his hands to his head.
Francisco De Legarreta C. on Unsplash.

The distinction between technology and human beings has become blurry: AI seems to be able to listen, answer our questions, even respond to our feelings. It becomes increasingly easy to confuse machines with humans. In this situation, it is increasingly important to ask: What makes us human, in distinction from machines? There are many answers to this question, but for now I would like to focus on just one aspect of what I think is distinctively human: As human beings, we live and learn in time.  

To be human means to be intrinsically temporal. We live in time and are oriented towards a future good. We are learning animals, and our learning is bound up with the taking of time. When we learn to know or to do something, we necessarily make mistakes, and we take practice. But keeping in view something we desire – a future good – we keep going.  

Let’s take the example of language. We acquire language in community over time. Toddlers make all sorts of hilarious mistakes when they first try to talk, and it takes them a long time even to get single words right, let alone to try and form sentences. But they keep trying, and they eventually learn. The same goes with love: Knowing how to love our family or our neighbours near and far is not something we are good at instantly. It is not the sort of learning where you absorb a piece of information and then you ‘get’ it. No, we learn it over time, we imitate others, we practice and even when we have learned, in the abstract, what it is to be loving, we keep getting it wrong. 

This, too, is part of what it means to be human: to make mistakes. Not the sort of mistakes machines make, when they classify some information wrongly, for instance, but the very human mistake of falling short of your own ideal. Of striving towards something you desire – happiness, in the broadest of terms – and yet falling short, in your actions, of that very goal. But there’s another very human thing right here: Human beings can also change. They – we – can have a change of heart, be transformed, and at some point in time, actually start to do the right thing – even against all the odds. Statistics of past behaviours, do not always correctly predict future outcomes. Part of being human means that we can be transformed.  

Transformation sometimes comes suddenly, when an overwhelming, awe-inspiring experience changes somebody’s life as by a bolt of lightning. Much more commonly, though, such transformation takes time. Through taking up small practices, we can form new habits, gradually acquire virtue, and do the right thing more often than not. This is so human: We are anything but perfect. As Christians would say: We have a tendency to entangle ourselves in the mess of sin and guilt. But we also bear the image of the Holy One who made us, and by the grace and favour of that One, we are not forever stuck in the mess. We are redeemed: are given the strength to keep trying, despite the mistakes we make, and given the grace to acquire virtue and become better people over time. All of this to say that being human means to live in time, and to learn in time. 

So, this is a real difference between human beings and machines: Human beings can, and do strive toward a future good. 

Now compare this to the most complex of machines. We say that AI is able to “learn”. But what does it mean to learn, for AI? Machine learning is usually categorized into supervised learning, unsupervised and self-supervised learning. Supervised learning means that a model is trained for a specific task based on correctly labelled data. For instance, if a model is to predict whether a mammogram image contains a cancerous tumour, it is given many example images which are correctly classed as ‘contains cancer’ or ‘does not contain cancer’. That way, it is “taught” to recognise cancer in unlabelled mammograms. Unsupervised learning is different. Here, the system looks for patterns in the dataset it is given. It clusters and groups data without relying on predefined labels. Self-supervised learning uses both methods: Here, the system uses parts of the data itself as a kind of label – such as, for instance, predicting the upper half of an image from its lower half, or the next word in a given text. This is the predominant paradigm for how contemporary large-scale AI models “learn”.  

In each case, AI’s learning is necessarily based on data sets. Learning happens with reference to pre-given data, and in that sense with reference to the past. It may look like such models can consider the future, and have future goals, but only insofar as they have picked up patterns in past data, which they use to predict future patterns – as if the future was nothing but a repetition of the past.  

So this is a real difference between human beings and machines: Human beings can, and do strive toward a future good. Machines, by contrast, are always oriented towards the past of the data that was fed to them. Human beings are intrinsically temporal beings, whereas machines are defined by temporality only in a very limited sense: it takes time to upload data, and for the data to be processed, for instance. Time, for machines, is nothing but an extension of the past, whereas for human beings, it is an invitation to and the possibility for being transformed for the sake of a future good. We, human beings, are intrinsically temporal, living in time towards a future good – which machines do not.  

In the face of new technologies we need a sharpened sense for the strange and awe-inspiring species that is the human race, and cultivate a new sense of wonder about humanity itself.  

Explainer
Creed
5 min read

Creator or creature – a centuries old question of identity

Why does a 1,700-year-old creed still matter?

Frances Young is Emeritus Professor of Theology at the University of Birmingham. 

An abstract depiction of The Creation shows an aperture in a cloud like formation over water.
The Creation, James Tissot.
James Tissot, Public domain, via Wikimedia Commons.

2025 will be the 1,700th anniversary of the Nicaea Creed. In October 2024, Prof. Frances Young gave the inaugural lecture of the McDonald Agape Nicaea Project at St Mellitus College.

 

In the year 325CE the first ever “ecumenical” (= “worldwide”) council of bishops assembled at Nicaea near Constantinople (now Istanbul). It was summoned by Constantine, the first Roman Emperor to convert to Christianity and patronize the Church. Why does this seventeenth centenary of an obscure discussion around complex words matter to us today? 

The outcome of the Council was agreement to the text of a creed, and banishment of a pesky priest named Arius, whose bishop disapproved of his teaching. Unfortunately, some other bishops remained sympathetic to something like Arius’ viewpoint, and for political reasons Constantine was desperate for Church unity. Argument over the issues went on for half a century, until another Council in 381CE reaffirmed the position established in 325CE and agreed the version labelled “the Nicene Creed” and still used in Church liturgies across the world today. 

The controversy was basically about the identity of the pre-existent Word or Son of God incarnate in Jesus Christ. Nicaea established that the Son was “of one substance” (homoousios) with the Father – in other words, he was fully God in every sense of the word. But for many traditional believers at the time this was difficult to accept. 

The common sense of the culture thought in terms of a “chain of being.” Most people in the Roman Empire were polytheists – there were loads of gods: Mars, god of war, Nepture, god of the sea, and so on. Each city, each ethnic group, had its own god, as did every family, every interest group, every burial society – you name it. But generally there was a sense that above all these was the Supreme God, who was worshipped indirectly through worship of these lower gods, and below them were all sorts of nature spirits, daemons, benign and malign, then souls incarnate in human persons, then animals, even vegetables as living entities, and finally inert matter like earth and stones, at the bottom of the hierarchy or chain of being. 

Jews identified their God with the Supreme God and insisted the one God alone should be worshipped. But they also imagined a heavenly court of archangels and angels, then below that the souls of the righteous, and so on in a somewhat parallel hierarchy. No surprise then that Christians assumed a similar picture: God, then the Son of God, then the Holy Spirit, then archangels and angels, then souls, and so on in a hierarchical ladder. 

But in the second century Christians had argued their way to the idea of “creation out of nothing.” Many non-Jewish thinkers, including some early Christians, followed Plato, conceiving creation as the outcome of Mind (the Demiurge or Craftsman) shaping Matter into whatever Forms or Ideas were in mind. But other Christian thinkers argued that God was not a mere Craftsman who needed stone or wood to work on like a sculptor – God produced the Matter in the first place. This then triggered a full-blown critique: God did not create out of pre-existent Matter or there would be two first principles; God did not create from God’s own self or everything would be divine; so God must have created out of nothing. 

Now try to fit that to the chain of being: where do you draw the line between God the Creator and everything else made out of nothing? This was the issue which surfaced in the so-called Arian controversy. What we might call the “mainstream” remained wedded to the hierarchy, not least because of earlier controversies about God’s monarchia. The word did mean “monarchy” – single sovereignty; but arche could mean “rule” or “beginning,” so monarchia also referred to the single first principle of all that is. It was natural to attribute monarchia to God the Father, a view that worked OK with the hierarchy. But some had suggested that the one God 'changed mode', as it were, appearing now as Father, now as Son, now as Holy Spirit, taking different roles in the overarching scriptural story. This suggestion was mocked as all too similar to the pagan god, Proteus, who in mythology kept changing shape. It is even possible that that key word homoousios had been condemned along with this “Modalist” view.  

Traditionalists were suspicious. The first historian of the Church, Eusebius of Caesarea, was present at Nicaea, and wrote a somewhat embarrassed letter to his congregation explaining how he had come to agree to this formula. Even Athanasius - the one who would come to be regarded as the staunch defender of Nicaea - largely avoided the term for a quarter of a century, though that does not mean he did not identify the principal issue. He campaigned hard and ended up in exile five times over. The fundamental issue was whether Christ was God incarnate or some kind of divinised superman, or a semi-divine mediating figure, a created Creator. Arius is supposed to have said, “there was a when he was not,” even though he was “the first and greatest of the creatures” through whom God created everything else. 

So why does it still matter? Four simple reasons:

Because it was basically about identity, and the question of Christ’s identity still matters. 

Because we still find people treating Jesus Christ as superhuman – not really one of us, or semi-divine – not God in the same sense as the God the Father. If we are to be ecumenical, across different denominations today but also across time, we need to affirm that God’s Son and Spirit are truly of the one God. As early as the second century the first great Christian theologian, Irenaeus, characterized the Word and the Spirit as God’s two hands – we can imagine the Trinity reaching out first to create and then to embrace us with God’s redeeming love. 

Because it means we can look to Jesus and there catch a glimpse of God’s very own loving face - not just a dim image but the reality itself.

And because only God could recreate us in God’s own image and raise us to new life. 

  

To find out more about the McDonald Agape Nicaea Project being held by St. Mellitus College in London, come and join the public lectures, or look out for other Nicene celebrations in 2025. 

For more information or to register for these events, you can visit the Nicaea Project website  

Watch the lecture