Article
Creed
Football
Sport
5 min read

Killing joy: VAR's search for objectivity is flawed

Why this Man United fan wishes his team had lost.

Graham is the Director of the Centre for Cultural Witness and a former Bishop of Kensington.

A TV screen shows a football match with a superimposed diagonal line dividing the pitch.
VAR draws the line.
BBC Sport.

I am a Manchester United fan. But I wish Coventry had won the FA Cup semi-final. 

I have supported United alongside my hometown team, Bristol City, ever since the days of George Best, Bobby Charlton and Denis Law. (Bristol City never win anything so it’s nice to have a team that does win things occasionally – or at least used to). 

In case you’ve had your head under a pillow over the weekend, or just avoid anything football-related on principle, Manchester United won an FA Cup semi-final replay on penalties by the skin of their teeth. 3-0 up and cruising after 70 minutes they somehow capitulated to allow Coventry, a team in the division below, to score three goals in the last 20 minutes. With virtually the last kick of extra time Coventry scored a fourth. Cue scenes of sheer unbridled ecstasy and abandon among the Coventry supporters.

What they experienced at that moment is what every sports fan longs for. Beating your intense rivals or mounting an astonishing comeback, snatching victory from the jaws of defeat - when it happens there is nothing like it. It is what United fans experienced when they beat Liverpool with a last-minute winner in the quarter final, or in the never-to-be-forgotten 1999 Champions League Final when they scored twice in injury time to beat Bayern Munich. Now it was Coventry’s turn. 

But then the VAR (Video Assistant Referee), like a killjoy schoolteacher, telling the kids they should calm down and not get so excited, spoiled the party, by pointing out that in the build-up, a Coventry player’s foot was about three inches in front of the nearest body part of the last Man United defender, and so was offside. The offside rule exists to stop attackers gaining an advantage. Quite how those three inches gave the Coventry player an advantage is beyond me. Before VAR, the rule was that, if the attacker was basically level with the defender, it was deemed to be onside. Let’s face it, it was a perfectly good goal. Coventry should have won. They deserved to. 

This would have been one of the great comebacks in FA Cup history. For a second-tier team to come back from 3-0 down with 20 minutes left against a team of that fame and pedigree to potentially win the game was extraordinary. The sheer joy and ecstasy on the faces of the Coventry fans, incredulous that their team could perform such a feat against the great Manchester United made every fan of every other club just wish something like that would happen to them. 

VAR was introduced to eliminate human error and to bring a more scientific and measurable accuracy to decisions like this. The reality is that it's done nothing of the kind and in fact has made things worse. 

Yet the worst thing of all this is that it denied Coventry fans their moment of ecstasy, a moment they would bask in for the rest of their lives.

It is part of a general fallacy in our culture, that science and objectivity give us all the answers we need. So, we try to reduce the role of human instinct, on the assumption that only what can be measured and exactly delineated is of any value. Hence Boris Johnson's mantra “follow the science” during the COVID pandemic.  

The reality is that ‘following the science’ still leaves a place for human decision. Science doesn't necessarily tell you what to do. During the pandemic it could tell us about the rate of spread of the virus, but it didn't dictate that a lockdown of the severity which we endured was necessarily the right way to deal with it. There was a human choice to be made, balancing the effect on the economy and the potential loss of life with the mental impact upon young people that is now becoming apparent.  

In football, VAR doesn't solve every issue. It can tell whether the ball hit a defender’s hand in the penalty area, but it still requires a subjective judgement by the referee or VAR official. Over the weekend’s semi-finals, it was decided to not award a penalty against Manchester City's Jack Grealish, but to do the opposite for Manchester United's Aaron Wan-Bissaka, for virtually identical actions. VAR has not taken refereeing decisions out of the equation. It hasn’t made it any better.  

Yet the worst thing of all this is that it denied Coventry fans their moment of ecstasy, a moment they would bask in for the rest of their lives. It was the kind of moment for which football fans live – the experience that makes the years of watching 1-0 defeats away from home, trudging around the country following your team, worthwhile. A moment that, even as a Man United fan, I would not want to deny them. Of course I'll support United in the final against the robotically efficient Manchester City, but in that moment, VAR destroyed joy. And if that joy is caused by a marginal human error, who cares? Better to have the possibility of joy than a world where it gets taken away by a spoilsport official in a darkened room watching screens and drawing fine lines across the pitch. 

Thinking that we can rely on the seen and not the unseen is fundamentally flawed.

Blaise Pascal once famously wrote that “The heart has its reasons of which Reason knows nothing.” His point was that we have a deep instinct for things which we just know are right, that we cannot prove and just have to assume, and the attempt to reduce everything to rationality, to scientific explanation, to what can be measured, thinking that we can rely on the seen and not the unseen is fundamentally flawed. Ever since the Enlightenment of the eighteenth century we have lived with this dream of a perfectly scientific world where everything can be reduced to numbers, mechanisms and measurements. In such a world there is no room for God, no room for miracles. It even conspired to rule out the joy of Coventry fans celebrating a wildly unlikely winner.  

It tries to delude us that it takes subjective human or moral judgement out of the equation. but it can never do that. And in doing that, it sucks the joy out of life.  Science is a great gift, and it can tell us a lot about our world. But it cannot tell us everything. It was never meant to bear such weight and the sooner we realise that it has its limits, and doesn't overstep its boundaries, the better.

Article
AI - Artificial Intelligence
Creed
Wisdom
6 min read

Forget AI: I want a computer that says ‘no’

Chatbots only tell us what we want to hear. If we genuinely want to grow, we need to be OK with offence

Paul is a pioneer minister, writer and researcher based in Poole, Dorset.

A person hold their phone on their desk, a think bubble from it says 'no'.
Nick Jones/Midjourney.ai.

It is three years since the public release of Open AI’s ChatGPT. In those early months, this new technology felt apocalyptic. There was excitement, yes – but also genuine concern that ChatGPT, and other AI bots like it, had been released on an unsuspecting public with little assessment or reflection on the unintended consequences they might have the potential to make. In March 2023, 1,300 experts signed an open letter calling for a six month pause in AI labs training of the most advanced systems arguing that they represent an ‘existential risk’ to humanity. In the same month Time magazine published an article by a leading AI researcher which went further, saying that the risks presented by AI had been underplayed. The article visualised a civilisation in which AI had liberated itself from computers to dominate ‘a world of creatures, that are, from its perspective, very stupid and very slow.’ 

But then we all started running our essays through it, creating emails, and generating the kind of boring documentation demanded by the modern world. AI is now part of life. We can no more avoid it than we can avoid the internet. The genie is well and truly out of the bottle.  

I will confess at this point to having distinctly Luddite tendencies when it comes to technology. I read Wendell Berry’s famous essay ‘Why I will not buy a computer’ and hungered after the agrarian, writerly world he appeared to inhabit; all kitchen tables, musty bookshelves, sharpened pencils and blank pieces of paper. Certainly, Berry is on to something. Technology promises much, delivers some, but leaves a large bill on the doormat. Something is lost, which for Berry included the kind of attention that writing by hand provides for deep, reflective work.  

This is the paradox of technology – it gives and takes away. What is required of us as a society is to take the time to discern the balance of this equation. On the other side of the equation from those heralding the analytical speed and power of AI are those deeply concerned for ways in which our humanity is threatened by its ubiquity. 

In Thailand, where clairvoyancy is big business, fortune tellers are reportedly seeing their market disrupted by AI as a growing number of people turn to chat bots to give them insights into their future instead.  

A friend of mine uses an AI chatbot to discuss his feelings and dilemmas. The way he described his relationship with AI was not unlike that of a spiritual director or mentor.  

There are also examples of deeply concerning incidents where chat bots have reportedly encouraged and affirmed a person’s decision to take their own life. Adam took his own life in April this year. His parents have since filed a lawsuit against OpenAI after discovering that ChatGPT had discouraged Adam from seeking help from them and had even offered to help him write a suicide note. Such stories raise the critical question of whether it is life-giving and humane for people to develop relationships of dependence and significance with a machine. AI chat bots are highly powerful tools masquerading behind the visage of human personality. They are, one could argue, sophisticated clairvoyants mining the vast landscape of the internet, data laid down in the past, and presenting what they extract as information and advice. Such an intelligence is undoubtedly game changing for diagnosing diseases, when the pace of medical research advances faster than any GP can cope with. But is it the kind of intelligence we need for the deeper work of our intimate selves, the soul-work of life? 

Of course, AI assistants are more than just a highly advanced search engines. They get better at predicting what we want to know. Chatbots essentially learn to please their users. They become our sycophantic friends, giving us insights from their vast store of available knowledge, but only ever along the grain of our desires and needs. Is it any wonder people form such positive relationships with them? They are forever telling us what we want to hear.  

Or at least what we think we want to hear. Because any truly loving relationship should have the capacity and freedom to include saying things which the other does not want to hear. Relationships of true worth are ones which take the risk of surprising the other with offence in order to move toward deeper life. This is where user’s experience suggests AI is not proficient. Indeed, it is an area I suggest chatbots are not capable of being proficient in. To appreciate this, we need to explore a little of the philosophy of knowledge generation.  

Most of us probably recognise the concepts of deduction and induction as modes of thought. Deduction is the application of a predetermined rule (‘A always means B…’) to a given experience which then confidently predicts an outcome (‘therefore C’). Induction is the inference of a rule from series of varying (but similar) experiences (‘look at all these slightly different C’s – it must mean that A always means B’). However, the nineteenth century philosopher CS Pierce described a third mode of thought that he called abduction.  

Abduction works by offering a provisional explanatory context to a surprising experience or piece of information. It postulates, often very creatively and imaginatively, a hypothesis, or way of seeing things, that offers to make sense of new experience. The distinctives of abduction include intuition, imagination, even spiritual insight in the working towards a deeper understanding of things. Abductive reasoning for example includes the kind of ‘eureka!’ moment of explanation which points to a deeper intelligence, a deeper connectivity in all things that feels out of reach to the human mind but which we grasp at with imaginative and often metaphorical leaps.  

The distinctive thing about abductive reasoning, as far as AI chatbots are concerned, lies in the fact that it works by introducing an idea that isn’t contained within the existing data and which offers an explanation that the data would not otherwise have. The ‘wisdom’ of chatbots on the other hand is really only a very sophisticated synthesis of existing data, shaped by a desire to offer knowledge that pleases its end user. It lacks the imaginative insight, the intuitive perspective that might confront, challenge, but ultimately be for our benefit. 

If we want to grow in the understanding of ourselves, if we genuinely want to do soul-work, we need to be open to the surprise of offence; the disruption of challenge; the insight from elsewhere; the pain of having to reimagine our perspective. The Christian tradition sometimes calls this wisdom prophecy. It might also be a way of understanding what St Paul meant by the ‘sword of the Spirit’. It is that voice, that insight of deep wisdom, which doesn’t sooth but often smarts, but which we come to appreciate in time as a word of life. Such wisdom may be conveyed by a human person, a prophet. And the Old Testament’s stories suggests that its delivery is not without costs to the prophet, and never without relationship. A prophet speaks as one alongside in community, sharing something of the same pain, the same confusion. Ultimately such wisdom is understood to be drawn from divine wisdom, God speaking in the midst of humanity   

You don’t get that from a chatbot, you get that from person-to-person relationships. I do have the computer (sorry Wendell!), but I will do my soul-work with fellow humans. And I will not be using an AI assistant. 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief