Essay
Character
Comment
Language
6 min read

Our language use is leading to a cultural abyss

We are witnessing a profound loss of commitment and discernment in the use of language, writes Oliver Wright.

After 15 years as a lawyer in London, Oliver is currently doing a DPhil at the University of Oxford.

Four rugby players stand and watch beside a referee gesturing with his arm.
Rugby players wait upon Wayne Barnes' word.
RFU.

The 2023 Rugby Union World Cup Final was one of the most iconic international matches in living memory, involving two of the most iconic teams – the All Blacks and the Springboks. It’s not surprising that after reaching such a pinnacle of a sporting career, there should be retirements that followed. But two retirements caught my eye. Not from players, but from referees: Wayne Barnes, the most experienced international referee in the world, the main match official, and Tom Foley, also highly experienced, the Television Match Official. Why? Wayne Barnes’s statement is particularly gracious and thoughtful. But the reason given in common with Tom Foley, and indeed many others in similar situations and similar high-pressure roles in the public eye, is worrying: online abuse. After the cup final, death threats were even sent to the school of Foley’s children.   

Online abuse has become an endemic, worldwide problem. There are real people issuing these threats and abuse; and there are real people receiving them, and responding in some way. Of course, there is also the problem of online ‘bots’. But they only succeed in their abuse because of their imitation of real abusers.  

It’s worth asking why, because we can go beyond the helpless handwringing of ‘the perils of being online’. There are philosophical and indeed theological reasons, and philosophical and theological ways, I suggest, of climbing out of the abyss.   

In fact, all words ‘act’ in some way. Even plain truth-describers assert something, such that an interlocuter can learn or discern for themselves. 

Let’s go back to the 1950s, when two important advances in the philosophy of language and in religious language occurred. The first came from Oxford, and the White’s Professor of Philosophy, J.L. Austin. The second came from Durham, and its then Bishop, Ian Ramsey.  

Austin, whose remarkable life and work has now been brilliantly documented for the first time in the biography by Mark Rowe (published by OUP, 2023) was a decorated Second World War veteran in the intelligence corps who was widely recognised as being one of the masterminds of the success of the D-Day Landings. On his return to Oxford in the late 1940s he perceived with great dissatisfaction a certain philosophical move which accorded the greatest importance in language to words and phrases which described things, which indicated some form of empirical truth about the world. For sure there were other kinds of use of language, religious language, emotional language, and so on, this argument continued. But that was fairly worthless. Describing cold hard scientific truth was the true utility for language.  

Austin’s most famous response was in his book How To Do Things With Words. The function of language goes way beyond the scientific description of the world. Language acts, it does things. We promise, we name, we cajole, we threaten, we apologise, we bet. There is no real ‘truth’ as such conveyed in such ‘speech-acts’. Their importance lies, rather, in what is thereby done, the act initiated by the words themselves. Or, in the Austin-ian jargon, the ‘illocution’ within the ‘locution’.   

But Austin realised something even more important as he investigated this form of language – these performative utterances. In fact, all words ‘act’ in some way. Even plain truth-describers assert something, such that an interlocuter can learn or discern for themselves. What matters is how ‘forceful’ the relevant act of speech is in each case. Sometimes the speech-act is very simple and limited. In other cases, such as threats, the performative aspect of the utterance is most forceful indeed.   

Austin’s student John Searle took the idea of performative language to America, and developed it considerably. Most notable for our purposes, however, over against Austin’s idea, was the separation of speech from act. By analysing the conventions and circumstances which surround the performance of a speech act – a baptism service for instance – we can observe how and why the act occurs, and how and why such an act might go wrong. But the debate was then divorced from the context of speakers themselves performing such actions, an integrity of speaker and action. The philosophical problem we then hit, therefore, is that a spoken word and the associated act (‘locution’ and ‘illocution’) are two entirely separate ‘acts’.  

Let’s move now from Oxford to the great Cathedral city of Durham. At the same time as Austin was teaching in Oxford, the Bishop of Durham Ian Ramsey – apparently unaware of Austin’s new theory of performatives – investigated religious language to try and get to grips with both how religious language does things, and what it says of its speakers and writers. Ramsey developed a two-fold typology for religious language – that of commitment and discernment. First, religious language implies two forms of commitment: there is the speaker/writer’s commitment of communicability, a desire to communicate, to be comprehensible, to ‘commune through language’; and the speaker/writer of religious language also  entertains prior commitments for the language adopted – language is rarely neutral when it comes to religion. Second, religious language implies a form of discernment about the words that are being invoked and for what purpose. They are not universals, but carry special meanings according to the particular conventions involved. Commitment and discernment.  

But this new innovation in the philosophy of religious language too was taken up and developed away from Ramsey’s idea – particularly in the much more famous work of John MacQuarrie, a Scottish philosophical theologian who spent much time teaching both in the States, and in Oxford. In MacQuarrie, writing at the height of the influence of thinkers such as Heidegger and Bultmann, Ramsey’s ‘commitment’ and ‘discernment’ got subsumed into existentialism and myth. The religious speech act became merely an event or an act for the self, a personal matter which might involve transformation, but might not.  

 These two strands, of the philosophy of language as it got taken up by Searle and his American counterparts, and of the philosophy of religious language as it got taken up by MacQuarrie, have for some time now predominated. And it is only recently that scholars on both sides have begun to perform a ressourcement, both on Austin, and on the nature of religious language in the wake of Bultmann.  

 The Twitter-sphere seems irrevocably to have divorced the bonds that tie speaker to their acts. In these fertile conditions, abuse flourishes. 

We can now return to the cases of Wayne Barnes and Tom Foley, and many others in many different walks of life just like them. Undoubtedly, the emotional, existential, and physical distance secured by interacting online has created the conditions for online abuse to flourish. But at a deeper level, what we are witnessing is a profound loss of commitment and discernment in the use of language, in society as a whole and also in the Church. Real people feel free to use language oblivious to any inherent act contained within it. The Twitter-sphere seems irrevocably to have divorced the bonds that tie speaker to their acts. In these fertile conditions, abuse flourishes. Similarly, in the Church, the commitment and discernment which has lain behind millennia of liturgical and doctrinal language has become a private spiritual matter; or indeed has been neglected in public when religious witness has not been matched between word and deed.  

How do we walk back from this cultural abyss? There is an ethical, and, potentially, a religious choice to make. The ethical choice is to think about what our language does to those who read (or hear) it, and to change the way we speak or write, accordingly. Ramsey's modes of ‘commitment’ and ‘discernment’. The religious dimension is to recognise that our words bind us to a system of belief, whether we like it or not. Saying one thing and doing another in a religious context implies a diminution in value of language for all concerned, not just the private life of the individual believer.  

Actions speak louder with words.  

Review
Books
Care
Comment
Psychology
7 min read

We don’t have an over-diagnosis problem, we have a society problem

Suzanne O’Sullivan's question is timely
A visualised glass head shows a swirl of pink across the face.
Maxim Berg on Unsplash.

Rates of diagnoses for autism and ADHD are at an all-time high, whilst NHS funding remains in a perpetual state of squeeze. In this context, consultant neurologist Suzanne O’Sullivan, in her recent book The Age of Diagnosis, asks a timely question: can getting a diagnosis sometimes do more harm than good? Her concern is that many of these apparent “diagnoses” are not so much wrong as superfluous; in her view, they risk harming a person’s sense of wellbeing by encouraging self-imposed limitations or prompting them to pursue treatments that may not be justified. 

There are elements of O-Sullivan’s argument that I am not qualified to assess. For example, I cannot look at the research into preventative treatments for localised and non-metastatic cancers and tell you what proportion of those treatments is unnecessary. However, even from my lay-person’s perspective, it does seem that if the removal of a tumour brings peace of mind to a patient, however benign that tumour might be, then O’Sullivan may be oversimplifying the situation when she proposes that such surgery is an unnecessary medical intervention.  

But O’Sullivan devotes a large proportion of the book to the topics of autism and ADHD – and on this I am less of a lay person. She is one of many people who are proposing that these are being over diagnosed due to parental pressure and social contagion. Her particular concern is that a diagnosis might become a self-fulfilling prophecy, limiting one’s opportunities in life: “Some will take the diagnosis to mean that they can’t do certain things, so they won’t even try.” Notably, O’Sullivan persists with this argument even though the one autistic person whom she interviewed for the book actually told her the opposite: getting a diagnosis had helped her interviewee, Poppy, to re-frame a number of the difficulties that she was facing in life and realise they were not her fault.  

Poppy’s narrative is one with which we are very familiar at the Centre for Autism and Theology, where our team of neurodiverse researchers have conducted many, many interviews with people of all neurotypes across multiple research projects. Time and time again we hear the same thing: getting a diagnosis is what helps many neurodivergent people make sense of their lives and to ask for the help that they need. As theologian Grant Macaskill said in a recent podcast:  

“A label, potentially, is something that can help you to thrive rather than simply label the fact that you're not thriving in some way.” 

Perhaps it is helpful to remember how these diagnoses come about, because neurodivergence cannot be identified by any objective means such as by a blood test or CT scan. At present the only way to get a diagnosis is to have one’s lifestyle, behaviours and preferences analysed by clinicians during an intrusive and often patronising process of self-disclosure. 

Despite the invidious nature of this diagnostic process, more and more people are willing to subject themselves to it. Philosopher Robert Chapman looks to late-stage capitalism for the explanation. Having a diagnosis means that one can take on what is known as the “sick role” in our societal structures. When one is in the “sick role” in any kind of culture, society, or organisation, one is given social permission to take less personal responsibility for one’s own well-being. For example, if I have the flu at home, then caring family members might bring me hot drinks, chicken soup or whatever else I might need, so that I don’t have to get out of bed. This makes sense when I am sick, but if I expected my family to do things like that for me all the time, then I would be called lazy and demanding! When a person is in the “sick role” to whatever degree (it doesn’t always entail being consigned to one’s bed) then the expectations on that person change accordingly.  

Chapman points out that the dynamics of late-stage capitalism have pushed more and more people into the “sick role” because our lifestyles are bad for our health in ways that are mostly out of our own control. In his 2023 book, Empire of Normality, he observes,  

“In the scientific literature more generally, for instance, modern artificial lighting has been associated with depression and other health conditions; excessive exposure to screen time has been associated with chronic overstimulation, mental health conditions, and cognitive disablement; and noise annoyance has been associated with a twofold increase in depression and anxiety, especially relating to noise pollution from aircraft, traffic, and industrial work.” 

Most of this we cannot escape, and on top of it all we live life at a frenetic pace where workers are expected to function like machines, often subordinating the needs and demands of the body. Thus, more and more people begin to experience disablement, where they simply cannot keep working, and they start to reach for medical diagnoses to explain why they cannot keep pace in an environment that is constantly thwarting their efforts to stay fit and well. From this arises the phenomenon of “shadow diagnoses” – this is where “milder” versions of existing conditions, including autism and ADHD, start to be diagnosed more commonly, because more and more people are feeling that they are unsuited to the cognitive, sensory and emotional demands of daily working life.  

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help.

O’Sullivan rightly observes that some real problems arise from this phenomenon of “shadow diagnoses”. It does create a scenario, for example, where autistic people who experience significant disability (e.g., those who have no perception of danger and therefore require 24-hour supervision to keep them safe) are in the same “queue” for support as those from whom being autistic doesn’t preclude living independently. 

But this is not a diagnosis problem so much as a society problem – health and social care resources are never limitless, and a process of prioritisation must always take place. If I cut my hand on a piece of broken glass and need to go to A&E for stiches, I might find myself in the same “queue” as a 7-year-old child who has done exactly the same thing. Like anyone, I would expect the staff to treat the child first, knowing that the same injury is likely to be causing a younger person much more distress. Autistic individuals are just as capable of recognising that others within the autism community may have needs that should take priority over their own.   

What O’Sullivan overlooks is that there are some equally big positives to “shadow diagnoses” – especially as our society runs on such strongly capitalist lines. When a large proportion of the population starts to experience the same disablement, it becomes economically worthwhile for employers or other authorities to address the problem. To put it another way: If we get a rise in “shadow diagnoses” then we also get a rise in “shadow treatments” – accommodations made in the workplace/society that mean everybody can thrive. As Macaskill puts it:  

“Accommodations then are not about accommodating something intrinsically negative; they're about accommodating something intrinsically different so that it doesn't have to be negative.” 

This can be seen already in many primary schools: where once it was the exception (and highly stigmatised) for a child to wear noise cancelling headphones, they are now routinely made available to all students, regardless of neurotype. This means not only that stigma is reduced for the one or two students who may be highly dependent on headphones, but it also means that many more children can benefit from a break from the deleterious effects of constant noise. 

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help. I suspect the rise in people identifying as neurodivergent reflects a latent cry of “Stop the world, I want to get off!” This is not to say that those coming forward are not autistic or do not have ADHD (or other neurodivergence) but simply that if our societies were gentler and more cohesive, fewer people with these conditions would need to reach for the “sick role” in order to get by.  

Perhaps counter-intuitively, if we want the number of people asking for the “sick role” to decrease, we actually need to be diagnosing more people! In this way, we push our capitalist society towards adopting “shadow-treatments” – adopting certain accommodations in our schools and workplaces as part of the norm. When this happens, there are benefits not only for neurodivergent people, but for everybody.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief