Article
Comment
Education
Language
6 min read

Cutting language learning is a moral failure

Learning someone’s tongue is a deeply humble and empathetic act
A check list shows 'thank you' in different languages.

When you go abroad, how do you navigate language differences? Do you just stick everything through Google translate? Or put a few weeks into Duolingo before you go? Or maybe you just speak a bit louder in the hope that that will somehow smooth over any misunderstandings? 

Recently, my wife and I went to Italy for a week. Neither of can speak a word of Italian and we were taking our toddler Zachary with us (who can speak even less Italian), so we booked into a big resort where we knew staff would be able to speak some English if we needed anything for Zach. Even so, we tried learning a few words and phrases:  

‘please’,  

‘thank you’,  

‘could I have …?’,  

‘where is the …?,  

‘please forgive my toddler, he hasn’t learned to regulate his emotions yet’. 

That sort of thing. Just some basics to get by.  

Of course, what happened was exactly what happens every time I speak another language. I try my best to make an effort, people immediately realise I’m a struggling and they put me out of my misery by replying in English anyway.  

All this reinforces the importance of deep and rigorous language learning in society. All this makes the continued diminishment of university modern language programmes rather odd, and more than a little unsettling.  

The University of Nottingham has announced it is terminating the employment of casual staff at its Language Centre. This will see the end of numerous classes for students and others in many languages, both ancient and modern, including British Sign Language.  

Nottingham is not alone in this. The news comes in the immediate aftermath of a review into the University of Aberdeen’s decision to scrap modern language degrees in 2023, which found the decision “hurried, unstructured, and dominated by immediate financial considerations.” (Not that we needed a review to tell us this). The University of Aberdeen has partially reversed the decision, continuing its provision of joint honour degrees, if not single honour language degrees.  

Elsewhere, in January, Cardiff University announced plans to cut 400 academic staff, cutting their entire modern language provision in the process. In May, the University revealed that it would reverse these plans, with modern languages continuing to be offered (for now), albeit it a revised and scaled-down manner. 

The situation is bleak. As a theology lecturer who works for a Church of England college, I’m all too aware of the precarity my friends and colleagues in University Arts and Humanities departments face across the sector. But I was also naïve enough to think that languages might be one of the subjects that would be able to survive the worst of education’s deepening malaise given their clear  importance. How wrong I was. 

There are the obvious causes for despair at the news of language department cuts. One the one hand is the human element of all this. People are losing their jobs. Moreover, as casual workers, the University had no obligation to consult them about the changes or provide any notice period, and so they didn’t, because why would a university demonstrate courtesy towards its staff unless it absolutely had to? As well as losing jobs and whole careers, people will lose sleep, and perhaps even homes and relationships as a direct result of the financial and emotional toll this decision will take on staff. My heart breaks for those effected.  

And yet, the move is also evidence – as if more were needed – of the increasing commercialization of Higher Education. A statement from the University said the decision to cut languages in this way was the result of the Language Centre not running at a “financial surplus.” The cuts will instead allow the University to focus on “providing a high-quality experience for our undergraduate and postgraduate students.” 

And there we have it. Not even a veneer of pretence that universities operate for the pursuit of truth or knowledge. No, nothing so idealistic. A university is business, thank you very much, here to offer an “experience”. And when parts of businesses become financially unsustainable, they’re tossed aside. 

Languages aren’t just ways of describing the world we see, they’re also ways of seeing the world in the first place. 

But cutting language offerings isn’t just a personal and a societal loss, it’s also a huge spiritual and moral failure. And that’s because of what language fundamentally is. Let me explain.  

It can be tempting to think of words as simply ‘labels’ we assign to objects in the world, with different languages using a different set of ‘labels’ to describe the same objects. As a native English speaker, I might see something with four legs and a flat surface on top and call it a ‘desk’. Someone else with a different native language might call it a Schreibtisch, or a bureau‚ or a scrivania, or a tepu, or a bàn làm việc. You get the point: we might be using different labels, but we’re all ‘seeing’ the same thing when we use those ‘labels’, right? 

Well, it’s a bit more complicated than that. Languages aren’t just ways of describing the world we see, they’re also ways of seeing the world in the first place. As such, languages have the capacity to shape how we behave in response to the world, a world itself suggested to us in part by our language(s). As twentieth-century philosopher Ludwig Wittgenstein once wrote, “the limits of my language mean the limits of my world.” 

Let me give you just one example. English distinguishes tenses: past, present, future. I did, I do, I will do. Chinese does not. It expresses past, present, and future in the same way, meaning past and future feel as immediate and as pressing as the present. The result of ‘seeing’ the world through a ‘futureless’ language like this? According to economist Keith Chen, ‘futureless’ language speakers are 30 per cent more likely to save income compared to ‘futured’ language speakers (like English speakers). They also retire with more wealth, smoke less, practice safer sex, eat better, and exercise more. The future is experienced in a much more immediate and pressing way, leading to people investing more into behaviours that positively impact their future selves, because their view of the world – and their future selves’ place within the world – is radically different because of their language. 

Different languages lead to seeing the world differently which leads to differences in behaviour. In other words, there are certain experiences and emotions – even certain types of knowledge and behaviours - that are only encounterable for those fluent in certain languages. And this means that to learn another language is to increase our capacity for empathy. Forget walking a mile in someone’s shoes, if you want truly to know someone, learn their language.  

In my day job as a lecturer, when I’m trying to encourage my students – most of whom are vicars-to-be – to learn biblical Greek and/or Hebrew, I tell them it will make them more empathetic people. It may make them better readers of the Bible, it may even make them better writers too but, more than anything else, students who learn languages will be better equipped to love their neighbour for having done so. They will get a better sense of the limits of their world, and a greater appreciation for the ways in which others see it too. Show me a society that is linguistically myopic, and I’ll show you one that’s deeply unempathetic. I can guarantee you of that.   

We ought to be deeply, deeply concerned about the diminishing language offerings in the UK’s Higher Education sector. To open oneself to other languages is to open oneself to other ways of seeing the world. It is to be shown the limits of one’s own ways of seeing. Learning a language is a deeply humble and empathetic act. And isn’t humility and empathy in desperately short supply at the moment? 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief

Review
Books
Care
Comment
Psychology
7 min read

We don’t have an over-diagnosis problem, we have a society problem

Suzanne O’Sullivan's question is timely
A visualised glass head shows a swirl of pink across the face.
Maxim Berg on Unsplash.

Rates of diagnoses for autism and ADHD are at an all-time high, whilst NHS funding remains in a perpetual state of squeeze. In this context, consultant neurologist Suzanne O’Sullivan, in her recent book The Age of Diagnosis, asks a timely question: can getting a diagnosis sometimes do more harm than good? Her concern is that many of these apparent “diagnoses” are not so much wrong as superfluous; in her view, they risk harming a person’s sense of wellbeing by encouraging self-imposed limitations or prompting them to pursue treatments that may not be justified. 

There are elements of O-Sullivan’s argument that I am not qualified to assess. For example, I cannot look at the research into preventative treatments for localised and non-metastatic cancers and tell you what proportion of those treatments is unnecessary. However, even from my lay-person’s perspective, it does seem that if the removal of a tumour brings peace of mind to a patient, however benign that tumour might be, then O’Sullivan may be oversimplifying the situation when she proposes that such surgery is an unnecessary medical intervention.  

But O’Sullivan devotes a large proportion of the book to the topics of autism and ADHD – and on this I am less of a lay person. She is one of many people who are proposing that these are being over diagnosed due to parental pressure and social contagion. Her particular concern is that a diagnosis might become a self-fulfilling prophecy, limiting one’s opportunities in life: “Some will take the diagnosis to mean that they can’t do certain things, so they won’t even try.” Notably, O’Sullivan persists with this argument even though the one autistic person whom she interviewed for the book actually told her the opposite: getting a diagnosis had helped her interviewee, Poppy, to re-frame a number of the difficulties that she was facing in life and realise they were not her fault.  

Poppy’s narrative is one with which we are very familiar at the Centre for Autism and Theology, where our team of neurodiverse researchers have conducted many, many interviews with people of all neurotypes across multiple research projects. Time and time again we hear the same thing: getting a diagnosis is what helps many neurodivergent people make sense of their lives and to ask for the help that they need. As theologian Grant Macaskill said in a recent podcast:  

“A label, potentially, is something that can help you to thrive rather than simply label the fact that you're not thriving in some way.” 

Perhaps it is helpful to remember how these diagnoses come about, because neurodivergence cannot be identified by any objective means such as by a blood test or CT scan. At present the only way to get a diagnosis is to have one’s lifestyle, behaviours and preferences analysed by clinicians during an intrusive and often patronising process of self-disclosure. 

Despite the invidious nature of this diagnostic process, more and more people are willing to subject themselves to it. Philosopher Robert Chapman looks to late-stage capitalism for the explanation. Having a diagnosis means that one can take on what is known as the “sick role” in our societal structures. When one is in the “sick role” in any kind of culture, society, or organisation, one is given social permission to take less personal responsibility for one’s own well-being. For example, if I have the flu at home, then caring family members might bring me hot drinks, chicken soup or whatever else I might need, so that I don’t have to get out of bed. This makes sense when I am sick, but if I expected my family to do things like that for me all the time, then I would be called lazy and demanding! When a person is in the “sick role” to whatever degree (it doesn’t always entail being consigned to one’s bed) then the expectations on that person change accordingly.  

Chapman points out that the dynamics of late-stage capitalism have pushed more and more people into the “sick role” because our lifestyles are bad for our health in ways that are mostly out of our own control. In his 2023 book, Empire of Normality, he observes,  

“In the scientific literature more generally, for instance, modern artificial lighting has been associated with depression and other health conditions; excessive exposure to screen time has been associated with chronic overstimulation, mental health conditions, and cognitive disablement; and noise annoyance has been associated with a twofold increase in depression and anxiety, especially relating to noise pollution from aircraft, traffic, and industrial work.” 

Most of this we cannot escape, and on top of it all we live life at a frenetic pace where workers are expected to function like machines, often subordinating the needs and demands of the body. Thus, more and more people begin to experience disablement, where they simply cannot keep working, and they start to reach for medical diagnoses to explain why they cannot keep pace in an environment that is constantly thwarting their efforts to stay fit and well. From this arises the phenomenon of “shadow diagnoses” – this is where “milder” versions of existing conditions, including autism and ADHD, start to be diagnosed more commonly, because more and more people are feeling that they are unsuited to the cognitive, sensory and emotional demands of daily working life.  

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help.

O’Sullivan rightly observes that some real problems arise from this phenomenon of “shadow diagnoses”. It does create a scenario, for example, where autistic people who experience significant disability (e.g., those who have no perception of danger and therefore require 24-hour supervision to keep them safe) are in the same “queue” for support as those from whom being autistic doesn’t preclude living independently. 

But this is not a diagnosis problem so much as a society problem – health and social care resources are never limitless, and a process of prioritisation must always take place. If I cut my hand on a piece of broken glass and need to go to A&E for stiches, I might find myself in the same “queue” as a 7-year-old child who has done exactly the same thing. Like anyone, I would expect the staff to treat the child first, knowing that the same injury is likely to be causing a younger person much more distress. Autistic individuals are just as capable of recognising that others within the autism community may have needs that should take priority over their own.   

What O’Sullivan overlooks is that there are some equally big positives to “shadow diagnoses” – especially as our society runs on such strongly capitalist lines. When a large proportion of the population starts to experience the same disablement, it becomes economically worthwhile for employers or other authorities to address the problem. To put it another way: If we get a rise in “shadow diagnoses” then we also get a rise in “shadow treatments” – accommodations made in the workplace/society that mean everybody can thrive. As Macaskill puts it:  

“Accommodations then are not about accommodating something intrinsically negative; they're about accommodating something intrinsically different so that it doesn't have to be negative.” 

This can be seen already in many primary schools: where once it was the exception (and highly stigmatised) for a child to wear noise cancelling headphones, they are now routinely made available to all students, regardless of neurotype. This means not only that stigma is reduced for the one or two students who may be highly dependent on headphones, but it also means that many more children can benefit from a break from the deleterious effects of constant noise. 

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help. I suspect the rise in people identifying as neurodivergent reflects a latent cry of “Stop the world, I want to get off!” This is not to say that those coming forward are not autistic or do not have ADHD (or other neurodivergence) but simply that if our societies were gentler and more cohesive, fewer people with these conditions would need to reach for the “sick role” in order to get by.  

Perhaps counter-intuitively, if we want the number of people asking for the “sick role” to decrease, we actually need to be diagnosing more people! In this way, we push our capitalist society towards adopting “shadow-treatments” – adopting certain accommodations in our schools and workplaces as part of the norm. When this happens, there are benefits not only for neurodivergent people, but for everybody.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief