Weekend essay
Comment
Royalty
4 min read

Beyond Charles: a radical case for the monarchy

In a culture that tends toward populism and moral relativism, what the coronation says is, ironically, radically prophetic, writes Nigel Biggar.

Nigel Biggar is Regius Professor Emeritus of Moral Theology at the University of Oxford and Distinguished Scholar in Residence at Pusey House, Oxford. 

A uniformed Prince Charles sits on a throne reading a speech, beside a crown resting on a cushion
In May 2022, the then Prince Charles delivers a speech in the House of Lords.
Copyright House of Lords 2022 / Photography by Annabel Moeller, CC BY 2.0 Media Commons.

Judging by a recent YouGov poll, the monarchy currently remains popular among the British, with 58 per cent supporting its continuation and only 26 per cent preferring an elected head of state. But support drops dramatically with age: 38 per cent of those aged 18-24 would like to abolish the monarchy, while only 32 per cent want to keep it.       

If the monarchy is to survive beyond the reign of King Charles III, therefore, a strong case in its favour needs to be articulated. It needs to be justified in terms of political well-being. Can this be done? I believe so. Monarchy as we now have it—with its executive powers entirely transferred to elected members of parliament (except in case of constitutional crisis)—makes important contributions to political health. For sure, most of these are symbolic; but symbols can represent important truths and serve important functions. 

First, by embodying a reassuring continuity and stability, monarchy enables society to cope with change. Thus, far from fostering conservatism during her seventy-year reign, the late Queen Elizabeth actually presided over huge cultural, social, and political change.  

Thanks to their monarchy, the British are spared the predicament of those Americans who loathed the politics of Donald Trump, while having to respect him as the symbolic representation of their nation.

Second, the distinction between the monarchical head of state and the prime ministerial head of government makes it easier to tell criticism of government policy from a lack of patriotic loyalty—easier than in an American presidential system, where the symbolic head of the nation and the head of government are one and the same. Thanks to their monarchy, the British are spared the predicament of those Americans who loathed the politics of Donald Trump, while having to respect him as the symbolic representation of their nation.     

Next, it’s good to have a head of state who, being unelected, can transcend party-politics and use her patronage to support civil society, thus reminding us (and politicians) that there is far more to public life than elections, parliamentary debates, and legislation.  

But there is yet a further benefit, which is more principled, more Christian, and more fundamentally important than any of the others. A good political constitution certainly needs a part where rulers are made sensitive and accountable to those they rule—that is, an elected legislature that can hold government to account and stop it in its tracks. A good constitution needs a democratic element. After all, according to a biblical and Christian view, rulers exist to serve the ruled: kings are expected to be shepherds of their people. 

Nevertheless, a Christian view is not naïve about the people. It does not suppose that the popular will, as expressed in majority vote, is always right and just. After all, it was the people (the laos as in ‘laity’) who bayed for Jesus’ blood in the Gospels, and it was the people (the demos as in ‘democracy’) which, according to the Acts of the Apostles, responded to the Christian persecutor, Herod, by lauding him as a god (Acts 12.21). If kings can be sinners, then so can the people. Hitler, remember, was elected by due democratic process. 

What this means is that a healthy political constitution should be more than simply democratic. In addition to an elected House of Commons, it needs other parts too, to balance it. It needs to be mixed. For example, it needs a House of Lords composed of a wide range of experts and leaders of civil society (including the Church of England). That is, it needs an aristocracy of wisdom, not of land, which can only be secured by appointment, not popular election.  

The heir to the throne gets on his knees to receive the crown—the symbol of his authority—not from below but from above, not from the fickle people but from the constant God. 

And it also needs a monarch, who symbolises the accountability of the whole nation, rulers and ruled, kings and people, to the given principles of justice. At base these principles are not human inventions. They are not the passing creatures of popular whim or majority vote. They are given in and with the created nature of things. And this is exactly what the coronation ritual says, when the heir to the throne gets on his knees to receive the crown—the symbol of his authority—not from below but from above, not from the fickle people but from the constant God.  

Contrary to what now passes for democratic common sense, the moral legitimacy of government does not lie in popular consent. It cannot, since the will of the people can be corrupt. Rather, moral legitimacy lies in the conformity of law and policy to the given principles of justice and prudence—to which the people might or might not adhere. Popular consent is vital, if law and government policy is to have any effective social authority, but it does not establish its moral legitimacy. This is a very important and fundamental political truth, which is rarely spoken nowadays, but which the coronation ritual speaks. And in a culture that tends toward populism and moral relativism, what the coronation says is, ironically, radically prophetic. 

In sum, then, I do think that there are good reasons—some of them directly Christian—to support the kind of monarchy we now have. However, on the question of how much public money should be used to support it, or how many members of the royal family should be supported, I am agnostic. And I don’t suppose that a monarchical republic is the only decent kind of republic. Nonetheless, I do think that monarchy can confer some important and distinctive political benefits; and if we are to continue to enjoy them—if Prince George is to find a throne awaiting him—then we had better bring to mind what they are.

Article
Comment
Education
Language
6 min read

Cutting language learning is a moral failure

Learning someone’s tongue is a deeply humble and empathetic act
A check list shows 'thank you' in different languages.

When you go abroad, how do you navigate language differences? Do you just stick everything through Google translate? Or put a few weeks into Duolingo before you go? Or maybe you just speak a bit louder in the hope that that will somehow smooth over any misunderstandings? 

Recently, my wife and I went to Italy for a week. Neither of can speak a word of Italian and we were taking our toddler Zachary with us (who can speak even less Italian), so we booked into a big resort where we knew staff would be able to speak some English if we needed anything for Zach. Even so, we tried learning a few words and phrases:  

‘please’,  

‘thank you’,  

‘could I have …?’,  

‘where is the …?,  

‘please forgive my toddler, he hasn’t learned to regulate his emotions yet’. 

That sort of thing. Just some basics to get by.  

Of course, what happened was exactly what happens every time I speak another language. I try my best to make an effort, people immediately realise I’m a struggling and they put me out of my misery by replying in English anyway.  

All this reinforces the importance of deep and rigorous language learning in society. All this makes the continued diminishment of university modern language programmes rather odd, and more than a little unsettling.  

The University of Nottingham has announced it is terminating the employment of casual staff at its Language Centre. This will see the end of numerous classes for students and others in many languages, both ancient and modern, including British Sign Language.  

Nottingham is not alone in this. The news comes in the immediate aftermath of a review into the University of Aberdeen’s decision to scrap modern language degrees in 2023, which found the decision “hurried, unstructured, and dominated by immediate financial considerations.” (Not that we needed a review to tell us this). The University of Aberdeen has partially reversed the decision, continuing its provision of joint honour degrees, if not single honour language degrees.  

Elsewhere, in January, Cardiff University announced plans to cut 400 academic staff, cutting their entire modern language provision in the process. In May, the University revealed that it would reverse these plans, with modern languages continuing to be offered (for now), albeit it a revised and scaled-down manner. 

The situation is bleak. As a theology lecturer who works for a Church of England college, I’m all too aware of the precarity my friends and colleagues in University Arts and Humanities departments face across the sector. But I was also naïve enough to think that languages might be one of the subjects that would be able to survive the worst of education’s deepening malaise given their clear  importance. How wrong I was. 

There are the obvious causes for despair at the news of language department cuts. One the one hand is the human element of all this. People are losing their jobs. Moreover, as casual workers, the University had no obligation to consult them about the changes or provide any notice period, and so they didn’t, because why would a university demonstrate courtesy towards its staff unless it absolutely had to? As well as losing jobs and whole careers, people will lose sleep, and perhaps even homes and relationships as a direct result of the financial and emotional toll this decision will take on staff. My heart breaks for those effected.  

And yet, the move is also evidence – as if more were needed – of the increasing commercialization of Higher Education. A statement from the University said the decision to cut languages in this way was the result of the Language Centre not running at a “financial surplus.” The cuts will instead allow the University to focus on “providing a high-quality experience for our undergraduate and postgraduate students.” 

And there we have it. Not even a veneer of pretence that universities operate for the pursuit of truth or knowledge. No, nothing so idealistic. A university is business, thank you very much, here to offer an “experience”. And when parts of businesses become financially unsustainable, they’re tossed aside. 

Languages aren’t just ways of describing the world we see, they’re also ways of seeing the world in the first place. 

But cutting language offerings isn’t just a personal and a societal loss, it’s also a huge spiritual and moral failure. And that’s because of what language fundamentally is. Let me explain.  

It can be tempting to think of words as simply ‘labels’ we assign to objects in the world, with different languages using a different set of ‘labels’ to describe the same objects. As a native English speaker, I might see something with four legs and a flat surface on top and call it a ‘desk’. Someone else with a different native language might call it a Schreibtisch, or a bureau‚ or a scrivania, or a tepu, or a bàn làm việc. You get the point: we might be using different labels, but we’re all ‘seeing’ the same thing when we use those ‘labels’, right? 

Well, it’s a bit more complicated than that. Languages aren’t just ways of describing the world we see, they’re also ways of seeing the world in the first place. As such, languages have the capacity to shape how we behave in response to the world, a world itself suggested to us in part by our language(s). As twentieth-century philosopher Ludwig Wittgenstein once wrote, “the limits of my language mean the limits of my world.” 

Let me give you just one example. English distinguishes tenses: past, present, future. I did, I do, I will do. Chinese does not. It expresses past, present, and future in the same way, meaning past and future feel as immediate and as pressing as the present. The result of ‘seeing’ the world through a ‘futureless’ language like this? According to economist Keith Chen, ‘futureless’ language speakers are 30 per cent more likely to save income compared to ‘futured’ language speakers (like English speakers). They also retire with more wealth, smoke less, practice safer sex, eat better, and exercise more. The future is experienced in a much more immediate and pressing way, leading to people investing more into behaviours that positively impact their future selves, because their view of the world – and their future selves’ place within the world – is radically different because of their language. 

Different languages lead to seeing the world differently which leads to differences in behaviour. In other words, there are certain experiences and emotions – even certain types of knowledge and behaviours - that are only encounterable for those fluent in certain languages. And this means that to learn another language is to increase our capacity for empathy. Forget walking a mile in someone’s shoes, if you want truly to know someone, learn their language.  

In my day job as a lecturer, when I’m trying to encourage my students – most of whom are vicars-to-be – to learn biblical Greek and/or Hebrew, I tell them it will make them more empathetic people. It may make them better readers of the Bible, it may even make them better writers too but, more than anything else, students who learn languages will be better equipped to love their neighbour for having done so. They will get a better sense of the limits of their world, and a greater appreciation for the ways in which others see it too. Show me a society that is linguistically myopic, and I’ll show you one that’s deeply unempathetic. I can guarantee you of that.   

We ought to be deeply, deeply concerned about the diminishing language offerings in the UK’s Higher Education sector. To open oneself to other languages is to open oneself to other ways of seeing the world. It is to be shown the limits of one’s own ways of seeing. Learning a language is a deeply humble and empathetic act. And isn’t humility and empathy in desperately short supply at the moment? 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief