Weekend essay
Comment
Royalty
4 min read

Beyond Charles: a radical case for the monarchy

In a culture that tends toward populism and moral relativism, what the coronation says is, ironically, radically prophetic, writes Nigel Biggar.

Nigel Biggar is Regius Professor Emeritus of Moral Theology at the University of Oxford and Distinguished Scholar in Residence at Pusey House, Oxford. 

A uniformed Prince Charles sits on a throne reading a speech, beside a crown resting on a cushion
In May 2022, the then Prince Charles delivers a speech in the House of Lords.
Copyright House of Lords 2022 / Photography by Annabel Moeller, CC BY 2.0 Media Commons.

Judging by a recent YouGov poll, the monarchy currently remains popular among the British, with 58 per cent supporting its continuation and only 26 per cent preferring an elected head of state. But support drops dramatically with age: 38 per cent of those aged 18-24 would like to abolish the monarchy, while only 32 per cent want to keep it.       

If the monarchy is to survive beyond the reign of King Charles III, therefore, a strong case in its favour needs to be articulated. It needs to be justified in terms of political well-being. Can this be done? I believe so. Monarchy as we now have it—with its executive powers entirely transferred to elected members of parliament (except in case of constitutional crisis)—makes important contributions to political health. For sure, most of these are symbolic; but symbols can represent important truths and serve important functions. 

First, by embodying a reassuring continuity and stability, monarchy enables society to cope with change. Thus, far from fostering conservatism during her seventy-year reign, the late Queen Elizabeth actually presided over huge cultural, social, and political change.  

Thanks to their monarchy, the British are spared the predicament of those Americans who loathed the politics of Donald Trump, while having to respect him as the symbolic representation of their nation.

Second, the distinction between the monarchical head of state and the prime ministerial head of government makes it easier to tell criticism of government policy from a lack of patriotic loyalty—easier than in an American presidential system, where the symbolic head of the nation and the head of government are one and the same. Thanks to their monarchy, the British are spared the predicament of those Americans who loathed the politics of Donald Trump, while having to respect him as the symbolic representation of their nation.     

Next, it’s good to have a head of state who, being unelected, can transcend party-politics and use her patronage to support civil society, thus reminding us (and politicians) that there is far more to public life than elections, parliamentary debates, and legislation.  

But there is yet a further benefit, which is more principled, more Christian, and more fundamentally important than any of the others. A good political constitution certainly needs a part where rulers are made sensitive and accountable to those they rule—that is, an elected legislature that can hold government to account and stop it in its tracks. A good constitution needs a democratic element. After all, according to a biblical and Christian view, rulers exist to serve the ruled: kings are expected to be shepherds of their people. 

Nevertheless, a Christian view is not naïve about the people. It does not suppose that the popular will, as expressed in majority vote, is always right and just. After all, it was the people (the laos as in ‘laity’) who bayed for Jesus’ blood in the Gospels, and it was the people (the demos as in ‘democracy’) which, according to the Acts of the Apostles, responded to the Christian persecutor, Herod, by lauding him as a god (Acts 12.21). If kings can be sinners, then so can the people. Hitler, remember, was elected by due democratic process. 

What this means is that a healthy political constitution should be more than simply democratic. In addition to an elected House of Commons, it needs other parts too, to balance it. It needs to be mixed. For example, it needs a House of Lords composed of a wide range of experts and leaders of civil society (including the Church of England). That is, it needs an aristocracy of wisdom, not of land, which can only be secured by appointment, not popular election.  

The heir to the throne gets on his knees to receive the crown—the symbol of his authority—not from below but from above, not from the fickle people but from the constant God. 

And it also needs a monarch, who symbolises the accountability of the whole nation, rulers and ruled, kings and people, to the given principles of justice. At base these principles are not human inventions. They are not the passing creatures of popular whim or majority vote. They are given in and with the created nature of things. And this is exactly what the coronation ritual says, when the heir to the throne gets on his knees to receive the crown—the symbol of his authority—not from below but from above, not from the fickle people but from the constant God.  

Contrary to what now passes for democratic common sense, the moral legitimacy of government does not lie in popular consent. It cannot, since the will of the people can be corrupt. Rather, moral legitimacy lies in the conformity of law and policy to the given principles of justice and prudence—to which the people might or might not adhere. Popular consent is vital, if law and government policy is to have any effective social authority, but it does not establish its moral legitimacy. This is a very important and fundamental political truth, which is rarely spoken nowadays, but which the coronation ritual speaks. And in a culture that tends toward populism and moral relativism, what the coronation says is, ironically, radically prophetic. 

In sum, then, I do think that there are good reasons—some of them directly Christian—to support the kind of monarchy we now have. However, on the question of how much public money should be used to support it, or how many members of the royal family should be supported, I am agnostic. And I don’t suppose that a monarchical republic is the only decent kind of republic. Nonetheless, I do think that monarchy can confer some important and distinctive political benefits; and if we are to continue to enjoy them—if Prince George is to find a throne awaiting him—then we had better bring to mind what they are.

Review
Books
Care
Comment
Psychology
7 min read

We don’t have an over-diagnosis problem, we have a society problem

Suzanne O’Sullivan's question is timely
A visualised glass head shows a swirl of pink across the face.
Maxim Berg on Unsplash.

Rates of diagnoses for autism and ADHD are at an all-time high, whilst NHS funding remains in a perpetual state of squeeze. In this context, consultant neurologist Suzanne O’Sullivan, in her recent book The Age of Diagnosis, asks a timely question: can getting a diagnosis sometimes do more harm than good? Her concern is that many of these apparent “diagnoses” are not so much wrong as superfluous; in her view, they risk harming a person’s sense of wellbeing by encouraging self-imposed limitations or prompting them to pursue treatments that may not be justified. 

There are elements of O-Sullivan’s argument that I am not qualified to assess. For example, I cannot look at the research into preventative treatments for localised and non-metastatic cancers and tell you what proportion of those treatments is unnecessary. However, even from my lay-person’s perspective, it does seem that if the removal of a tumour brings peace of mind to a patient, however benign that tumour might be, then O’Sullivan may be oversimplifying the situation when she proposes that such surgery is an unnecessary medical intervention.  

But O’Sullivan devotes a large proportion of the book to the topics of autism and ADHD – and on this I am less of a lay person. She is one of many people who are proposing that these are being over diagnosed due to parental pressure and social contagion. Her particular concern is that a diagnosis might become a self-fulfilling prophecy, limiting one’s opportunities in life: “Some will take the diagnosis to mean that they can’t do certain things, so they won’t even try.” Notably, O’Sullivan persists with this argument even though the one autistic person whom she interviewed for the book actually told her the opposite: getting a diagnosis had helped her interviewee, Poppy, to re-frame a number of the difficulties that she was facing in life and realise they were not her fault.  

Poppy’s narrative is one with which we are very familiar at the Centre for Autism and Theology, where our team of neurodiverse researchers have conducted many, many interviews with people of all neurotypes across multiple research projects. Time and time again we hear the same thing: getting a diagnosis is what helps many neurodivergent people make sense of their lives and to ask for the help that they need. As theologian Grant Macaskill said in a recent podcast:  

“A label, potentially, is something that can help you to thrive rather than simply label the fact that you're not thriving in some way.” 

Perhaps it is helpful to remember how these diagnoses come about, because neurodivergence cannot be identified by any objective means such as by a blood test or CT scan. At present the only way to get a diagnosis is to have one’s lifestyle, behaviours and preferences analysed by clinicians during an intrusive and often patronising process of self-disclosure. 

Despite the invidious nature of this diagnostic process, more and more people are willing to subject themselves to it. Philosopher Robert Chapman looks to late-stage capitalism for the explanation. Having a diagnosis means that one can take on what is known as the “sick role” in our societal structures. When one is in the “sick role” in any kind of culture, society, or organisation, one is given social permission to take less personal responsibility for one’s own well-being. For example, if I have the flu at home, then caring family members might bring me hot drinks, chicken soup or whatever else I might need, so that I don’t have to get out of bed. This makes sense when I am sick, but if I expected my family to do things like that for me all the time, then I would be called lazy and demanding! When a person is in the “sick role” to whatever degree (it doesn’t always entail being consigned to one’s bed) then the expectations on that person change accordingly.  

Chapman points out that the dynamics of late-stage capitalism have pushed more and more people into the “sick role” because our lifestyles are bad for our health in ways that are mostly out of our own control. In his 2023 book, Empire of Normality, he observes,  

“In the scientific literature more generally, for instance, modern artificial lighting has been associated with depression and other health conditions; excessive exposure to screen time has been associated with chronic overstimulation, mental health conditions, and cognitive disablement; and noise annoyance has been associated with a twofold increase in depression and anxiety, especially relating to noise pollution from aircraft, traffic, and industrial work.” 

Most of this we cannot escape, and on top of it all we live life at a frenetic pace where workers are expected to function like machines, often subordinating the needs and demands of the body. Thus, more and more people begin to experience disablement, where they simply cannot keep working, and they start to reach for medical diagnoses to explain why they cannot keep pace in an environment that is constantly thwarting their efforts to stay fit and well. From this arises the phenomenon of “shadow diagnoses” – this is where “milder” versions of existing conditions, including autism and ADHD, start to be diagnosed more commonly, because more and more people are feeling that they are unsuited to the cognitive, sensory and emotional demands of daily working life.  

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help.

O’Sullivan rightly observes that some real problems arise from this phenomenon of “shadow diagnoses”. It does create a scenario, for example, where autistic people who experience significant disability (e.g., those who have no perception of danger and therefore require 24-hour supervision to keep them safe) are in the same “queue” for support as those from whom being autistic doesn’t preclude living independently. 

But this is not a diagnosis problem so much as a society problem – health and social care resources are never limitless, and a process of prioritisation must always take place. If I cut my hand on a piece of broken glass and need to go to A&E for stiches, I might find myself in the same “queue” as a 7-year-old child who has done exactly the same thing. Like anyone, I would expect the staff to treat the child first, knowing that the same injury is likely to be causing a younger person much more distress. Autistic individuals are just as capable of recognising that others within the autism community may have needs that should take priority over their own.   

What O’Sullivan overlooks is that there are some equally big positives to “shadow diagnoses” – especially as our society runs on such strongly capitalist lines. When a large proportion of the population starts to experience the same disablement, it becomes economically worthwhile for employers or other authorities to address the problem. To put it another way: If we get a rise in “shadow diagnoses” then we also get a rise in “shadow treatments” – accommodations made in the workplace/society that mean everybody can thrive. As Macaskill puts it:  

“Accommodations then are not about accommodating something intrinsically negative; they're about accommodating something intrinsically different so that it doesn't have to be negative.” 

This can be seen already in many primary schools: where once it was the exception (and highly stigmatised) for a child to wear noise cancelling headphones, they are now routinely made available to all students, regardless of neurotype. This means not only that stigma is reduced for the one or two students who may be highly dependent on headphones, but it also means that many more children can benefit from a break from the deleterious effects of constant noise. 

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help. I suspect the rise in people identifying as neurodivergent reflects a latent cry of “Stop the world, I want to get off!” This is not to say that those coming forward are not autistic or do not have ADHD (or other neurodivergence) but simply that if our societies were gentler and more cohesive, fewer people with these conditions would need to reach for the “sick role” in order to get by.  

Perhaps counter-intuitively, if we want the number of people asking for the “sick role” to decrease, we actually need to be diagnosing more people! In this way, we push our capitalist society towards adopting “shadow-treatments” – adopting certain accommodations in our schools and workplaces as part of the norm. When this happens, there are benefits not only for neurodivergent people, but for everybody.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief