Article
Assisted dying
Care
Comment
Politics
4 min read

Assisted dying is not a medical procedure; it is a social one

Another vote, and an age-related amendment, highlight the complex community of care.
Graffiti reads 'I miss me' with u crossed out under the 'mem'
Sidd Inban on Unsplash.

Scottish Parliament’s Assisted Dying bill will go to a stage one vote on Tuesday 13th May, with some amendments having been made in response to public and political consultation. This includes the age of eligibility, originally proposed as 16 years. In the new draft of the bill, those requesting assistance to die must be at least 18.  

MSPs have been given a free vote on this bill, which means they can follow their consciences. Clearly, amongst those who support it, there is a hope that raising the age threshold will calm the troubled consciences of some who are threatening to oppose. When asked if this age amendment was a response to weakening support, The Times reports that one “seasoned parliamentarian” (unnamed) agreed, and commented: 

“The age thing was always there to be traded, a tactical retreat.”  

The callousness of this language chills me. Whilst it is well known that politics is more of an art than a science, there are moments when our parliamentarians literally hold matters of life and death in their hands. How can someone speak of such matters as if they are bargaining chips or military manoeuvres? But my discomfort aside, there is a certain truth in what this unnamed strategist says.  

When Liam McArthur MSP was first proposed the bill, he already suggested that the age limit would be a point of debate, accepting that there were “persuasive” arguments for raising it to 18. Fortunately, McArthur’s language choices were more appropriate to the subject matter. “The rationale for opting for 16 was because of that being the age of capacity for making medical decisions,” he said, but at the same time he acknowledged that in other countries where similar assisted dying laws are already in operation, the age limit is typically 18.  

McArthur correctly observes that at 16 years old young people are considered legally competent to consent to medical procedures without needing the permission of a parent or guardian. But surely there is a difference, at a fundamental level, between consenting to a medical procedure that is designed to improve or extend one’s life and consenting to a medical procedure that will end it?  

Viewed philosophically, it would seem to me that Assisted Dying is actually not a medical procedure at all, but a social one. This claim is best illustrated by considering one of the key arguments given for protecting 16- and 17- year-olds from being allowed to make this decision, which is the risk of coercion. The adolescent brain is highly social; therefore, some argue, a young person might be particularly sensitive to the burden that their terminal illness is placing on loved ones. Or worse, socially motivated young people may be particularly vulnerable to pressure from exhausted care givers, applied subtly and behind closed doors.  

Whilst 16- and 17- year-olds are considered to have legal capacity, guidance for medical staff already indicates that under 18s should be strongly advised to seek parent or guardian advice before consenting to any decision that would have major consequences. Nothing gets more major than consenting to die, but sadly, some observe, we cannot be sure that a parent or guardian’s advice in that moment will be always in the young person’s best interests. All of this discussion implies that we know we are not asking young people to make just a medical decision that impacts their own body, but a social one that impacts multiple people in their wider networks.  

For me, this further raises the question of why 18 is even considered to be a suitable age threshold. If anything, the more ‘adult’ one gets, the more one realises one’s place in the world is part of a complex web of relationships with friends and family, in which one is not the centre. Typically, the more we grow up, the more we respect our parents, because we begin to learn that other people’s care of us has come at a cost to themselves. This is bound to affect how we feel about needing other people’s care in the case of disabling and degenerative illness. Could it even be argued that the risk of feeling socially pressured to end one’s life early actually increases with age? Indeed, there is as much concern about this bill leaving the elderly vulnerable to coercion as there is for young people, not to mention disabled adults. As MSP Pam Duncan-Glancey (a wheelchair-user) observes, “Many people with disabilities feel that they don’t get the right to live, never mind the right to die.” 

There is just a fundamental flawed logic to equating Assisted Dying with a medical procedure; one is about the mode of one’s existence in this world, but the other is about the very fact of it. The more we grow, the more we learn that we exist in communities – communities in which sometimes we are the care giver and sometimes we are the cared for. The legalisation of Assisted Dying will impact our communities in ways which cannot be undone, but none of that is accounted for if Assisted Dying is construed as nothing more than a medical choice.  

As our parliamentarians prepare to vote, I pray that they really will listen to their consciences. This is one of those moments when our elected leaders literally hold matters of life and death in their hands. Now is not the time for ‘tactical’ moves that might simply sweep the cared-for off of the table, like so many discarded bargaining chips. As MSPs consider making this very fundamental change to the way our communities in Scotland are constituted, they are not debating over the mode of the cared-for’s existence, they are debating their very right to it.   

Article
Belief
Creed
Education
7 min read

The myth of secular neutrality

Where academia went wrong.

Alex Stewart is a lawyer, trustee and photographer.  

A phrenology head is shown with its eyes closed.
David Matos on Unsplash.

In the recent horror-thriller Heretic, Hugh Grant plays Mr. Reed, a sharp-witted psychopath who imprisons two missionaries, subjecting them to ceaseless diatribes about the supposed irrationality of all religions.  Mr. Reed is also a terribly smug, self-righteous bore, a caricature of the fervent atheist who dismisses faith as mere superstition while assuming atheism is objective and neutral.  

This kind of assumption lies behind the criticisms directed by secularists at those who argue from a position of faith, as we saw recently with the debates on the Assisted Dying Bill. Yet, the notion of secular objectivity is itself a fallacy. Secularism, like any worldview, is a perspective, ironically one that is deeply indebted to Christianity, and humanity’s history of abandoning faith and its moral foundation has had disastrous consequences.  

Secularism is a bias, often grounded in an ethical vanity, whose supposedly universal principles have very Christian roots. Concepts like personal autonomy stem from a tradition that views life as sacred, based on the belief that humans are uniquely created in God's image. Appeals to compassion reflect Jesus’ teachings and Christian arguments for social justice throughout history. Claims that the Assisted Dying Bill was "progressive" rely on the Judaeo-Christian understanding of time as linear rather than cyclical. Even the separation of the secular and sacred is derived from Jesus’ teaching to “render to Caesar what is Caesar’s and to God what is God’s”. Authors like Tom Holland in Dominion and Glen Scrivener in The Air We Breathe have shown how Western societies, though often disconnected from their Christian roots, still operate within frameworks shaped by centuries of Christianity.

The antidote to human pride and self-deception was to be found in the Almighty.  Ironically, it was this humility, rooted in a very theological concern about human cognitive fallibility, that gave birth to the scientific method. 

A political secularism began to emerge after the seventeenth century European religious wars but the supposed historical conflict between science and religion, in which the former triumphs over superstition and a hostile Church, is myth. Promoted in the eighteenth century by figures like John Draper and Andrew White, this ‘conflict thesis’ persists even though it has been comprehensively debunked by works such as David Hutchings and James C. Ungureanu’s Of Popes and Unicorns and Nicholas Spencer’s Magisteria. Historians now emphasize the complex, often collaborative relationship between faith and science. 

Far from opposing intellectual inquiry, faith was its foundation. Medieval Christian Europe birthed the great universities; this was not simply because the Church had power and wealth but because knowledge of God was viewed as the basis for all understanding. University mottos reflect this view: Oxford’s "Dominus illuminatio mea" (The Lord is my light), Yale’s "Lux et Veritas" (Light and Truth), and Harvard’s original "Veritas Christo et Ecclesiae" (Truth for Christ and the Church). This intertwining of faith and academia fuelled the Enlightenment, when scientists like Boyle, Newton, and Kepler approached the study of creation (what Calvin described as ‘the theatre of God’s glory”) as an affirmation of the divine order of a God who delighted in His creatures “thinking His thoughts after Him”.   

Their Christian beliefs not only provided an impetus for rigorous exploration but also instilled in them a humility about human intellect. Unlike modernity's view of the mind as a detached, all-seeing eye, they believed man’s cognitive faculties had been diminished, both morally and intellectually, by Adam’s fall, which made perfect knowledge unattainable. Blaise Pascal captures this struggle with uncertainty in his Pensées.  

“We desire truth, and find within ourselves only uncertainty....This desire is left to us, partly to punish us, partly to make us perceive from whence we have fallen.”  

For Pascal and his believing contemporaries, the antidote to human pride and self-deception was to be found in the Almighty.  Ironically, it was this humility, rooted in a very theological concern about human cognitive fallibility, that gave birth to the scientific method, the process of systematic experimentation based on empirical evidence, and which later became central to Enlightenment thinking. 

Orwell was not alone in thinking that some ideas were so foolish that only intellectuals believed them. 

Although many of its leading lights were believers, the Enlightenment era hastened a shift away from God and towards man as the centre of understanding and ethics. Philosophers like David Hume marginalized or eliminated God altogether, paving the way for His later dismissal as a phantom of human projection (Freud) or as a tool of exploitation and oppression (Marx), while Rousseau popularised the appealing idea that rather than being inherently flawed, man was naturally good, only his environment made him do bad things.  

But it took the nihilist Nietzsche, the son of a Lutheran pastor, to predict the moral vacuum created by the death of God and its profound consequences. Ethical boundaries became unstable, allowing new ideologies to justify anything in pursuit of their utopian ends. Nietzsche’s prophesies about the rise of totalitarianism and competing ideologies that were to characterise the twentieth century were chillingly accurate. Germany universities provided the intellectual justification for Nazi atrocities against the Jews while the Marxist inspired revolutions and policies of the Soviet and Chinese Communist regimes led to appalling suffering and the deaths of between 80 and 100 million people. Devoid of divine accountability, these pseudo, human-centred religions amplified human malevolence and man’s destructive impulses.      

By the early 1990s, the Soviet Union had collapsed, leading Francis Fukuyama to opine from his ivory tower that secular liberal democracy was the natural end point in humanity's socio-political evolution and that history had ‘ended’. But his optimism was short lived. The events of 9/11 and the resurgence of a potent Islamism gave the lie that everyone wanted a western style secular liberal democracy, while back in the west a repackaged version of the old Marxist oppressor narrative began to appear on campuses, its deceitful utopian Siren song that man could be the author of his own salvation bewitching the academy. This time it came in the guise of divisive identity-based ideologies overlayed with post-modern power narratives that seemed to defy reality and confirm Chesterton’s view that when man ceased to believe in God he was capable of believing in anything.  

As universities promoted ideology over evidence and conformity over intellectual freedom, George Orwell’s critique of intellectual credulity and the dark fanaticism it often fosters, epitomized in 1984 where reality itself is manipulated through dogma, seemed more relevant than ever.  Orwell was not alone in thinking that some ideas were so foolish that only intellectuals believed them. Other commentators like Thomas Sowell are equally sceptical, critiquing the tenured academics whose lives are insulated from the suffering of those who have to live under their pet ideologies, and who prefer theories and sophistry to workable solutions. Intellect, he notes, is not the same thing as wisdom. More recently, American writer David Brooks, writing in The Atlantic, questions the point of having elite educational systems that overemphasize cognitive ability at the expense of other qualities, suggesting they tend to produce a narrow-minded ruling class who are blind to their own biases and false beliefs. 

It was intellectual over-confidence that led many institutions to abandon their faith-based origins. Harvard shortened its motto from "Veritas Christo et Ecclesiae" to plain "Veritas” and introduced a tellingly symbolic change to its shield. The original shield depicted three books: two open, symbolizing the Old and New Testaments, and one closed, representing a knowledge that required divine revelation. The modern shield shows all three books open, reflecting a human centred worldview that was done with God. 

However, secular confidence seems to be waning. Since the peak of New Atheism in the mid-2000s, there has been a growing dissatisfaction with worldviews limited to reason and materialism. Artists like Nick Cave have critiqued secularism’s inability to address concepts like forgiveness and mercy, while figures like Ayaan Hirsi Ali and Russell Brand have publicly embraced Christianity. The longing for the transcendent and a world that is ‘re-enchanted’ seems to be widespread.  

Despite the Church’s struggles, the teaching and person of Christ, the One who claimed not to point towards the truth but to be the Truth, the original Veritas the puritan founders of Harvard had in mind, remains as compelling as ever.  The story of fall, forgiveness, cosmic belonging and His transforming love is the narrative that most closely maps to our deepest human longings and lived experience, whilst simultaneously offering us the hope of redemption and - with divine help – becoming better versions of ourselves, the kind of people that secularism thinks we already are.   

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief