Article
Belief
Creed
Education
7 min read

The myth of secular neutrality

Where academia went wrong.

Alex Stewart is a lawyer, trustee and photographer.  

A phrenology head is shown with its eyes closed.
David Matos on Unsplash.

In the recent horror-thriller Heretic, Hugh Grant plays Mr. Reed, a sharp-witted psychopath who imprisons two missionaries, subjecting them to ceaseless diatribes about the supposed irrationality of all religions.  Mr. Reed is also a terribly smug, self-righteous bore, a caricature of the fervent atheist who dismisses faith as mere superstition while assuming atheism is objective and neutral.  

This kind of assumption lies behind the criticisms directed by secularists at those who argue from a position of faith, as we saw recently with the debates on the Assisted Dying Bill. Yet, the notion of secular objectivity is itself a fallacy. Secularism, like any worldview, is a perspective, ironically one that is deeply indebted to Christianity, and humanity’s history of abandoning faith and its moral foundation has had disastrous consequences.  

Secularism is a bias, often grounded in an ethical vanity, whose supposedly universal principles have very Christian roots. Concepts like personal autonomy stem from a tradition that views life as sacred, based on the belief that humans are uniquely created in God's image. Appeals to compassion reflect Jesus’ teachings and Christian arguments for social justice throughout history. Claims that the Assisted Dying Bill was "progressive" rely on the Judaeo-Christian understanding of time as linear rather than cyclical. Even the separation of the secular and sacred is derived from Jesus’ teaching to “render to Caesar what is Caesar’s and to God what is God’s”. Authors like Tom Holland in Dominion and Glen Scrivener in The Air We Breathe have shown how Western societies, though often disconnected from their Christian roots, still operate within frameworks shaped by centuries of Christianity.

The antidote to human pride and self-deception was to be found in the Almighty.  Ironically, it was this humility, rooted in a very theological concern about human cognitive fallibility, that gave birth to the scientific method. 

A political secularism began to emerge after the seventeenth century European religious wars but the supposed historical conflict between science and religion, in which the former triumphs over superstition and a hostile Church, is myth. Promoted in the eighteenth century by figures like John Draper and Andrew White, this ‘conflict thesis’ persists even though it has been comprehensively debunked by works such as David Hutchings and James C. Ungureanu’s Of Popes and Unicorns and Nicholas Spencer’s Magisteria. Historians now emphasize the complex, often collaborative relationship between faith and science. 

Far from opposing intellectual inquiry, faith was its foundation. Medieval Christian Europe birthed the great universities; this was not simply because the Church had power and wealth but because knowledge of God was viewed as the basis for all understanding. University mottos reflect this view: Oxford’s "Dominus illuminatio mea" (The Lord is my light), Yale’s "Lux et Veritas" (Light and Truth), and Harvard’s original "Veritas Christo et Ecclesiae" (Truth for Christ and the Church). This intertwining of faith and academia fuelled the Enlightenment, when scientists like Boyle, Newton, and Kepler approached the study of creation (what Calvin described as ‘the theatre of God’s glory”) as an affirmation of the divine order of a God who delighted in His creatures “thinking His thoughts after Him”.   

Their Christian beliefs not only provided an impetus for rigorous exploration but also instilled in them a humility about human intellect. Unlike modernity's view of the mind as a detached, all-seeing eye, they believed man’s cognitive faculties had been diminished, both morally and intellectually, by Adam’s fall, which made perfect knowledge unattainable. Blaise Pascal captures this struggle with uncertainty in his Pensées.  

“We desire truth, and find within ourselves only uncertainty....This desire is left to us, partly to punish us, partly to make us perceive from whence we have fallen.”  

For Pascal and his believing contemporaries, the antidote to human pride and self-deception was to be found in the Almighty.  Ironically, it was this humility, rooted in a very theological concern about human cognitive fallibility, that gave birth to the scientific method, the process of systematic experimentation based on empirical evidence, and which later became central to Enlightenment thinking. 

Orwell was not alone in thinking that some ideas were so foolish that only intellectuals believed them. 

Although many of its leading lights were believers, the Enlightenment era hastened a shift away from God and towards man as the centre of understanding and ethics. Philosophers like David Hume marginalized or eliminated God altogether, paving the way for His later dismissal as a phantom of human projection (Freud) or as a tool of exploitation and oppression (Marx), while Rousseau popularised the appealing idea that rather than being inherently flawed, man was naturally good, only his environment made him do bad things.  

But it took the nihilist Nietzsche, the son of a Lutheran pastor, to predict the moral vacuum created by the death of God and its profound consequences. Ethical boundaries became unstable, allowing new ideologies to justify anything in pursuit of their utopian ends. Nietzsche’s prophesies about the rise of totalitarianism and competing ideologies that were to characterise the twentieth century were chillingly accurate. Germany universities provided the intellectual justification for Nazi atrocities against the Jews while the Marxist inspired revolutions and policies of the Soviet and Chinese Communist regimes led to appalling suffering and the deaths of between 80 and 100 million people. Devoid of divine accountability, these pseudo, human-centred religions amplified human malevolence and man’s destructive impulses.      

By the early 1990s, the Soviet Union had collapsed, leading Francis Fukuyama to opine from his ivory tower that secular liberal democracy was the natural end point in humanity's socio-political evolution and that history had ‘ended’. But his optimism was short lived. The events of 9/11 and the resurgence of a potent Islamism gave the lie that everyone wanted a western style secular liberal democracy, while back in the west a repackaged version of the old Marxist oppressor narrative began to appear on campuses, its deceitful utopian Siren song that man could be the author of his own salvation bewitching the academy. This time it came in the guise of divisive identity-based ideologies overlayed with post-modern power narratives that seemed to defy reality and confirm Chesterton’s view that when man ceased to believe in God he was capable of believing in anything.  

As universities promoted ideology over evidence and conformity over intellectual freedom, George Orwell’s critique of intellectual credulity and the dark fanaticism it often fosters, epitomized in 1984 where reality itself is manipulated through dogma, seemed more relevant than ever.  Orwell was not alone in thinking that some ideas were so foolish that only intellectuals believed them. Other commentators like Thomas Sowell are equally sceptical, critiquing the tenured academics whose lives are insulated from the suffering of those who have to live under their pet ideologies, and who prefer theories and sophistry to workable solutions. Intellect, he notes, is not the same thing as wisdom. More recently, American writer David Brooks, writing in The Atlantic, questions the point of having elite educational systems that overemphasize cognitive ability at the expense of other qualities, suggesting they tend to produce a narrow-minded ruling class who are blind to their own biases and false beliefs. 

It was intellectual over-confidence that led many institutions to abandon their faith-based origins. Harvard shortened its motto from "Veritas Christo et Ecclesiae" to plain "Veritas” and introduced a tellingly symbolic change to its shield. The original shield depicted three books: two open, symbolizing the Old and New Testaments, and one closed, representing a knowledge that required divine revelation. The modern shield shows all three books open, reflecting a human centred worldview that was done with God. 

However, secular confidence seems to be waning. Since the peak of New Atheism in the mid-2000s, there has been a growing dissatisfaction with worldviews limited to reason and materialism. Artists like Nick Cave have critiqued secularism’s inability to address concepts like forgiveness and mercy, while figures like Ayaan Hirsi Ali and Russell Brand have publicly embraced Christianity. The longing for the transcendent and a world that is ‘re-enchanted’ seems to be widespread.  

Despite the Church’s struggles, the teaching and person of Christ, the One who claimed not to point towards the truth but to be the Truth, the original Veritas the puritan founders of Harvard had in mind, remains as compelling as ever.  The story of fall, forgiveness, cosmic belonging and His transforming love is the narrative that most closely maps to our deepest human longings and lived experience, whilst simultaneously offering us the hope of redemption and - with divine help – becoming better versions of ourselves, the kind of people that secularism thinks we already are.   

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief

Article
Comment
Freedom of Belief
Politics
5 min read

The UN promised freedom of belief — but 80 years later, it’s still elusive

Flawed, fragile but still vital to those without a voice

Steve is news director of Article 18, a human rights organisation documenting Christian persecution in Iran.

Trump address the UN.
Trump addresses the 80th session of the United Nations General Assembly.
The White House.

It’s been 80 years since the United Nations was founded, at the end of the Second World War, primarily in an attempt to avoid a third global conflict. 

So on that score, at least, I suppose one must accept that the UN has achieved its primary objective. But why, then, does the overall feeling towards the organisation today seem negative? 

The UN’s founding charter outlined three other major goals alongside maintaining “international peace and security”: developing “friendly relations” among nations; international cooperation in solving economic, social, cultural or humanitarian problems; and respect for human rights and fundamental freedoms, “without distinction as to race, sex, language or religion”. 

Given that the UN is comprised of 193 countries, it is perhaps little wonder that “friendly relations” and “cooperation” between all sides have not always been forthcoming, and that instead clear cliques have formed between Western countries on the one hand, and much of the rest of the world on the other. (Perhaps the clearest such clique at the moment is the 2021-founded “Group of Friends in Defence of the UN Charter”, the identities of whose members - China, North Korea, Iran, Russia, Venezuela, et al - may lead one to wonder what exactly it is in the UN charter they wish to defend. Short answer: “sovereignty”, code for doing whatever they wish, without interference.) 

As for the pursuit of “human rights” - my primary focus as an employee of an NGO - perhaps the greatest obstacle remains the lack of a truly united consensus over which rights should be included in the definition. 

The closest that the nations of the world have come to an agreement on this score was the adoption in 1948, three years after the founding of the UN, of the Universal Declaration of Human Rights (UDHR), which was backed by 48 of 58 member states at the time, but which failed to secure the support of others, including apartheid South Africa, the former Soviet bloc, and Saudi Arabia. 

A primary objection in the case of Saudi Arabia was to Article 18 of the declaration - the bit about religious freedom and which includes the claim that everyone should have the right to change their religion or belief, an issue that remains problematic for many of the not-so-united nations of the world today. 

The UK, meanwhile, was happy to ratify the UDHR but expressed frustration at its lack of legal force, and it was nearly 20 years before another treaty, the 1966 International Covenant on Civil and Political Rights, attempted to correct this.  

But while the 174 signatories to the ICCPR - including Iran, Russia, Cuba and China (though the latter two without ever ratifying the treaty) - are at least on paper legally obliged to uphold this international treaty, the challenge of enforcement remains. For example, while the signatories of the ICCPR are obliged to provide freedom of religion as defined by Article 18 of the covenant, which closely resembles the same article of the UDHR, few practical tools exist to hold to account any state that fails to meet its obligations.  

In the case of persistent violators like Iran - the focus of my work - it seems the best we can currently hope for is to see a “resolution” passed by the majority of member states, outlining the ways in which the particular violator has failed to provide its citizens with the religious freedom (among other things) that should be their right according to the international treaties it has signed, and calling on them to do better.  

But when pariahs like Iran can merely continue to deny that such failures exist, call them “biased” and “political”, and all the while prevent access to the country to the independent experts (“Special Rapporteurs”) best able to ascertain the veracity of the allegations, such “resolutions” can at times appear rather hollow. 

At the same time, for advocates of human rights in non-compliant countries like Iran, the public shaming offered by such resolutions at least provides an opportunity for otherwise voiceless victims to be heard on the international stage. And when real change inside the country can sometimes appear nigh-on-impossible, you tend to take the small wins, such as hearing the representatives of member states mentioning the names of individual victims or groups in the public arena. 

Many mentions are made, for example, about the plight of the Baha’is during every UN discussion of human rights in Iran, and while it is less common to also hear about my own area of interest - the persecution of Christians in Iran - there is usually at least one mention, which for us advocates (and we hope also the victims we represent) provides some comfort and hope for future change. 

So 80 years since the establishment of the UN, it is clear the organisation has much room for improvement, but I remain persuaded by the argument that if we didn’t have the UN, we’d have to invent it. 

“Friendly relations” - a helpfully loose term - between our disunited nations will always be a challenge, but increased economic ties globally over the past 80 years have also provided potential pressure points for those who fail to follow the rules. (If, for example, Iran wishes to see sanctions removed, Western countries can and should continue to demand improvements in the area of human rights.) 

As for the UN’s endeavour to see increased “respect for human rights and fundamental freedoms”, the question of what such rights and freedoms should entail will continue to be debated, with persistent areas of challenge including not only religious conversion but also abortion and same-sex relations. 

It is not uncommon, for example, to hear representatives of Muslim states such as Iran questioning what Western nations really mean by “human rights” and accusing them of using the term only as a “pretext” for their own “biased” agendas. 

But for all its challenges, 80 years after its establishment the UN continues to offer the only forum today where countries of contrasting beliefs can come together to discuss their differences on the world stage.  

Whether that is a worthwhile exercise remains a matter for debate, but to the degree that it is, the UN remains the primary channel through which such conversations can take place. 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief