Article
Belief
Creed
Identity
Truth and Trust
5 min read

Calls to revive the Enlightenment ignore its own illusions

Returning to the Age of Reason won’t save us from post-Truth

Alister McGrath retired as Andreas Idreos Professor of Science and Religion at Oxford University in 2022.

In the style of a Raeburn portrait, a set of young people lounge around on their phones looking diffident
Enlightened disagreement (with apologies to Henry Raeburn).
Nick Jones/Midjourney.ai.

Is truth dead? Are we living in a post-truth era where forcefully asserted opinions overshadow evidence-based public truths that once commanded widespread respect and agreement? Many people are deeply concerned about the rise of irrational beliefs, particularly those connected to identity politics, which have gained considerable influence in recent years. It seems we now inhabit a culture where emotional truths take precedence, while factual truths are relegated to a secondary status. Challenging someone’s beliefs is often portrayed as abusive, or even as a hate crime. Is it any surprise that irrationality and fantasy thrive when open debate and discussion are so easily shut down? So, what has gone wrong—and what can we do to address it? 

We live in an era marked by cultural confusion and uncertainty, where a multitude of worldviews, opinions, and prejudices vie for our attention and loyalty. Many people feel overwhelmed and unsettled by this turmoil, often seeking comfort in earlier modes of thinking—such as the clear-cut universal certainties of the eighteenth-century “Age of Reason.” In a recent op-ed in The Times, James Marriott advocates for a return to this kind of rational thought. I share his frustration with the chaos in our culture and the widespread hesitation to challenge powerful irrationalities and absurdities out of fear of being canceled or marginalized. However, I am not convinced that his proposed solution is the right one. We cannot simply revert to the eighteenth century. Allow me to explain my concerns. 

What were once considered simple, universal certainties are now viewed by scholars as contested, ethnocentric opinions. These ideas gained prominence not because of their intellectual merit, but due to the economic, political, and cultural power of dominant cultures. “Rationality” does not refer to a single, universal, and correct way of thinking that exists independently of our cultural and historical context. Instead, global culture has always been a bricolage of multiple rationalities. 

The great voyages of navigation of the early seventeenth century made it clear that African and Asian understandings of morality and rationality differed greatly from those in England. These accounts should have challenged the emerging English philosophical belief in a universal human rationality. However, rather than recognizing a diverse spectrum of human rationalities—each shaped by its own unique cultural evolution—Western observers dismissed these perspectives as “primitive” or “savage” modes of reasoning that needed to be replaced by modern Western thought. This led to forms of intellectual colonialism, founded on the questionable assumption that imposing English rational philosophies was a civilizing mission intended to improve the world. 

Although Western intellectual colonialism was often driven by benign intentions, its consequences were destructive. The increasing influence of Charles Darwin’s theory of biological and cultural evolution in the late nineteenth century led Darwin’s colleague, Alfred Russel Wallace, to conclude that intellectually and morally superior Westerners would “displace the lower and more degraded races,” such as “the Tasmanian, Australian and New Zealander”—a process he believed would ultimately benefit humanity as a whole. 

We can now acknowledge the darker aspects of the British “Age of Reason”: it presumed to possess a definitive set of universal rational principles, which it then imposed on so-called “primitive” societies, such as its colonies in the south Pacific. This reflected an ethnocentric illusion that treated distinctly Western beliefs as if they were universal truths. 

A second challenge to the idea of returning to the rational simplicities of the “Age of Reason” is that its thinkers struggled to agree on what it meant to be “rational.” This insight is often attributed to the philosopher Alasdair MacIntyre, who argued that the Enlightenment’s legacy was the establishment of an ideal of rational justification that ultimately proved unattainable. As a result, philosophy relies on commitments whose truth cannot be definitively proven and must instead be defended on the basis of assumptions that carry weight for some, but not for all. 

We have clearly moved beyond the so-called rational certainties of the “Age of Reason,” entering a landscape characterized by multiple rationalities, each reasonable in its own unique way. This shift has led to a significant reevaluation of the rationality of belief in God. Recently, Australian atheist philosopher Graham Oppy has argued that atheism, agnosticism, and theism should all be regarded as “rationally permissible” based on the evidence and the rational arguments supporting each position. Although Oppy personally favours atheism, he does not expect all “sufficiently thoughtful, intelligent, and well-informed people” to share his view. He acknowledges that the evidence available is insufficient to compel a definitive conclusion on these issues. All three can claim to be reasonable beliefs. 

The British philosopher Bertrand Russell contended that we must learn to accept a certain level of uncertainty regarding the beliefs that really matter to us, such as the meaning of life. Russell’s perspective on philosophy provides a valuable counterbalance to the excesses of Enlightenment rationalism: “To teach how to live without certainty, and yet without being paralyzed by hesitation, is perhaps the chief thing that philosophy, in our age, can still do for those who study it.” 

Certainly, we must test everything and hold fast to what is good, as St Paul advised. It seems to me that it is essential to restore the role of evidence-based critical reasoning in Western culture. However, simply returning to the Enlightenment is not a practical solution. A more effective approach might be to gently challenge the notion, widespread in some parts of our society, that disagreement equates to hatred. We clearly need to develop ways of modelling a respectful and constructive disagreement, in which ideas can be debated and examined without diminishing the value and integrity of those who hold them. This is no easy task—yet we need to find a way of doing this if we are to avoid fragmentation into cultural tribes, and lose any sense of a “public good.” 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief

Article
Creed
Education
5 min read

Our social problems need theology, here’s why

Taking the god’s-eye view develops critical skills
young people listen, and ponder, to a speaker off screen.
M Accelerator on Unsplash.

At secondary school level, Religious Studies continues to attract strong numbers. On the surface, this looks like a healthy sign for the subject. Yet, critics argue that appearances can be deceiving: many faith-based schools make the subject compulsory, artificially pushing up participation. The result is a stark disconnect when students progress to higher education. Here interest appears to drop off sharply, and several universities have been forced to close their single-honours degrees in Theology and Religious Studies due to unsustainable student numbers. 

But this presents a misleading picture – even at tertiary level students are far more interested in Theology and Religious Studies than the statistics seem to suggest. While few undergraduates commit to a full degree in Theology, (in Scotland this is called Divinity) or Religious Studies, partly because career pathways outside of ordained ministry and teaching can seem unclear, many are eager to sample the subject alongside their main studies. This means that at the University of Aberdeen, the department of Divinity finds a different kind of relevance. Thanks to Aberdeen’s flexible degree structure, it is not unusual to find law, sociology, psychology, anthropology, and even physics students sitting in on our undergraduate modules. This interdisciplinary mix brings a distinctive energy to classroom discussions, as well as a few challenges… and challengers.  

Some students arrive never having opened a Bible, never having heard a word from the Qur’an, and never having engaged with any other religious text. Many are openly ambivalent about the existence of God, some downright hostile, and more than a few admit that they were drawn in by the promise of coursework-based assessment rather than traditional exams. Yet, once in the room, most engage with surprising enthusiasm, and even the challengers play a vital role.  

What emerges is a lively space where students approach theology less as a matter of personal faith and more as an intellectual exercise, grappling with life’s big questions, testing out ideas, and debating seriously with the prospect that God exists. Far from diminishing the subject, this shift gives the Divinity department a new role: not as a training ground for clergy, but as a forum for critical thinking across disciplines. 

In one of our courses for example, students are asked to debate this question: if a human chooses to go wild swimming in a crocodile’s natural habitat, does the crocodile have a right to kill and eat that human, as it would any other prey item that strayed into its path? Or, if a person with profound physical and intellectual disability is not able to live out many of the rights and responsibilities envisaged by the United Nations Convention on Human Rights, on what grounds are they still reckoned to be a human person? As we tease out the (multiple) possible answers to these questions, many of the turn out to be surprisingly theological. Whilst some students will work towards becoming better able to affirm and articulate their own atheism, others are surprised to discover that they have been living out a deistic morality all along; on the quiet, their internal moral compass believes in God. 

But my sense is that even if students don’t walk out with an easy A, they walk out with a set of skills that is, in the long run, far more valuable. 

Further to that, in an open letter the Theos think tank recently highlighted the role of theology in the ethical and cultural development of communities. They argue that theological study equips people to engage thoughtfully with different people groups and traditions, to develop skills in interfaith dialogue, and to promote communication across cultural barriers. Put simply: 

“In an increasingly polarised world, it helps us understand other points of view.” 

This insight is highly relevant to our students as they set out on varied career paths in an increasingly complex world. The skills honed in our Divinity classrooms – empathy, critical thinking, close observation, and clear writing – are both essential and transferable. Theology degrees do not lead only to ordination or teaching; they can open doors to careers in journalism, diplomacy, politics, community work, authorship, and screenwriting, among many others. As Professor Gordon Lynch, Professor of Religion, Society and Ethics at the University of Edinburgh, observed at a recent panel discussion: 

“It’s very difficult to think about a major geopolitical issue at the moment in which religion isn’t deeply implicated in some way.” 

The relevance of theological training extends far beyond traditional disciplines. For example, law students will need to recognise not only that a person with profound disability is a human person, but also to understand the deeper ethical and theological reasons why society judges this to be so. International Relations students will need to appreciate why resolving the Israel/Palestine conflict is not as simple as drawing lines on a map, but is rooted in long histories of faith, identity, and belonging – histories which will reach their influence far into the future as well as the present. Sports science and physiotherapy students will need to empathise with the human drive to become ever faster and stronger, while discerning when to help people recognise the limits before injury occurs. 

So, we gather all these students and more into our divinity courses, and work with them as they develop such skills. By discussing these matters as though God exists, in a space where there is unapologetic openness to confessional or deistic ways of looking at the world, students are freed to adopt a third-person standpoint, a “god’s-eye view” if you like, which allows them to critically examine both their own and other people’s perspectives. When this freedom becomes apparent, it is the challengers often find themselves the ones being challenged, and hostility soon morphs into vibrant dialogue. Also, for those who want “an easy A” it quickly becomes apparent that coursework-based assessment is in no way easier than traditional exams – if anything, it can be the opposite! Getting your ideas down on paper, coherently, and with relevant references to research from across disciplines is a sophisticated competency. But my sense is that even if students don’t walk out with an easy A, they walk out with a set of skills that is, in the long run, far more valuable.  

With an eye to business models and balance sheets, many universities don’t think they need their theology departments anymore, and with the current financial precarity faced by the higher education sector, on paper this may be true. But society is crying out for complex ways forwards with complex situations, and the problems of social division are becoming more apparent than ever. Whilst it is clear that fewer and fewer students are choosing to do whole theology degrees, it is also clear the world still needs theologians.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief