Explainer
Creed
Leading
6 min read

Why’s there a pope in Rome?

A modern gathering sheds light on an ancient question: who brings the church together?
A pope wearing a white skull cap and white robe, viewed from behind
Coronel Gonorrea on Unsplash.

On 1st March, an event called Gather25 tried something which it described as “unprecedented”. It sought to unite, in a 25-hour worship broadcast, global Christianity. Each session was led by a different nation, and its top pastors sent out to bat. It was slickly organised, as well as technologically sophisticated; I wished the event well, and prayed for it. While in the more institutionalised church settings, people are waiting on new Archbishops of Canterbury, or praying for the Pope to recover, here was action. 

It got me thinking, though: when can it be said that the Church has properly met? Who decides to meet? Who sends the invites? Who confirms the decisions? Anyone reading the Gather25 website will have noticed overtures about the ‘Council of Nicaea’. Nicaea was the first important gathering of the ancient Church, in 325 AD. By citing this, Gather25 positioned itself downstream of a very prestigious meetup. It perhaps hoped for itself that it would be similarly ecumenical (a Greek word to do with the whole household). Did Gather25 have the same Nicene status? 

Sadly, no. Gather25 was openhearted and dynamic, but it was a very particular slice of Christianity meeting for a very historically specific form of worship. The difficulty is that it takes more than even the buzziest PR, or all our modern advances in communication and streaming, to truly summon something as untameable as the Church to order. What do we need to ensure we have a gathering at which the Church is truly represented, and able to officially act with the “mind of Christ” (as St Paul puts it)? Wouldn’t you need something, or someone, able to steer the entire thing?  

At the Council of Nicaea in 325AD, nothing less than the most powerful man on the planet would do. The Emperor Constantine alone had the clout to draw in Christian leaders from around the world, make them sit together, and demand an official settlement of a difficult question: how exactly was God the Son, Jesus, linked to God the Father? Even with such heavyweight patronage, church leaders did not produce an absolutely finished answer at that time - it actually took a whole extra council, in Constantinople in 381 AD, to confirm the confession that is still known today as the Nicene Creed, and which begins “I believe in God, the Father Almighty”.  

Yet a problem had been touched on. Should a Caesar really have this kind of upper hand over a sacred institution? Some lines of thought tried to think of the emperor as a kind of ‘living law’ who represents God to the Christian people he rules over. But this couldn’t jive with key parts of a tradition wherein Jesus had radically authorised servant leaders from among simple Galilean fishermen, known as the Twelve Apostles, or ‘sent ones’. Who should rule? 

Was this a bit of overreading, designed to give a senior cleric a Scriptural trump card to play against a secular leader in a petty power showdown? 

The problem has not really gone away. If we ignore this question of legitimacy, we are at the mercy of raw power. Either we seek an authoritative means of unifying Christians, or it becomes a case of who happens to have the most cash, or the most Instagram followers. It has never been the case that the Church has just organically ‘met up’ without a protos, a first name on the team sheet. Indeed, when has this ever happened, in any sphere of life? It would be like a parliament forming without the invitation of the sovereign.  

It is against this bigger problem that the rise of the Christian leader in Rome must be viewed. Rome was, of course, the centre of an Empire during the first few Christian centuries, but it quickly gained distinctly Christian prestige. St Paul’s letter to the congregations there continues to be one of the most sizzling documents in the New Testament; he was also martyred there, along with his fellow leader St Peter, one of the original Twelve. Rome was a big deal, and sources from as early as the first century show the leading clergyman (or ‘bishop’) of Rome, a man called Clement, being asked to weigh in on a dispute over 600 miles away from his locale. 

For Catholics like me, it is clear the Church was onto something. It would go on to discover that there was more to the Bishop of Rome than merely his occupation of a well-to-do area. There would be a development. Many church doctrines, after all, are the result of reflection, Scriptural deep dives, and the need for clearer unified doctrine and practice - the Trinity, for example.  

And the Bishop of Rome’s role developed in a particular setting: while the figure of the emperor loomed ever larger in the East, a parallel momentum would gather around the leading cleric of a city where St Peter had passed on his mantle. For St Peter had, after all, been singled out by Jesus in his earthly ministry. In the Bible, the Gospel according to St Matthew depicts Jesus giving “the keys to the kingdom” to his follower Simon, who he then renames ‘Peter’, meaning rock, upon which he vows to build his church. Not only this - Peter is to strengthen his brothers (St Luke 22:32); to feed the Lord’s sheep (John 21). The Bishop of Rome was increasingly thought to have this Peter-like quality.  

Was this a bit of overreading, designed to give a senior cleric a Scriptural trump card to play against a secular leader in a petty power showdown? It is an accusation hard to shake off completely, sinful humanity being what it is. In the Middle Ages, a decree was conveniently ‘discovered’ by the Roman Emperor that handed over all his power to the Bishop of Rome, the new pontifex maximus - it was, of course, a complete phoney designed to assert church power over secular rulers. But for Catholics, despite patchy moments, there has always been more to be said for the Pope (from Papa, ‘father’) as a legitimate consolidation of Jesus’ vision for the leadership of the Church he founded: a brotherhood, headed by a type of St Peter, the rock on which the Church is built.  

Not headed by St Peter’s successor as a flawless demigod, it should be said. For it is also part of Christian tradition about St Peter that he was capable of tremendous human weakness - he betrayed Jesus on the night of his arrest and trial, and denied he ever knew him. Some Popes have sadly been downright wicked or self-serving. Nor is it headed by the Pope as a tyrant. St Peter confirms early doctrinal pronouncements for the Church - he declares that food laws should not prevent Israelites from enjoying table fellowship with other ethnic groups, for example. But he is also frankly challenged by other leaders during early meetings in Jerusalem. The Pope teaches not as a lone ranger, but always within a bigger fraternity of fellow bishops.  

In 2024, a Vatican department released a new document pondering what role the Pope could play in bringing together the separated brethren of world Christianity. I hope this offer is taken seriously. Because what remains compelling for Catholics is a figure who makes it possible, at the most foundational level, to say that the Church is One, as per Jesus’ prayer in John 17:21, and all without needing to rely on good digital marketing. It is not just pious sentiment for a Catholic to say that they are genuinely connected to a global family of as many as 1.4 billion people, because they share a pastor who claims to serve the whole thing, the ‘servant of the servants of God’. Any critique of the Papacy - and there are many intelligible ones, raking over the sordid moments or disputing the Scriptural evidence - must, though, rankle with that: what really keeps us together, then? The Catholic insistence has always been that saying ‘Jesus’ or ‘the Holy Spirit’ really amounts to saying: “what I think Jesus wants; what I think the Holy Spirit is saying” - and that is, in effect, actually many popes instead of just one. 

Celebrate our Second Birthday!

Since March 2023, our readers have enjoyed over 1,000 articles. All for free. This is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief

Article
Belief
Creed
Identity
Truth and Trust
5 min read

Calls to revive the Enlightenment ignore its own illusions

Returning to the Age of Reason won’t save us from post-Truth

Alister McGrath retired as Andreas Idreos Professor of Science and Religion at Oxford University in 2022.

In the style of a Raeburn portrait, a set of young people lounge around on their phones looking diffident
Enlightened disagreement (with apologies to Henry Raeburn).
Nick Jones/Midjourney.ai.

Is truth dead? Are we living in a post-truth era where forcefully asserted opinions overshadow evidence-based public truths that once commanded widespread respect and agreement? Many people are deeply concerned about the rise of irrational beliefs, particularly those connected to identity politics, which have gained considerable influence in recent years. It seems we now inhabit a culture where emotional truths take precedence, while factual truths are relegated to a secondary status. Challenging someone’s beliefs is often portrayed as abusive, or even as a hate crime. Is it any surprise that irrationality and fantasy thrive when open debate and discussion are so easily shut down? So, what has gone wrong—and what can we do to address it? 

We live in an era marked by cultural confusion and uncertainty, where a multitude of worldviews, opinions, and prejudices vie for our attention and loyalty. Many people feel overwhelmed and unsettled by this turmoil, often seeking comfort in earlier modes of thinking—such as the clear-cut universal certainties of the eighteenth-century “Age of Reason.” In a recent op-ed in The Times, James Marriott advocates for a return to this kind of rational thought. I share his frustration with the chaos in our culture and the widespread hesitation to challenge powerful irrationalities and absurdities out of fear of being canceled or marginalized. However, I am not convinced that his proposed solution is the right one. We cannot simply revert to the eighteenth century. Allow me to explain my concerns. 

What were once considered simple, universal certainties are now viewed by scholars as contested, ethnocentric opinions. These ideas gained prominence not because of their intellectual merit, but due to the economic, political, and cultural power of dominant cultures. “Rationality” does not refer to a single, universal, and correct way of thinking that exists independently of our cultural and historical context. Instead, global culture has always been a bricolage of multiple rationalities. 

The great voyages of navigation of the early seventeenth century made it clear that African and Asian understandings of morality and rationality differed greatly from those in England. These accounts should have challenged the emerging English philosophical belief in a universal human rationality. However, rather than recognizing a diverse spectrum of human rationalities—each shaped by its own unique cultural evolution—Western observers dismissed these perspectives as “primitive” or “savage” modes of reasoning that needed to be replaced by modern Western thought. This led to forms of intellectual colonialism, founded on the questionable assumption that imposing English rational philosophies was a civilizing mission intended to improve the world. 

Although Western intellectual colonialism was often driven by benign intentions, its consequences were destructive. The increasing influence of Charles Darwin’s theory of biological and cultural evolution in the late nineteenth century led Darwin’s colleague, Alfred Russel Wallace, to conclude that intellectually and morally superior Westerners would “displace the lower and more degraded races,” such as “the Tasmanian, Australian and New Zealander”—a process he believed would ultimately benefit humanity as a whole. 

We can now acknowledge the darker aspects of the British “Age of Reason”: it presumed to possess a definitive set of universal rational principles, which it then imposed on so-called “primitive” societies, such as its colonies in the south Pacific. This reflected an ethnocentric illusion that treated distinctly Western beliefs as if they were universal truths. 

A second challenge to the idea of returning to the rational simplicities of the “Age of Reason” is that its thinkers struggled to agree on what it meant to be “rational.” This insight is often attributed to the philosopher Alasdair MacIntyre, who argued that the Enlightenment’s legacy was the establishment of an ideal of rational justification that ultimately proved unattainable. As a result, philosophy relies on commitments whose truth cannot be definitively proven and must instead be defended on the basis of assumptions that carry weight for some, but not for all. 

We have clearly moved beyond the so-called rational certainties of the “Age of Reason,” entering a landscape characterized by multiple rationalities, each reasonable in its own unique way. This shift has led to a significant reevaluation of the rationality of belief in God. Recently, Australian atheist philosopher Graham Oppy has argued that atheism, agnosticism, and theism should all be regarded as “rationally permissible” based on the evidence and the rational arguments supporting each position. Although Oppy personally favours atheism, he does not expect all “sufficiently thoughtful, intelligent, and well-informed people” to share his view. He acknowledges that the evidence available is insufficient to compel a definitive conclusion on these issues. All three can claim to be reasonable beliefs. 

The British philosopher Bertrand Russell contended that we must learn to accept a certain level of uncertainty regarding the beliefs that really matter to us, such as the meaning of life. Russell’s perspective on philosophy provides a valuable counterbalance to the excesses of Enlightenment rationalism: “To teach how to live without certainty, and yet without being paralyzed by hesitation, is perhaps the chief thing that philosophy, in our age, can still do for those who study it.” 

Certainly, we must test everything and hold fast to what is good, as St Paul advised. It seems to me that it is essential to restore the role of evidence-based critical reasoning in Western culture. However, simply returning to the Enlightenment is not a practical solution. A more effective approach might be to gently challenge the notion, widespread in some parts of our society, that disagreement equates to hatred. We clearly need to develop ways of modelling a respectful and constructive disagreement, in which ideas can be debated and examined without diminishing the value and integrity of those who hold them. This is no easy task—yet we need to find a way of doing this if we are to avoid fragmentation into cultural tribes, and lose any sense of a “public good.” 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief