Column
Atheism
Creed
6 min read

Confessions of an atheist philosopher. Part 3: the secret about truth I learned at seminary

In the third of a series, philosopher Stefani Ruper recalls learning a crucial lesson about her knowledge and her truth claims.

Stefani Ruper is a philosopher specialising in the ethics of belief and Associate Member of Christ Church College, Oxford. She received her PhD from the Theology & Religion faculty at the University of Oxford in 2020.

An unfocused views down on to stacks of books in an old library.
Jana Kowalewicz on Unsplash.

My name is Stefani. I was a committed atheist for almost my entire life. I studied religion to try to figure out how to have spiritual fulfillment without God. I tried writing books on spirituality for agnostics and atheists, but I gave up because the answers were terrible. Two years after completing my PhD, I finally realised that that’s because the answer is God.  

Today, I explain how and why I decided to walk into Christian faith.  

Here at Seen and Unseen I am publishing a six-article series highlighting key turning points or realisations I made on my walk into faith. It tells my story, and it tells our story too.  

 

For the first 20 years of my life, I thought religion was for stupid and weak people. I carried a copy of Richard Dawkins’s The God Delusion in my purse. I studied science as a way to defeat religion. 

But one day, while titrating an iron solution in a laboratory, a sudden realization crashed over me. I remember just staring at the orange solution simmering in the beaker, thinking, “oh no, oh no, oh no, oh no, oh no.” 

The realization was that I had dismissed religion as stupid without ever engaging it. I had never even asked religious people what they thought! I had done all this while priding myself on open-mindedness.  

This struck me as deeply hypocritical. I had always thought that one of the hallmarks of a good argument was being able to defend the ideas of your enemies. I wasn’t even close.  

So, I printed 500 pages from the Zygon Journal of Religion and Science. I sat down with a cup of tea. And after reading just two pages, I set the stack of paper back down on the desk and thumped my head down on top of them. 

Oh no.  

The theologians had a point

To seminary 

Twelve months later, I dropped my duffel bag on the floor of my new room in Theology House. Theology House was the residence of the most earnest students training to be pastors at the Boston University School of Theology. 

I was an atheist, but the seminary administrators gave me the benefit of the doubt when I told them I wanted to be as immersed in the world of faith as possible. We had house-dinner planned for that night, and school was to begin Monday. I couldn’t wait. I was going to get a master’s degree in theology as an atheist.  

I spent the next two years proving my old self wrong. It was delightful. Every day was a new opportunity to unearth another bias I didn’t know I had, or to discover another philosophical approach I hadn’t known existed. It was occasionally difficult to let go of certain cherished ideas, but it was more than worth it. The intellectual richness of faith blew my mind over and over. 

About six months into my studies, I ran into a secular friend I used to sit around and bash religion with.  

“So, what have you learned at seminary?” he asked me, grimacing. I told him the simple but life-changing truth: Christianity is intellectually rigorous. It’s reasonable. It can even be beautiful.  

“Did you become a believer?” he asked. “No,” I said, shrugging. “But I’m beginning understand why other people do.”  

Why do we believe what we believe?  

The most important question I ended up asking at seminary was about the nature of belief itself. I needed to understand: how could my roommates and I all work so hard to be reasonable, but still believe such different things?  

Rationality, I learned, is always contextual. All of us would like to think that what we believe—what seems to us the obvious, “rational” conclusion—is the truth. But it’s not. There are eight billion people on this planet and every single one of us thinks we are right about everything.  

Each conclusion each of us draws comes from deploying our best possible reasoning to the model of reality that lives in our heads. These models are always under revision; they are the result of the model of one minute ago plus whatever happened in that minute. This process stretches all the way back to before birth, since exposure to different sounds and nutrients in the womb impacted how we began making sense of the world. Then we were born into contexts that came pre-laden with various metaphysical presuppositions, attitudes, and values. Throughout life we did and continue to do our best to reason within these models and to steer their development. 

This “best reasoning” is never pure intellect. There is no such thing as reason unbiased by feeling. It is now an accepted scientific fact that thought and feeling are always intertwined. 

Indeed, rationality itself may be best thought of as a feeling. The philosopher William James says we deem things true when they give us the “sentiment of rationality”—that is, a feeling of satisfaction or harmony that occurs when an idea fits well with our current model of reality. This doesn’t mean reason and reasonableness don’t exist; it means that, contrary to the popular myth that quality thinking is free of emotion, emotional awareness is a key element of it.  

My friends and I were all reasonable while believing different things because we each made sincere effort to improve our reasoning as thought-feelers born into different models of reality. None of us could claim with 100 per cent certainty that we were correct. What we could do was welcome new insights into ourselves, one another, and the world that would help us keep developing our models in the direction of truth. 

The path to truth  

By the time I graduated from seminary, I hadn’t changed my mind on God. I remained a firm atheist. 

But I had learned a crucial lesson: my knowledge and truth claims were far from perfect. If I wanted to say true things or to keep getting closer to the truth—which I very much did, my loyalty to truth still my highest value—I needed to do two things:  

First, I needed to keep untangling my own personal history, thoughts, and feelings. Only through self-awareness could I unpack my own biases, hone my capacities to reason amidst emotion, and discern the elements of my worldview worth keeping or leaving behind.  

Second, I needed to keep engaging people who were different from me. Only through exposure to new ideas could I expand or develop my own.  

 Today, my model of reality includes something I thought it never would: God. But this change took twelve years of the most careful, self-aware, humble, prudent, and open-minded quest for truth I could manage. 

I’m not done revising the model, and I won’t ever be. God will almost surely remain a part of it, but I’m open to the possibility He will not. I’ll keep learning about myself; I’ll keep learning about others; I’ll keep steering my model as responsibly as I am able. 

The ultimate truth of things beats at the heart of all our eight billion different perspectives; the best any of us can do is keep working to beat in harmony with it. 

  

 RELATED ARTICLE COMPONENT 

https://www.seenandunseen.com/confessions-atheist-philosopher-part-1-born-be-atheist-born-be-anxious  

Confessions of an atheist philosopher. Part 2: The making of rage against religion | Seen & Unseen (seenandunseen.com) 

  

 Barney on Belief 

 

Article
Belief
Creed
Identity
Truth and Trust
5 min read

Calls to revive the Enlightenment ignore its own illusions

Returning to the Age of Reason won’t save us from post-Truth

Alister McGrath retired as Andreas Idreos Professor of Science and Religion at Oxford University in 2022.

In the style of a Raeburn portrait, a set of young people lounge around on their phones looking diffident
Enlightened disagreement (with apologies to Henry Raeburn).
Nick Jones/Midjourney.ai.

Is truth dead? Are we living in a post-truth era where forcefully asserted opinions overshadow evidence-based public truths that once commanded widespread respect and agreement? Many people are deeply concerned about the rise of irrational beliefs, particularly those connected to identity politics, which have gained considerable influence in recent years. It seems we now inhabit a culture where emotional truths take precedence, while factual truths are relegated to a secondary status. Challenging someone’s beliefs is often portrayed as abusive, or even as a hate crime. Is it any surprise that irrationality and fantasy thrive when open debate and discussion are so easily shut down? So, what has gone wrong—and what can we do to address it? 

We live in an era marked by cultural confusion and uncertainty, where a multitude of worldviews, opinions, and prejudices vie for our attention and loyalty. Many people feel overwhelmed and unsettled by this turmoil, often seeking comfort in earlier modes of thinking—such as the clear-cut universal certainties of the eighteenth-century “Age of Reason.” In a recent op-ed in The Times, James Marriott advocates for a return to this kind of rational thought. I share his frustration with the chaos in our culture and the widespread hesitation to challenge powerful irrationalities and absurdities out of fear of being canceled or marginalized. However, I am not convinced that his proposed solution is the right one. We cannot simply revert to the eighteenth century. Allow me to explain my concerns. 

What were once considered simple, universal certainties are now viewed by scholars as contested, ethnocentric opinions. These ideas gained prominence not because of their intellectual merit, but due to the economic, political, and cultural power of dominant cultures. “Rationality” does not refer to a single, universal, and correct way of thinking that exists independently of our cultural and historical context. Instead, global culture has always been a bricolage of multiple rationalities. 

The great voyages of navigation of the early seventeenth century made it clear that African and Asian understandings of morality and rationality differed greatly from those in England. These accounts should have challenged the emerging English philosophical belief in a universal human rationality. However, rather than recognizing a diverse spectrum of human rationalities—each shaped by its own unique cultural evolution—Western observers dismissed these perspectives as “primitive” or “savage” modes of reasoning that needed to be replaced by modern Western thought. This led to forms of intellectual colonialism, founded on the questionable assumption that imposing English rational philosophies was a civilizing mission intended to improve the world. 

Although Western intellectual colonialism was often driven by benign intentions, its consequences were destructive. The increasing influence of Charles Darwin’s theory of biological and cultural evolution in the late nineteenth century led Darwin’s colleague, Alfred Russel Wallace, to conclude that intellectually and morally superior Westerners would “displace the lower and more degraded races,” such as “the Tasmanian, Australian and New Zealander”—a process he believed would ultimately benefit humanity as a whole. 

We can now acknowledge the darker aspects of the British “Age of Reason”: it presumed to possess a definitive set of universal rational principles, which it then imposed on so-called “primitive” societies, such as its colonies in the south Pacific. This reflected an ethnocentric illusion that treated distinctly Western beliefs as if they were universal truths. 

A second challenge to the idea of returning to the rational simplicities of the “Age of Reason” is that its thinkers struggled to agree on what it meant to be “rational.” This insight is often attributed to the philosopher Alasdair MacIntyre, who argued that the Enlightenment’s legacy was the establishment of an ideal of rational justification that ultimately proved unattainable. As a result, philosophy relies on commitments whose truth cannot be definitively proven and must instead be defended on the basis of assumptions that carry weight for some, but not for all. 

We have clearly moved beyond the so-called rational certainties of the “Age of Reason,” entering a landscape characterized by multiple rationalities, each reasonable in its own unique way. This shift has led to a significant reevaluation of the rationality of belief in God. Recently, Australian atheist philosopher Graham Oppy has argued that atheism, agnosticism, and theism should all be regarded as “rationally permissible” based on the evidence and the rational arguments supporting each position. Although Oppy personally favours atheism, he does not expect all “sufficiently thoughtful, intelligent, and well-informed people” to share his view. He acknowledges that the evidence available is insufficient to compel a definitive conclusion on these issues. All three can claim to be reasonable beliefs. 

The British philosopher Bertrand Russell contended that we must learn to accept a certain level of uncertainty regarding the beliefs that really matter to us, such as the meaning of life. Russell’s perspective on philosophy provides a valuable counterbalance to the excesses of Enlightenment rationalism: “To teach how to live without certainty, and yet without being paralyzed by hesitation, is perhaps the chief thing that philosophy, in our age, can still do for those who study it.” 

Certainly, we must test everything and hold fast to what is good, as St Paul advised. It seems to me that it is essential to restore the role of evidence-based critical reasoning in Western culture. However, simply returning to the Enlightenment is not a practical solution. A more effective approach might be to gently challenge the notion, widespread in some parts of our society, that disagreement equates to hatred. We clearly need to develop ways of modelling a respectful and constructive disagreement, in which ideas can be debated and examined without diminishing the value and integrity of those who hold them. This is no easy task—yet we need to find a way of doing this if we are to avoid fragmentation into cultural tribes, and lose any sense of a “public good.” 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?

Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief