Explainer
America
Comment
6 min read

The Cold War, the Internet and America’s nones

How does a culture lose religion so rapidly? Stephen Bullivant investigates the American phenomenon of ‘nonversion'.

Stephen Bullivant is a professor of theology and sociology at St Mary’s University, UK, and the University of Notre Dame, Australia.

Image from the series South Park

Those even passingly familiar with American religious trends over the past couple of decades will have read or heard the phrase ‘rise of the nones’ countless times. And with good reason. While in Britain the proportion of people telling pollsters they have ‘no religion’ grew rapidly over the twentieth century – it was already 43% when the British Social Attitudes survey began in 1983; it tipped 50% a little over a decade later – in America the figure stayed stubbornly low. According to Gallup polls, only 5% of American adults identified with no religion in 1975. Twenty years later, in 1995, it was still 5%.

But then, seemingly very suddenly, things started to change. Beginning in the late nineties, then rapidly accelerating in the early 2000s, each new survey showed the nones getting bigger and bigger. Depending on which survey one looks at, nones now account for  somewhere between a quarter and third of American adults. Even at the lower end, that amounts to some 60 million people. And they’re still growing.

This raises a natural question: Why now? Or rather, what is it about the nineties and early 2000s that pushed or pulled large swathes out of thinking of themselves as religious? Various ways of measuring American religiosity all indicate that something significant must have happened around then. But what

A prior, deeper puzzle

That, at least, is the obvious way of approaching things. And to be fair, it has much to recommend it: something, or rather a combination of somethings, certainly did happen to American religion in those critical years. But this in itself raises a prior, deeper puzzle: why hadn’t the numbers of American nones already risen before the late nineties or early naughts? In all manner of other, quasi-comparable countries – Britain, Canada, Australia, France – the nones started growing steadily from the 1960s onwards. Yet while the sixties had all manner of other disruptive and destabilizing effects on American culture, society, politics, and religion, the proportion of nones grew only a little bit, then stopped.

At the risk of gross oversimplification, if one were to look for a sufficiently big ‘something’ within American society, mostly absent from those other countries, which could plausibly have kept non-religiosity artificially low in these decades, then there is an obvious candidate: the Cold War. Or more specifically, the precise and peculiarly religious way in which it was framed in the USA. 

A final, all-out battle

We Brits were as up to our neck in the Cold War as anyone. But only in America, I think, was the Cold War ever popularly framed as a “final, all-out battle between commu­nistic atheism and Christianity”, to quote Republican Senator Joseph McCarthy. Remember too that it was only in the mid-1950s that Congress adopted “In God We Trust” as America’s official motto, and “under God” was added to the Pledge. During the Pledge debates in Congress, the Democrat Louis C. Rabaut’s summed up a common view on both sides of the aisle:

“You may argue from dawn to dusk about differing po­litical, economic, and social systems but the fundamental issue which is the unbridgeable gap between America and Communist Russia is a belief in almighty God.”

This wasn’t just an issue with wide bipartisan and popular support view, it was thoroughly ecumenical too. While McCarthy and Rabaut were Catholics, it was a Presbyterian president, Eisenhower, who signed the “under God” bill into law. As Eisenhower himself put it during his 1952 election campaign:

“What is our battle against communism if it is not a fight between anti-God and a belief in the Almighty?”

Embellishing the city on a hill

It was also during the Cold War that presidents began likening America to the biblical “city built on a hill” – all the better positioned, one presumes, to scour the horizon for incoming Soviet missiles. Kennedy was the first US president to use it. Reagan, adding his own embellishment of “shining,” would make it his, and many of his countrymen’s, own. Taken together, all this helped lay down a deep, implicit association between being un-religious and being un-American. Atheism itself bore the brunt of this, but it more generally ruled out as­sociated ideas and identities – including thinking of oneself as having “no religion” – as live options for the great majority of Americans.

Riven fault lines

Meanwhile, the cultural fault lines that begin obviously opening up in the late sixties – gender equality, sexual liberation – kept on widening, with new generations socialized into ever more liberal baselines. This created a growing values gap between traditional Christian views and the wider mainstream culture, on topics that were very personal to, and thus felt very deeply by, people on all sides. This meant that, while churches tended to be most visible on the 'conservative side' of various battlegrounds, they were also often deeply riven by internal versions of the same debates. Not surprisingly, church attendance, at least within Catholic and mainline churches, started falling steadily in the seventies and (except where immigration has helped fill the pews) has never really stopped.

The Internet of ideas and identities

On this basic account – and there is much that could be, and elsewhere has been, added to it – the thawing of the Cold War is obviously significant. Note that it is the Millennial generation, only the youngest of whom are able to remember the Cold War (and even then mostly from holiday reruns of Red Dawn and Rocky IV), who were at the vanguard of the rise of the nones. They were also the first generation to be true digital natives, opening many of them up to a much wider range of ideas and identities than hitherto available. This has been especially effective at chipping away the walls of some of America’s stronger religious subcultures. My ex-Mormon interviewees, especially, cite “the wonderful thing called the internet” as being “the game-changer”.

Serious discussion and South Park

The Millennials started coming of age, and indeed responding to pollsters’ surveys, in the early 2000s. This was also around the time when, arguably for the first time since maybe the hugely popular writer and speaker  Robert “The Great Agnostic” Ingersoll a century before, unbelief was being seriously discussed everywhere from primetime talkshows to episodes of South Park. The bestselling books of the New Atheists – principally Sam Harris, Richard Dawkins, Daniel Dennett, and Christopher Hitchens – evidently hit upon some long pent-up demand. They were also, significantly here, able to position atheism, and 'no religion' more generally, as a panacea for a world awash with religion. Harris, for example, makes much of how he started writing The End of Faith on September 12th. Dawkins made no secret about his wanting to run adverts with an image of the Twin Towers and the tagline “Imagine no religion…”.

Cultural space opens

Whatever one makes of such arguments, similar rhetorical moves would have had less intuitive appeal to earlier American generations, learning to duck and cover from atheists’ H-bombs: the stuff of Americans’ nightmares were now those with too much religion, rather than not enough. While the long term impact of the not-so-New Atheism is hard to judge – many nones are keen to distance themselves from what they saw as its “dogmatism” and “extremism”, even while agreeing with much of it – it certainly helped open up ‘cultural space’ for being both American and non-religious that the Cold War had (outside of various enclaves, such as college towns and certain big cities) largely sealed shut. As we have seen, it is one that a good quarter of American adults are quite comfortable placing themselves within.

So yes, new somethings indeed happened in the final years of the twentieth century and the first years of the twenty-first: and these helped drive the uptick of nones. But these happened at the same time as the none-inhibiting effects of a much earlier something had more or less worn off, especially among the emerging genera­tions most affected by the new somethings. It is this combination of factors— akin to pulling one foot off the brake as you slam the other down on the accelerator— that explains quite why the nones rose so suddenly and (seemingly) out of nowhere.  

 

Article
Assisted dying
Care
Comment
Ethics
6 min read

It's a dreadful thing when we regard the disabled, the dependent, and the different as disposable

A MND sufferer reflects on the historic vote to legalise assisted dying
A crowded House of Commons awaits a vote.
MPs await the result.
Parliament TV.

I can’t say I’m surprised, but I am disappointed. The euthanasia juggernaut has been gathering momentum throughout the western world. In this country it appeared as the Voluntary Euthanasia Society, to be later rebranded as the richly endowed Dignity in Dying. It’s been beavering away for decades, with well publicised personal stories and legal cases which have been very effective in persuading general opinion that dying is frequently nasty and that we should have the right to choose when and how to die. That organisation resisted using the term ‘suicide’, which is what they advocate, realising that it opens up the accusation of devaluing life. So, I’m not surprised that MPs have, after an impassioned debate, by a narrow majority, eventually given way to the pressure.

A fortnight ago, I had my annual check-up at the motor neurone disorder clinic and subsequently received the GP letter.

“Date seen 02/06/2025…  Diagnosis (this visit) Primary Lateral Sclerosis…  Symptom onset 2000”.

I well remember the year 2000, my voice deteriorating, my balance starting to fail me, resulting finally a year later in the consultant’s verdict, “You have a motor neurone disorder.”

I knew what that meant as at the time Diane Pretty, backed and publicised by the Voluntary Euthanasia Society, was fighting through the courts as far as the European Court of Human Rights for the right for her husband to take her to commit suicide in Switzerland in the Dignitas “clinic”. It was a frightening time to receive an MND diagnosis, and it still is today. The normal progression is both swift and relentless. However, the Motor Neurone Disease Association does say “in the majority of cases, death with MND is peaceful and dignified”.

At that time I could have been depressed; I could have known how much care I would need, how much it might eat into our savings; I could have feared the physical and emotional toll it would take on my wife; I could have been desperate about the future. Certainly I was vulnerable. Fortunately, I was of an optimistic nature and had plenty of reasons for living.

But it could easily have been otherwise. I might well have panicked and opted for a doctor to help me die, if the law debated in the Commons today was in effect. Then I wouldn’t have seen two sons getting married nor grandchildren being born and growing up. I would have missed out on twenty years of an increasingly restricted but paradoxically fulfilled life.

Of course you might argue that I’m ‘lucky’ to have, as became clear over the years, my exceptionally rare and slow form of MND, but I wasn’t to know that, as indeed none of us do despite our doctors’ best predictions. Indeed I am lucky to be alive.

However it was my experience that brought me face to face with the fact of my own mortality and the issue of assisted dying. There seemed to me to be four main drivers. First, the desire for autonomy; second, the insistence of independence; third, a sort of compassion, and fourth, finance. There were two further factors: fear of death and fear of being “a burden”.

Autonomy

It’s a modern western concept that humans are by nature autonomous beings, meaning that choice is an inalienable right. I once co-wrote a book with the title, I Choose Everything, based on a quote of Therèse of Lisieux. It was from a childhood incident, but it did not mean she reserved the right for total autonomy, but rather the opposite. As she later wrote, “I fear only one thing: to keep my own will; so take it, for ‘I choose all!’ that you (God) will!”

Absolute choice is not a virtue. Choosing where to drive your car is not a virtue as it can endanger other road users. There are many limitations on freedom or taboos that protect others in a society. Taking someone’s life directly or indirectly is a universal one. Individuals submitting to a higher authority holds a community and a nation together.  

Independence

Another related modern heresy is the ideal of independence. How utterly fatuous this is! None of us is born independent. We’re born relational. All of our lives we are interdependent. Being cared for is not to be lacking in dignity. Being 100% dependent does not deprive someone of their human dignity. Even the most disabled person is a human being made in the image of God. It is a dreadful thing when a society regards the disabled, the dependent, the different, the mentally deficient and the declining as inferior and potentially disposable. Of course the advocates of the Bill would vehemently deny that they or it implied any such thing. Yet the history of the twentieth century bears witness to how subtly a society can be seduced by the pernicious philosophy of eugenics.

Compassion

It is a modern paradox that medical advances have contributed to the illusion that death is to be feared. Yes, death has always been the last enemy and, yes, we hope it will be peaceful. But we shall all die. Contrary to received wisdom, the compassionate response to that fact of life is not to “put someone out of their misery”; compassion (literally suffering with) means to be with them in their suffering. This is what good palliative care provides, making the end of life dignified, worth living and even pain free.

As former Prime Minister Gordon Brown pertinently asked, “When only a small fraction of the population are expected to choose assisted dying, would it not be better to focus all our energies on improving all-round hospice care to reach everyone in need of end-of-life support?”

Finance

Of course palliative care costs more than facilitating patients to take their own lives. According to the Daily Mail “Legalising assisted dying would save the taxpayer £10million in NHS costs in its first year, rising to £60million after a decade, according to grim new estimates published by the government.” The estimates are indeed grim, but also attractive to politicians straining to balance the national budget. Yet they raise the fundamental question: do we want to live in a society which values money over life?

Which is the most fundamental of all the issues: the sanctity of life has been a core principle central to all the Abrahamic faiths, which undergird our culture and way of life. In the words of Job on hearing of the death of all his children, “The Lord gave and the Lord has taken away.” The start and end of life are not ours to determine. We lack the wisdom of God.

Apparently the majority of our parliamentarians have decided to place that prerogative into the hands of suggestible and distinctly fallible humans beings. We or our children shall, I fear, reap the whirlwind.

As an afterthought I have a number of friends who disagree with me, often after personal experience of watching a loved one die. I sympathise and I suppose that I must be glad for them that the MPs have represented their wishes. And I would never condemn them if they decided to choose the route of assisted dying for themselves. I hope they won’t have to.

Meanwhile I trust that, when the Bill comes to the upper house, their Lordships will fulfil their function of revising it wisely and effectively. They certainly have relevant expertise, for example in the field of palliative care - which is in danger of being squeezed following this bill.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief