Explainer
America
Comment
6 min read

The Cold War, the Internet and America’s nones

How does a culture lose religion so rapidly? Stephen Bullivant investigates the American phenomenon of ‘nonversion'.

Stephen Bullivant is a professor of theology and sociology at St Mary’s University, UK, and the University of Notre Dame, Australia.

Image from the series South Park

Those even passingly familiar with American religious trends over the past couple of decades will have read or heard the phrase ‘rise of the nones’ countless times. And with good reason. While in Britain the proportion of people telling pollsters they have ‘no religion’ grew rapidly over the twentieth century – it was already 43% when the British Social Attitudes survey began in 1983; it tipped 50% a little over a decade later – in America the figure stayed stubbornly low. According to Gallup polls, only 5% of American adults identified with no religion in 1975. Twenty years later, in 1995, it was still 5%.

But then, seemingly very suddenly, things started to change. Beginning in the late nineties, then rapidly accelerating in the early 2000s, each new survey showed the nones getting bigger and bigger. Depending on which survey one looks at, nones now account for  somewhere between a quarter and third of American adults. Even at the lower end, that amounts to some 60 million people. And they’re still growing.

This raises a natural question: Why now? Or rather, what is it about the nineties and early 2000s that pushed or pulled large swathes out of thinking of themselves as religious? Various ways of measuring American religiosity all indicate that something significant must have happened around then. But what

A prior, deeper puzzle

That, at least, is the obvious way of approaching things. And to be fair, it has much to recommend it: something, or rather a combination of somethings, certainly did happen to American religion in those critical years. But this in itself raises a prior, deeper puzzle: why hadn’t the numbers of American nones already risen before the late nineties or early naughts? In all manner of other, quasi-comparable countries – Britain, Canada, Australia, France – the nones started growing steadily from the 1960s onwards. Yet while the sixties had all manner of other disruptive and destabilizing effects on American culture, society, politics, and religion, the proportion of nones grew only a little bit, then stopped.

At the risk of gross oversimplification, if one were to look for a sufficiently big ‘something’ within American society, mostly absent from those other countries, which could plausibly have kept non-religiosity artificially low in these decades, then there is an obvious candidate: the Cold War. Or more specifically, the precise and peculiarly religious way in which it was framed in the USA. 

A final, all-out battle

We Brits were as up to our neck in the Cold War as anyone. But only in America, I think, was the Cold War ever popularly framed as a “final, all-out battle between commu­nistic atheism and Christianity”, to quote Republican Senator Joseph McCarthy. Remember too that it was only in the mid-1950s that Congress adopted “In God We Trust” as America’s official motto, and “under God” was added to the Pledge. During the Pledge debates in Congress, the Democrat Louis C. Rabaut’s summed up a common view on both sides of the aisle:

“You may argue from dawn to dusk about differing po­litical, economic, and social systems but the fundamental issue which is the unbridgeable gap between America and Communist Russia is a belief in almighty God.”

This wasn’t just an issue with wide bipartisan and popular support view, it was thoroughly ecumenical too. While McCarthy and Rabaut were Catholics, it was a Presbyterian president, Eisenhower, who signed the “under God” bill into law. As Eisenhower himself put it during his 1952 election campaign:

“What is our battle against communism if it is not a fight between anti-God and a belief in the Almighty?”

Embellishing the city on a hill

It was also during the Cold War that presidents began likening America to the biblical “city built on a hill” – all the better positioned, one presumes, to scour the horizon for incoming Soviet missiles. Kennedy was the first US president to use it. Reagan, adding his own embellishment of “shining,” would make it his, and many of his countrymen’s, own. Taken together, all this helped lay down a deep, implicit association between being un-religious and being un-American. Atheism itself bore the brunt of this, but it more generally ruled out as­sociated ideas and identities – including thinking of oneself as having “no religion” – as live options for the great majority of Americans.

Riven fault lines

Meanwhile, the cultural fault lines that begin obviously opening up in the late sixties – gender equality, sexual liberation – kept on widening, with new generations socialized into ever more liberal baselines. This created a growing values gap between traditional Christian views and the wider mainstream culture, on topics that were very personal to, and thus felt very deeply by, people on all sides. This meant that, while churches tended to be most visible on the 'conservative side' of various battlegrounds, they were also often deeply riven by internal versions of the same debates. Not surprisingly, church attendance, at least within Catholic and mainline churches, started falling steadily in the seventies and (except where immigration has helped fill the pews) has never really stopped.

The Internet of ideas and identities

On this basic account – and there is much that could be, and elsewhere has been, added to it – the thawing of the Cold War is obviously significant. Note that it is the Millennial generation, only the youngest of whom are able to remember the Cold War (and even then mostly from holiday reruns of Red Dawn and Rocky IV), who were at the vanguard of the rise of the nones. They were also the first generation to be true digital natives, opening many of them up to a much wider range of ideas and identities than hitherto available. This has been especially effective at chipping away the walls of some of America’s stronger religious subcultures. My ex-Mormon interviewees, especially, cite “the wonderful thing called the internet” as being “the game-changer”.

Serious discussion and South Park

The Millennials started coming of age, and indeed responding to pollsters’ surveys, in the early 2000s. This was also around the time when, arguably for the first time since maybe the hugely popular writer and speaker  Robert “The Great Agnostic” Ingersoll a century before, unbelief was being seriously discussed everywhere from primetime talkshows to episodes of South Park. The bestselling books of the New Atheists – principally Sam Harris, Richard Dawkins, Daniel Dennett, and Christopher Hitchens – evidently hit upon some long pent-up demand. They were also, significantly here, able to position atheism, and 'no religion' more generally, as a panacea for a world awash with religion. Harris, for example, makes much of how he started writing The End of Faith on September 12th. Dawkins made no secret about his wanting to run adverts with an image of the Twin Towers and the tagline “Imagine no religion…”.

Cultural space opens

Whatever one makes of such arguments, similar rhetorical moves would have had less intuitive appeal to earlier American generations, learning to duck and cover from atheists’ H-bombs: the stuff of Americans’ nightmares were now those with too much religion, rather than not enough. While the long term impact of the not-so-New Atheism is hard to judge – many nones are keen to distance themselves from what they saw as its “dogmatism” and “extremism”, even while agreeing with much of it – it certainly helped open up ‘cultural space’ for being both American and non-religious that the Cold War had (outside of various enclaves, such as college towns and certain big cities) largely sealed shut. As we have seen, it is one that a good quarter of American adults are quite comfortable placing themselves within.

So yes, new somethings indeed happened in the final years of the twentieth century and the first years of the twenty-first: and these helped drive the uptick of nones. But these happened at the same time as the none-inhibiting effects of a much earlier something had more or less worn off, especially among the emerging genera­tions most affected by the new somethings. It is this combination of factors— akin to pulling one foot off the brake as you slam the other down on the accelerator— that explains quite why the nones rose so suddenly and (seemingly) out of nowhere.  

 

Article
Comment
Justice
Leading
Politics
5 min read

The consequences of truth-telling are so severe our leaders can’t admit their mistakes

When accountability means annihilation, denial is the only way to survive
A woman talks in an interivew.
Baroness Casey.
BBC.

Why do our leaders struggle so profoundly with admitting error? 

Media and inquiries regularly report on such failures in the NHS, the Home Office, the Department of Work and Pensions, HMRC, the Metropolitan Police, the Ministry of Defence, and so many more public institutions. Often accompanied by harrowing personal stories of the harm done. 

In a recent white paper (From harm to healing: rebuilding trust in Britain’s publicly funded institutions), I defined “harm” as a holistic concept occurring where physical injury or mental distress is committed and sustained and explained that harm is generally something that is caused, possibly resulting in injury or loss of life.  

When we look at harm from an institutional perspective, structural power dynamics inevitably oppress certain groups, limit individual freedoms, and negatively affect the safety and security of individuals. But when we look at it through the lens of the individuals who run those institutions, we see people who often believe that they are acting in good faith, believe that their decisions won’t have a significant impact, who don’t have time to think about the decisions they are making, or worse still, prefer to protect what is in their best interest.  

Even well-intentioned leaders can become complicit in cycles of harm - not just through malice, but through their lack of self-awareness and unwillingness to put themselves in the shoes of the person on the receiving end of their decisions.  

Martin Luther King Jr supposedly said, “the ultimate measure of a man is not where he stands in moments of comfort and convenience, but where he stands at times of challenge and controversy.” In contemporary politics, leaders are neither selected nor (largely) do they remain, because of their humility. Humility is synonymous with weakness and showing weakness must be avoided at all cost. Responsibility is perceived as something that lies outside of us, rather than something we can take ownership of from within.  

So, why do leaders struggle so profoundly with admitting error? 

The issue is cultural and three-fold. 

First, we don’t quantify or systematically address human error, allowing small mistakes to escalate. 

We then enable those responsible to evade accountability through institutional protection and legal barriers. 

Finally, we actively discourage truth-telling by punishing whistle-blowers rather than rewarding transparency. Taken together, these create the very conditions that transform errors into institutional harm.  

Nowhere is this plainer than in Baroness Casey’s recent report on Group-based Child Sexual Exploitation and Abuse that caused the Government to announce a grooming gangs inquiry. In this case, the initial harm was compounded by denial and obfuscation, resulting not just in an institutional failure to protect children, but system-wide failures that have enabled the so-called “bad actors” to remain in situ. 

Recently, this trend was bucked at Countess of Chester Hospital where the police arrested three hospital managers involved in the Lucy Letby investigation. Previously, senior leadership had been protected, thus allowing them to evade accountability. Humble leadership would look like acting when concerns are raised before they become scandals. However, in this case, leadership did act; they chose to bury the truth rather than believe the whistle-blowers.

Until we separate admission of error from institutional destruction, we will continue to incentivise the very cover-ups that erode public trust. 

The answer to our conundrum is obvious. In Britain, accountability is conflated with annihilation. Clinging onto power is the only option because admitting error has become synonymous with career suicide, legal liability, and is tantamount to being hanged in the gallows of social media. We have managed to create systems of governing where the consequences of truth-telling are so severe that denial is the only survival mechanism left. We have successfully weaponised accountability rather than understanding it as the foundation of trust. 

If Rotherham Metropolitan Borough Council had admitted even half of the failures Alexis Jay OBE identified in her 2013 report and that Baroness Casey identifies in her 2025 audit, leaders would face not only compensation claims but media storms, regulatory sanctions, and individual prosecutions. It’s so unthinkable to put someone through that that we shrink back with empathy as to why someone might not speak up. But this is not justice. Justice is what the families of Hillsborough have been seeking in the Public Authority (Accountability) Bill: legal duties of candour, criminal offences for those who deliberately mislead investigations or cover-up service failures, legal representation, and appropriate disclosure of documentation. 

Regardless of your political persuasion, it has to be right that when police misconduct occurs, officers should fear not only disciplinary action and criminal charges. When politicians admit mistakes, they should face calls for their resignation. Public vilification is par for the course. Being ejected from office is the bare minimum required to take accountability for their actions.  

The white paper shows that the cover-up always causes more damage than the original error. Institutional denial - whether relating to the Post Office sub-postmasters, the infected blood scandal victims, grooming gang victims, Grenfell Towers victims, Windrush claimants, or Hillsborough families - compounds the original harm exponentially.  

In a society beset with blame, shame, and by fame, it is extraordinary that this struggle to admit error is so pervasive. Survivors can and will forgive human fallibility. What they will not forgive is the arrogance of institutions that refuse to acknowledge when they have caused harm.  

The white paper refers to a four-fold restorative framework that starts with acknowledgment, not punishment. The courage to say “we were wrong” is merely the first step. Next is apology and accountability followed by amends. It recognises that healing - not just legal resolution - must be at the heart of justice, treating both those harmed and those who caused it as whole human beings deserving of dignity.  

Until we separate admission of error from institutional destruction, we will continue to incentivise the very cover-ups that erode public trust. I was recently struck by Baroness Onora O’Neill who insisted that we must demand trustworthiness in our leaders. We cannot have trustworthiness without truth-telling, and we cannot have that without valuing the act of repairing harm over reputation management. True authority comes from service, through vulnerability rather than invulnerability; strength comes through the acknowledgement of weakness not the projection of power.  

We must recognise that those entrusted with power have a moral obligation to those they serve. That obligation transcends institutional self-interest. Thus, we must stop asking why leaders struggle to admit error and instead ask why we have made truth-telling so dangerous that lies seem safer.