Explainer
America
Comment
6 min read

The Cold War, the Internet and America’s nones

How does a culture lose religion so rapidly? Stephen Bullivant investigates the American phenomenon of ‘nonversion'.

Stephen Bullivant is a professor of theology and sociology at St Mary’s University, UK, and the University of Notre Dame, Australia.

Image from the series South Park

Those even passingly familiar with American religious trends over the past couple of decades will have read or heard the phrase ‘rise of the nones’ countless times. And with good reason. While in Britain the proportion of people telling pollsters they have ‘no religion’ grew rapidly over the twentieth century – it was already 43% when the British Social Attitudes survey began in 1983; it tipped 50% a little over a decade later – in America the figure stayed stubbornly low. According to Gallup polls, only 5% of American adults identified with no religion in 1975. Twenty years later, in 1995, it was still 5%.

But then, seemingly very suddenly, things started to change. Beginning in the late nineties, then rapidly accelerating in the early 2000s, each new survey showed the nones getting bigger and bigger. Depending on which survey one looks at, nones now account for  somewhere between a quarter and third of American adults. Even at the lower end, that amounts to some 60 million people. And they’re still growing.

This raises a natural question: Why now? Or rather, what is it about the nineties and early 2000s that pushed or pulled large swathes out of thinking of themselves as religious? Various ways of measuring American religiosity all indicate that something significant must have happened around then. But what

A prior, deeper puzzle

That, at least, is the obvious way of approaching things. And to be fair, it has much to recommend it: something, or rather a combination of somethings, certainly did happen to American religion in those critical years. But this in itself raises a prior, deeper puzzle: why hadn’t the numbers of American nones already risen before the late nineties or early naughts? In all manner of other, quasi-comparable countries – Britain, Canada, Australia, France – the nones started growing steadily from the 1960s onwards. Yet while the sixties had all manner of other disruptive and destabilizing effects on American culture, society, politics, and religion, the proportion of nones grew only a little bit, then stopped.

At the risk of gross oversimplification, if one were to look for a sufficiently big ‘something’ within American society, mostly absent from those other countries, which could plausibly have kept non-religiosity artificially low in these decades, then there is an obvious candidate: the Cold War. Or more specifically, the precise and peculiarly religious way in which it was framed in the USA. 

A final, all-out battle

We Brits were as up to our neck in the Cold War as anyone. But only in America, I think, was the Cold War ever popularly framed as a “final, all-out battle between commu­nistic atheism and Christianity”, to quote Republican Senator Joseph McCarthy. Remember too that it was only in the mid-1950s that Congress adopted “In God We Trust” as America’s official motto, and “under God” was added to the Pledge. During the Pledge debates in Congress, the Democrat Louis C. Rabaut’s summed up a common view on both sides of the aisle:

“You may argue from dawn to dusk about differing po­litical, economic, and social systems but the fundamental issue which is the unbridgeable gap between America and Communist Russia is a belief in almighty God.”

This wasn’t just an issue with wide bipartisan and popular support view, it was thoroughly ecumenical too. While McCarthy and Rabaut were Catholics, it was a Presbyterian president, Eisenhower, who signed the “under God” bill into law. As Eisenhower himself put it during his 1952 election campaign:

“What is our battle against communism if it is not a fight between anti-God and a belief in the Almighty?”

Embellishing the city on a hill

It was also during the Cold War that presidents began likening America to the biblical “city built on a hill” – all the better positioned, one presumes, to scour the horizon for incoming Soviet missiles. Kennedy was the first US president to use it. Reagan, adding his own embellishment of “shining,” would make it his, and many of his countrymen’s, own. Taken together, all this helped lay down a deep, implicit association between being un-religious and being un-American. Atheism itself bore the brunt of this, but it more generally ruled out as­sociated ideas and identities – including thinking of oneself as having “no religion” – as live options for the great majority of Americans.

Riven fault lines

Meanwhile, the cultural fault lines that begin obviously opening up in the late sixties – gender equality, sexual liberation – kept on widening, with new generations socialized into ever more liberal baselines. This created a growing values gap between traditional Christian views and the wider mainstream culture, on topics that were very personal to, and thus felt very deeply by, people on all sides. This meant that, while churches tended to be most visible on the 'conservative side' of various battlegrounds, they were also often deeply riven by internal versions of the same debates. Not surprisingly, church attendance, at least within Catholic and mainline churches, started falling steadily in the seventies and (except where immigration has helped fill the pews) has never really stopped.

The Internet of ideas and identities

On this basic account – and there is much that could be, and elsewhere has been, added to it – the thawing of the Cold War is obviously significant. Note that it is the Millennial generation, only the youngest of whom are able to remember the Cold War (and even then mostly from holiday reruns of Red Dawn and Rocky IV), who were at the vanguard of the rise of the nones. They were also the first generation to be true digital natives, opening many of them up to a much wider range of ideas and identities than hitherto available. This has been especially effective at chipping away the walls of some of America’s stronger religious subcultures. My ex-Mormon interviewees, especially, cite “the wonderful thing called the internet” as being “the game-changer”.

Serious discussion and South Park

The Millennials started coming of age, and indeed responding to pollsters’ surveys, in the early 2000s. This was also around the time when, arguably for the first time since maybe the hugely popular writer and speaker  Robert “The Great Agnostic” Ingersoll a century before, unbelief was being seriously discussed everywhere from primetime talkshows to episodes of South Park. The bestselling books of the New Atheists – principally Sam Harris, Richard Dawkins, Daniel Dennett, and Christopher Hitchens – evidently hit upon some long pent-up demand. They were also, significantly here, able to position atheism, and 'no religion' more generally, as a panacea for a world awash with religion. Harris, for example, makes much of how he started writing The End of Faith on September 12th. Dawkins made no secret about his wanting to run adverts with an image of the Twin Towers and the tagline “Imagine no religion…”.

Cultural space opens

Whatever one makes of such arguments, similar rhetorical moves would have had less intuitive appeal to earlier American generations, learning to duck and cover from atheists’ H-bombs: the stuff of Americans’ nightmares were now those with too much religion, rather than not enough. While the long term impact of the not-so-New Atheism is hard to judge – many nones are keen to distance themselves from what they saw as its “dogmatism” and “extremism”, even while agreeing with much of it – it certainly helped open up ‘cultural space’ for being both American and non-religious that the Cold War had (outside of various enclaves, such as college towns and certain big cities) largely sealed shut. As we have seen, it is one that a good quarter of American adults are quite comfortable placing themselves within.

So yes, new somethings indeed happened in the final years of the twentieth century and the first years of the twenty-first: and these helped drive the uptick of nones. But these happened at the same time as the none-inhibiting effects of a much earlier something had more or less worn off, especially among the emerging genera­tions most affected by the new somethings. It is this combination of factors— akin to pulling one foot off the brake as you slam the other down on the accelerator— that explains quite why the nones rose so suddenly and (seemingly) out of nowhere.  

 

Article
Comment
Digital
General Election 24
Politics
4 min read

Are we really our vote?

Elections exacerbates the worst of our digital personality.

Jamie is Vicar of St Michael's Chester Square, London.

A AI generaed montage shows two politicans back to back surrounded by like, share and angry icons.
The divide
Nick Jones/Midjourney.ai.

All the world’s a stage. Never more so than in a general election. Amidst the usual stunts and gimmicks of political leaders in election season (and much of the drama unintended or badly scripted) we too have become the performers. It doesn’t matter that Rishi and Keir are ‘boring’ - the digital space has created platforms for us also to posture and present our political positions. But in acting for the crowd, I worry that we’re losing a sense of who we are. 

If fame is the mask that eats the face of its wearer, then we’re all at risk of losing ourselves. Absurd! You might say, I’m not famous! But we have become mini celebrities to our tens and tens, if not hundreds or thousands of followers. Every post, story, or reel is an opportunity to project who we are and what we’re about, and what we think. Times columnist James Marriott goes so far as to write that ‘the root of our modern problem is the way opinion has become bound up with identity. In the absence of religious or community affiliations our opinions have become crucial to our sense of self.’ 

A recent study by New York University shows that many people in America are starting with politics as their basis for their identity. They say, "I'm a Democrat or a Republican first and foremost", and then shifting parts of their identity around like ethnicity and religion to suit their political identity. I’ve stopped being surprised when I see someone’s Twitter bio listing their ideology before anything else that might be core to their identity. But are we really our vote, or is there more to us than that? 

The platform is a precarious place to position yourself, as is the harsh glare of the smartphone blue light. 

If politics is the mask that we are presenting to the world, then we are engaging in a hollowing out of our representative democracy. Who needs an MP if we’re all directly involved? Don't get me wrong – I'm not in favour of apathy, inaction, or even lack of protest. But we elect members of parliament because we can’t all be directly engaged all of the time. Speaking all the time, about all of the things. Strong opinions used to be the possessions of those who had too much time on their hands… now you can be busy watch and pass on a meme in a matter of seconds without proper reflection and engagement. And so we’ve imported the very worst of student politics into our everyday digital lives and identities. 

Student politics is the often-formative, immature peacocking of ideologies one way or the other. It also often reduces others to caricatures, and the campus culture has increasingly become one that cancels rather than listens and illuminates. And so, the loudest voices dominate and intimidate others to comply. Someone I barely know recently sent me an invitation to reshare a strong opinion on social media. We’ve never spoken about this topic, and they have no idea if I've in fact developed an opinion on it. Marriott writes, ‘For many, an opinion has achieved the status of a positive moral duty… the implication: to reserve judgement is to sin.’ And without a merciful judge, sin means shame: not just what I do is bad, but who I am is bad too. 

The dopamine hit we get from these short bursts of antisocial media use is killing us. Martin Amis said that 'Being inoffensive, and being offended, are now the twin addictions of the culture.' That was 1996. Now engaging in politics in the era of the smartphone, we are addicted to the current age’s offended/being inoffensive dichotomy. Like the drug that it is, wrongly used, it will disfigure us as it propels us to play the roles the crowds want. The platform is a precarious place to position yourself, as is the harsh glare of the smartphone blue light.  

Every general election transforms the wooden floorboards of school halls into holy ground. 

Countless commentators have offered the wisdom that you are who you are when nobody’s watching. But we’re all watching, all the time. First, we had the Twitter election, then the Facebook election, and now political parties have recently launched accounts on TikTok (all the while wondering if they are going to try to ban it). What we need is a post-social media election. If the world is facing impending doom, then we don’t need doomscrolling to help. Whether it’s activism or slacktivism, our politics need not be our identity. We need a greater light source that reveals our truest selves, and helps us to be fully ourselves. This ‘audience of one’ is a much simpler, if not easier, way to live. 

After all, a secret ballot means nobody’s watching, and we don’t have to broadcast our vote, unless we really want to. On the 4th July, the ‘only poll that matters’ is private. We step out of the spotlights of our screens, and we cast a vote for the kind of leaders we want. Every general election transforms the wooden floorboards of school halls into holy ground. 

We’d do well to treat the online world as a sacred space too, and each person as a sacred person. Perhaps it’s time not only for a general election, but also a personal election: to step out of the spotlight, and the light of our phones, and quietly cast a vote for who we want to be.