Explainer
America
Comment
6 min read

The Cold War, the Internet and America’s nones

How does a culture lose religion so rapidly? Stephen Bullivant investigates the American phenomenon of ‘nonversion'.

Stephen Bullivant is a professor of theology and sociology at St Mary’s University, UK, and the University of Notre Dame, Australia.

Image from the series South Park

Those even passingly familiar with American religious trends over the past couple of decades will have read or heard the phrase ‘rise of the nones’ countless times. And with good reason. While in Britain the proportion of people telling pollsters they have ‘no religion’ grew rapidly over the twentieth century – it was already 43% when the British Social Attitudes survey began in 1983; it tipped 50% a little over a decade later – in America the figure stayed stubbornly low. According to Gallup polls, only 5% of American adults identified with no religion in 1975. Twenty years later, in 1995, it was still 5%.

But then, seemingly very suddenly, things started to change. Beginning in the late nineties, then rapidly accelerating in the early 2000s, each new survey showed the nones getting bigger and bigger. Depending on which survey one looks at, nones now account for  somewhere between a quarter and third of American adults. Even at the lower end, that amounts to some 60 million people. And they’re still growing.

This raises a natural question: Why now? Or rather, what is it about the nineties and early 2000s that pushed or pulled large swathes out of thinking of themselves as religious? Various ways of measuring American religiosity all indicate that something significant must have happened around then. But what

A prior, deeper puzzle

That, at least, is the obvious way of approaching things. And to be fair, it has much to recommend it: something, or rather a combination of somethings, certainly did happen to American religion in those critical years. But this in itself raises a prior, deeper puzzle: why hadn’t the numbers of American nones already risen before the late nineties or early naughts? In all manner of other, quasi-comparable countries – Britain, Canada, Australia, France – the nones started growing steadily from the 1960s onwards. Yet while the sixties had all manner of other disruptive and destabilizing effects on American culture, society, politics, and religion, the proportion of nones grew only a little bit, then stopped.

At the risk of gross oversimplification, if one were to look for a sufficiently big ‘something’ within American society, mostly absent from those other countries, which could plausibly have kept non-religiosity artificially low in these decades, then there is an obvious candidate: the Cold War. Or more specifically, the precise and peculiarly religious way in which it was framed in the USA. 

A final, all-out battle

We Brits were as up to our neck in the Cold War as anyone. But only in America, I think, was the Cold War ever popularly framed as a “final, all-out battle between commu­nistic atheism and Christianity”, to quote Republican Senator Joseph McCarthy. Remember too that it was only in the mid-1950s that Congress adopted “In God We Trust” as America’s official motto, and “under God” was added to the Pledge. During the Pledge debates in Congress, the Democrat Louis C. Rabaut’s summed up a common view on both sides of the aisle:

“You may argue from dawn to dusk about differing po­litical, economic, and social systems but the fundamental issue which is the unbridgeable gap between America and Communist Russia is a belief in almighty God.”

This wasn’t just an issue with wide bipartisan and popular support view, it was thoroughly ecumenical too. While McCarthy and Rabaut were Catholics, it was a Presbyterian president, Eisenhower, who signed the “under God” bill into law. As Eisenhower himself put it during his 1952 election campaign:

“What is our battle against communism if it is not a fight between anti-God and a belief in the Almighty?”

Embellishing the city on a hill

It was also during the Cold War that presidents began likening America to the biblical “city built on a hill” – all the better positioned, one presumes, to scour the horizon for incoming Soviet missiles. Kennedy was the first US president to use it. Reagan, adding his own embellishment of “shining,” would make it his, and many of his countrymen’s, own. Taken together, all this helped lay down a deep, implicit association between being un-religious and being un-American. Atheism itself bore the brunt of this, but it more generally ruled out as­sociated ideas and identities – including thinking of oneself as having “no religion” – as live options for the great majority of Americans.

Riven fault lines

Meanwhile, the cultural fault lines that begin obviously opening up in the late sixties – gender equality, sexual liberation – kept on widening, with new generations socialized into ever more liberal baselines. This created a growing values gap between traditional Christian views and the wider mainstream culture, on topics that were very personal to, and thus felt very deeply by, people on all sides. This meant that, while churches tended to be most visible on the 'conservative side' of various battlegrounds, they were also often deeply riven by internal versions of the same debates. Not surprisingly, church attendance, at least within Catholic and mainline churches, started falling steadily in the seventies and (except where immigration has helped fill the pews) has never really stopped.

The Internet of ideas and identities

On this basic account – and there is much that could be, and elsewhere has been, added to it – the thawing of the Cold War is obviously significant. Note that it is the Millennial generation, only the youngest of whom are able to remember the Cold War (and even then mostly from holiday reruns of Red Dawn and Rocky IV), who were at the vanguard of the rise of the nones. They were also the first generation to be true digital natives, opening many of them up to a much wider range of ideas and identities than hitherto available. This has been especially effective at chipping away the walls of some of America’s stronger religious subcultures. My ex-Mormon interviewees, especially, cite “the wonderful thing called the internet” as being “the game-changer”.

Serious discussion and South Park

The Millennials started coming of age, and indeed responding to pollsters’ surveys, in the early 2000s. This was also around the time when, arguably for the first time since maybe the hugely popular writer and speaker  Robert “The Great Agnostic” Ingersoll a century before, unbelief was being seriously discussed everywhere from primetime talkshows to episodes of South Park. The bestselling books of the New Atheists – principally Sam Harris, Richard Dawkins, Daniel Dennett, and Christopher Hitchens – evidently hit upon some long pent-up demand. They were also, significantly here, able to position atheism, and 'no religion' more generally, as a panacea for a world awash with religion. Harris, for example, makes much of how he started writing The End of Faith on September 12th. Dawkins made no secret about his wanting to run adverts with an image of the Twin Towers and the tagline “Imagine no religion…”.

Cultural space opens

Whatever one makes of such arguments, similar rhetorical moves would have had less intuitive appeal to earlier American generations, learning to duck and cover from atheists’ H-bombs: the stuff of Americans’ nightmares were now those with too much religion, rather than not enough. While the long term impact of the not-so-New Atheism is hard to judge – many nones are keen to distance themselves from what they saw as its “dogmatism” and “extremism”, even while agreeing with much of it – it certainly helped open up ‘cultural space’ for being both American and non-religious that the Cold War had (outside of various enclaves, such as college towns and certain big cities) largely sealed shut. As we have seen, it is one that a good quarter of American adults are quite comfortable placing themselves within.

So yes, new somethings indeed happened in the final years of the twentieth century and the first years of the twenty-first: and these helped drive the uptick of nones. But these happened at the same time as the none-inhibiting effects of a much earlier something had more or less worn off, especially among the emerging genera­tions most affected by the new somethings. It is this combination of factors— akin to pulling one foot off the brake as you slam the other down on the accelerator— that explains quite why the nones rose so suddenly and (seemingly) out of nowhere.  

 

Essay
Character
Comment
Language
6 min read

Our language use is leading to a cultural abyss

We are witnessing a profound loss of commitment and discernment in the use of language, writes Oliver Wright.

After 15 years as a lawyer in London, Oliver is currently doing a DPhil at the University of Oxford.

Four rugby players stand and watch beside a referee gesturing with his arm.
Rugby players wait upon Wayne Barnes' word.
RFU.

The 2023 Rugby Union World Cup Final was one of the most iconic international matches in living memory, involving two of the most iconic teams – the All Blacks and the Springboks. It’s not surprising that after reaching such a pinnacle of a sporting career, there should be retirements that followed. But two retirements caught my eye. Not from players, but from referees: Wayne Barnes, the most experienced international referee in the world, the main match official, and Tom Foley, also highly experienced, the Television Match Official. Why? Wayne Barnes’s statement is particularly gracious and thoughtful. But the reason given in common with Tom Foley, and indeed many others in similar situations and similar high-pressure roles in the public eye, is worrying: online abuse. After the cup final, death threats were even sent to the school of Foley’s children.   

Online abuse has become an endemic, worldwide problem. There are real people issuing these threats and abuse; and there are real people receiving them, and responding in some way. Of course, there is also the problem of online ‘bots’. But they only succeed in their abuse because of their imitation of real abusers.  

It’s worth asking why, because we can go beyond the helpless handwringing of ‘the perils of being online’. There are philosophical and indeed theological reasons, and philosophical and theological ways, I suggest, of climbing out of the abyss.   

In fact, all words ‘act’ in some way. Even plain truth-describers assert something, such that an interlocuter can learn or discern for themselves. 

Let’s go back to the 1950s, when two important advances in the philosophy of language and in religious language occurred. The first came from Oxford, and the White’s Professor of Philosophy, J.L. Austin. The second came from Durham, and its then Bishop, Ian Ramsey.  

Austin, whose remarkable life and work has now been brilliantly documented for the first time in the biography by Mark Rowe (published by OUP, 2023) was a decorated Second World War veteran in the intelligence corps who was widely recognised as being one of the masterminds of the success of the D-Day Landings. On his return to Oxford in the late 1940s he perceived with great dissatisfaction a certain philosophical move which accorded the greatest importance in language to words and phrases which described things, which indicated some form of empirical truth about the world. For sure there were other kinds of use of language, religious language, emotional language, and so on, this argument continued. But that was fairly worthless. Describing cold hard scientific truth was the true utility for language.  

Austin’s most famous response was in his book How To Do Things With Words. The function of language goes way beyond the scientific description of the world. Language acts, it does things. We promise, we name, we cajole, we threaten, we apologise, we bet. There is no real ‘truth’ as such conveyed in such ‘speech-acts’. Their importance lies, rather, in what is thereby done, the act initiated by the words themselves. Or, in the Austin-ian jargon, the ‘illocution’ within the ‘locution’.   

But Austin realised something even more important as he investigated this form of language – these performative utterances. In fact, all words ‘act’ in some way. Even plain truth-describers assert something, such that an interlocuter can learn or discern for themselves. What matters is how ‘forceful’ the relevant act of speech is in each case. Sometimes the speech-act is very simple and limited. In other cases, such as threats, the performative aspect of the utterance is most forceful indeed.   

Austin’s student John Searle took the idea of performative language to America, and developed it considerably. Most notable for our purposes, however, over against Austin’s idea, was the separation of speech from act. By analysing the conventions and circumstances which surround the performance of a speech act – a baptism service for instance – we can observe how and why the act occurs, and how and why such an act might go wrong. But the debate was then divorced from the context of speakers themselves performing such actions, an integrity of speaker and action. The philosophical problem we then hit, therefore, is that a spoken word and the associated act (‘locution’ and ‘illocution’) are two entirely separate ‘acts’.  

Let’s move now from Oxford to the great Cathedral city of Durham. At the same time as Austin was teaching in Oxford, the Bishop of Durham Ian Ramsey – apparently unaware of Austin’s new theory of performatives – investigated religious language to try and get to grips with both how religious language does things, and what it says of its speakers and writers. Ramsey developed a two-fold typology for religious language – that of commitment and discernment. First, religious language implies two forms of commitment: there is the speaker/writer’s commitment of communicability, a desire to communicate, to be comprehensible, to ‘commune through language’; and the speaker/writer of religious language also  entertains prior commitments for the language adopted – language is rarely neutral when it comes to religion. Second, religious language implies a form of discernment about the words that are being invoked and for what purpose. They are not universals, but carry special meanings according to the particular conventions involved. Commitment and discernment.  

But this new innovation in the philosophy of religious language too was taken up and developed away from Ramsey’s idea – particularly in the much more famous work of John MacQuarrie, a Scottish philosophical theologian who spent much time teaching both in the States, and in Oxford. In MacQuarrie, writing at the height of the influence of thinkers such as Heidegger and Bultmann, Ramsey’s ‘commitment’ and ‘discernment’ got subsumed into existentialism and myth. The religious speech act became merely an event or an act for the self, a personal matter which might involve transformation, but might not.  

 These two strands, of the philosophy of language as it got taken up by Searle and his American counterparts, and of the philosophy of religious language as it got taken up by MacQuarrie, have for some time now predominated. And it is only recently that scholars on both sides have begun to perform a ressourcement, both on Austin, and on the nature of religious language in the wake of Bultmann.  

 The Twitter-sphere seems irrevocably to have divorced the bonds that tie speaker to their acts. In these fertile conditions, abuse flourishes. 

We can now return to the cases of Wayne Barnes and Tom Foley, and many others in many different walks of life just like them. Undoubtedly, the emotional, existential, and physical distance secured by interacting online has created the conditions for online abuse to flourish. But at a deeper level, what we are witnessing is a profound loss of commitment and discernment in the use of language, in society as a whole and also in the Church. Real people feel free to use language oblivious to any inherent act contained within it. The Twitter-sphere seems irrevocably to have divorced the bonds that tie speaker to their acts. In these fertile conditions, abuse flourishes. Similarly, in the Church, the commitment and discernment which has lain behind millennia of liturgical and doctrinal language has become a private spiritual matter; or indeed has been neglected in public when religious witness has not been matched between word and deed.  

How do we walk back from this cultural abyss? There is an ethical, and, potentially, a religious choice to make. The ethical choice is to think about what our language does to those who read (or hear) it, and to change the way we speak or write, accordingly. Ramsey's modes of ‘commitment’ and ‘discernment’. The religious dimension is to recognise that our words bind us to a system of belief, whether we like it or not. Saying one thing and doing another in a religious context implies a diminution in value of language for all concerned, not just the private life of the individual believer.  

Actions speak louder with words.