Explainer
America
Comment
6 min read

The Cold War, the Internet and America’s nones

How does a culture lose religion so rapidly? Stephen Bullivant investigates the American phenomenon of ‘nonversion'.

Stephen Bullivant is a professor of theology and sociology at St Mary’s University, UK, and the University of Notre Dame, Australia.

Image from the series South Park

Those even passingly familiar with American religious trends over the past couple of decades will have read or heard the phrase ‘rise of the nones’ countless times. And with good reason. While in Britain the proportion of people telling pollsters they have ‘no religion’ grew rapidly over the twentieth century – it was already 43% when the British Social Attitudes survey began in 1983; it tipped 50% a little over a decade later – in America the figure stayed stubbornly low. According to Gallup polls, only 5% of American adults identified with no religion in 1975. Twenty years later, in 1995, it was still 5%.

But then, seemingly very suddenly, things started to change. Beginning in the late nineties, then rapidly accelerating in the early 2000s, each new survey showed the nones getting bigger and bigger. Depending on which survey one looks at, nones now account for  somewhere between a quarter and third of American adults. Even at the lower end, that amounts to some 60 million people. And they’re still growing.

This raises a natural question: Why now? Or rather, what is it about the nineties and early 2000s that pushed or pulled large swathes out of thinking of themselves as religious? Various ways of measuring American religiosity all indicate that something significant must have happened around then. But what

A prior, deeper puzzle

That, at least, is the obvious way of approaching things. And to be fair, it has much to recommend it: something, or rather a combination of somethings, certainly did happen to American religion in those critical years. But this in itself raises a prior, deeper puzzle: why hadn’t the numbers of American nones already risen before the late nineties or early naughts? In all manner of other, quasi-comparable countries – Britain, Canada, Australia, France – the nones started growing steadily from the 1960s onwards. Yet while the sixties had all manner of other disruptive and destabilizing effects on American culture, society, politics, and religion, the proportion of nones grew only a little bit, then stopped.

At the risk of gross oversimplification, if one were to look for a sufficiently big ‘something’ within American society, mostly absent from those other countries, which could plausibly have kept non-religiosity artificially low in these decades, then there is an obvious candidate: the Cold War. Or more specifically, the precise and peculiarly religious way in which it was framed in the USA. 

A final, all-out battle

We Brits were as up to our neck in the Cold War as anyone. But only in America, I think, was the Cold War ever popularly framed as a “final, all-out battle between commu­nistic atheism and Christianity”, to quote Republican Senator Joseph McCarthy. Remember too that it was only in the mid-1950s that Congress adopted “In God We Trust” as America’s official motto, and “under God” was added to the Pledge. During the Pledge debates in Congress, the Democrat Louis C. Rabaut’s summed up a common view on both sides of the aisle:

“You may argue from dawn to dusk about differing po­litical, economic, and social systems but the fundamental issue which is the unbridgeable gap between America and Communist Russia is a belief in almighty God.”

This wasn’t just an issue with wide bipartisan and popular support view, it was thoroughly ecumenical too. While McCarthy and Rabaut were Catholics, it was a Presbyterian president, Eisenhower, who signed the “under God” bill into law. As Eisenhower himself put it during his 1952 election campaign:

“What is our battle against communism if it is not a fight between anti-God and a belief in the Almighty?”

Embellishing the city on a hill

It was also during the Cold War that presidents began likening America to the biblical “city built on a hill” – all the better positioned, one presumes, to scour the horizon for incoming Soviet missiles. Kennedy was the first US president to use it. Reagan, adding his own embellishment of “shining,” would make it his, and many of his countrymen’s, own. Taken together, all this helped lay down a deep, implicit association between being un-religious and being un-American. Atheism itself bore the brunt of this, but it more generally ruled out as­sociated ideas and identities – including thinking of oneself as having “no religion” – as live options for the great majority of Americans.

Riven fault lines

Meanwhile, the cultural fault lines that begin obviously opening up in the late sixties – gender equality, sexual liberation – kept on widening, with new generations socialized into ever more liberal baselines. This created a growing values gap between traditional Christian views and the wider mainstream culture, on topics that were very personal to, and thus felt very deeply by, people on all sides. This meant that, while churches tended to be most visible on the 'conservative side' of various battlegrounds, they were also often deeply riven by internal versions of the same debates. Not surprisingly, church attendance, at least within Catholic and mainline churches, started falling steadily in the seventies and (except where immigration has helped fill the pews) has never really stopped.

The Internet of ideas and identities

On this basic account – and there is much that could be, and elsewhere has been, added to it – the thawing of the Cold War is obviously significant. Note that it is the Millennial generation, only the youngest of whom are able to remember the Cold War (and even then mostly from holiday reruns of Red Dawn and Rocky IV), who were at the vanguard of the rise of the nones. They were also the first generation to be true digital natives, opening many of them up to a much wider range of ideas and identities than hitherto available. This has been especially effective at chipping away the walls of some of America’s stronger religious subcultures. My ex-Mormon interviewees, especially, cite “the wonderful thing called the internet” as being “the game-changer”.

Serious discussion and South Park

The Millennials started coming of age, and indeed responding to pollsters’ surveys, in the early 2000s. This was also around the time when, arguably for the first time since maybe the hugely popular writer and speaker  Robert “The Great Agnostic” Ingersoll a century before, unbelief was being seriously discussed everywhere from primetime talkshows to episodes of South Park. The bestselling books of the New Atheists – principally Sam Harris, Richard Dawkins, Daniel Dennett, and Christopher Hitchens – evidently hit upon some long pent-up demand. They were also, significantly here, able to position atheism, and 'no religion' more generally, as a panacea for a world awash with religion. Harris, for example, makes much of how he started writing The End of Faith on September 12th. Dawkins made no secret about his wanting to run adverts with an image of the Twin Towers and the tagline “Imagine no religion…”.

Cultural space opens

Whatever one makes of such arguments, similar rhetorical moves would have had less intuitive appeal to earlier American generations, learning to duck and cover from atheists’ H-bombs: the stuff of Americans’ nightmares were now those with too much religion, rather than not enough. While the long term impact of the not-so-New Atheism is hard to judge – many nones are keen to distance themselves from what they saw as its “dogmatism” and “extremism”, even while agreeing with much of it – it certainly helped open up ‘cultural space’ for being both American and non-religious that the Cold War had (outside of various enclaves, such as college towns and certain big cities) largely sealed shut. As we have seen, it is one that a good quarter of American adults are quite comfortable placing themselves within.

So yes, new somethings indeed happened in the final years of the twentieth century and the first years of the twenty-first: and these helped drive the uptick of nones. But these happened at the same time as the none-inhibiting effects of a much earlier something had more or less worn off, especially among the emerging genera­tions most affected by the new somethings. It is this combination of factors— akin to pulling one foot off the brake as you slam the other down on the accelerator— that explains quite why the nones rose so suddenly and (seemingly) out of nowhere.  

 

Article
Comment
Education
Language
6 min read

Cutting language learning is a moral failure

Learning someone’s tongue is a deeply humble and empathetic act
A check list shows 'thank you' in different languages.

When you go abroad, how do you navigate language differences? Do you just stick everything through Google translate? Or put a few weeks into Duolingo before you go? Or maybe you just speak a bit louder in the hope that that will somehow smooth over any misunderstandings? 

Recently, my wife and I went to Italy for a week. Neither of can speak a word of Italian and we were taking our toddler Zachary with us (who can speak even less Italian), so we booked into a big resort where we knew staff would be able to speak some English if we needed anything for Zach. Even so, we tried learning a few words and phrases:  

‘please’,  

‘thank you’,  

‘could I have …?’,  

‘where is the …?,  

‘please forgive my toddler, he hasn’t learned to regulate his emotions yet’. 

That sort of thing. Just some basics to get by.  

Of course, what happened was exactly what happens every time I speak another language. I try my best to make an effort, people immediately realise I’m a struggling and they put me out of my misery by replying in English anyway.  

All this reinforces the importance of deep and rigorous language learning in society. All this makes the continued diminishment of university modern language programmes rather odd, and more than a little unsettling.  

The University of Nottingham has announced it is terminating the employment of casual staff at its Language Centre. This will see the end of numerous classes for students and others in many languages, both ancient and modern, including British Sign Language.  

Nottingham is not alone in this. The news comes in the immediate aftermath of a review into the University of Aberdeen’s decision to scrap modern language degrees in 2023, which found the decision “hurried, unstructured, and dominated by immediate financial considerations.” (Not that we needed a review to tell us this). The University of Aberdeen has partially reversed the decision, continuing its provision of joint honour degrees, if not single honour language degrees.  

Elsewhere, in January, Cardiff University announced plans to cut 400 academic staff, cutting their entire modern language provision in the process. In May, the University revealed that it would reverse these plans, with modern languages continuing to be offered (for now), albeit it a revised and scaled-down manner. 

The situation is bleak. As a theology lecturer who works for a Church of England college, I’m all too aware of the precarity my friends and colleagues in University Arts and Humanities departments face across the sector. But I was also naïve enough to think that languages might be one of the subjects that would be able to survive the worst of education’s deepening malaise given their clear  importance. How wrong I was. 

There are the obvious causes for despair at the news of language department cuts. One the one hand is the human element of all this. People are losing their jobs. Moreover, as casual workers, the University had no obligation to consult them about the changes or provide any notice period, and so they didn’t, because why would a university demonstrate courtesy towards its staff unless it absolutely had to? As well as losing jobs and whole careers, people will lose sleep, and perhaps even homes and relationships as a direct result of the financial and emotional toll this decision will take on staff. My heart breaks for those effected.  

And yet, the move is also evidence – as if more were needed – of the increasing commercialization of Higher Education. A statement from the University said the decision to cut languages in this way was the result of the Language Centre not running at a “financial surplus.” The cuts will instead allow the University to focus on “providing a high-quality experience for our undergraduate and postgraduate students.” 

And there we have it. Not even a veneer of pretence that universities operate for the pursuit of truth or knowledge. No, nothing so idealistic. A university is business, thank you very much, here to offer an “experience”. And when parts of businesses become financially unsustainable, they’re tossed aside. 

Languages aren’t just ways of describing the world we see, they’re also ways of seeing the world in the first place. 

But cutting language offerings isn’t just a personal and a societal loss, it’s also a huge spiritual and moral failure. And that’s because of what language fundamentally is. Let me explain.  

It can be tempting to think of words as simply ‘labels’ we assign to objects in the world, with different languages using a different set of ‘labels’ to describe the same objects. As a native English speaker, I might see something with four legs and a flat surface on top and call it a ‘desk’. Someone else with a different native language might call it a Schreibtisch, or a bureau‚ or a scrivania, or a tepu, or a bàn làm việc. You get the point: we might be using different labels, but we’re all ‘seeing’ the same thing when we use those ‘labels’, right? 

Well, it’s a bit more complicated than that. Languages aren’t just ways of describing the world we see, they’re also ways of seeing the world in the first place. As such, languages have the capacity to shape how we behave in response to the world, a world itself suggested to us in part by our language(s). As twentieth-century philosopher Ludwig Wittgenstein once wrote, “the limits of my language mean the limits of my world.” 

Let me give you just one example. English distinguishes tenses: past, present, future. I did, I do, I will do. Chinese does not. It expresses past, present, and future in the same way, meaning past and future feel as immediate and as pressing as the present. The result of ‘seeing’ the world through a ‘futureless’ language like this? According to economist Keith Chen, ‘futureless’ language speakers are 30 per cent more likely to save income compared to ‘futured’ language speakers (like English speakers). They also retire with more wealth, smoke less, practice safer sex, eat better, and exercise more. The future is experienced in a much more immediate and pressing way, leading to people investing more into behaviours that positively impact their future selves, because their view of the world – and their future selves’ place within the world – is radically different because of their language. 

Different languages lead to seeing the world differently which leads to differences in behaviour. In other words, there are certain experiences and emotions – even certain types of knowledge and behaviours - that are only encounterable for those fluent in certain languages. And this means that to learn another language is to increase our capacity for empathy. Forget walking a mile in someone’s shoes, if you want truly to know someone, learn their language.  

In my day job as a lecturer, when I’m trying to encourage my students – most of whom are vicars-to-be – to learn biblical Greek and/or Hebrew, I tell them it will make them more empathetic people. It may make them better readers of the Bible, it may even make them better writers too but, more than anything else, students who learn languages will be better equipped to love their neighbour for having done so. They will get a better sense of the limits of their world, and a greater appreciation for the ways in which others see it too. Show me a society that is linguistically myopic, and I’ll show you one that’s deeply unempathetic. I can guarantee you of that.   

We ought to be deeply, deeply concerned about the diminishing language offerings in the UK’s Higher Education sector. To open oneself to other languages is to open oneself to other ways of seeing the world. It is to be shown the limits of one’s own ways of seeing. Learning a language is a deeply humble and empathetic act. And isn’t humility and empathy in desperately short supply at the moment? 

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief