Explainer
America
Comment
6 min read

The Cold War, the Internet and America’s nones

How does a culture lose religion so rapidly? Stephen Bullivant investigates the American phenomenon of ‘nonversion'.

Stephen Bullivant is a professor of theology and sociology at St Mary’s University, UK, and the University of Notre Dame, Australia.

Image from the series South Park

Those even passingly familiar with American religious trends over the past couple of decades will have read or heard the phrase ‘rise of the nones’ countless times. And with good reason. While in Britain the proportion of people telling pollsters they have ‘no religion’ grew rapidly over the twentieth century – it was already 43% when the British Social Attitudes survey began in 1983; it tipped 50% a little over a decade later – in America the figure stayed stubbornly low. According to Gallup polls, only 5% of American adults identified with no religion in 1975. Twenty years later, in 1995, it was still 5%.

But then, seemingly very suddenly, things started to change. Beginning in the late nineties, then rapidly accelerating in the early 2000s, each new survey showed the nones getting bigger and bigger. Depending on which survey one looks at, nones now account for  somewhere between a quarter and third of American adults. Even at the lower end, that amounts to some 60 million people. And they’re still growing.

This raises a natural question: Why now? Or rather, what is it about the nineties and early 2000s that pushed or pulled large swathes out of thinking of themselves as religious? Various ways of measuring American religiosity all indicate that something significant must have happened around then. But what

A prior, deeper puzzle

That, at least, is the obvious way of approaching things. And to be fair, it has much to recommend it: something, or rather a combination of somethings, certainly did happen to American religion in those critical years. But this in itself raises a prior, deeper puzzle: why hadn’t the numbers of American nones already risen before the late nineties or early naughts? In all manner of other, quasi-comparable countries – Britain, Canada, Australia, France – the nones started growing steadily from the 1960s onwards. Yet while the sixties had all manner of other disruptive and destabilizing effects on American culture, society, politics, and religion, the proportion of nones grew only a little bit, then stopped.

At the risk of gross oversimplification, if one were to look for a sufficiently big ‘something’ within American society, mostly absent from those other countries, which could plausibly have kept non-religiosity artificially low in these decades, then there is an obvious candidate: the Cold War. Or more specifically, the precise and peculiarly religious way in which it was framed in the USA. 

A final, all-out battle

We Brits were as up to our neck in the Cold War as anyone. But only in America, I think, was the Cold War ever popularly framed as a “final, all-out battle between commu­nistic atheism and Christianity”, to quote Republican Senator Joseph McCarthy. Remember too that it was only in the mid-1950s that Congress adopted “In God We Trust” as America’s official motto, and “under God” was added to the Pledge. During the Pledge debates in Congress, the Democrat Louis C. Rabaut’s summed up a common view on both sides of the aisle:

“You may argue from dawn to dusk about differing po­litical, economic, and social systems but the fundamental issue which is the unbridgeable gap between America and Communist Russia is a belief in almighty God.”

This wasn’t just an issue with wide bipartisan and popular support view, it was thoroughly ecumenical too. While McCarthy and Rabaut were Catholics, it was a Presbyterian president, Eisenhower, who signed the “under God” bill into law. As Eisenhower himself put it during his 1952 election campaign:

“What is our battle against communism if it is not a fight between anti-God and a belief in the Almighty?”

Embellishing the city on a hill

It was also during the Cold War that presidents began likening America to the biblical “city built on a hill” – all the better positioned, one presumes, to scour the horizon for incoming Soviet missiles. Kennedy was the first US president to use it. Reagan, adding his own embellishment of “shining,” would make it his, and many of his countrymen’s, own. Taken together, all this helped lay down a deep, implicit association between being un-religious and being un-American. Atheism itself bore the brunt of this, but it more generally ruled out as­sociated ideas and identities – including thinking of oneself as having “no religion” – as live options for the great majority of Americans.

Riven fault lines

Meanwhile, the cultural fault lines that begin obviously opening up in the late sixties – gender equality, sexual liberation – kept on widening, with new generations socialized into ever more liberal baselines. This created a growing values gap between traditional Christian views and the wider mainstream culture, on topics that were very personal to, and thus felt very deeply by, people on all sides. This meant that, while churches tended to be most visible on the 'conservative side' of various battlegrounds, they were also often deeply riven by internal versions of the same debates. Not surprisingly, church attendance, at least within Catholic and mainline churches, started falling steadily in the seventies and (except where immigration has helped fill the pews) has never really stopped.

The Internet of ideas and identities

On this basic account – and there is much that could be, and elsewhere has been, added to it – the thawing of the Cold War is obviously significant. Note that it is the Millennial generation, only the youngest of whom are able to remember the Cold War (and even then mostly from holiday reruns of Red Dawn and Rocky IV), who were at the vanguard of the rise of the nones. They were also the first generation to be true digital natives, opening many of them up to a much wider range of ideas and identities than hitherto available. This has been especially effective at chipping away the walls of some of America’s stronger religious subcultures. My ex-Mormon interviewees, especially, cite “the wonderful thing called the internet” as being “the game-changer”.

Serious discussion and South Park

The Millennials started coming of age, and indeed responding to pollsters’ surveys, in the early 2000s. This was also around the time when, arguably for the first time since maybe the hugely popular writer and speaker  Robert “The Great Agnostic” Ingersoll a century before, unbelief was being seriously discussed everywhere from primetime talkshows to episodes of South Park. The bestselling books of the New Atheists – principally Sam Harris, Richard Dawkins, Daniel Dennett, and Christopher Hitchens – evidently hit upon some long pent-up demand. They were also, significantly here, able to position atheism, and 'no religion' more generally, as a panacea for a world awash with religion. Harris, for example, makes much of how he started writing The End of Faith on September 12th. Dawkins made no secret about his wanting to run adverts with an image of the Twin Towers and the tagline “Imagine no religion…”.

Cultural space opens

Whatever one makes of such arguments, similar rhetorical moves would have had less intuitive appeal to earlier American generations, learning to duck and cover from atheists’ H-bombs: the stuff of Americans’ nightmares were now those with too much religion, rather than not enough. While the long term impact of the not-so-New Atheism is hard to judge – many nones are keen to distance themselves from what they saw as its “dogmatism” and “extremism”, even while agreeing with much of it – it certainly helped open up ‘cultural space’ for being both American and non-religious that the Cold War had (outside of various enclaves, such as college towns and certain big cities) largely sealed shut. As we have seen, it is one that a good quarter of American adults are quite comfortable placing themselves within.

So yes, new somethings indeed happened in the final years of the twentieth century and the first years of the twenty-first: and these helped drive the uptick of nones. But these happened at the same time as the none-inhibiting effects of a much earlier something had more or less worn off, especially among the emerging genera­tions most affected by the new somethings. It is this combination of factors— akin to pulling one foot off the brake as you slam the other down on the accelerator— that explains quite why the nones rose so suddenly and (seemingly) out of nowhere.  

 

Article
Assisted dying
Comment
Death & life
Politics
5 min read

The careless conflation of independence, autonomy and dignity

As Jersey begins to legalise assisted dying, there’s keyword confusion.
A elderly women in a care home stands and places her hands on the shoulders of a seated woman.
Eberhard Grossgasteiger on Unsplash.

Reviewing Canada’s legislation on assisted dying, one article raises the concern: “Does it make dying with dignity easier than living with dignity?” This insightful question cuts to the centre of the debate: dignity. Or more particularly, the unwitting conflation of dignity with independence, and of independence with autonomy.  

As Jersey becomes the first place in the British Isles to begin the process of legalising assisted dying, I feel that we should listen carefully as to how and where these terms are being used, both in the formal debate, and in the commentary that surrounds it. The States Assembly in Jersey voted to allow the development of assisted dying legislation for those with six months to live (or twelve months if their condition is neurodegenerative). A second vote to make assisted dying available more broadly to those who experience conditions that entail “unbearable suffering” was defeated by a narrower margin. Reading the flurry of press releases that followed the vote, these keywords, autonomy, independence, and dignity, are everywhere. But are we really thinking about what these words communicate?  

People in positions of wealth and power have more independence and autonomy, more choices and freedoms, but it is we who ascribe dignity to those in that position.

The word dignity comes from the Latin word dignus, meaning ‘worthy’, and this is still the primary definition given to the English word dignity today. The OED dictionary has it as “the quality of being worthy or honourable”, immediately followed by reference to “honourable or high estate”. If this is so, then dignity is not something that can be bought, nor assumed – it is a status conferred upon someone by the esteem in which other people hold them. The haughtiest person in the world can still be esteemed undignified, as can the richest. Moreover, the opposite is also true: we are never prevented from conferring dignity upon, and esteeming the worthiness of, those who live the humblest of lives.   

And yet, if we are honest with ourselves – do many of us not quietly associate the idea of becoming rich and powerful with becoming dignified? Do we not tend to assume the worthiness of those in high office – at least until we meet them and realise pretty quickly that they all put their trousers on one leg at a time, the same as the rest of us. This association happens because we have such a tendency to conflate dignity with independence (the ability to live without assistance from others) and autonomy (the ability to make one’s own decisions, and not have those decisions limited or interfered with). People in positions of wealth and power have more independence and autonomy, more choices and freedoms, but it is we who ascribe dignity to those in that position. It is society who sees the autonomy of those in high status, and esteems it as dignified.    

Does this not unwittingly suggest that choosing to live in a state of extreme dependence on palliative care is, by implication, undignified? 

Repeatedly ancient wisdom, in the Bible, warns us not to assume that dignity comes with the freedom of wealth or power. All the great ‘heroes’ of that book suffer their indignities. Fresh from the success of his Ark project, Noah gets drunk and exposes himself. Elated from a victory against an enemy, King David dances half-naked through the streets. These are just two examples of the catalogue of embarrassments and mishaps that beset nearly all the kings and leaders whose stories are told as part of the Christian story. One after another, they stumble and struggle with life and leadership. The apostle Paul explains that this is because God uses the foolish things of this world to shame human pride, “for even the foolishness of God is still wiser than human wisdom.” Therefore, Paul argues, God chooses to speak to us through the weak and the lowly things and people of this world. Never was this demonstrated so clearly as when Jesus was born in a draughty stable, lived a life of poverty, and died a criminal’s death on a cross.

But what has all this to do with the debate over assisted dying? Well, I am struck by how often the idea of losing one’s independence (through disabling or terminal illness) is conflated with losing one’s dignity, and so dying through personal choice (autonomy) is presented as regaining it. One campaign group that speaks to this debate even calls itself ‘Dignity in Dying’ – but does this not unwittingly suggest that choosing to live in a state of extreme dependence on palliative care is, by implication, undignified?  

Independence is not possible for everybody, or not possible to the same degree. And dignity? Well, dignity is possible for anyone. 

The Dean of Jersey, the Very Reverend Mike Keirle, has spoken of his concern that the change in legislation will make vulnerable people feel pressured to end their lives. Examples from Canada, where physician assisted dying is already available, show that his concern is not unfounded. In 2022, Canadian veteran and Paralympian Christine Gauthier phoned her caseworker to chase up the over-due installation of her new wheelchair ramp. She then describes how she was horrified to find herself being advised to consider assisted dying instead.  

"It is remotely just what they're doing,” says Gauthier, “exhausting us to the point of no return. […] I was like, 'Are you serious?' Like that easy, you're going to be helping me to die but you won't help me to live?"

Gauthier is not alone – she spoke out when she learned that four other Canadian veterans had reported similar experiences. In these unhappy moments, one can see how dangerous the assumption can be – the assumption that no one would want to live a life of needing help. Here are disabled people who do want to live, and this assumption, this careless conflation of independence, autonomy, and dignity, leaves them fighting for their right to do so. Why should anyone have to fight or even speak for their right not to commit suicide? It is little wonder that disabled actress, Liz Carr, describes assisted dying legislation as “terrifying” for disabled people. 

I respect that there are terminally ill people, and those who love them, who speak from a desire to end their suffering; it is clear that people on all sides of the debate need to have this difficult and emotionally charged conversation. But whatever the eventual outcome in terms of legislation, we must be careful that it is not based on careless assumptions, or on the conflation of one thing with an entirely different other. Independence is not possible for everybody, or not possible to the same degree. And dignity? Well, dignity is possible for anyone – it is a state that can be conferred whenever, and upon whomever society chooses to confer it. Autonomy is the matter in question – we are talking about autonomy in dying. And whatever happens, we should by no means legislate in a way that leaves disabled people esteemed unworthy, left open to the indignity of fighting for their right to live.