Explainer
America
Comment
6 min read

The Cold War, the Internet and America’s nones

How does a culture lose religion so rapidly? Stephen Bullivant investigates the American phenomenon of ‘nonversion'.

Stephen Bullivant is a professor of theology and sociology at St Mary’s University, UK, and the University of Notre Dame, Australia.

Image from the series South Park

Those even passingly familiar with American religious trends over the past couple of decades will have read or heard the phrase ‘rise of the nones’ countless times. And with good reason. While in Britain the proportion of people telling pollsters they have ‘no religion’ grew rapidly over the twentieth century – it was already 43% when the British Social Attitudes survey began in 1983; it tipped 50% a little over a decade later – in America the figure stayed stubbornly low. According to Gallup polls, only 5% of American adults identified with no religion in 1975. Twenty years later, in 1995, it was still 5%.

But then, seemingly very suddenly, things started to change. Beginning in the late nineties, then rapidly accelerating in the early 2000s, each new survey showed the nones getting bigger and bigger. Depending on which survey one looks at, nones now account for  somewhere between a quarter and third of American adults. Even at the lower end, that amounts to some 60 million people. And they’re still growing.

This raises a natural question: Why now? Or rather, what is it about the nineties and early 2000s that pushed or pulled large swathes out of thinking of themselves as religious? Various ways of measuring American religiosity all indicate that something significant must have happened around then. But what

A prior, deeper puzzle

That, at least, is the obvious way of approaching things. And to be fair, it has much to recommend it: something, or rather a combination of somethings, certainly did happen to American religion in those critical years. But this in itself raises a prior, deeper puzzle: why hadn’t the numbers of American nones already risen before the late nineties or early naughts? In all manner of other, quasi-comparable countries – Britain, Canada, Australia, France – the nones started growing steadily from the 1960s onwards. Yet while the sixties had all manner of other disruptive and destabilizing effects on American culture, society, politics, and religion, the proportion of nones grew only a little bit, then stopped.

At the risk of gross oversimplification, if one were to look for a sufficiently big ‘something’ within American society, mostly absent from those other countries, which could plausibly have kept non-religiosity artificially low in these decades, then there is an obvious candidate: the Cold War. Or more specifically, the precise and peculiarly religious way in which it was framed in the USA. 

A final, all-out battle

We Brits were as up to our neck in the Cold War as anyone. But only in America, I think, was the Cold War ever popularly framed as a “final, all-out battle between commu­nistic atheism and Christianity”, to quote Republican Senator Joseph McCarthy. Remember too that it was only in the mid-1950s that Congress adopted “In God We Trust” as America’s official motto, and “under God” was added to the Pledge. During the Pledge debates in Congress, the Democrat Louis C. Rabaut’s summed up a common view on both sides of the aisle:

“You may argue from dawn to dusk about differing po­litical, economic, and social systems but the fundamental issue which is the unbridgeable gap between America and Communist Russia is a belief in almighty God.”

This wasn’t just an issue with wide bipartisan and popular support view, it was thoroughly ecumenical too. While McCarthy and Rabaut were Catholics, it was a Presbyterian president, Eisenhower, who signed the “under God” bill into law. As Eisenhower himself put it during his 1952 election campaign:

“What is our battle against communism if it is not a fight between anti-God and a belief in the Almighty?”

Embellishing the city on a hill

It was also during the Cold War that presidents began likening America to the biblical “city built on a hill” – all the better positioned, one presumes, to scour the horizon for incoming Soviet missiles. Kennedy was the first US president to use it. Reagan, adding his own embellishment of “shining,” would make it his, and many of his countrymen’s, own. Taken together, all this helped lay down a deep, implicit association between being un-religious and being un-American. Atheism itself bore the brunt of this, but it more generally ruled out as­sociated ideas and identities – including thinking of oneself as having “no religion” – as live options for the great majority of Americans.

Riven fault lines

Meanwhile, the cultural fault lines that begin obviously opening up in the late sixties – gender equality, sexual liberation – kept on widening, with new generations socialized into ever more liberal baselines. This created a growing values gap between traditional Christian views and the wider mainstream culture, on topics that were very personal to, and thus felt very deeply by, people on all sides. This meant that, while churches tended to be most visible on the 'conservative side' of various battlegrounds, they were also often deeply riven by internal versions of the same debates. Not surprisingly, church attendance, at least within Catholic and mainline churches, started falling steadily in the seventies and (except where immigration has helped fill the pews) has never really stopped.

The Internet of ideas and identities

On this basic account – and there is much that could be, and elsewhere has been, added to it – the thawing of the Cold War is obviously significant. Note that it is the Millennial generation, only the youngest of whom are able to remember the Cold War (and even then mostly from holiday reruns of Red Dawn and Rocky IV), who were at the vanguard of the rise of the nones. They were also the first generation to be true digital natives, opening many of them up to a much wider range of ideas and identities than hitherto available. This has been especially effective at chipping away the walls of some of America’s stronger religious subcultures. My ex-Mormon interviewees, especially, cite “the wonderful thing called the internet” as being “the game-changer”.

Serious discussion and South Park

The Millennials started coming of age, and indeed responding to pollsters’ surveys, in the early 2000s. This was also around the time when, arguably for the first time since maybe the hugely popular writer and speaker  Robert “The Great Agnostic” Ingersoll a century before, unbelief was being seriously discussed everywhere from primetime talkshows to episodes of South Park. The bestselling books of the New Atheists – principally Sam Harris, Richard Dawkins, Daniel Dennett, and Christopher Hitchens – evidently hit upon some long pent-up demand. They were also, significantly here, able to position atheism, and 'no religion' more generally, as a panacea for a world awash with religion. Harris, for example, makes much of how he started writing The End of Faith on September 12th. Dawkins made no secret about his wanting to run adverts with an image of the Twin Towers and the tagline “Imagine no religion…”.

Cultural space opens

Whatever one makes of such arguments, similar rhetorical moves would have had less intuitive appeal to earlier American generations, learning to duck and cover from atheists’ H-bombs: the stuff of Americans’ nightmares were now those with too much religion, rather than not enough. While the long term impact of the not-so-New Atheism is hard to judge – many nones are keen to distance themselves from what they saw as its “dogmatism” and “extremism”, even while agreeing with much of it – it certainly helped open up ‘cultural space’ for being both American and non-religious that the Cold War had (outside of various enclaves, such as college towns and certain big cities) largely sealed shut. As we have seen, it is one that a good quarter of American adults are quite comfortable placing themselves within.

So yes, new somethings indeed happened in the final years of the twentieth century and the first years of the twenty-first: and these helped drive the uptick of nones. But these happened at the same time as the none-inhibiting effects of a much earlier something had more or less worn off, especially among the emerging genera­tions most affected by the new somethings. It is this combination of factors— akin to pulling one foot off the brake as you slam the other down on the accelerator— that explains quite why the nones rose so suddenly and (seemingly) out of nowhere.  

 

Article
Attention
Comment
Psychology
6 min read

Paying attention to ADHD– is it really just a fad?

Media fixation with ADHD caught Henna Cundill’s eye, so she decided to investigate its struggles and superpowers.
From a darkly shadowed face, a single illuminated eye stares.
Brands&People on Unsplash.

In a feat of irony, attention deficit hyperactivity disorder (commonly known as ADHD) is now getting a lot of attention. For example, between 28 and 31 January The Times newspaper published one article per day about ADHD. Intrigued, I looked back over the past few months, and I found that The Times has averaged 8 to 10 articles per month which are either partly or exclusively about this topic. These range from celebrity diagnoses to handwringing over the “troubling rise” in incidents of the condition, to concerns about parents gaming the system to get their children disability payments or extra time in exams.  

With all this media hype, it is little wonder that some commentators are inclined to dismiss ADHD as a fad. Scroll through the comments beneath each article, and you will reliably find the rallying cry of, “We didn’t have ADHD in my day!” followed by the patient responses of those who try to correct this fallacy.  

While the high public profile of ADHD is new, the condition itself is not. As early as the mid-1700s a Professor of Medicine called Melchior Adam Weikard was describing patients who were “unwary, careless, and flighty” – behaving in ways governed by impulse, and showing poor skills in punctuality, accuracy, and having an inability to complete tasks, to the detriment of their mental health. His description is of its day. For example, and somewhat amusingly, Weikard (himself German, but at this point living in Russia) also described his patients as follows:  

Compared to an attentive and considerate person such a jumpy person may act like a young Frenchman does in comparison to a mature Englishman. 

Even so, Weikard did not unconsciously adopt all the prejudices and stereotypes of his context: he broke firmly with existing medical consensus when he diagnosed these patients as having a “dysregulation in cerebral fibres” – rather than attributing their difficulties to astrological misalignments or demon possession.  

By characterising ADHD as a brain-based condition, Weikard was ahead of his time, and we’ve come a long way since then. This is not the place to chart the whole biography of ADHD, suffice to say that when someone rolls their eyes and declares dismissively, “We didn’t have ADHD in my day…” – they are either over 300 years old or not talking like a mature Englishman, even if they read The Times.  

The negative side of the condition as being in a constant fight with one’s own thoughts and senses – these are doughty opponents, they always know where to find you, and they only sleep when you do. 

Another thing that is not new, despite what cynical commentators might seek to imply, is the treatment of some aspects of ADHD with medication.  

Doctors have been prescribing amphetamines to patients with ADHD since at least the 1950s. Yet now those medications are in short supply. Contrary to the media hype, fewer than 1 in 10 people with an ADHD diagnosis take prescribed medication, but for some of those who do it can be a lifeline – calming down a washing machine mind that is stuck on constant spin.  

One acquaintance of mine has taken to anxiously touring the local pharmacies, driving to neighbouring towns and villages, desperate to get her prescription filled.  

Another is passing her own tablets on to her son, whose prescribed supply ran out sooner. Sharing prescription medication is, I am duty-bound to add, an illegal practice – but it is hard to expect a parent to medicate themselves whilst seeing their own child struggle to attend school, to complete exam papers and to just generally feel (and I quote) “like a normal person.”  

People who have ADHD sometimes describe the negative side of the condition as being in a constant fight with one’s own thoughts and senses – these are doughty opponents, they always know where to find you, and they only sleep when you do.   

This is not to overlook that there are positives to ADHD too – it is often pointed out that the condition entails a degree of “superpower.” A person living with ADHD may have an incredible ability to focus on one difficult problem to the exclusion of all else, and thus solve it, perhaps devising creative solutions that elude those with a more pedestrian style of thought.  

Also, it is common for people who live with ADHD to be dynamic conversationalists, with high social intelligence and empathy, priming them for success at tasks like broadcasting and debating. Many elite athletes also live with ADHD and say that they able to strive for excellence due to their restless energy and resilience in the face of tough training regimes.  

Given the mixed bag of struggles and superpowers, there is a raging debate about whether ADHD should even be considered as pathology, or just as a neurodivergent way of being human. I suspect there is no right or wrong answer to this – for each person who lives with ADHD it depends on their own experience and how they feel it helps or hinders them to live the life they choose. Neither is it a binary choice: more than one of my own acquaintances who live with ADHD has described themselves as being in a “love-hate relationship” with their neurodivergence.   

ADHD challenges me to unfold my mind too – to become ever more aware and appreciative of the fact that there are many ways to be human. 

Neurodiversity, like any kind of diversity, challenges the way we live to together in communities, choosing or refusing to show empathy towards those who are perceived as ‘other’. There are several places in the Bible where human interconnectedness is likened to the human body – made up of many different parts, with each member dependent on the other for the wellbeing of the body as a whole. In one of his letters, St Paul wrote, “If the whole body were an eye, where would the hearing be? Or if the whole body were an ear, where would the sense of smell be?” Society needs problem solvers, communicators, high achievers, even while society also needs people who can structure, plan and maintain consistency – and above all, society needs these different neurotypes to work together with a certain amount of mutual understanding and trust.  

Reflecting further on the body metaphor, Paul also wrote this: “If one part of the body suffers, the whole body suffers with it.” It is estimated that about 5 per cent of people in the UK has ADHD, so it is likely that includes someone you know. The majority don’t take regular meds, but if you are connected to someone who is usually reliant on these, the next few months may be a time of particular stress and anxiety, as the current medication shortage is expected to continue into late spring. This affects not just those living with ADHD, but all of us, as we live together in our families, communities, and networks. Not everyone chooses to be open about having an ADHD diagnosis, but if they are, now might be a good time to ask them how they experience this condition, both with its positives and negatives, and how you can support them if they are managing without their usual prescription. 

The body metaphor, and Paul’s teaching around it, reminds us that diversity is no accident, God has always been attentive to those who feel divergent or far from the centre, as Jesus affirmed when he announced his ministry would be for the poor, the prisoners, the disabled and the oppressed. The psalmist too, observes that God’s attention and concern for us is so complete, that one is “…hemmed in, before and behind” – even if one strays to the very ends of the Earth, or drives to the pharmacy in the next village. Thus, while the media circus may be new, we can be sure that God has always been attentive to those with ADHD, and wider society is called to be likewise. 

Writing for The Times, Esther Walker describes ADHD as “…the health story that keeps unfolding.” Well, certainly every time I unfold my newspaper, there it is again. But ADHD challenges me to unfold my mind too – to become ever more aware and appreciative of the fact that there are many ways to be human: usually complex, sometimes difficult, often brilliant, and always interconnected.