Article
Culture
Psychology
Weirdness
5 min read

Why smell jumps the queue when it comes to memories

Smells hardwire deep into the brain, writes Henna Cundill, as she explores why they jump-start such vivid memories.
An autumnal scene of a church yard and church framed by leafless trees.
'The smell of dust and damp stone will always cry “safety!”'
Jakub Pabis on Unsplash.

When I was a 22-year-old undergraduate my mother died quite suddenly. I can't remember the name of the undertakers we used, nor the chaplain who took her funeral. I can no longer visualise what any of their faces looked like. I know I visited the chaplain’s house to plan the funeral, but I can't remember exactly where that house was. What sticks is that the day of the funeral was a sultry summer's day, and both the chaplain and the undertakers smelt of perspiration. To this day there are moments where I catch that same whiff of man-sweat in some other location, and for a fleeting second, I am a bewildered 22-year-old once more. 

Here is another memory. I attended a tiny, rural Church of England primary school in the middle of England. At the end of each school year, all of us donned our little Wellington boots, which smelt faintly of slurry (since this was dairy-farming country) and sweaty feet. Then we lined up in a crocodile and trudged through the bluebell-wood (damp leaves) and skirted the edge of fields (silage, which stings the nose) covering the mile or so between our school building and the village church. 

We would enter the church grounds through the back field, hurrying through an eerily muffled graveyard with tombstones towering far above our heads and the grass disturbingly lumpy beneath our little feet. To the chidings of “Quickly!” and “Quietly!” we children scurried down a gravel path, away from this unsettling place of death, to reach the cool sanctuary of a little church, and the comforting smells (for me, at least) of damp stone and dusty hymnbooks. 

Others may not have the same associations, but for me the smell of dust and damp stone will always cry “safety!” and the reassurance that “there are no ghosts in here!” in contrast to that troubling graveyard. From death to life. Yet, at the same time, getting stuck with my nose close to some man’s whiffy armpit on the Tube will forever insinuate that I am just a child pretending to be a grown-up, out of my depth, overwhelmed with one thousand decisions to make (“What flowers do you want for her coffin?”) and no-one to advise. In the midst of life, death again.  

On reflection I will know that my emotions are being manipulated by my nose, in ways which are more or less than helpful depending on the circumstances.

Of course, I am not 22 years old and lost anymore, no matter what that man’s armpit tries to tell me. My rational mind knows better, but my rational mind doesn’t get a say – or doesn’t get the first say anyway. This is because smell is the only one of our senses that bypasses the thalamus (the brain’s ‘filtering gate’ that decides which part of the brain needs to respond to sensory input) and goes straight to the limbic system, where emotional memory is stored.  

Sometimes it is very obvious that this is taking place, such as in the examples given above. On reflection I will know that my emotions are being manipulated by my nose, in ways which are more or less than helpful depending on the circumstances. But it can happen in more subtle ways too. Supermarkets infamously pump out smells to influence our buying choices, and we’re trying to sell our house right now, so we’ve been brewing a whole lot more coffee than we ever usually would.   

Intriguingly, scientists don’t really know why the human sense of smell jumps the queue when it comes to cognitive processing. There are biological theories, such as that the smell of predator could wake up our ancestors while they were sleeping and/or could allow them to follow a scent trail quickly when fleeing danger or seeking food. There are social theories too, such as that we don’t have a lot of good words to describe smells, so the brain just doesn’t bother trying to analyse them. Whatever the truth of the matter, the reality is that (whether we like it or not) our noses are an emotional trip-hazard.  

When I walk through those great oak doors there is a moment, a glitch in the matrix, when the unmistakable smell of church hits my nose. Dust, damp… a little hint of mouse. 

I can’t help wondering what this tells me about my religious practice. Do I go to church because I have made a cognitive decision to worship God each Sunday? Or do I go to church because I am following my nose, getting away from a world full of armpits and responsibilities to a place where I am a seven-year-old girl, all gingham dress and wellies, feeling safe. If so, does it matter?    

Truth is, my mind can give me a dozen reasons not to go to church every single week. In fact, two dozen reasons. More. It has always been a busy week; I’m always behind on work. The house always needs a sort out and the car is never washed. But because certain congregation members are normally counting on me for certain things, and because I’m still pretending to be a grown up, I typically drag myself out the door, and off to church I go.  

And week on week, without fail, when I walk through those great oak doors there is a moment, a glitch in the matrix, when the unmistakable smell of church hits my nose. Dust, damp… a little hint of mouse. My body registers this before my mind; my shoulders drop a little of their tension. Even if it’s just for a fleeting moment, I start to feel that I know for sure what is absolutely real in my life and what is just pretend.  

Is this knowledge irrational – since it doesn’t come from the cognitive part of my mind? Or is there a God who knows that the cognitive part of my mind sometimes tells me all sorts of untrue and unhelpful things. Is there a God who is choosing to reach out to me in more subtle, more ancient ways?  

I can only wonder if I have been following my nose all this time, without even noticing. Drawn along by an ancient scent trail that leads me time and time again…this way…and that way…until I reach a place where there is safety, and bread. 

Review
Books
Care
Comment
Psychology
7 min read

We don’t have an over-diagnosis problem, we have a society problem

Suzanne O’Sullivan's question is timely
A visualised glass head shows a swirl of pink across the face.
Maxim Berg on Unsplash.

Rates of diagnoses for autism and ADHD are at an all-time high, whilst NHS funding remains in a perpetual state of squeeze. In this context, consultant neurologist Suzanne O’Sullivan, in her recent book The Age of Diagnosis, asks a timely question: can getting a diagnosis sometimes do more harm than good? Her concern is that many of these apparent “diagnoses” are not so much wrong as superfluous; in her view, they risk harming a person’s sense of wellbeing by encouraging self-imposed limitations or prompting them to pursue treatments that may not be justified. 

There are elements of O-Sullivan’s argument that I am not qualified to assess. For example, I cannot look at the research into preventative treatments for localised and non-metastatic cancers and tell you what proportion of those treatments is unnecessary. However, even from my lay-person’s perspective, it does seem that if the removal of a tumour brings peace of mind to a patient, however benign that tumour might be, then O’Sullivan may be oversimplifying the situation when she proposes that such surgery is an unnecessary medical intervention.  

But O’Sullivan devotes a large proportion of the book to the topics of autism and ADHD – and on this I am less of a lay person. She is one of many people who are proposing that these are being over diagnosed due to parental pressure and social contagion. Her particular concern is that a diagnosis might become a self-fulfilling prophecy, limiting one’s opportunities in life: “Some will take the diagnosis to mean that they can’t do certain things, so they won’t even try.” Notably, O’Sullivan persists with this argument even though the one autistic person whom she interviewed for the book actually told her the opposite: getting a diagnosis had helped her interviewee, Poppy, to re-frame a number of the difficulties that she was facing in life and realise they were not her fault.  

Poppy’s narrative is one with which we are very familiar at the Centre for Autism and Theology, where our team of neurodiverse researchers have conducted many, many interviews with people of all neurotypes across multiple research projects. Time and time again we hear the same thing: getting a diagnosis is what helps many neurodivergent people make sense of their lives and to ask for the help that they need. As theologian Grant Macaskill said in a recent podcast:  

“A label, potentially, is something that can help you to thrive rather than simply label the fact that you're not thriving in some way.” 

Perhaps it is helpful to remember how these diagnoses come about, because neurodivergence cannot be identified by any objective means such as by a blood test or CT scan. At present the only way to get a diagnosis is to have one’s lifestyle, behaviours and preferences analysed by clinicians during an intrusive and often patronising process of self-disclosure. 

Despite the invidious nature of this diagnostic process, more and more people are willing to subject themselves to it. Philosopher Robert Chapman looks to late-stage capitalism for the explanation. Having a diagnosis means that one can take on what is known as the “sick role” in our societal structures. When one is in the “sick role” in any kind of culture, society, or organisation, one is given social permission to take less personal responsibility for one’s own well-being. For example, if I have the flu at home, then caring family members might bring me hot drinks, chicken soup or whatever else I might need, so that I don’t have to get out of bed. This makes sense when I am sick, but if I expected my family to do things like that for me all the time, then I would be called lazy and demanding! When a person is in the “sick role” to whatever degree (it doesn’t always entail being consigned to one’s bed) then the expectations on that person change accordingly.  

Chapman points out that the dynamics of late-stage capitalism have pushed more and more people into the “sick role” because our lifestyles are bad for our health in ways that are mostly out of our own control. In his 2023 book, Empire of Normality, he observes,  

“In the scientific literature more generally, for instance, modern artificial lighting has been associated with depression and other health conditions; excessive exposure to screen time has been associated with chronic overstimulation, mental health conditions, and cognitive disablement; and noise annoyance has been associated with a twofold increase in depression and anxiety, especially relating to noise pollution from aircraft, traffic, and industrial work.” 

Most of this we cannot escape, and on top of it all we live life at a frenetic pace where workers are expected to function like machines, often subordinating the needs and demands of the body. Thus, more and more people begin to experience disablement, where they simply cannot keep working, and they start to reach for medical diagnoses to explain why they cannot keep pace in an environment that is constantly thwarting their efforts to stay fit and well. From this arises the phenomenon of “shadow diagnoses” – this is where “milder” versions of existing conditions, including autism and ADHD, start to be diagnosed more commonly, because more and more people are feeling that they are unsuited to the cognitive, sensory and emotional demands of daily working life.  

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help.

O’Sullivan rightly observes that some real problems arise from this phenomenon of “shadow diagnoses”. It does create a scenario, for example, where autistic people who experience significant disability (e.g., those who have no perception of danger and therefore require 24-hour supervision to keep them safe) are in the same “queue” for support as those from whom being autistic doesn’t preclude living independently. 

But this is not a diagnosis problem so much as a society problem – health and social care resources are never limitless, and a process of prioritisation must always take place. If I cut my hand on a piece of broken glass and need to go to A&E for stiches, I might find myself in the same “queue” as a 7-year-old child who has done exactly the same thing. Like anyone, I would expect the staff to treat the child first, knowing that the same injury is likely to be causing a younger person much more distress. Autistic individuals are just as capable of recognising that others within the autism community may have needs that should take priority over their own.   

What O’Sullivan overlooks is that there are some equally big positives to “shadow diagnoses” – especially as our society runs on such strongly capitalist lines. When a large proportion of the population starts to experience the same disablement, it becomes economically worthwhile for employers or other authorities to address the problem. To put it another way: If we get a rise in “shadow diagnoses” then we also get a rise in “shadow treatments” – accommodations made in the workplace/society that mean everybody can thrive. As Macaskill puts it:  

“Accommodations then are not about accommodating something intrinsically negative; they're about accommodating something intrinsically different so that it doesn't have to be negative.” 

This can be seen already in many primary schools: where once it was the exception (and highly stigmatised) for a child to wear noise cancelling headphones, they are now routinely made available to all students, regardless of neurotype. This means not only that stigma is reduced for the one or two students who may be highly dependent on headphones, but it also means that many more children can benefit from a break from the deleterious effects of constant noise. 

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help. I suspect the rise in people identifying as neurodivergent reflects a latent cry of “Stop the world, I want to get off!” This is not to say that those coming forward are not autistic or do not have ADHD (or other neurodivergence) but simply that if our societies were gentler and more cohesive, fewer people with these conditions would need to reach for the “sick role” in order to get by.  

Perhaps counter-intuitively, if we want the number of people asking for the “sick role” to decrease, we actually need to be diagnosing more people! In this way, we push our capitalist society towards adopting “shadow-treatments” – adopting certain accommodations in our schools and workplaces as part of the norm. When this happens, there are benefits not only for neurodivergent people, but for everybody.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief