Article
Attention
Comment
War & peace
5 min read

Put poppy politics in the past and give Remembrance a hopeful future

Memory without hope will lead us to a dead-end.

Mark is a research mathematician who writes on ethics, human identity and the nature of intelligence.

A woman walls along a war memorial wall covered in red poppies.
War memorial in Canberra.
Raelle Gann-Owens on Unsplash.

Remembrance Day is complicated. A nation shows its gratitude for the service and sacrifice of its armed forces and tries to connect to its history. Never far away, are poppy politics, along with anxiety about identity and forgetting, and fears about nationalism and militarism. Is this the way to remember? 

Last November, protests in solidarity with Gaza dominated the headlines. On Armistice Day, hundreds of thousands of people marched through central London to demand a ceasefire. In the preceding weeks, there was vigorous debate about whether the march should be cancelled. There were several motivations for this: there were genuine fears of violence and extremism, and of disruption at the Cenotaph, but also questions of whether marching on Armistice Day was inappropriate or disrespectful. 

The march itself was organised to minimise the risk of disrupting public commemorations of Remembrance. It started several hours after the two-minute silence and followed a route several miles from the cenotaph. It was mostly peaceful, although there were arrests for anti-Semitism, open support for terrorism and violent attacks on police officers. Armistice Day did see violence around the cenotaph, but this was from the self-described ‘Cenotaph Defenders’ who had organised a counter-demonstration against the Gaza march. The group of football hooligans and far-right EDL members gathered with poppy emblazoned banners declaring ‘Have some respect for British Heroes’. Within a few hours, the calls for respect had degenerated into violent attacks on serving uniformed officers, in this case the police. 

The far-right’s adoption of remembrance symbolism can be seen as an extreme form of a wider entanglement of poppies and politics. The red paper poppy is a symbol of remembrance, but it has other connotations. For some it invokes patriotism and feelings of pride in their country, for others it represents conformity and militarism. Whether television news presenters are wearing them attracts disproportionate attention. In 2019, one Australian TV network had a very tasteless segment denouncing a rival station whose newscasters failed to wear poppies. The non-poppy wearing hosts were accused of failing in their duty to respect their country and to help preserve its culture and traditions. Regardless of the presenters’ actual reasons, this feels like a lot of baggage to load onto the delicate poppy, a symbol of quiet remembrance and gratitude. 

Unsurprisingly, this has led many to question whether Remembrance Day has become detached from its original purposes. Twelve years after the death of the last British First World War veteran, there is little living connection to either of the two world wars. With this passage of time, there is a growing danger of mistaking the symbols of ceremonial Remembrance for the thing itself.  

The focus of remembrance can shift away from the sheer horrors of war, from awe at the sacrifice of our forebears, and from the resolved ‘never again’ to fixing our gaze on the processed goods: the ceremonial silence, the poppies themselves and even the quality of our own emotional response. 

Some commentators have suggested that organised Remembrance has served its purpose and is best forgotten, and that too much remembering is a bad thing, fuelling grudges and sectarian conflicts. Personally, I’m not convinced, but I do think our current Remembrance is missing something. 

With a strong grounding in a shared past and a common hope, we would talk frankly about the times our country has fallen short without a sense of betraying our history or identity. 

Reflecting on the importance and difficulty of memory, the writer and Holocaust survivor Elie Wiesel emphasised the importance of hope. Despite the horrific experiences of the twentieth century, for Wiesel it is hope that “summons the future”. Memory without hope would lead us to a dead-end, where we grip onto the past while feeling it slip like sand through our fingers. Many of the anxieties around Remembrance point to a hope deficit. 

How can we remember with hope?  

We need to broaden our perspective and engage better with our shared national story. We need to be grounded in our history, stories and myths but we also need to be drawn forward by the good things we have and will have. If this story is big enough then it will be a large tapestry of interwoven strands, and we will be able to generously incorporate new strands, other cultures with their own relationship to the past into it. We will also be better prepared for our remembering to deal with difficult questions about our nation’s history. With a strong grounding in a shared past and a common hope, we would talk frankly about the times our country has fallen short without a sense of betraying our history or identity. Hope would connect us better to our neighbours overseas and to the men and women who risk their lives to serve their country. 

Last Remembrance Sunday, I helped our church’s under-7s make big paper poppies out of red paint and paper plates. The older children made origami peace cranes, and both the big red poppies and the peace cranes were placed by the altar. Here the focus is on remembering, but not just on our own memory. For me and countless other Christians, God’s memory is the real focus. God remembers us in our broken and war-torn world, and as Jesus, chose to join us in it, experiencing the worst of suffering while dying a painful death. All our personal and collective stories of pain, loss and sadness are met in this sacrifice. More than this, in the promises of restoration Jesus gave when He rose from the dead, they find a concrete hope. 

What does Remembrance look like when it’s really grounded in hope? I think there would be a few noticeable signs. It would be less precious about itself. It would be more open to different emphases of remembrance such as the Peace Pledge Union and the white poppy, and excited about new creative expressions of remembrance like the ‘poppy walks’ organised by the Royal British Legion. More patient to the concerns of those who find the religious elements of Remembrance difficult. More integrated into our attitudes to current and ongoing conflict around the world. Most of all I hope it would make us really hungry for both peace and for righteousness. 

Review
Books
Care
Comment
Psychology
7 min read

We don’t have an over-diagnosis problem, we have a society problem

Suzanne O’Sullivan's question is timely
A visualised glass head shows a swirl of pink across the face.
Maxim Berg on Unsplash.

Rates of diagnoses for autism and ADHD are at an all-time high, whilst NHS funding remains in a perpetual state of squeeze. In this context, consultant neurologist Suzanne O’Sullivan, in her recent book The Age of Diagnosis, asks a timely question: can getting a diagnosis sometimes do more harm than good? Her concern is that many of these apparent “diagnoses” are not so much wrong as superfluous; in her view, they risk harming a person’s sense of wellbeing by encouraging self-imposed limitations or prompting them to pursue treatments that may not be justified. 

There are elements of O-Sullivan’s argument that I am not qualified to assess. For example, I cannot look at the research into preventative treatments for localised and non-metastatic cancers and tell you what proportion of those treatments is unnecessary. However, even from my lay-person’s perspective, it does seem that if the removal of a tumour brings peace of mind to a patient, however benign that tumour might be, then O’Sullivan may be oversimplifying the situation when she proposes that such surgery is an unnecessary medical intervention.  

But O’Sullivan devotes a large proportion of the book to the topics of autism and ADHD – and on this I am less of a lay person. She is one of many people who are proposing that these are being over diagnosed due to parental pressure and social contagion. Her particular concern is that a diagnosis might become a self-fulfilling prophecy, limiting one’s opportunities in life: “Some will take the diagnosis to mean that they can’t do certain things, so they won’t even try.” Notably, O’Sullivan persists with this argument even though the one autistic person whom she interviewed for the book actually told her the opposite: getting a diagnosis had helped her interviewee, Poppy, to re-frame a number of the difficulties that she was facing in life and realise they were not her fault.  

Poppy’s narrative is one with which we are very familiar at the Centre for Autism and Theology, where our team of neurodiverse researchers have conducted many, many interviews with people of all neurotypes across multiple research projects. Time and time again we hear the same thing: getting a diagnosis is what helps many neurodivergent people make sense of their lives and to ask for the help that they need. As theologian Grant Macaskill said in a recent podcast:  

“A label, potentially, is something that can help you to thrive rather than simply label the fact that you're not thriving in some way.” 

Perhaps it is helpful to remember how these diagnoses come about, because neurodivergence cannot be identified by any objective means such as by a blood test or CT scan. At present the only way to get a diagnosis is to have one’s lifestyle, behaviours and preferences analysed by clinicians during an intrusive and often patronising process of self-disclosure. 

Despite the invidious nature of this diagnostic process, more and more people are willing to subject themselves to it. Philosopher Robert Chapman looks to late-stage capitalism for the explanation. Having a diagnosis means that one can take on what is known as the “sick role” in our societal structures. When one is in the “sick role” in any kind of culture, society, or organisation, one is given social permission to take less personal responsibility for one’s own well-being. For example, if I have the flu at home, then caring family members might bring me hot drinks, chicken soup or whatever else I might need, so that I don’t have to get out of bed. This makes sense when I am sick, but if I expected my family to do things like that for me all the time, then I would be called lazy and demanding! When a person is in the “sick role” to whatever degree (it doesn’t always entail being consigned to one’s bed) then the expectations on that person change accordingly.  

Chapman points out that the dynamics of late-stage capitalism have pushed more and more people into the “sick role” because our lifestyles are bad for our health in ways that are mostly out of our own control. In his 2023 book, Empire of Normality, he observes,  

“In the scientific literature more generally, for instance, modern artificial lighting has been associated with depression and other health conditions; excessive exposure to screen time has been associated with chronic overstimulation, mental health conditions, and cognitive disablement; and noise annoyance has been associated with a twofold increase in depression and anxiety, especially relating to noise pollution from aircraft, traffic, and industrial work.” 

Most of this we cannot escape, and on top of it all we live life at a frenetic pace where workers are expected to function like machines, often subordinating the needs and demands of the body. Thus, more and more people begin to experience disablement, where they simply cannot keep working, and they start to reach for medical diagnoses to explain why they cannot keep pace in an environment that is constantly thwarting their efforts to stay fit and well. From this arises the phenomenon of “shadow diagnoses” – this is where “milder” versions of existing conditions, including autism and ADHD, start to be diagnosed more commonly, because more and more people are feeling that they are unsuited to the cognitive, sensory and emotional demands of daily working life.  

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help.

O’Sullivan rightly observes that some real problems arise from this phenomenon of “shadow diagnoses”. It does create a scenario, for example, where autistic people who experience significant disability (e.g., those who have no perception of danger and therefore require 24-hour supervision to keep them safe) are in the same “queue” for support as those from whom being autistic doesn’t preclude living independently. 

But this is not a diagnosis problem so much as a society problem – health and social care resources are never limitless, and a process of prioritisation must always take place. If I cut my hand on a piece of broken glass and need to go to A&E for stiches, I might find myself in the same “queue” as a 7-year-old child who has done exactly the same thing. Like anyone, I would expect the staff to treat the child first, knowing that the same injury is likely to be causing a younger person much more distress. Autistic individuals are just as capable of recognising that others within the autism community may have needs that should take priority over their own.   

What O’Sullivan overlooks is that there are some equally big positives to “shadow diagnoses” – especially as our society runs on such strongly capitalist lines. When a large proportion of the population starts to experience the same disablement, it becomes economically worthwhile for employers or other authorities to address the problem. To put it another way: If we get a rise in “shadow diagnoses” then we also get a rise in “shadow treatments” – accommodations made in the workplace/society that mean everybody can thrive. As Macaskill puts it:  

“Accommodations then are not about accommodating something intrinsically negative; they're about accommodating something intrinsically different so that it doesn't have to be negative.” 

This can be seen already in many primary schools: where once it was the exception (and highly stigmatised) for a child to wear noise cancelling headphones, they are now routinely made available to all students, regardless of neurotype. This means not only that stigma is reduced for the one or two students who may be highly dependent on headphones, but it also means that many more children can benefit from a break from the deleterious effects of constant noise. 

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help. I suspect the rise in people identifying as neurodivergent reflects a latent cry of “Stop the world, I want to get off!” This is not to say that those coming forward are not autistic or do not have ADHD (or other neurodivergence) but simply that if our societies were gentler and more cohesive, fewer people with these conditions would need to reach for the “sick role” in order to get by.  

Perhaps counter-intuitively, if we want the number of people asking for the “sick role” to decrease, we actually need to be diagnosing more people! In this way, we push our capitalist society towards adopting “shadow-treatments” – adopting certain accommodations in our schools and workplaces as part of the norm. When this happens, there are benefits not only for neurodivergent people, but for everybody.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief