Article
Comment
Eating
7 min read

Why hold on to Veganuary anymore?

As commercial promotion of plant-based diets falter Trystan Owain Hughes digs for the real roots around a ‘reverence for life’.
A man stands at rest, one arm holding some vegetables.
NordWood Themes on Unsplash.

For many people, the month of January has been rechristened 'Veganuary’. Through this global campaign, which is celebrating its tenth anniversary this year, numerous people have embraced a plant-based diet.  

Founded by a married couple from York, Veganuary has become a worldwide phenomenon, with more than 700,000 making the pledge last year. A YouGov poll suggests that numbers participating informally are far higher, perhaps as many as 4 per cent of Brits, 7 per cent of Americans, and almost 10 per cent of Germans. The campaign has also gained celebrity backing, with Paul McCartney, Joaquin Phoenix, Deborah Meadon, and Billy Eilish amongst the many star names backing the movement in recent years.  

Yet there are some signs that the vegan bubble may have finally burst. The pace of interest in non-animal diets has started to level off and some analysts believe that “peak vegan” in the UK was way back in 2019. Figures by consumer intelligence company NIQ seem to confirm this. UK sales of both chilled and frozen meat alternatives have fallen sharply in recent years and prominent companies, including Oatly, Nestlé, Innocent and Heck, have withdrawn various vegan products. 

Recent years have also seen an increasing number of posts and memes on social media feeds that are antagonistic towards the vegan lifestyle. It seems attitudes towards animals are slowly becoming incorporated into the cultural wars, with veganism often regarded as part of an over-righteous so-called “woke” ideology.  

Some Christians subscribe to such an attitude and are hostile to those who embrace plant-based diets. Others, on the other hand, take a very different stance in considering their scriptures and theological traditions, emphasising the absolute necessity of a holistic awareness of diet, not least in light of animal cruelty, uncompassionate means of food production, and environmental concern. There are, after all, numerous affirmations of the precious and holy nature of the created order in the Bible. This would have differed profoundly from non-Judaic teaching in the Ancient Near East. 

When he was surrounded by suffering and death... he came to regard a transcendent ‘reverence for life’ as the only way of living that made sense. 

The moral imperative to care for the environment and value all creatures is clear from the very first pages of the Bible. After each day in the Genesis account of creation, God regards what he has formed as tov, a Hebrew word meaning good, pleasurable, and delightful. At the end of the creative process, God then looks at the whole of his handiwork, and he sees that the wonderful harmony of the complex, intricate, and balanced ecosystem is tov me’od, meaning ‘very good’. Later, in the New Testament, Jesus asserts that only God himself is good. It therefore follows that creation can, in some way, reveal the goodness of God directly. 

And so there are many Christians who are drawn to an awareness that everything in this wonderful world has value and significance – the strangers we pass on the street, our pets who share our houses, the squirrels who dart across our paths in the park, the snowdrops breaking through the soil in our gardens, and even the slugs in our flowerbeds. No wonder the biblical images of the glorious eschatological, heavenly future are ones in which natural world is at harmony. 

The German phrase that theologian Albert Schweitzer used to express the ramifications of the biblical concept of the goodness of the creation is ‘Ehrfurcht vor dem Leben’, which is often translated as ‘reverence for life’. The word Ehrfurcht, however, expresses far more than its English translation implies. It suggests an attitude of awe and ultimate respect, and so carries with it an overwhelming sense of moral responsibility towards creation. For Schweitzer this was no abstract intellectualism. His principle of ‘reverence for life’ came to him as he worked among the sick in the heart of tropical Africa. While prominent atheists like Richard Dawkins and Stephen Fry maintain that cruelty in nature is one of the main stumbling blocks of belief in the divine, it was not a sanitized version of nature that led Schweitzer to his God-centred conclusion. Rather, when he was surrounded by suffering and death, both in the hospital in which he worked and in the unforgiving natural world of the jungle around him, he came to regard a transcendent ‘reverence for life’ as the only way of living that made sense. 

We are not only shockingly merciless towards each other, but we extend our cruelty to the creatures with which we share the planet. 

Nature may well be ‘red in tooth and claw’, to use Lord Tennyson’s phrase, but humanity has been gifted with the potential to bring compassion and love to a world of pain and suffering. Most people already regard human life as inherently precious, but Christianity stands alongside other faiths in challenging people to consider the value the lives of non-human creatures. After all, Schweitzer suggested that every creature holds to the importance of its own life, albeit unconsciously, and this should lead people to solidarity with all forms of life. In this sense, an individual’s relationship with nature is far more intimate than we might think. ‘Wherever you see life,’ he wrote, ‘that is yourself!’  

This recognition of humankind’s profound bond with other living creatures allowed Schweitzer to apply Jesus’ core teaching on love to the wider world – ‘the ethic of love widened into universality’, as he put it. This stands in marked contrast to the present status quo which views the only real value of non-human life to be its usefulness. No wonder that so many animals in modern industrial farming experience what Richard Holloway describes as a ‘double-dying’, as their everyday existence is as pitiful as their death. They live out wretched lifespans in disease-prone torture before being transported hundreds of miles in overcrowded trucks to their slaughter. But our society continues to turn a blind eye towards heartless factory farming practices. They are not only tolerated but justified with the argument that animals are little more than unfeeling machines who don’t really ‘suffer’ in the human sense of the word. 

Such attitudes contribute to what the 1995 papal encyclical Evangelium Vitae refers to as the ‘culture of death’ of the modern world. We are not only shockingly merciless towards each other, but we extend our cruelty to the creatures with which we share the planet. In the large global corporations that dominate the food industry, animals are viewed as products to be reared to supply fast-food outlets. They are bred specifically for death. While nature itself is cruel, each creature is endowed with a fighting instinct for survival and a means to achieve it through armour, speed, disguise, poison or odour. We humans, though, offer no chance for such defensive capabilities to be utilised. Nothing is as uncaring and ruthless in nature as the hungry human. 

Not that this recognition necessarily leads us to a purely plant-based diet. Even Schweitzer himself, who was a proponent of vegetarianism, ate meat on occasions. Perhaps the indigenous hunting communities of our world today can help us to bridge the gap between reverence for life and the killing of animals for food. While they are principally carnivores, many of these communities appreciate their utter dependence on the animals that are sacrificed so they might live and thrive. There is, therefore, a deep empathy and affection towards the hunted. In fact, compassionate ceremonies and rituals are often performed to show gratitude to the animals for the gift of their lives. The tribesmen of the Kalahari Desert will, for example, symbolically enter into the suffering of their dying prey by enacting their final death throes. Contrast this with our own food system, which is largely controlled by a small group of multinational corporations who attempt to hide the truth about what we are eating and the harsh treatment of both animals and workers in their factories. 

In a YouGov survey, participants in Veganuary were asked to list their incentives for taking part. The main reason given, above environmental regard and personal health, was animal welfare. The concept of 'reverence for life’ speaks into this concern. As such, in embracing the concept that all life is equally worthy of our attention, respect, and love, Christians could have so much to offer contemporary debate. Their perspective could have huge implications on the moral and ethical matters that we face today – climate change, food production, health care, emerging technologies, animal care, AI, and energy development. ‘Do not do any injury, if you can possibly avoid it,’ the great Welsh Celtic saint Teilo is purported to have said while reflecting on creation. The anthropocentric, human-centred paradigm does not, then, reflect a truly Christian worldview. Instead, Christianity holds that every part of creation reflects God’s goodness and non-human life deserves respect for its own sake, not simply because of its usefulness. The whole, wonderful web of life is considered to be valued and loved by God, not merely one strand of it, and the daily call of the Christian is to live out the compassion, care, and love that such an awareness demands. 

Review
Books
Care
Comment
Psychology
7 min read

We don’t have an over-diagnosis problem, we have a society problem

Suzanne O’Sullivan's question is timely
A visualised glass head shows a swirl of pink across the face.
Maxim Berg on Unsplash.

Rates of diagnoses for autism and ADHD are at an all-time high, whilst NHS funding remains in a perpetual state of squeeze. In this context, consultant neurologist Suzanne O’Sullivan, in her recent book The Age of Diagnosis, asks a timely question: can getting a diagnosis sometimes do more harm than good? Her concern is that many of these apparent “diagnoses” are not so much wrong as superfluous; in her view, they risk harming a person’s sense of wellbeing by encouraging self-imposed limitations or prompting them to pursue treatments that may not be justified. 

There are elements of O-Sullivan’s argument that I am not qualified to assess. For example, I cannot look at the research into preventative treatments for localised and non-metastatic cancers and tell you what proportion of those treatments is unnecessary. However, even from my lay-person’s perspective, it does seem that if the removal of a tumour brings peace of mind to a patient, however benign that tumour might be, then O’Sullivan may be oversimplifying the situation when she proposes that such surgery is an unnecessary medical intervention.  

But O’Sullivan devotes a large proportion of the book to the topics of autism and ADHD – and on this I am less of a lay person. She is one of many people who are proposing that these are being over diagnosed due to parental pressure and social contagion. Her particular concern is that a diagnosis might become a self-fulfilling prophecy, limiting one’s opportunities in life: “Some will take the diagnosis to mean that they can’t do certain things, so they won’t even try.” Notably, O’Sullivan persists with this argument even though the one autistic person whom she interviewed for the book actually told her the opposite: getting a diagnosis had helped her interviewee, Poppy, to re-frame a number of the difficulties that she was facing in life and realise they were not her fault.  

Poppy’s narrative is one with which we are very familiar at the Centre for Autism and Theology, where our team of neurodiverse researchers have conducted many, many interviews with people of all neurotypes across multiple research projects. Time and time again we hear the same thing: getting a diagnosis is what helps many neurodivergent people make sense of their lives and to ask for the help that they need. As theologian Grant Macaskill said in a recent podcast:  

“A label, potentially, is something that can help you to thrive rather than simply label the fact that you're not thriving in some way.” 

Perhaps it is helpful to remember how these diagnoses come about, because neurodivergence cannot be identified by any objective means such as by a blood test or CT scan. At present the only way to get a diagnosis is to have one’s lifestyle, behaviours and preferences analysed by clinicians during an intrusive and often patronising process of self-disclosure. 

Despite the invidious nature of this diagnostic process, more and more people are willing to subject themselves to it. Philosopher Robert Chapman looks to late-stage capitalism for the explanation. Having a diagnosis means that one can take on what is known as the “sick role” in our societal structures. When one is in the “sick role” in any kind of culture, society, or organisation, one is given social permission to take less personal responsibility for one’s own well-being. For example, if I have the flu at home, then caring family members might bring me hot drinks, chicken soup or whatever else I might need, so that I don’t have to get out of bed. This makes sense when I am sick, but if I expected my family to do things like that for me all the time, then I would be called lazy and demanding! When a person is in the “sick role” to whatever degree (it doesn’t always entail being consigned to one’s bed) then the expectations on that person change accordingly.  

Chapman points out that the dynamics of late-stage capitalism have pushed more and more people into the “sick role” because our lifestyles are bad for our health in ways that are mostly out of our own control. In his 2023 book, Empire of Normality, he observes,  

“In the scientific literature more generally, for instance, modern artificial lighting has been associated with depression and other health conditions; excessive exposure to screen time has been associated with chronic overstimulation, mental health conditions, and cognitive disablement; and noise annoyance has been associated with a twofold increase in depression and anxiety, especially relating to noise pollution from aircraft, traffic, and industrial work.” 

Most of this we cannot escape, and on top of it all we live life at a frenetic pace where workers are expected to function like machines, often subordinating the needs and demands of the body. Thus, more and more people begin to experience disablement, where they simply cannot keep working, and they start to reach for medical diagnoses to explain why they cannot keep pace in an environment that is constantly thwarting their efforts to stay fit and well. From this arises the phenomenon of “shadow diagnoses” – this is where “milder” versions of existing conditions, including autism and ADHD, start to be diagnosed more commonly, because more and more people are feeling that they are unsuited to the cognitive, sensory and emotional demands of daily working life.  

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help.

O’Sullivan rightly observes that some real problems arise from this phenomenon of “shadow diagnoses”. It does create a scenario, for example, where autistic people who experience significant disability (e.g., those who have no perception of danger and therefore require 24-hour supervision to keep them safe) are in the same “queue” for support as those from whom being autistic doesn’t preclude living independently. 

But this is not a diagnosis problem so much as a society problem – health and social care resources are never limitless, and a process of prioritisation must always take place. If I cut my hand on a piece of broken glass and need to go to A&E for stiches, I might find myself in the same “queue” as a 7-year-old child who has done exactly the same thing. Like anyone, I would expect the staff to treat the child first, knowing that the same injury is likely to be causing a younger person much more distress. Autistic individuals are just as capable of recognising that others within the autism community may have needs that should take priority over their own.   

What O’Sullivan overlooks is that there are some equally big positives to “shadow diagnoses” – especially as our society runs on such strongly capitalist lines. When a large proportion of the population starts to experience the same disablement, it becomes economically worthwhile for employers or other authorities to address the problem. To put it another way: If we get a rise in “shadow diagnoses” then we also get a rise in “shadow treatments” – accommodations made in the workplace/society that mean everybody can thrive. As Macaskill puts it:  

“Accommodations then are not about accommodating something intrinsically negative; they're about accommodating something intrinsically different so that it doesn't have to be negative.” 

This can be seen already in many primary schools: where once it was the exception (and highly stigmatised) for a child to wear noise cancelling headphones, they are now routinely made available to all students, regardless of neurotype. This means not only that stigma is reduced for the one or two students who may be highly dependent on headphones, but it also means that many more children can benefit from a break from the deleterious effects of constant noise. 

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help. I suspect the rise in people identifying as neurodivergent reflects a latent cry of “Stop the world, I want to get off!” This is not to say that those coming forward are not autistic or do not have ADHD (or other neurodivergence) but simply that if our societies were gentler and more cohesive, fewer people with these conditions would need to reach for the “sick role” in order to get by.  

Perhaps counter-intuitively, if we want the number of people asking for the “sick role” to decrease, we actually need to be diagnosing more people! In this way, we push our capitalist society towards adopting “shadow-treatments” – adopting certain accommodations in our schools and workplaces as part of the norm. When this happens, there are benefits not only for neurodivergent people, but for everybody.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief