Explainer
Books
Culture
Film & TV
Monsters
Weirdness
6 min read

Bled dry: some red flags for those who hope to date a vampire

A philosopher's guide to undying love.

Ryan is the author of A Guidebook to Monsters: Philosophy, Religion, and the Paranormal.

A modern vampire stairs at the face of his girlfried.
Kristen Stewart and Robert Pattinson in Twilight.
Lionsgate.

Writing from his new book, A Guidebook to Monsters, Ryan Stark delves into humanity’s undying passion for all things gothic.  In the first of a two-part series, he asks what is so irresistible about vampires, what do we want from them, and what’s the deal with the armadillos? 

 

Historians point to John Polidori’s The Vampyre as that vital moment in Western vampire lore when the grisly undead creature becomes instead a Casanova. London, 1819. Lord Ruthven, the suave vampire in question, seduces young women and orchestrates chaos in the lives of others—all for his own carnal pleasure. Importantly, he does this by way of persuasion, not rote coercion, which illustrates a key aspect of the modern vampires’ modus operandi. They prefer romance to compulsion, seduction to force. They prefer thrall, almost to the end, at which point the monster fully emerges and the victims fully grasp that their good senses have been compromised. But by then it is too late. 

“None are more hopelessly enslaved than those who falsely believe they are free,” Goethe once observed. Similarly, none are more hopelessly enslaved than those who believe themselves to be dating vampires. 

To resist, however, is easier said than done. Even Buffy the vampire slayer succumbs to thrall, so much so that she invites Dracula to bite her. And he happily obliges, if “happily” is possible in the mind of a vampire. Later, having sobered up from the ordeal, Buffy stakes the villain, but we are nonetheless left with an uneasy feeling. Despite all her experience, despite all her kung-fu knowhow, Buffy still crumbles in the wake of thrall, at least temporarily, putting herself in grave danger and eliciting from us a pressing set of questions. How could this have happened so easily? Will this happen again? Are women attracted to men in capes? 

Much like kryptonite, vampire magic also affects Superman. Two vampires have so far succeeded in hypnotizing him. Crucifer, not fortunate in name, enthralls our protagonist and sends him on several errands, until the Man of Steel has a moment of clarity, as the alcoholics call it, at which point he punches the ancient vampire through the heart. Dracula, too, disguised as an aristocrat named Rominoff, charms our superhero rather easily and then bites him on the neck, only to explode—hilariously—on the premise that Superman’s blood is tinged with sunlight. A moment of dream logic used to subvert the expectation that superblood might somehow benefit the Count. 

Lord Ruthven of Polidori fame also wanders into the DC Comic Book Universe and, per usual, charms his way through problems, until he inadvertently skewers himself on a war memorial. Before this happens, however, we get the strong impression that Ruthven could beguile Superman with ease, if given the chance: that pens are mightier than swords and always have been. 

On the contrary, vampires have a long history of not pointing to Heaven. Instead, they gild the lily. In their attempt to out-gothic the gothic, they turn their style inwardly upon themselves.

Psychoanalysts observe that to empathize with sociopaths is to negate the self most dangerously. They are right, I think, and right—too—that self-erasure proves difficult to recognize at times, because it feels like love. Such is the predicament of those who hope to rendezvous with vampires. Perhaps they have a death wish, some will say, or maybe a savior syndrome, as if they are to save the brooding scoundrels. As if they can. Regardless, the monsters have another plan entirely. As an early church father once explained, those who dine with the devils should bring long spoons. 

Not that vampires are particularly good at banquets. They inevitably exaggerate, like the Macbeths as they welcome King Duncan to the castle: “All our service,” the lady says, “in every point twice done and then done double.”  

Or recall the embroidered hospitality of Bela Lugosi’s 1931 Dracula, caught between silent film and sound: “I bid you welcome,” he says, acting out the part as if the audience must see the motive on his face. A perfect moment when the silent cinema and talking pictures conspire to produce the quintessential vampire ethos, an overstated affectation framed for the modern age. The bow of pretended humility, the elongated gesture, the monumental gravity. The outfit.   

Some speculate that if vampires were able to see themselves in mirrors, they would reconsider their wardrobes. We have reason to think otherwise. Of course, the true gothic is not the vampire aesthetic, because the true gothic always points to Heaven, as in Notre Dame Cathedral, for instance, or Westminster Abbey. On the contrary, vampires have a long history of not pointing to Heaven. Instead, they gild the lily. In their attempt to out-gothic the gothic, they turn their style inwardly upon themselves, incurvatus in se, which signals not grandeur but rather self-apotheosis. In essence, vampires are their own cathedrals, and with this premise proceed accordingly, candelabras in tow. 

If the vampire could only find pleasure in chocolate, if he could laugh with children, if he could be loved like Bella loves Edward in The Twilight Saga, then maybe there is hope enough in the world for all of us.

Longinus, in On the Sublime, uses the term “frigidity” to describe the emotional effect produced by such false grandeur. He means to convey both rhetorical and metaphysical coldness, as does Dante, who places the Devil in a block of ice at the Inferno’s gaudy center. As does Stanley Kubrick, too, who freezes the possessed Jack in the maze at the end of The Shining. And somewhere near the frozen middle of Hell we find the vampires, those who betrayed the strangers in their midst and preyed upon the lonely and the desperate. But now they only devour themselves. We are punished by our sins, not for them. 

On a side note, and concerning the vampire’s many choristers, the opening scene of Lugosi’s Dracula features three armadillos. They wander about the castle and mind their own business, it seems, as wolves howl and spiders weave their webs. On how they got there we do not know, but the armadillos further confirm Longinus’s additional point that the ridiculous and the sublime bear a family resemblance. 

What, then, are we to make of the vampires who sparkle and the vampires with souls? Or, if not in the direction of the dreamy, then in the theater of the absurd: Count Chocula, the mascot for a popular breakfast cereal, or the puppet Count von Count from the children’s program Sesame Street, who teaches viewers how to add and subtract—hitting all the numbers with his heavy Transylvanian accent. We might deem these manifestations too unserious to be taken seriously, but in fairness to the spirit of Count Chocula, perhaps something else happens here. Namely, we find more variations upon the culture-making effort to rehabilitate the demonic, and the almost demonic, as the case might be.  

If the vampire could only find pleasure in chocolate, if he could laugh with children, if he could be loved like Bella loves Edward in The Twilight Saga, then maybe there is hope enough in the world for all of us. Indeed, maybe some vampires have grown tired of being vampires. That said, we do well to heed the old Transylvanian proverb, lest we over-empathize with the villains: the sane would do no good if they made themselves monsters to help the monsters. 

A recent meme depicts the real Dracula in the company of Count Chocula, Count von Count, The Twilight Saga’s Edward, and several other less-than-scary princes of darkness, at which point Dracula laments that the vampires have lost their edge. 

And, true, I have yet to comment on psychic vampires and flaming extroverts, which is an oversight to be sure. As a corrective, and by way of conclusion, I observe the following: for twenty-seven dollars, one can buy a beaker of psychic vampire repellent from Gwyneth Paltrow’s Goop Store in Beverly Hills, California. The Paper Crane Apothecary makes the product, which—with an essential blend of rosemary, lavender, and juniper—protects against the fiends who corner people at parties. At present, however, shipping will be difficult: the website tells me “This item is sold out.” 

  

From A Guidebook to Monsters, Ryan J. Stark.  Used by permission of Wipf and Stock Publishers. 

Review
Books
Care
Comment
Psychology
7 min read

We don’t have an over-diagnosis problem, we have a society problem

Suzanne O’Sullivan's question is timely
A visualised glass head shows a swirl of pink across the face.
Maxim Berg on Unsplash.

Rates of diagnoses for autism and ADHD are at an all-time high, whilst NHS funding remains in a perpetual state of squeeze. In this context, consultant neurologist Suzanne O’Sullivan, in her recent book The Age of Diagnosis, asks a timely question: can getting a diagnosis sometimes do more harm than good? Her concern is that many of these apparent “diagnoses” are not so much wrong as superfluous; in her view, they risk harming a person’s sense of wellbeing by encouraging self-imposed limitations or prompting them to pursue treatments that may not be justified. 

There are elements of O-Sullivan’s argument that I am not qualified to assess. For example, I cannot look at the research into preventative treatments for localised and non-metastatic cancers and tell you what proportion of those treatments is unnecessary. However, even from my lay-person’s perspective, it does seem that if the removal of a tumour brings peace of mind to a patient, however benign that tumour might be, then O’Sullivan may be oversimplifying the situation when she proposes that such surgery is an unnecessary medical intervention.  

But O’Sullivan devotes a large proportion of the book to the topics of autism and ADHD – and on this I am less of a lay person. She is one of many people who are proposing that these are being over diagnosed due to parental pressure and social contagion. Her particular concern is that a diagnosis might become a self-fulfilling prophecy, limiting one’s opportunities in life: “Some will take the diagnosis to mean that they can’t do certain things, so they won’t even try.” Notably, O’Sullivan persists with this argument even though the one autistic person whom she interviewed for the book actually told her the opposite: getting a diagnosis had helped her interviewee, Poppy, to re-frame a number of the difficulties that she was facing in life and realise they were not her fault.  

Poppy’s narrative is one with which we are very familiar at the Centre for Autism and Theology, where our team of neurodiverse researchers have conducted many, many interviews with people of all neurotypes across multiple research projects. Time and time again we hear the same thing: getting a diagnosis is what helps many neurodivergent people make sense of their lives and to ask for the help that they need. As theologian Grant Macaskill said in a recent podcast:  

“A label, potentially, is something that can help you to thrive rather than simply label the fact that you're not thriving in some way.” 

Perhaps it is helpful to remember how these diagnoses come about, because neurodivergence cannot be identified by any objective means such as by a blood test or CT scan. At present the only way to get a diagnosis is to have one’s lifestyle, behaviours and preferences analysed by clinicians during an intrusive and often patronising process of self-disclosure. 

Despite the invidious nature of this diagnostic process, more and more people are willing to subject themselves to it. Philosopher Robert Chapman looks to late-stage capitalism for the explanation. Having a diagnosis means that one can take on what is known as the “sick role” in our societal structures. When one is in the “sick role” in any kind of culture, society, or organisation, one is given social permission to take less personal responsibility for one’s own well-being. For example, if I have the flu at home, then caring family members might bring me hot drinks, chicken soup or whatever else I might need, so that I don’t have to get out of bed. This makes sense when I am sick, but if I expected my family to do things like that for me all the time, then I would be called lazy and demanding! When a person is in the “sick role” to whatever degree (it doesn’t always entail being consigned to one’s bed) then the expectations on that person change accordingly.  

Chapman points out that the dynamics of late-stage capitalism have pushed more and more people into the “sick role” because our lifestyles are bad for our health in ways that are mostly out of our own control. In his 2023 book, Empire of Normality, he observes,  

“In the scientific literature more generally, for instance, modern artificial lighting has been associated with depression and other health conditions; excessive exposure to screen time has been associated with chronic overstimulation, mental health conditions, and cognitive disablement; and noise annoyance has been associated with a twofold increase in depression and anxiety, especially relating to noise pollution from aircraft, traffic, and industrial work.” 

Most of this we cannot escape, and on top of it all we live life at a frenetic pace where workers are expected to function like machines, often subordinating the needs and demands of the body. Thus, more and more people begin to experience disablement, where they simply cannot keep working, and they start to reach for medical diagnoses to explain why they cannot keep pace in an environment that is constantly thwarting their efforts to stay fit and well. From this arises the phenomenon of “shadow diagnoses” – this is where “milder” versions of existing conditions, including autism and ADHD, start to be diagnosed more commonly, because more and more people are feeling that they are unsuited to the cognitive, sensory and emotional demands of daily working life.  

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help.

O’Sullivan rightly observes that some real problems arise from this phenomenon of “shadow diagnoses”. It does create a scenario, for example, where autistic people who experience significant disability (e.g., those who have no perception of danger and therefore require 24-hour supervision to keep them safe) are in the same “queue” for support as those from whom being autistic doesn’t preclude living independently. 

But this is not a diagnosis problem so much as a society problem – health and social care resources are never limitless, and a process of prioritisation must always take place. If I cut my hand on a piece of broken glass and need to go to A&E for stiches, I might find myself in the same “queue” as a 7-year-old child who has done exactly the same thing. Like anyone, I would expect the staff to treat the child first, knowing that the same injury is likely to be causing a younger person much more distress. Autistic individuals are just as capable of recognising that others within the autism community may have needs that should take priority over their own.   

What O’Sullivan overlooks is that there are some equally big positives to “shadow diagnoses” – especially as our society runs on such strongly capitalist lines. When a large proportion of the population starts to experience the same disablement, it becomes economically worthwhile for employers or other authorities to address the problem. To put it another way: If we get a rise in “shadow diagnoses” then we also get a rise in “shadow treatments” – accommodations made in the workplace/society that mean everybody can thrive. As Macaskill puts it:  

“Accommodations then are not about accommodating something intrinsically negative; they're about accommodating something intrinsically different so that it doesn't have to be negative.” 

This can be seen already in many primary schools: where once it was the exception (and highly stigmatised) for a child to wear noise cancelling headphones, they are now routinely made available to all students, regardless of neurotype. This means not only that stigma is reduced for the one or two students who may be highly dependent on headphones, but it also means that many more children can benefit from a break from the deleterious effects of constant noise. 

When I read in O’Sullivan’s book that a lot more people are asking for diagnoses, what I hear is that a lot more people are asking for help. I suspect the rise in people identifying as neurodivergent reflects a latent cry of “Stop the world, I want to get off!” This is not to say that those coming forward are not autistic or do not have ADHD (or other neurodivergence) but simply that if our societies were gentler and more cohesive, fewer people with these conditions would need to reach for the “sick role” in order to get by.  

Perhaps counter-intuitively, if we want the number of people asking for the “sick role” to decrease, we actually need to be diagnosing more people! In this way, we push our capitalist society towards adopting “shadow-treatments” – adopting certain accommodations in our schools and workplaces as part of the norm. When this happens, there are benefits not only for neurodivergent people, but for everybody.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief