Article
Culture
Film & TV
Monsters
Weirdness
Zombies
5 min read

Zombies: a philosopher's guide to the purpose-driven undead

Don’t dismiss zombiecore as lowbrow.

Ryan is the author of A Guidebook to Monsters: Philosophy, Religion, and the Paranormal.

A regency woman dabs her mouth with a bloody hankerchief.
Lilly James in Pride and Prejudice and Zombies.
Lionsgate.

Writing from his new book, A Guidebook to Monsters, Ryan Stark delves into humanity’s fascination for all things monsterous. In the second of a two-part series, he asks what and where zombies remind us of, and why they caught the eyes of C.S. Lewis and Salvador Dali 

 

On how Frankenstein’s monster came to life nobody knows for sure, but he is more urbane than zombies tend to be. Nor do Jewish golems and Frosty the Snowman count as zombiecore. The latter sings too much, and both are wrongly formulated. Frosty comes from snow, obviously, and the golems—from mere loam, not what the Renaissance playwrights call “gilded loam,” that is, already pre-assembled bodies, which is a zombie requirement. Tolkien’s orcs function likewise as golem-esque monsters, cast from miry clay and then enlivened by the grim magic of Mordor. We do not, for instance, discover scenes with orc children. 

And neither is Pinocchio a zombie, nor Pris from Blade Runner, but dolls, automatons, and C3POs border upon the land of zombies insofar as they all carry a non-human tint. Zombies, however, carry something else as well, a history of personhood, and so in their present form appear as macabre parodies of the human condition writ large. They are gruesome undead doppelgangers, reminding us of who we are not and perhaps—too—of where we are not. Hell is a place prepared for the Devil and his angels, Christ tells us in the book of Matthew. And maybe, subsequently, for zombies. 

Kolchak, in an episode of Kolchak: The Night Stalker aptly titled “The Zombie,” correctly discerns the grim scenario at hand: “He, sir, is from Hell itself!”  

C.S. Lewis pursues a similar line of thinking in The Problem of Pain: “You will remember that in the parable, the saved go to a place prepared for them, while the damned go to a place never made for men at all. To enter Heaven is to become more human than you ever succeeded in being on earth; to enter Hell is to be banished from humanity. What is cast (or casts itself) into Hell is not a man: it is ‘remains.’” Lewis makes an intriguing point, which has as its crescendo the now-famous line about the doors of Hell: “I willingly believe that the damned are, in one sense, successful, rebels to the end; that the doors of Hell are locked on the inside by zombies.” I added that last part about zombies. 

I make this point—in part—to correct those in the cognoscenti who dismiss zombies as a subject too lowbrow for serious consideration.

Not everyone believes in Hell, of course, yet most concede that some people behave worse than others, which also helps our cause. Indeed, part of zombiecore’s wisdom is to show that bad people often produce more horror than the zombies themselves. Such is the character of Legendre Murder, a case in point from the film White Zombie. Not fortunate in name, Mr. Murder runs a dark satanic mill populated by hordes of zombie workers, which is the film’s heavy-handed critique of sociopathic industrialization. The truth to be gleaned, here, is that zombies did not invent the multinational corporation; rather, they fell prey to it. 

We might think, too, of Herman Melville’s dehumanized characters from Bartleby the Scrivener: Nippers, Turkey, Ginger Nut, and the other functionaries whose nicknames themselves indicate the functions. From an economic standpoint, their value becomes a matter of utility, not essence, which is Melville’s reproach of the despairingly corporate drive to objectify personhood—of which zombies are an example beyond the pale. They might as well be fleshy mannequins, in fact, and as such provide the perfect foil for the human being properly conceived. 

Here, then, is why we do not blame zombies for eating brains, nor do we hold them accountable for wearing white pants after Labor Day, as some inevitably do. They cannot help it—in ethics and in fashion. Perhaps especially in fashion. The best we can hope for in the realm of zombie couture is Solomon Grundy, the quasi-zombie supervillain who holds up his frayed pants with a frayed rope, a fashion victory to be sure, however small it might be, though “zombie fashion” is a misnomer in the final analysis. They wear clothes, but not for the same reasons we do. 

The point holds true for Salvador Dali’s zombies as well, most of whom find themselves in nice dresses. I make this point—in part—to correct those in the cognoscenti who dismiss zombies as a subject too lowbrow for serious consideration. Not so. Exhibit A: the avant-garde Dali, darling of the highbrow, or at least still of the middlebrow, now that his paintings appear on t-shirts and coffee mugs. Burning giraffe. Mirage. Woman with Head of Roses. All zombies, too ramshackle and emaciated to live, never mind the missing head on the last one, and yet there they are posed for the leering eye, not unlike those heroin-chic supermodels from Vogue magazine in the late 1990s. Necrophilia never looked so stylish. 

The zombie’s gloomy predicament bears a striking resemblance to that of the Danaids in the classical underworld, those sisters condemned to fill a sieve with water for all eternity...

But never let it be said that zombies are lazy. They are tired, to be sure. Their ragged countenances tell us this, but they are not indolent. Zombies live purpose-driven undead lives. They want to eat brains, or any human flesh, depending on the mythos, and their calendars are organized accordingly. No naps. No swimming lessons. Just brains.  

But we quickly discern that no amount of flesh will satisfy. There is always one more hapless minimart clerk to ambush, one more sorority girl in bunny slippers to chase down the corridor. In this way, the zombie’s gloomy predicament bears a striking resemblance to that of the Danaids in the classical underworld, those sisters condemned to fill a sieve with water for all eternity, an emblem of the perverse appetite unchecked, which has at its core the irony of insatiable hunger. And as the pleasure becomes less and less, the craving becomes more and more. The law of diminishing returns. So, it is with all vices. The love of money demands more money, and the love of brains, more brains. 

And so, in conclusion, a prayer. God bless the obsessive-compulsive internet shoppers, the warehouse workers on unnecessarily tight schedules, and the machine-like managers of the big data algorithms. God bless the students who sedate themselves in order to survive their own educations, taking standardized test after standardized test. And God bless the Emily Griersons of the world, who keep their petrified-boyfriend corpses near them in the bedroom, an emblem of what happens when one tries too mightily to hold on to the past. And God help us, too, when we see in our own reflections a zombie-like affectation, the abyss who stares back at us and falsely claims that we are not the righteousness of God, as Paul says we are in 2 Corinthians. And, finally, Godspeed to Gussie Fink-Nottle from the P.G. Wodehouse sagas: “Many an experienced undertaker would have been deceived by his appearance, and started embalming on sight.”  

  

From A Guidebook to Monsters, Ryan J. Stark.  Used by permission of Wipf and Stock Publishers.   

Article
AI
Attention
Culture
5 min read

Will AI’s attentions amplify or suffocate us?

Keeping attention on the right things has always been a problem.

Mark is a research mathematician who writes on ethics, human identity and the nature of intelligence.

A cute-looking robot with big eyes stares up at the viewer.
Robots - always cuter than AI.
Alex Knight on Unsplash.

Taking inspiration from human attention has made AI vastly more powerful. Can this focus our minds on why attention really matters? 

Artificial intelligence has been developing at a dizzying rate. Chatbots like ChatGPT and Copilot can automate everyday tasks and can effortlessly summarise information. Photorealistic images and videos can be generated from a couple of words and medical AI promises to revolutionise both drug discovery and healthcare. The technology (or at least the hype around it) gives an impression of boundless acceleration. 

So far, 2025 has been the year AI has become a real big-ticket political item. The new Trump administration has promised half a trillion dollars for AI infrastructure and UK prime minister Keir Starmer plans to ‘turbocharge’ AI in the UK. Predictions of our future with this new technology range from doom-laden apocalypse to techno-utopian superabundance. The only certainty is that it will lead to dramatic personal and social change. 

This technological impact feels even more dramatic given the relative simplicity of its components. Huge volumes of text, image and videos are converted into vast arrays of numbers. These grids are then pushed through repeated processes of addition, multiplication and comparison. As more data is fed into this process, the numbers (or weights) in the system are updated and the AI ‘learns’ from the data. With enough data, meaningful relationships between words are internalised and the model becomes capable of generating useful answers to questions. 

So why have these algorithms become so much more powerful over the past few years? One major driver has been to take inspiration from human attention. An ‘attention mechanism’ allows very distant parts of texts or images to be associated together. This means that when processing a passage of conversation in a novel, the system is able to take cues on the mood of the characters from earlier in the chapter. This ability to attend to the broader context of the text has allowed the success of the current wave of ‘large language models’ or ‘generative AI’. In fact, these models with the technical name ‘Transformer’ were developed by removing other features and concentrating only on the attention mechanisms. This was first published in the memorably named ‘Attention is All You Need’ paper written by scientists working at Google in 2017. 

If you’re wondering whether this machine replication of human attention has much to do with the real thing, you might be right to be sceptical. That said, this attention-imitating technology has profound effects on how we attend to the world. On the one hand, it has shown the ability to focus and amplify our attention, but on the other, to distract and suffocate it. 

Attention is a moral act, directed towards care for others.

A radiologist acts with professional care for her patients. Armed with a lifetime of knowledge and expertise, she diligently checks scans for evidence of malignant tumours. Using new AI tools can amplify her expertise and attention. These can automatically detect suspicious patterns in the image including very fine detail that a human eye could miss. These additional pairs of eyes can free her professional attention to other aspects of the scan or other aspects of the job. 

Meanwhile, a government acts with obligations to keep its spending down. It decides to automate welfare claim handling using a “state of the art” AI system. The system flags more claimants as being overpaid than the human employees used to. The politicians and senior bureaucrats congratulate themselves on the system’s efficiency and they resolve to extend it to other types of payments. Meanwhile, hundreds of thousands are being forced to pay non-existent debts. With echoes of the British Post Office Horizon Scandal, the 2017-2020 the Australian Robo-debt scandal was due to flaws in the algorithm used to calculate the debts. To have a properly functioning welfare safety net, there needs to be public scrutiny, and a misplaced deference to machines and algorithms suffocated the attention that was needed.   

These examples illustrate the interplay between AI and our attention, but they also show that human attention has a broader meaning than just being the efficient channelling of information. In both cases, attention is a moral act, directed towards care for others. There are many other ways algorithms interact with our attention – how social media is optimised to keep us scrolling, how chatbots are being touted as a solution to loneliness among the elderly, but also how translation apps help break language barriers. 

Algorithms are not the first thing to get in the way of our attention, and keeping our attention on the right things has always been a problem. One of the best stories about attention and noticing other people is Jesus’ parable of the Good Samaritan. A man lies badly beaten on the side of the road after a robbery. Several respectable people walk past without attending to the man. A stranger stops. His people and the injured man’s people are bitter enemies. Despite this, he generously attends to the wounded stranger. He risks the danger of stopping – perhaps the injured man will attack him? He then tends the man’s wounds and uses his money to pay for an indefinite stay in a hotel. 

This is the true model of attention. Risky, loving “noticing” which is action as much as intellect. A model of attention better than even the best neuroscientist or programmer could come up with, one modelled by God himself. In this story, the stranger, the Good Samaritan, is Jesus, and we all sit wounded and in need of attention. 

But not only this, we are born to imitate the Good Samaritan’s attention to others. Just as we can receive God’s love, we can also attend to the needs of others. This mirrors our relationship to artificial intelligence, just as our AI toys are conduits of our attention, we can be conduits of God’s perfect loving attention. This is what our attention is really for, and if we remember this while being prudent about the dangers of technology, then we might succeed in elevating our attention-inspired tools to make AI an amplifier of real attention. 

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief