Explainer
AI
Culture
Digital
6 min read

Tech has changed: it’s no longer natural or neutral

The first in a three-part series exploring the implications of technology.

James is Canon Missioner at Blackburn Cathedral. He researches technology and theology at Oxford University.

A caveman holding a hammer looks at a bench on which are a broken bicycle and a laptop.
Nick Jones/Midjourney.ai.

My son was born in February last year and it seems that every day he is developing new skills or facial expressions and adorable quirks. Just the other day he was playing with some wooden blocks and when they inevitably fell over, he let out the most adorable giggle. As you can guess I immediately reached for my phone so that I could capture the moment. Moments like this happen all the time in the life of a modern parent- we want to share with our spouse, family, and friends or just capture the moment for ourselves because it’s something we treasure. And yet, in this series of articles I would like to consider this moment, and the thousands like it that take place in a technological society, and ask: is everything as benign as it seems? 

There are two ideas that often come up whenever people talk about technology. The first is that technology is basically ‘neutral’, that technology only becomes good or bad depending on what you are doing with it. “Look at a hammer,” someone might say, “there is nothing intrinsically good or bad about this hammer, only the end result is good or bad depending on whether I’m using it to hit nails or people!” On this reading of technology, the only important questions relate to the consequences of use.  

If technology is neutral, then the primary concern for users, legislators and technologists is the consequences of technology, and not the technology itself. The only way to ensure that the technology is used for good is to ensure, somehow, that more good people will use the technology for good things than bad people using it for bad things. Often this idea will present itself as a conversation about competing freedoms: very few people (with some important exceptions, see this article from Ezra Klein) are debating whether there is something intrinsically problematic about the app formerly known as Twitter, most discussion revolves around how to maintain the freedom of good users while curtailing the freedom of bad users. 

We assume that these tools of social interaction like Facebook and Instagram are, in and of themselves, perfectly benign. We are encouraged to think this by massive corporations who have a vested interest in maintaining our use of their platforms, and at first glance, they seem completely harmless: what could possibly be the problem with a website in which grandma can share photos of her cat? And while the dark underbelly of these platforms has violent real-world consequences – like the rise of antisemitism and anti-Muslim hatred – the solution is primarily imagined as a matter of dealing with ‘bad actors’ rather than anything intrinsically problematic with the platforms themselves. 

Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools.

The second idea is related but somewhat different: Advocates of modern technology will suggest that humanity has been using technology ever since there were humans and therefore all this modern technology is not really anything to worry about. “Yes, modern technology looks scary,” someone might say, “but it’s really nothing to worry about, humans have been using tools since the Stone Age don’t you know!” This view proposes that because hammers are technology, and all technology is the same, there is, therefore, no difference between a hammer and the internet, or between the internet and a cyborg.  

This second idea tends to be accompanied by an emphasis on the slow and steady evolution of technology and by highlighting the fact that at every major technological advancement there have been naysayers decrying the latest innovation. (Even Plato was suspicious of writing when that was invented). Taken as part of a very long view of human history, the technological innovations of the last 100 years seem to be a normal and natural part of the evolution of our species which has always set itself apart from the rest of the animal kingdom in its use of technology. 

Steve Jobs gives a good example of this in an interview he gave about the development PC: 

“I think one of the things that really separates us from the high primates is that we’re tool builders. I read a study that measured the efficiency of locomotion for various species on the planet. The condors used the least energy to move a kilometer. And humans came in with a rather unimpressive showing about a third of the way down the list… not too proud of a showing for the crown of creation… But then somebody at Scientific American had the insight to test the efficiency of locomotion for a man on a bicycle. And a human on a bicycle blew the condor away – completely off the top of the charts. 

And that’s what a computer is to me… It’s the most remarkable tool we’ve ever come up with… It’s the equivalent of a bicycle for our minds”  

Notice that Jobs here draws a straight-line comparison between the bicycle and the PC. As far as Jobs is concerned, there is no quantitative difference in kind between the two tools: one is more complex than the other but otherwise, they are just technologies that expand human capacity. “A Bicycle for our minds” is a fascinating way to describe a computer because it implies that nothing about our minds will be changed, they’ll just be a little bit faster. 

And yet, despite the attempts of thought leaders like Jobs to convince us that modern technology is entirely benign, many of us are left with a natural suspicion that there is more going on. As a priest in the Church of England, I often have conversations with parishioners and members of the public who are looking for language or a framework which describes the instinctive recognition that something has changed at some point (fairly recently) about the nature of the technology that we use, or the way that it influences our lives. That modern technology is not simply the natural extension of the sorts of tools that humans have been using since the Stone Age and that modern technology is not neutral but in significant ways has already had an effect regardless of how we might use it. How do we respond to such articulate and thoughtful people such as Steve Jobs who make a compelling case that modern technology is neutral and natural?  

I often have conversations with parishioners who are looking for language or a framework which describes the instinctive recognition that something has changed about the nature of the technology that we use, or the way that it influences our lives.

Thinking back to that moment with my son when he giggles and I take a photo of him, at first glance it seems completely innocuous. But what resources are available if I did want to think more carefully about that moment (and the many like it) which suffuse my daily life? Thankfully there is a growing body of literature from philosophers and theologians who are thinking about the impact of modern technology on the human condition.  In the next two articles I would like to introduce the work of Martin Heidegger, outline his criticism of modern technology, showing how he challenges the idea that technology is simply a natural extension of human capacity or a neutral tool.  

Heidegger is a complex character in philosophy and in Western history. There is no getting around the fact that he was a supporter of the Nazi Party during the second world war. His politics have been widely condemned and rightly so, nevertheless, his insights on the nature of modern technology continue to this day to provide insights that are useful. His claim is that modern technology essentially and inevitably changes our relationship with the world in which we live and even with ourselves. It is this claim, and Heidegger’s suggested solution, that I will unpack in the next two articles. 

Article
Creed
Education
5 min read

Our social problems need theology, here’s why

Taking the god’s-eye view develops critical skills
young people listen, and ponder, to a speaker off screen.
M Accelerator on Unsplash.

At secondary school level, Religious Studies continues to attract strong numbers. On the surface, this looks like a healthy sign for the subject. Yet, critics argue that appearances can be deceiving: many faith-based schools make the subject compulsory, artificially pushing up participation. The result is a stark disconnect when students progress to higher education. Here interest appears to drop off sharply, and several universities have been forced to close their single-honours degrees in Theology and Religious Studies due to unsustainable student numbers. 

But this presents a misleading picture – even at tertiary level students are far more interested in Theology and Religious Studies than the statistics seem to suggest. While few undergraduates commit to a full degree in Theology, (in Scotland this is called Divinity) or Religious Studies, partly because career pathways outside of ordained ministry and teaching can seem unclear, many are eager to sample the subject alongside their main studies. This means that at the University of Aberdeen, the department of Divinity finds a different kind of relevance. Thanks to Aberdeen’s flexible degree structure, it is not unusual to find law, sociology, psychology, anthropology, and even physics students sitting in on our undergraduate modules. This interdisciplinary mix brings a distinctive energy to classroom discussions, as well as a few challenges… and challengers.  

Some students arrive never having opened a Bible, never having heard a word from the Qur’an, and never having engaged with any other religious text. Many are openly ambivalent about the existence of God, some downright hostile, and more than a few admit that they were drawn in by the promise of coursework-based assessment rather than traditional exams. Yet, once in the room, most engage with surprising enthusiasm, and even the challengers play a vital role.  

What emerges is a lively space where students approach theology less as a matter of personal faith and more as an intellectual exercise, grappling with life’s big questions, testing out ideas, and debating seriously with the prospect that God exists. Far from diminishing the subject, this shift gives the Divinity department a new role: not as a training ground for clergy, but as a forum for critical thinking across disciplines. 

In one of our courses for example, students are asked to debate this question: if a human chooses to go wild swimming in a crocodile’s natural habitat, does the crocodile have a right to kill and eat that human, as it would any other prey item that strayed into its path? Or, if a person with profound physical and intellectual disability is not able to live out many of the rights and responsibilities envisaged by the United Nations Convention on Human Rights, on what grounds are they still reckoned to be a human person? As we tease out the (multiple) possible answers to these questions, many of the turn out to be surprisingly theological. Whilst some students will work towards becoming better able to affirm and articulate their own atheism, others are surprised to discover that they have been living out a deistic morality all along; on the quiet, their internal moral compass believes in God. 

But my sense is that even if students don’t walk out with an easy A, they walk out with a set of skills that is, in the long run, far more valuable. 

Further to that, in an open letter the Theos think tank recently highlighted the role of theology in the ethical and cultural development of communities. They argue that theological study equips people to engage thoughtfully with different people groups and traditions, to develop skills in interfaith dialogue, and to promote communication across cultural barriers. Put simply: 

“In an increasingly polarised world, it helps us understand other points of view.” 

This insight is highly relevant to our students as they set out on varied career paths in an increasingly complex world. The skills honed in our Divinity classrooms – empathy, critical thinking, close observation, and clear writing – are both essential and transferable. Theology degrees do not lead only to ordination or teaching; they can open doors to careers in journalism, diplomacy, politics, community work, authorship, and screenwriting, among many others. As Professor Gordon Lynch, Professor of Religion, Society and Ethics at the University of Edinburgh, observed at a recent panel discussion: 

“It’s very difficult to think about a major geopolitical issue at the moment in which religion isn’t deeply implicated in some way.” 

The relevance of theological training extends far beyond traditional disciplines. For example, law students will need to recognise not only that a person with profound disability is a human person, but also to understand the deeper ethical and theological reasons why society judges this to be so. International Relations students will need to appreciate why resolving the Israel/Palestine conflict is not as simple as drawing lines on a map, but is rooted in long histories of faith, identity, and belonging – histories which will reach their influence far into the future as well as the present. Sports science and physiotherapy students will need to empathise with the human drive to become ever faster and stronger, while discerning when to help people recognise the limits before injury occurs. 

So, we gather all these students and more into our divinity courses, and work with them as they develop such skills. By discussing these matters as though God exists, in a space where there is unapologetic openness to confessional or deistic ways of looking at the world, students are freed to adopt a third-person standpoint, a “god’s-eye view” if you like, which allows them to critically examine both their own and other people’s perspectives. When this freedom becomes apparent, it is the challengers often find themselves the ones being challenged, and hostility soon morphs into vibrant dialogue. Also, for those who want “an easy A” it quickly becomes apparent that coursework-based assessment is in no way easier than traditional exams – if anything, it can be the opposite! Getting your ideas down on paper, coherently, and with relevant references to research from across disciplines is a sophisticated competency. But my sense is that even if students don’t walk out with an easy A, they walk out with a set of skills that is, in the long run, far more valuable.  

With an eye to business models and balance sheets, many universities don’t think they need their theology departments anymore, and with the current financial precarity faced by the higher education sector, on paper this may be true. But society is crying out for complex ways forwards with complex situations, and the problems of social division are becoming more apparent than ever. Whilst it is clear that fewer and fewer students are choosing to do whole theology degrees, it is also clear the world still needs theologians.

Support Seen & Unseen

Since Spring 2023, our readers have enjoyed over 1,500 articles. All for free. 
This is made possible through the generosity of our amazing community of supporters.

If you enjoy Seen & Unseen, would you consider making a gift towards our work?
 
Do so by joining Behind The Seen. Alongside other benefits, you’ll receive an extra fortnightly email from me sharing my reading and reflections on the ideas that are shaping our times.

Graham Tomlin
Editor-in-Chief