Events
Belief
Creed
Digital
Wisdom
6 min read

The wisdom of living with the question not googling the quick answer

Are we trading wisdom for apparent certainty?

Elizabeth Wainwright is a writer, coach and walking guide. She's a former district councillor and has a background in international development.

A person sits on a window sill with one raise knee.

In much of the work I’ve been involved in, whether writing, coaching, hillwalking, local politics, or international development, I’ve learned to ask questions I don’t have answers to, and sometimes neither do the people I’m with. We sit with the question, decide whether it’s the right one, try to discern what else emerges in our peripheral vision as we focus on it. It takes effort to come to something like an answer, and in doing so, we peel off layers of unknowing. It has taken practice, and it can be slow work. But in searching for good questions, I see they can be an entry point into not just information but wisdom too. And there are many places that are hungry for wisdom.  

I longed for better questions and more curiosity when I was a district councillor. Curiosity that made space for residents to share their stories and opinions, curiosity about different political positions and what might happen if we work across divides, about what might be possible if we could get past the way things had always been and imagine what they could become.  But the desire to save face, to be seen to be in control, was strong, and I felt it often got in the way of real conversation. To be committed to the process more than the product takes courage, I think. The courage of uncertainty, of saying, “I don’t know”, of putting humility and honesty before status. Sitting with questions can be difficult, perhaps even feeling like a luxury, but they show us ourselves and the world a bit more clearly, offering a pathway to relationship, to collaboration, to humanity, to wisdom.  

I have been thinking about the temptation to trade the wisdom of questions for the apparent certainty of instant answers, even wrong answers. It is a temptation that, in our age of one-click everything and the importance of image, is only quickening. It is a temptation that I have been thinking about, wondering if it started with that old temptation in the Garden of Eden. Staring at a painting of Adam and Eve in the Prado gallery a while back, I wondered whether that original temptation set us on a path of instant information but also of depleted wisdom.  

As I peered into the painting, a thought sparked: what if God told Adam and Eve they could eat fruit from any tree except the tree of the knowledge of good and evil not because he wanted us to stay ignorant or innocent (something that Philip Pullman explores in his Northern Lights books) but because he knew it was too easy for us to eat from that tree. He wanted us to live, and to search for knowing and wisdom ourselves. Eating the fruit would bypass experience, there’d be no need to develop muscles of thinking and discernment. And he wanted us to be wise, to keep creating and tending the world with him. When those first pulled-from-the-earth humans ate the fruit, it was like us still-dependent-on-the-earth humans asking Artificial Intelligence to write us an essay: we might get what we want, but we’ve bypassed the experience of thinking, creating, discerning what’s ours to say.  

This analogy creaks when pulled too far, but it lingers all the same. There’s a quote I’ve long appreciated, from the biologist E.O.Wilson:  

“We are drowning in information, while starving for wisdom. The world henceforth will be run by synthesisers, people able to put together the right information at the right time, think critically about it, and make important choices wisely.”  

Picking the fruit, becoming reliant on AI, gives us information but perhaps not the ability to think, and not the wisdom to make good choices. God wants us to be wise. The Bible’s Book of James says we can ask for wisdom. It is not withheld from us; it is not hidden. It’s everywhere, waiting to be called on.  

There are no digital shortcuts to the difficult work of community, no AI-shortcut to loving well, just as there was never a shortcut to complete knowledge of good and evil. 

It’s so easy to find answers now — Google solves problems and democratises access to information, unless of course you’re in a part of the world that has no digital access. In rural mid Devon and in rural Zambia, both places I’ve worked deeply with communities, you can’t simply access an online meeting or find the answer to a question you might have. Sometimes this feels a life-giving challenge: it increases the need for relationship, for trust, for community conversation. Other times is hinders progress: it means people can’t access jobs, or basic health knowledge, or government decisions that affect them. Google has changed who can access the world, how we interact with it, how we think and learn. Historically, people memorised poetry and scripture and news. The printing press changed that; words were pulled from minds and printed on paper. Our online existence has accelerated that: I don’t need to stretch my memory if I don’t want — I can find and store what I need digitally. We’ve outsourced our memory, and I wonder whether we’re also outsourcing our capacity to think and discern. 

In doing so, we risk disconnecting from ourselves, our relationships, our communities, our places. No longer do we need to rely on each other for knowing and wisdom — we can trust faceless digital forces that profit from us doing so. We risk too our unique ability to think creatively, to discern good sources, to think deeply and with nuance about a topic. If AI learns from everything that has been, it can synthesise and perhaps even extrapolate from that and project forward, but it can’t creatively imagine. It can’t reflect and speak wisdom.  

There is an ease and convenience to Google, to AI. There was an ease and convenience to picking the fruit to gain knowledge. But we are not called to ease and convenience. I think we are called to love, to care for our neighbours, and these things are necessarily inconvenient. Digital access to information is a tool, a resource, a gift that benefits many of us in many ways. But it could easily blunt our humanity, becoming a temptation that bypasses the work of truly living.  There are no digital shortcuts to the difficult work of community, no AI-shortcut to loving well, just as there was never a shortcut to complete knowledge of good and evil. With information available at the tug of a fruit — a click, a download, a request to an artificial intelligence — I am curious how our ability to sit with questions will change, whether we’ll feel beauty or fear in not having all the answers, whether we’ll lose our ability to discern, and to “have faith in what we do not see.”  

Sitting with questions, with curiosity, is I think an entry point to faith and to mystery. 

Jesus calls us to questions, to relationship, to love, not to answers that might be easily won but little interrogated. He knew that questions, not answers, were often the best response to questions. Questions to sit with, to hold up as a mirror, to walk as a path to wisdom. He asked a lot of them. Who do you say I am? How many loaves do you have? Do you love me? What do you want? Why are you afraid?  The Bible records Jesus asking questions, and sometimes offering answers too. But the point often seems to be the question itself, giving endless chances for people to question their assumptions, and their judgements, and to deepen their faith and make it personal. In doing so, Jesus offered a path to deeper and more meaningful knowledge of God, the world, others, and ourselves. And by asking questions he gave dignity to people, listening deeply to them, loving them, calling them into themselves. 

Sitting with questions, with curiosity, is I think an entry point to faith and to mystery. And we have companions as we do this: Jesus, early Christian mystics, prayer, the Psalms, each other – these are all places I turn to dig deeper into the knowing that comes through unknowing. To live with questions and within mystery, to listen deeply to each other, to speak the language of soul rather than certainty, might be difficult and countercultural. But in an age where the future is becoming less certain despite the whole world seemingly at our fingertips, I think it is where our hope is. After all, “what good is it for a man to gain the whole world but forfeit his soul?”

Article
AI
Attention
Culture
5 min read

Will AI’s attentions amplify or suffocate us?

Keeping attention on the right things has always been a problem.

Mark is a research mathematician who writes on ethics, human identity and the nature of intelligence.

A cute-looking robot with big eyes stares up at the viewer.
Robots - always cuter than AI.
Alex Knight on Unsplash.

Taking inspiration from human attention has made AI vastly more powerful. Can this focus our minds on why attention really matters? 

Artificial intelligence has been developing at a dizzying rate. Chatbots like ChatGPT and Copilot can automate everyday tasks and can effortlessly summarise information. Photorealistic images and videos can be generated from a couple of words and medical AI promises to revolutionise both drug discovery and healthcare. The technology (or at least the hype around it) gives an impression of boundless acceleration. 

So far, 2025 has been the year AI has become a real big-ticket political item. The new Trump administration has promised half a trillion dollars for AI infrastructure and UK prime minister Keir Starmer plans to ‘turbocharge’ AI in the UK. Predictions of our future with this new technology range from doom-laden apocalypse to techno-utopian superabundance. The only certainty is that it will lead to dramatic personal and social change. 

This technological impact feels even more dramatic given the relative simplicity of its components. Huge volumes of text, image and videos are converted into vast arrays of numbers. These grids are then pushed through repeated processes of addition, multiplication and comparison. As more data is fed into this process, the numbers (or weights) in the system are updated and the AI ‘learns’ from the data. With enough data, meaningful relationships between words are internalised and the model becomes capable of generating useful answers to questions. 

So why have these algorithms become so much more powerful over the past few years? One major driver has been to take inspiration from human attention. An ‘attention mechanism’ allows very distant parts of texts or images to be associated together. This means that when processing a passage of conversation in a novel, the system is able to take cues on the mood of the characters from earlier in the chapter. This ability to attend to the broader context of the text has allowed the success of the current wave of ‘large language models’ or ‘generative AI’. In fact, these models with the technical name ‘Transformer’ were developed by removing other features and concentrating only on the attention mechanisms. This was first published in the memorably named ‘Attention is All You Need’ paper written by scientists working at Google in 2017. 

If you’re wondering whether this machine replication of human attention has much to do with the real thing, you might be right to be sceptical. That said, this attention-imitating technology has profound effects on how we attend to the world. On the one hand, it has shown the ability to focus and amplify our attention, but on the other, to distract and suffocate it. 

Attention is a moral act, directed towards care for others.

A radiologist acts with professional care for her patients. Armed with a lifetime of knowledge and expertise, she diligently checks scans for evidence of malignant tumours. Using new AI tools can amplify her expertise and attention. These can automatically detect suspicious patterns in the image including very fine detail that a human eye could miss. These additional pairs of eyes can free her professional attention to other aspects of the scan or other aspects of the job. 

Meanwhile, a government acts with obligations to keep its spending down. It decides to automate welfare claim handling using a “state of the art” AI system. The system flags more claimants as being overpaid than the human employees used to. The politicians and senior bureaucrats congratulate themselves on the system’s efficiency and they resolve to extend it to other types of payments. Meanwhile, hundreds of thousands are being forced to pay non-existent debts. With echoes of the British Post Office Horizon Scandal, the 2017-2020 the Australian Robo-debt scandal was due to flaws in the algorithm used to calculate the debts. To have a properly functioning welfare safety net, there needs to be public scrutiny, and a misplaced deference to machines and algorithms suffocated the attention that was needed.   

These examples illustrate the interplay between AI and our attention, but they also show that human attention has a broader meaning than just being the efficient channelling of information. In both cases, attention is a moral act, directed towards care for others. There are many other ways algorithms interact with our attention – how social media is optimised to keep us scrolling, how chatbots are being touted as a solution to loneliness among the elderly, but also how translation apps help break language barriers. 

Algorithms are not the first thing to get in the way of our attention, and keeping our attention on the right things has always been a problem. One of the best stories about attention and noticing other people is Jesus’ parable of the Good Samaritan. A man lies badly beaten on the side of the road after a robbery. Several respectable people walk past without attending to the man. A stranger stops. His people and the injured man’s people are bitter enemies. Despite this, he generously attends to the wounded stranger. He risks the danger of stopping – perhaps the injured man will attack him? He then tends the man’s wounds and uses his money to pay for an indefinite stay in a hotel. 

This is the true model of attention. Risky, loving “noticing” which is action as much as intellect. A model of attention better than even the best neuroscientist or programmer could come up with, one modelled by God himself. In this story, the stranger, the Good Samaritan, is Jesus, and we all sit wounded and in need of attention. 

But not only this, we are born to imitate the Good Samaritan’s attention to others. Just as we can receive God’s love, we can also attend to the needs of others. This mirrors our relationship to artificial intelligence, just as our AI toys are conduits of our attention, we can be conduits of God’s perfect loving attention. This is what our attention is really for, and if we remember this while being prudent about the dangers of technology, then we might succeed in elevating our attention-inspired tools to make AI an amplifier of real attention. 

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief