Article
Comment
Digital
Freedom
5 min read

Seen in Beijing: what’s it like in a surveillance society?

Cameras and controls remind a visitor to value freedom.
A guard stands behind a barrier across an entrance to a station escalator.
A Beijing station gate and guard.

The recent Archers’ storyline wouldn’t have worked in Beijing. Here, great gantries of traffic cameras see into cars and record who is driving, so a court case which hinged on who was behind the wheel would not play out in months of suspense. The British press periodically runs stories on how much tracking and surveilling we are subject to, while the success of the TV series Hunted showed just how hard it is it to evade detection, and how interested we are in the possibility—but how often do we stop to think about the tensions inherent in the freedoms we enjoy? 

It is difficult to explain just how free life in Britain is to someone in China, and how precious, and conducive to social good, that freedom is, to people at home. Take my recent experiences. Prior to entry at Beijing airport, I was randomly chosen for a health check and required to give a mouth swap. This may have been a benign Covid testing program, but it was impossible to tell from the questions on screen we had to answer—and a mouth swab certainly hands DNA to the authorities. At the university where I was studying, face scans are required for entry on every gate, and visitors must be registered with state ID in advance. Despite not having been in China since prior to Covid restrictions, my face had been pre-programmed into the system and an old photograph flashed up on screen as the barriers opened.  

The first time I used a rental bike to cycle back to campus (the local Boris-bikes come on a monthly scheme, linked to a registered phone number), a message flashed up on my phone telling me that I had gone the wrong way down a one-way bike lane. The banner appeared twice, and the system would not let me lock the bike until I had acknowledged my error. The fact that the GPS system tracks the bikes so closely it knew I had gone against the traffic flow for a couple of hundred metres to avoid cycling across a 4-lane street was a surprise. Since that phone is registered to a Chinese friend, such infractions are also potentially a problem for him. What was less surprising, is the systemic nature of China’s ability to track its people at all times. 

Walking through my university campus, where every junction has three or four cameras covering all directions, I occasionally wonder where students find space to have a quick snog. 

No one uses cash in cities in China; in many outlets and places cash isn’t even accepted. Everyone uses apps like WeChat or Alipay to pay for goods—even at food trucks and casual stalls the vendor has a machine to scan a phone QR code. WeChat is WhatsApp and Facebook and a bank debit card and a travel service and news outlet rolled into one; Alipay, its only effective rival, offers similar. To obtain either account, a phone number is needed, numbers which have to be registered. And to pay for anything, a bank account in the name of the individual must be linked to the account. In other words, the government can choose to know every purchase I make, and its exact time and place. A friend who works in a bank says he uses cash where possible because he doesn’t want his colleagues in the bank to see what he's been buying. 

Transport is also heavily regulated. To enter a train station, a national ID card is needed, which is scanned after bags are x-rayed. To purchase a high-speed train ticket, a national ID card—or passport for foreigners—is required. It might be possible to purchase a ticket anonymously in cash from a ticket window outside the station for an old-fashioned slow train, but one would still need an ID card corresponding to the face being scanned to make it to the platform—and the train station has, of course, cameras at every entrance and exit. 

Cameras are pervasive. Walking through my university campus, where every junction has three or four cameras covering all directions, I occasionally wonder where students find space to have a quick snog. The only place I have not yet noticed cameras is the swimming pool changing rooms, which are communal, and in which I am the only person not to shower naked. There are cameras in the church sanctuary, and cameras on street crossings.  

Imagine being constantly reminded by human overseers that your activity in person and online is both seen and heard.

Even when not being watched, out in the countryside, the state makes its presence felt. On a recent hike in the hills, our passage triggered a recording every few hundred metres: “Preventing forest fires is everyone’s responsibility.” Once or twice is common sense, ten or twenty times a stroll is social intrusion. One can, of course, learn to ignore the posters, the announcements, the security guards on trains playing their pre-recorded notices as they wander up the aisles and the loud speaker reminders that smoking in the toilets or boarding without a ticket would affect one’s social credit score and imperil future train travel, but white noise shapes perception.  

As a (mostly) upright citizen, there are many upsides to constant surveillance. People leave their laptops unattended on trains, since they will not be stolen. Delivery packages are left strewn by the roadside or by a doorway: anyone stealing them will be quickly found. There is almost no graffiti. I can walk around at night safe in the knowledge that I am exceedingly unlikely to be a victim of petty theft, let alone knife or gun crime. Many Chinese have horrified tales of pickpockets in European cities or crime rates in the UK, while young friends are so used to the state having access to phone data and camera logs that they barely notice. Most Chinese I know are very happy with the trade-off of surveillance for safety—and the longer I spend in Beijing, the more appealing that normality seems. 

To those who have lived outside, however, the restrictions make for a more Orwellian existence. Any church group wanting to hold an online service must apply for a permit. A friend was recently blocked from his WeChat account for a period after using a politically sensitive term in a family group-chat. Not being able to access certain foreign websites, search engines or media (no Google, no WhatsApp and no Guardian without an illegal virtual private network) might be an irritation for a foreign resident but means a lifetime of knowingly limited information for a citizen. Not being able to access information freely means, ultimately, not being able to think freely, a loss that cannot be quantified. The elite can skip over the firewall, but many cannot.  

We have seen the dangers recently in the UK of limited information flow, and of social media interference by hostile players. Imagine never being able to know whether the information you are receiving is trustworthy—or being constantly reminded by human overseers that your activity in person and online is both seen and heard. Christians may believe in the benevolent and watchful gaze of God—but are rightly wary of devolving that omniscience to fellow humans.

Article
AI
Attention
Culture
5 min read

Will AI’s attentions amplify or suffocate us?

Keeping attention on the right things has always been a problem.

Mark is a research mathematician who writes on ethics, human identity and the nature of intelligence.

A cute-looking robot with big eyes stares up at the viewer.
Robots - always cuter than AI.
Alex Knight on Unsplash.

Taking inspiration from human attention has made AI vastly more powerful. Can this focus our minds on why attention really matters? 

Artificial intelligence has been developing at a dizzying rate. Chatbots like ChatGPT and Copilot can automate everyday tasks and can effortlessly summarise information. Photorealistic images and videos can be generated from a couple of words and medical AI promises to revolutionise both drug discovery and healthcare. The technology (or at least the hype around it) gives an impression of boundless acceleration. 

So far, 2025 has been the year AI has become a real big-ticket political item. The new Trump administration has promised half a trillion dollars for AI infrastructure and UK prime minister Keir Starmer plans to ‘turbocharge’ AI in the UK. Predictions of our future with this new technology range from doom-laden apocalypse to techno-utopian superabundance. The only certainty is that it will lead to dramatic personal and social change. 

This technological impact feels even more dramatic given the relative simplicity of its components. Huge volumes of text, image and videos are converted into vast arrays of numbers. These grids are then pushed through repeated processes of addition, multiplication and comparison. As more data is fed into this process, the numbers (or weights) in the system are updated and the AI ‘learns’ from the data. With enough data, meaningful relationships between words are internalised and the model becomes capable of generating useful answers to questions. 

So why have these algorithms become so much more powerful over the past few years? One major driver has been to take inspiration from human attention. An ‘attention mechanism’ allows very distant parts of texts or images to be associated together. This means that when processing a passage of conversation in a novel, the system is able to take cues on the mood of the characters from earlier in the chapter. This ability to attend to the broader context of the text has allowed the success of the current wave of ‘large language models’ or ‘generative AI’. In fact, these models with the technical name ‘Transformer’ were developed by removing other features and concentrating only on the attention mechanisms. This was first published in the memorably named ‘Attention is All You Need’ paper written by scientists working at Google in 2017. 

If you’re wondering whether this machine replication of human attention has much to do with the real thing, you might be right to be sceptical. That said, this attention-imitating technology has profound effects on how we attend to the world. On the one hand, it has shown the ability to focus and amplify our attention, but on the other, to distract and suffocate it. 

Attention is a moral act, directed towards care for others.

A radiologist acts with professional care for her patients. Armed with a lifetime of knowledge and expertise, she diligently checks scans for evidence of malignant tumours. Using new AI tools can amplify her expertise and attention. These can automatically detect suspicious patterns in the image including very fine detail that a human eye could miss. These additional pairs of eyes can free her professional attention to other aspects of the scan or other aspects of the job. 

Meanwhile, a government acts with obligations to keep its spending down. It decides to automate welfare claim handling using a “state of the art” AI system. The system flags more claimants as being overpaid than the human employees used to. The politicians and senior bureaucrats congratulate themselves on the system’s efficiency and they resolve to extend it to other types of payments. Meanwhile, hundreds of thousands are being forced to pay non-existent debts. With echoes of the British Post Office Horizon Scandal, the 2017-2020 the Australian Robo-debt scandal was due to flaws in the algorithm used to calculate the debts. To have a properly functioning welfare safety net, there needs to be public scrutiny, and a misplaced deference to machines and algorithms suffocated the attention that was needed.   

These examples illustrate the interplay between AI and our attention, but they also show that human attention has a broader meaning than just being the efficient channelling of information. In both cases, attention is a moral act, directed towards care for others. There are many other ways algorithms interact with our attention – how social media is optimised to keep us scrolling, how chatbots are being touted as a solution to loneliness among the elderly, but also how translation apps help break language barriers. 

Algorithms are not the first thing to get in the way of our attention, and keeping our attention on the right things has always been a problem. One of the best stories about attention and noticing other people is Jesus’ parable of the Good Samaritan. A man lies badly beaten on the side of the road after a robbery. Several respectable people walk past without attending to the man. A stranger stops. His people and the injured man’s people are bitter enemies. Despite this, he generously attends to the wounded stranger. He risks the danger of stopping – perhaps the injured man will attack him? He then tends the man’s wounds and uses his money to pay for an indefinite stay in a hotel. 

This is the true model of attention. Risky, loving “noticing” which is action as much as intellect. A model of attention better than even the best neuroscientist or programmer could come up with, one modelled by God himself. In this story, the stranger, the Good Samaritan, is Jesus, and we all sit wounded and in need of attention. 

But not only this, we are born to imitate the Good Samaritan’s attention to others. Just as we can receive God’s love, we can also attend to the needs of others. This mirrors our relationship to artificial intelligence, just as our AI toys are conduits of our attention, we can be conduits of God’s perfect loving attention. This is what our attention is really for, and if we remember this while being prudent about the dangers of technology, then we might succeed in elevating our attention-inspired tools to make AI an amplifier of real attention. 

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief