Article
Culture
Music
6 min read

What was I made for?

Caught up in the Barbie moment, Belle Tindall ponders the haunting depths of the anthem that Billie Eilish has penned for the influential movie.

Belle is the staff writer at Seen & Unseen and co-host of its Re-enchanting podcast.

Barbie stands on a balcony and waves while looking out over her city.
Barbie in Barbieland.
Warner Bros.

I urge you to take the Barbie movie completely seriously - the film itself, the press-tour, the reactions and reviews, the watch-parties, the soundtrack, the costumes. All of it.  

This is not a film to be shrugged at. Love it or hate it, Greta Gerwig’s re-imagining of the Barbie universe is a tool with which we can read this cultural moment. This film, fronted by Margot Robbie and Ryan Gosling (to name just two of an astonishingly expansive A-list cast), is already something of a cultural artefact in that it binds together decades worth of individual memories and experiences with a toy whose impact is truly unfathomable. These micro-stories have fed into what is now a macro-narrative. In binding together such experiences, the Barbie movie will attempt to speak into what has been, what is, and what may be.  

You may think that I am being dramatic, but if you’re unaware of the term ‘Barbenheimer’, then I’m afraid that culture is already speaking a language that you’re unfamiliar with. While it's hard to know how this film will age, it's not hard to see how it is a real moment. One that should be given our full attention.  

As Lauren Windle has provided a masterful analysis of the movie itself, this article will turn its attention to Billie Eilish’s hauntingly good musical accompaniment. 

What is particularly interesting to explore, is who Billie is asking this question on behalf of, and who she’s asking it to. 

Anticipation has been building as certain songs have mysteriously been left off the movie soundtrack’s track list: what are these mystery songs? Who is giving them to us? Why are they being kept hidden?  

Rumours began to swirl, the most traction being given to the theory that Billie Eilish, the 21-year-old musical prodigy, had something particularly special up her sleeve. And the rumours were right. A week before Barbie’s release date, Eilish released What Was I Made For?, a song written just for this movie. And perhaps, just for this moment. The last time Billie turned her hand to writing a song for a film, she wrote an Oscar-winning anthem for James Bond, so this Barbie offering was always going to be special.  

This song, written with her older brother (Finneas) in their childhood home, has already been streamed around twenty-million times. We can therefore assume that it is already residing in Gen-Z’s public consciousness. Simplicity seems to have been the key choice when it came to the production of this ballad; aside from a soft piano accompaniment and a hint of harp in the middle, Billie’s vocals have nothing to hide behind. In fact, her clean and soft voice sounds as though it reaches out of the song, the echo and layered harmonies giving it a truly 3D feel. 

The result is ethereal.  

But this song is more than beautiful. It is more than its (wonderous) sound. The lyrics are, quite literally, haunting. The title of the song is also the question that ties it together, as repeatedly Billie asks the question: ‘what was I made for?’ This question, and its implications, is where this song becomes more than a song. As so many of the great ones do, it becomes a three-minute-long existential pondering. What is particularly interesting to explore, is who Billie is asking this question on behalf of, and who she’s asking it to.  

 Of course, this song was written for the purpose of featuring in a film, its primary job being to tell the same story as the film itself (or at least an aspect of it).  

Over a billion Barbie dolls have been sold since 1959. Over the years, Barbie has had over 250 professions, she has evolved through the decades to best personify the evolving beauty ideals of the age, she is, to quote herself, everything. But in being everything, is she also nothing? Time recently wrote that:  

‘Barbie has no inner life or purpose; children are supposed to project their hopes and dreams onto her blank canvas.’ 

Considering this, it’s obvious how lines such as -  

‘Takin' a drive, I was an ideal. Looked so alive, turns out I'm not real, just something you paid for. What was I made for?’   

–  hit the brief perfectly. If the song was intended to be a seeking out of Barbie’s more fragile side, it is a job tremendously well done.  

But there’s more to it.  

Billie Eilish has been under culture’s magnifying glass since she was fifteen years old. Many of her most formative years have been spent in our gaze as she’s become an adult in front of our very eyes. Whether it’s been the ever-changing colour of her hair, the romanticism of her homegrown talent, the fact that her sense of style so satisfyingly defies all the rules of the moment, or that her voice is so delicate it almost feels as though it needs protecting, she’s had us utterly captivated. And of course, such captivation has taken quite the toll. It always does.  

Taking a moment to imagine how the world looks from Billie’s viewpoint, it becomes obvious that a song which was written for a toy is also profoundly autobiographical. She too is an ideal, she is something we’ve paid for. Through writing this song, Billie offered us her profound vulnerability. And what’s fascinating is that she did so without even realising it. When speaking about the song, Billie recalls how,  

‘I was purely inspired by this movie and this character, and the way I thought she would feel, and I wrote about that. And then, over the next couple of days, I was listening… and I do this thing where I’m writing for myself, and I don’t even know it… this is exactly how I feel, and I didn’t even mean to be singing it.’ 

So, this song has two profound levels to it. And yet, I can’t help but feel as if it has even more to offer. The chances are that neither you nor I are a twenty-one-year-old mega-star, and we’re certainly not a sixty-four-year-old doll, but I wonder if this song was written about us too.  

It hints at a belief that she was made with some kind of purpose and intentionality weaved into her existence. 

This cultural moment is asking a pertinent question, it’s certainly not a new one, in fact, I would guess that it’s as old as time itself. But every now and again it is as if the volume gets turned up and this question rings out above all others: what does it mean to be human? Or, to borrow Billie’s phrasing: what were we made for?  

The interesting, albeit obvious, thing about Billie’s particular wording, is that it implies a kind of faith that is hidden in plain sight (for, as far as I know, Billie has no religious faith). It hints at a belief that she was made with some kind of purpose and intentionality weaved into her existence. This is one of the most faith-filled things one could think, and naturally, Christians would heartily agree. Of course, it’s perfectly possible that this is simply emotive wording that Billie has crafted, for the sole purpose of getting people to listen to her song. However, I would argue that this question is asked all day every day, by people who have an intuition that there is more to their presence in the here and now than mere chance. And I’m willing to bet that the Barbie movie is going to have a lot to say about it.  

Are we in a cultural moment where we’re wanting to re-find our humanity in its truest form? So much so, that we’re willing to shirk falsehoods, pretences, and presumptions? Are we disillusioned by anything less than our most authentic selves? It is interesting to ponder where such questions are prompting us to look for answers: inward? Outward? Upward, even?   

What Was I Made For? is a soundtrack for a movie, a particularly interesting movie at that. But I would suggest that it’s also the soundtrack of an existential yearning, a song of a human working out what it means to be such. And I suppose that makes it a song that tells our story, as well as Barbie and Billie’s.  

Article
AI
Character
Culture
Digital
7 min read

Apple’s AI ads show how we can lose our moral skills

Apple Intelligence promises to safeguard us from the worst of ourselves.

Jenny is training to be a priest. She holds a PhD in law and writes at the intersection of law, politics and theology.

A worker at a desk sits back contemplating a situation
Dour Dale contemplates AI.
Apple.

“I got through the three stages of the interview process, and they said I had done well, but they aren’t hiring any computer science graduates anymore. AI is cheaper, and faster.”

John*, a bright 24-year-old coder and philosopher, has just completed an MSc in Computer Science from one of the top universities in the UK. And he can’t find a job. AI has outcompeted him. In a couple of years, he says, entry level into computer science as a field will require a PhD. What about in ten years, or twenty? Will the only people able to work in the field have to effectively be geniuses to keep up with a technology that’s metastasizing at the rate of knots? It felt painfully ironic to be discussing over coffee the death of an entire sector of meaningful jobs less than a week after the new Labour government announced its plans to “turbocharge” AI (Artificial Intelligence) as the saviour of the nation’s economy. What are we willing to sacrifice in the name of “national renewal”?  

As worrying as John’s story is, there is much more than jobs – and the skills, knowledge and social relations tied up in them – on the line when it comes to AI. The alleged saviour of the nation’s economy is after your soul as well, it turns out.  

This came home to me starkly over the Christmas holidays with the new advertisements for Apple Intelligence tools on MacBook Pro. In the first ad, “Lazy Lance” – a procrastinating business professional – sheepishly shifts in his seat. He has been asked to make a presentation on the new business prospectus, and he has been caught out, unprepared. But he is saved at the last moment. The click of the “Key Points” button using the new Apple Intelligence software on his MacBook Pro provides him with the critical breakdown summary needed to avoid becoming the pariah of the team. The sheepish shifting turns to smug smile: his substandard performance has evaded detection with the ready aid of Apple Intelligence.  

In the second ad, “Dour Dale” – a disgruntled office worker – writes a scathing email to the “monster” who has devoured his pudding from the communal fridge. Before clicking send on this missive, he raises his eyes from the raging words on his screen to see a pious teddy bear holding a love-heart which says “find your kindness.” This moral cue from a cuddly toy prompts Dave to select the “Friendly” button from the dropdown list on Apple Intelligence writing tools, which immediately converts his childish strop over pudding thievery into a mature response in which he kindly expresses his disappointment along with a polite request for the pudding to be returned. The only moral effort required of Dale is the click of a button; Apple Intelligence sorts out the bile and the blame and re-presents his pudding fury in a professionally palatable manner.  

These advertisements for AI tools are designed to provoke an empathetic laugh. Who indeed can honestly say they have never arrived unprepared to a meeting, or at least mentally penned a vindictive response to the tiniest office slight?  

AI is poised to strike at the root of our individual virtue, by inserting itself as an emotional regulator. 

However, underneath the easy laughs, I felt a profound sense of dis-ease when watching them. They indicate just how far AI has already begun to penetrate our moral economy. By inserting a technological tool to disguise or translate social interactions into new terms, our moral relations with each other are deceptively smoothed to avoid the social and personal costs of shame (e.g. Lance using “Key Points” rather than owning up to his poor work ethic) and anger (e.g. Dale using “Friendly” mode to transform his email from raging diatribe into courteous appeal). As appealing as it sounds to have automatic tech weapons to tranquilise social and emotional bugbears, they also remove daily opportunities to learn how to live and work together.  

For example, as excruciating as it is to be the person who came to the meeting woefully under-prepared, embarrassment can be a very useful corrective in learning the art of time management as well as the virtue of pulling our weight. We probably all know from school what it feels like to work on a group project, when only half the group cares about the outcome. If we do not learn moral skills of responsibility and accountability in our formative years, the workplace becomes a vital school for virtue in adulthood where we learn what it means to be trusted and how to be worthy of it. As in the case of Lance, AI now offers us everyday tools which help us to avoid embarrassment and effectively hide our lack of effort, taking the edge off of the very exposure that would help us to grow in both skill and trustworthiness. This is not propaganda for the Protestant work ethic but rather a top survival tip for the human soul in hyper-capitalist economy. Maintaining the moral significance of our labour as a school of formation in self-respect and trustworthiness does not baptise the extractive and exploitative nature of many workplaces. Rather, it offers a means of resistance to the soul-destroying idea that we are all replaceable, that nothing really matters and that our efforts are simply grist for the eternal and insatiable mill of market supply and demand.

In the case of Dale, Apple Intelligence goes beyond protecting users from social shame: it promises to safeguard us from the worst of ourselves. Of the two Apple Intelligence advertisements, I find Dale’s to be even more pernicious because it evidences how AI is poised to strike at the root of our individual virtue, by inserting itself as an emotional regulator. Rather than doing the difficult work of redrafting the email himself, which would require Dale to critically examine his own reactions and put himself into the shoes of the recipient, Apple Intelligence offers to do it automatically. By short-circuiting Dale’s process of recognising the emotions underneath his rage, he misses a critical opportunity to learn for himself what his anger is all about, and even more than that, to practice the art of genuine self-mastery in conflict. The AI tool smooths out the conflict on the surface, while Dale is presumably left with all those rotten feelings built up and unprocessed, because he has not had to do the difficult work of converting his aggressive monologue into a respectful dialogue with another human being.

The insertion of these seemingly innocuous AI tools into the spheres of our everyday, workaday lives introduces new means and modes of (self) deception in our habits, where we are able to hide much more easily from honest moral evaluation of the quality of our work as well as our interpersonal relationships. It also risks new heights of moral “de-skilling” over time as we live in a social and economic world that has become so deeply mediated by technology, to the point where we may very well eventually trust Apple as the gold standard of professional behaviour rather than our own discernment. The soul – our very interiority – is the new frontier of economic expansion, in the name of securing Britain’s place in the ranks of global competitiveness.

To AI enthusiasts, all this may sound like Luddite naysaying. Many people find AI tools helpful in the process of research and preparation. Even some priests, I have recently discovered, use Chat GPT to aid sermon-writing. And what, as a priest friend asked me recently, is the problem with these time-saving tools, as long as we use them critically?

Apart from the obvious answer that AI can’t be trusted to get all the facts right, let alone the word of God, this question presumes that human beings’ critical faculties and moral compasses remain fundamentally unaffected by these new technologies. It may be true for older generations (whose formative years occurred well before the meteoric surge of digital technology in the early 2000s) that technology continues to function as an optional extra to make life that little bit easier. But for Gen Z and below, and even for some younger millennials, intuitive digital technologies have become so fused with the ways that we learn and process information that it is no longer – if it ever was – a neutral tool to improve our lives. We are only learning now about the extent to which social media has thoroughly penetrated the emotional worlds of teenagers, with severe consequences for their wellbeing. What will be the consequences for the generations to come, when AI becomes so integrated into the emotional and social fabric of our lives that we cannot quite tell where we start and it begins? The risk with “turbocharging” AI is not only a huge number of jobs, but the atrophy of our moral muscles as AI encroaches further into the heartlands of what it means to be human. While a few tech elites may always stay one step ahead of AI and keep it safely in the toolbox rather than the driver’s seat, most of us time-poor plebians are being taken for the ride of our lives.

 

 *Name changed for anonymity. 

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief