The work of creation in the age of AI

Meaning, authenticity, and the creative process – and why they matter

The work of creation in the age of AI

When I was an undergraduate I read Walter Benjamin’s famous essay, The Work of Art in the Age of Mechanical Reproduction. It was written in 1935 but of all the things I remember from that class, it has been one of the most enduring.

The essay is far too complex to do justice in a blog post, but the basic point that stuck with me was that the authenticity of a piece of art doesn’t just lie in the art itself but also in the context in which it’s embedded. At the time of Benjamin’s writing, the world was grappling with the changes imposed by inventions like the phonograph, television, and radio. He was therefore concerned with understanding how reproduction technology alters the context in which art is experienced; before these inventions, the only way to appreciate a symphony or a speech or a play was to be there, with the actors, experiencing it as it was performed in the moment. And before photography and mass production, the vast majority of visual art had been directly made and touched by the artist.

Benjamin’s essay is enduring in part because he pinpointed an essential truth: directness of experience matters. There is something deeply different about owning an original painting – knowing that the artist’s own hand made those actual brush strokes – than there is in owning even a flawless reproduction. The visceral experience of a live concert cannot be matched by a recording, no matter how good the audiovisual system or how immersive the experience. Receiving a handwritten letter is far more meaningful than receiving an email with the exact same message, and it would still be so even if it were just as hard to write an email as send a letter.1 Watching a play in person affects one differently than the same play on TV, even if the production values of the latter are far better – there is something about the present, human bodies of the actors, sharing the same air, being part of the vibes and communication between them and the audience – all of it leads to a qualitatively distinct experience. Even workplace meetings and scientific conferences, much to my dismay,2 are different when held in-person vs when they are held online. Some of that is probably due to other factors, like the ease of having side conversations, but at least some is because we gain something from being bodies inhabiting the same space, breathing the same air, unmediated by a screen.

None of this is to say that an experience ceases to have meaning when it is mediated by technology. But the mediation changes the experience in a very real way; the transmission chain from the creator to the recipient becomes part of the experience and thus part of the meaning. That means that different media and different kinds of mediation fundamentally change the meaning. It also means that the longer the transmission chain – the more a creation is divorced from its original context – the more the connection between the creator and the audience is frayed. The result thing might still be valuable, but it is different.

This much, to me, seems inescapable. But I am left with two questions. First, why is it different? And secondly, does any of this change when the technology involved is AI?

I’m going to claim, regarding the second, that AI does change things in a deep and fundamental way. Some say that AI is “just another tool”, but I don’t think it is; I think it distinctly and qualitatively alters the relationship between ourselves, our creations, and each other.3 To explain why, I have to go into my thoughts about the first question: why and how context and connection matters for authenticity and meaning.

Meaning requires a mind

Meaning is weird. What the heck is it?!? It’s not observable, not something you can measure or see directly. And yet it is everywhere. It is the invisible engine of much of our lives: behind our relationships, our employment, our hobbies, our institutions, our economic systems, and our social organisation. Just as in biology nothing makes sense except in light of evolution, in the human sciences nothing makes sense except in light of the meanings we attach.

Meaning is not externally visible because it comes from us: it is what happens when a mind tries to make sense of something in the world (including, of course, other minds).4

To be technical and nerdy about it, we can think of meaning as the informational transformation that a cognitive system imposes on the data it perceives from the world. The “cognitive system” part is crucial, otherwise every process in the universe that could be characterised as an informational transformation would have meaning. Entropy is a kind of information transfer, as is radioactive decay and the propagation of motion across a vacuum and the erosion of a mountain over time, but none of these things are meaningful in the same way that the cognitions or actions of dogs and cats and people are, the way I am talking about.

The difference, I would suggest, is one of agency and intent. Dogs, cats, people – the information transformation we do occurs in the service of goals of some sort. These goals might be short term (find something tasty to eat) or long term (make a lot of money); they might be conscious (flirt with an attractive person) or unconscious (pick a totally unsuitable person to flirt with because of a legacy of childhood trauma); they might be sensible (go to bed at a decent time) or silly (pull an all-nighter because of an addiction to a video game). But, whatever the intent happens to be, our thoughts and actions occur in the service of some goal or another. Their meaning derives from the fact that the nature of the transformation can be understood given those goals, the world we live in, the constraints we are operating under, and the data we have access to.

What does this have to do with creation5 and authenticity?

To answer this, we need to consider the three logically possible kinds of creation. Meaning is different in each.

Type 1. There is a creator but no audience (Individual Meaning)

This occurs often. A variant is happening to me, right now, as I write this essay, because I haven’t shown it to anybody yet and I’m not even sure I’m going to. If I was certain I was going to keep it to myself, my writing of this would be a prototypical example of purely Individual meaning; if you end up reading this because I shared it after all, it will be an example of Type 3 below (what I’m calling Deep Meaning).

There are many examples of Individual Meaning. It happens all of the time: when we create little ditties in our heads, when we talk to ourselves, when we write in our diary, when we doodle, when we paint or draw or sing with no audience in mind but ourselves.

There is definitely meaning in this kind of creation. I would even say there is a lot of meaning. This suggests that meaning can’t reside in the fact of communication to another person, or in the presence of an audience.6 It lies in something else. What is that?

Well, if meaning is informational transformation performed by our cognition in the service of some goal, then in this case the act of creation subserves the goal.

It is the goal and the process of creation that imbues it with meaning.

If I am writing an essay or diary entry to myself, the goal is not to achieve a certain word count7 or splat down lines in the shapes of letters onto a paper or pixels in the arrangement of words onto a screen; the goal is generally to clarify my own thoughts, to identify flaws in my thinking, or to distil the logic and motivations behind my ideas. I might have a more emotional goal - I might want to vent to myself or to express a feeling. I might want to improve my emotional regulation, feel better about life, or change those feelings somehow. Regardless of the nature of my goal, the act of creation requires that I have one; few would say that accidentally spilling a bucket of paint is in itself a creative endeavour, even if the resulting splash is aesthetically appealing. Conversely, if I tried very hard to create a painting that looked exactly like that splash, it might be a silly painting, but it would be at least somewhat meaningful because I did it for some reason.

What this suggests is that the form of the creation is not itself the goal. The words and paragraphs of this essay are the result of my goal and thus are a signal of the meaning, but they are not the meaning in itself. The meaning lies in my intent, and thus also in the process I went through in realising that intent – the struggling to find words, the organisation and reorganisation, the false starts, the edits, the bits I ended up happy with and the bits I removed and the bits I retained despite their imperfection because they were close enough. All of these are part of the meaning, because they all reflect what I was trying to do in creating this in the first place.

Of course, not all creation – even to oneself – is this complicated, and not all of it is necessarily deep; meaning doesn’t require heartfelt delving into one’s soul. Even something as prosaic as my to-do list has meaning to the extent that the things on it are things I actually think I need to do. Doodles have meaning to the extent they reflect my brain’s frantic attempt to pay attention to the boring meeting I’m in (their meaning is “help! I am bored! I desperately need stimulation!”). Any honest creation is authentic and thus meaningful on some level, but to-do lists and doodles reflect less of me, of my underlying thoughts and emotions and intentions, than something like this essay does. That is why this essay has more meaning than those things. It is why heartfelt songs are more meaningful than cliche-ridden pablum; why the spontaneous messy drawing of a seven-year-old is more meaningful than a page of their colouring book, even if the colouring book is objectively “better”; and why the old recipe cards with my Nana’s scribbled observations to herself are more meaningful to me than tastier recipes that have been mass-printed in a cookbook.

Meaning comes from the intent behind a creation, and thus emerges from the process of realising that intent. It exists in a creation to the extent that the process by which it was generated reflects the true intent of the person who created it.

Type 2. There is an audience but no creator (Projected Meaning)

Let’s consider the opposite case now – the situation where we are part of the audience, and we interpret and infer meaning – but there is no creator. Because humans are meaning-makers, it happens all of the time. We imbue meaning everywhere, often, even to things whose creation involved no intent or agency at all. Indeed, we very often assign meaning by assuming agency or intent. This is the impetus behind virtually all of the world’s religions, on some level; natural phenomena are explained by inferring the existence of agents – gods – and the characteristics of those phenomena can be understood through the lens of the goals of the gods. Modern people may believe ourselves to be immune to this teleological disposition, but everyone is born with it; education only papers over it, and it re-emerges when we are stressed or pressured. We’re all predisposed to see patterns in randomness and to try to make sense of events by imposing narratives on them.

To be clear, we don’t have to assume the presence of an agent or intention in order to extract meaning. It’s easiest and most natural, but not required. Even atheists see beauty and meaning in the mathematics and physical laws of the universe, the richness of the natural world, or the vastness of the cosmos. That said, when we know there is no agent, we recognise that on some level the meaning is coming from us; the universe exists apart from us, but it is a cold and meaningless thing without a mind to appreciate it.

Projected meaning can be valuable. There are many examples of this. I’m an atheist and I gain solace and comfort from my perception of the universe as ordered and purposeful. The patterns we see in constellations and the stories they’ve given rise to are great sources of cultural richness as well as useful navigation and memory tools. When random terrible things happen to us, it can be a source of immense personal transformation to be able to put them into a narrative of purpose and intention, even if you know on some level that they were completely random. etc.

So Projected meaning is not a bad thing. But it is different from the other kinds of meaning, and a lot of problems stem from confusing them. Millions of people have died in religious wars about what was the “right” story or God, even though it was projected in the first place. Similarly, while it is healthy and helpful to construct narratives around our lives, when we cease to recognise that we’re the source of those narratives and begin to think that our story and our dreams are actually the hopes and desires of the universe – well, that’s a nice fast path to narcissism and sociopathy.

It’s really important, in other words, that we be clear about where our meaning is coming from and what kind of meaning it is. Hold that thought, because I’m going to come back to it. But first, let me talk about the final kind of meaning.

Type 3. There is both a creator and an audience (Deep Meaning)

This is the biggie. If both Projected Meaning and Individual Meaning alone are powerful, even though they involve one mind in isolation, think of the power that comes from two or more minds interacting with each other. This is what Deep Meaning is. It is when a creator (with a mind) creates something with the intent of sharing their creation with an audience (with a mind).

In the least-rich version of this it stops there, and there is no feedback or communication from the audience to the creator. Even that is a big change from Individual Meaning, because now the creator changes what they are doing based on the audience and their communicative goals. After my first draft of this blog (which was just for me, and about getting my thoughts on paper), once I decided I might actually want to post it, I changed it in important ways. I included a lot more explanation of things that were clear in my head but I thought would be non-obvious to others (and, in so doing, often I clarified them for myself too!). I fiddled with the structure and the argument flow, took some things out, put others in. I removed some jokes that I thought were funny but probably aren’t. I added a bunch of footnotes because all good essays have an unseemly number of footnotes. And I edited it many more times than I would have if I weren’t going to post it.

All those things I did are part of the meaning, because they reflect my intent. That intent is now visible to you in the thousands of tiny assumptions I made about you, which shaped my word choice and emphasis and presumed common ground and many, many other things. My intent is evident in the choice of picture8 I posted it with and the level of care I put in and the way I wrote it, including the occasional lapses into colloquialisms as well as nerdspeak. Deep Meaning turns what was originally an isolated act of creation into something that is now fundamentally communicative and connection-oriented. The traces of that are visible everywhere in the creation.

Of course, the power of Deep Meaning is magnified a thousandfold when it becomes truly interactive – when creators and audience communicate with one another, when the audience becomes a creator by taking the creation and changing it or adding to it or even just commenting on it. Massively entangled and long-term processes like these are what has given rise to virtually all of the meaning in our world: our culture, our institutions, our languages – all are the outcome of an intense co-creation of meaning, a communication between many minds that has lasted for thousands of years. And the process of that creation has inextricably shaped and become part of that meaning. Indeed, the process – the transmission chain – is inseparable from the meaning, just as Benjamin pointed out almost a century ago.

How does AI change things?

I think it changes a lot.

Let’s forget about Deep Meaning for a moment, and just think about creation alone. When you create something with the assistance of an AI – when you generate an image or compose an essay or write some code by prompting an AI to do it – you are fundamentally changing the meaning that it has for you. This is because the once-direct connection between your process and your creation is now massively altered and much more indirect.

You are now alienated from your own creation in a really profound way.

Let’s take image generation as an example. Suppose you want to create a picture of, I dunno, a wolf bounding through the trees. That purpose – that desire – will still be there in the picture that is created when you give an AI a prompt like “wolf bounding through trees.” But that desire is all the meaning that will be there. That is the only part of the process that came from you; the rest came from the AI. Conversely, if you had drawn a wolf bounding through the trees, you would have had to make thousands of tiny choices while making the drawing: how big is the wolf? Is it fierce or cowering? What colour is it? How visible is it in the trees? Are the trees part of a dark and dense wood, or one or two saplings on the edge of a plain? The details of the wolf, the sky, the plants, the ambiance, and so much more than I can possibly express in a paragraph – all of these are things you would have had to think about and make choices about. The choices might not have been conscious or thoroughly considered, but they would still have come from you. And the skill behind the drawing, the ability to translate the image in your head to the canvas – that, also, reflects a deep part of you and would have shaped the final product in profound ways. The effort and intent that went into those choices and that skill are what would have formed most of the meaning of that image. They are completely absent from the AI-generated image.

As I said earlier: part of the point of creation, for a creator, is the process. Use of AI allows people to “create” but removes much of the purpose of creation in the first place. It turns something that can be one of the most soulful and expressive of human activities into automated button-pushing.

I can hear you object! Wait, you might say, none of this is an argument against AI in general, just in how it’s used. One could imagine creating an image with AI in a way that imbues it with meaning. Suppose you thoroughly experimented with different increasingly detailed prompts, discarding and iterating over thousands of images until finally finding one that captured the idea in your head. This would be more meaningful, and maybe taken to the extreme, would probably even be similarly meaningful as something that did not involve AI.

I’ll admit that, yes. But there are still two problems, and they are big ones.

First problem: very few people do this, for good reasons, which means that it is an issue with how the vast majority of how AI is actually used. The reason few people do it is that it is very, very difficult. There is a staggering amount of prompt engineering and curation necessary to be equivalent to the thousands of tiny decisions and intense degree of skill-building that go into a hand-drawn image. It is so difficult that long before that point, it’s actually easier – not to mention far more rewarding – to just draw the darn thing oneself. At most, people might try a few prompts or curate a few dozen images. In those cases, the vast majority of the substance and nature of what has been created has still been put into it by the AI and not by anything the person did. And the person, therefore, has not gotten nearly any of the benefit that comes from wrestling with and creating the substance themselves.

I’ve used drawing as an example but the exact same is true of an essay. An AI couldn’t have written this blog post. The amount of strategic prompt engineering and curation required to get this from an AI would have been far more work than just writing it. I’d go even further and say that I doubt whether I could have gotten this with any level of prompt engineering, because I only fully figured out what I was going to say, myself, by trying to write the thing in the first place. I had the basic idea before I began writing, but the refinement of the idea – the details that make the essay work – came only by thinking a lot about the meaning I was trying to convey as I wrote it. I simply can’t imagine how a process of refining prompts would have gotten me to that point. Even if I’d started from a generic essay (created by a simple prompt) and edited it, the only way to have gotten from that to this would have been if I edited it so much there was nothing left from the AI in the first place. And that would almost certainly have been more work, because I’d have had to expend time trying to figure out what part of the blather it produced captured what I was saying, what was rubbish, what was correct but badly stated, how to stitch it together, and so forth.9

Using AI in a way that gives it real meaning is almost always more work than not using it at all. Which means that when we rely on AI to create because it’s easier or faster or beyond our skill level, we are robbing ourselves of the most of the purpose behind creation and skill development.

Second problem: suppose somebody, for some strange reason, did go through a process with an AI that was effortful and mindful enough that the resulting creation was imbued with meaning.

… How could we tell?

This touches on the biggest way that AI changes our relationship with creation, and it has to do with Deep Meaning, with the relationship between the creator and the audience. The issue is that very little of the underlying process is apparent in the output in the same way it is for a non-AI mediated creation. At most, we can often pick up the “vibe” of an AI-generated piece and can guess that it wasn’t made by a person. Even with that, our error rates are not great. And if we are confronted with something that was co-created by AI and human, it is nigh impossible to tell what parts are from the human and what parts aren’t. Is the wolf that particular shade of brown because the model decided to make it so, or because the human prompted that? Does the essay use that word because the human mind behind it thought it was the best way to express the meaning they intended to convey, or because the AI selected it as statistically most probable?

We simply cannot tell. And that means we (as the audience) cannot engage in our end of the rich process that underlies Deep Meaning: we cannot do anything but extremely shallow interpretation. Moreover, there is no mind on the other end to respond to our interpretation. The reciprocal interaction loop that Deep Meaning requires is destroyed when we place AI in the middle; it eats away at this vital, deeply human thing until very little is left.

And that is one of the best-case scenarios. Much more often, we are confronted with AI “creations” that were lazily generated, with superficial prompts and not much thought given to editing or playing with the output. The AI destroys the link between the creation and the human mind on the other end, and adds very little meaning of its own.10 We are then presented with the AI “creations” in contexts where we’ve learned to expect Deep Meaning, and told that we should be able to interpret them the same way we would if a human had made them. We are told that AI art is “real” art and informed that human writing is dead because now AIs can do it much faster and that it is all so good that we should stop bothering to do art or create things ourselves and learn to work with AIs to do it together instead.

This is fucking horrifying.

When people say things like this – or share an AI-generated creation with me expecting me to engage with the “meaning” of the piece – I feel similarly to how I’d feel if somebody wanted me to treat a dead person like a live one. That thing they’re shoving in my face might have the surface form of something that matters, but it no more contains meaning than a corpse contains the essence of a person. And I find it gross and disturbing to be asked to act as if I believe otherwise.

AI-generated content is a perversion of creation. I choose that word deliberately: it perverts something important and sacred (Deep Meaning) into something far more shallow and senseless (Projected Meaning). To the extent that there is any meaning in the “creations” of an AI, it is meaning that we, the audience, project onto it. But because the surface form is so much like something with Deep Meaning, it can be very hard for us to remember this. And so we find ourselves feeling curiously unsatiated, knowing that it wasn’t right in some fundamental way, but not knowing why.

I guess part of why I’m writing this blog post is to explain why.

Are words like “perversion” and “horrifying” overwrought? I honestly don’t think so. I mean, I do see value in AI. I enjoy playing around with it sometimes. As a scientist, I’m fascinated by what it is doing and how. I think it’s a good tool for some things if used judiciously (like creating lists of suggestions or boilerplate language for forms nobody will read). But, at best, it is far less of a good tool than what many people – usually people who don’t understand it very well – want to make it be. At best.

AI is the junk food of meaning-making in the same way that social media is the junk food of human connection and junk food is, well, junk food. All of them have been expertly designed to appeal to fundamental aspects of how our brains work, to fit “just right” to our desires and tastes and wants – but all of them only mimic the thing we really crave. Too much junk food is bad for us because we fill our bellies with it instead of the actual nutrition our bodies require. Too much time exchanging messages with strangers on social media is bad for us because instead of spending our hours building deep, personal connections with real people, we engage in shallow interactions with people we hardly know.11 And too much time indulging in AI-mediated “creation” is bad for us because it steadily divorces creative work from the process of meaning-making that gives it its power, thus alienating us not only from each other, but from our own selves as well.

Modernity and the collapse of meaning

It’s worth pointing out that AI is not the only thing that breaks the process of Deep Meaning. It does so in a qualitatively new way (by removing the agent12 at the other end) but there are other ways of breaking it, too. Pretty much anything that separates the audience from the creator – that makes it impossible for the audience to infer the creator’s intent – will do the trick.

Benjamin worried about mechanical reproduction, but the modern world has many more powerful ways of fucking this up. We have soulless companies that produce committee-created boilerplate that is especially designed to communicate nothing while still sounding sensible. We make it increasingly difficult for people to seek help from authentic humans; people who want to contact a company must inevitably negotiate a maze of automated systems, only to be forced to talk to a minimum-wage lackey reading from a script, who lacks the power or knowledge to help. People with chronic illnesses are extremely lucky to be able to develop ongoing and supportive relationships with medical practitioners who know them, rather than having their complex situation dealt with in rushed meetings with doctors whose choices are more shaped by resource availability and insurance companies than the human in front of them. Mainstream media treats everything, from politics to climate change, as a spectacle, reporting on their own thoughts about the performance of the spectacle rather than communicating actual information about the thing itself. I could go on and on and on, but I’m sure you all can fill in your own examples; they are ubiquitous.

Other than AI, we see this alienation of people from meaning most profoundly in the wealth of trolls and other “inauthentic activity” on social media and the internet. Although the (non-bot) people on the other side of this activity are at least actually agents with minds (unlike AI), they break the process of Deep Meaning-making because they are not putting themselves into the creation in any real sense. These actors copy and paste “content” which has been designed not to communicate authentically to the desired audience, but to achieve some political outcome.13 In the creation of this content, words are divorced from their meaning, because the goal is no longer communication.14 The same thing happens anytime a person lies, of course, but inauthenticity on social media operates on a scale that was previously unimaginable; plus, it is also much more difficult to detect because it is so much easier to simulate the superficial trappings of authenticity and so much harder to gain access to the ground truth.

The world is full of bullshit, and it’s only getting fuller.

So if you ever wonder why things feel so meaningless nowadays… well, it’s because, in a real sense, we’ve engineered a world where more and more of it is meaningless. We’re mercilessly sawing away at the connection between each other – a connection that relies on the fact that real minds are on both ends, and both are authentically trying to make meaning and share it. AI breaks the connection most powerfully, but meaninglessness is the water in which the modern world swims.

This sucks, doesn’t it?

I’m not very optimistic about where this is going, to be honest.

As a teacher, what I want to teach, and what I want people to practice doing, is the process of creation itself. I don’t give a single goddamn flying fuck what ChatGPT thinks about the essay prompt. The purpose of having students write an essay is not to fill the world with yet more words on the topic of the essay: the point is to find out what my students think about the topic. Actually, the point is even deeper – it is to make them go through the process of figuring out for themselves what they think, and to wrestle with putting those thoughts into words so that they can clarify things to themselves and improve their ability to share those thoughts clearly. This process cannot be short-circuited by outsourcing it to AI; to do so removes the entire fucking point.

The same thing goes with other things I teach. When I teach coding and statistics, the skills I aim to impart are the habits of thought and the ease with concepts and ideas that come from actually doing the coding and actually figuring out how to apply and interpret the statistics on actual real data. These skills do not come by asking an AI for the code and copying that code into the console without understanding. Just as real facility with advanced math requires a facility with numbers (not using a calculator every time you need to add something), so too facility with statistics and coding and analyses comes from internalising the base skills; and you only really internalise those by doing and creating things yourself.

We lose so much of what learning should be if the goal of education becomes to pass it to an AI and then fiddle with the content it outputs. We are outsourcing a real part of our humanity, our creativity, to a machine, and not recognising the cost.

As someone concerned about mental health, I’m really concerned about the deepening alienation, nihilism, and ennui that we are letting ourselves in for. Do you think the levels of depression and anxiety are bad now? Wait till human connection is fractured even further, where instead of actual interactions with actual therapists people are expected to talk to ChatGPT, where social anxiety has heightened to the point that people’s “interactions” with each other consist of sharing AI-generated witticisms on social media. Wait till we live in a world where we are less and less sure that people we talk to online are real people, or their communications are their actual words. Wait till we live in a world where it is routine to get emails that were generated by AI, which are then answered by AI, with people taking no part in the process at all.

Does this sound like a world you want to live in?

Most profoundly, as someone who must live in the world, I’m troubled about living in a society where meaninglessness is the norm. Among other issues, fascism and authoritarianism thrive on meaningless because people who lack meaning turn to “strong leaders” and are easy to manipulate. Even beyond that, our information systems rely on norms of cooperation and trust that the people on the other end, the creators of messages, are authentic and real. Without that, the best case is that the system stops being used and dies entirely. More likely, while the rich might be able to create walled gardens of meaning, the system for most of us will become a swamp of falseness and distortion, a cursed transformation of humanity’s greatest asset – our cumulative cultural knowledge – into our greatest weakness.

I’m concerned that the end state of all of this is a coarsening of the idea of meaning in the first place. I do not want to live in a world where meaning has become a cargo cult – where people confuse real creation and real meaning with simulacra that retain the superficial characteristics but lack the underlying process and agency that are where it really comes from. Once we’re there, how can we get back?

This essay is my (almost certainly futile) effort to stop this process. I am definitely an old man shouting at clouds here, but I feel like I have to say something. Maybe by pointing out exactly what is going on we can recognise why it is so important to have real authentic minds that are connected by a real authentic process. Maybe people with more power to change course will understand why we can’t outsource creation to machines without losing something fundamental to humanity.

Otherwise, well… I guess I can say “I told you so.” Assuming anybody can still read this and think about the meaning behind it, that is.


  1. I spent a year in Mozambique a few decades ago, and it was actually far easier for me to send letters home by post than it was to send emails; emailing required a days’ total travel and considerable expense to use one of the few internet cafes in the region. Nevertheless, I think my recipients treasured the letters far more than the emails, despite knowing how much work the emails were; the torn and wrinkled paper with my handwriting and the dust and smudges from its journey around the world added to their value, their authenticity, their connection to me, in some deep and fundamental way. ↩︎

  2. Because it would be so much better for the environment, for equality of access, and for ease of effort if this were not the case! ↩︎

  3. The thing that is most similar to how it changes these relationships is social media and the internet. A point which I’ll tease apart throughout this post, but couldn’t help mentioning here to give you a little teaser! ↩︎

  4. Of course, “mind” doesn’t necessarily mean “human”; I’m perfectly prepared to believe that sentient aliens have meaning in the sense I’m talking about here; so do apes and dogs and cats and parrots. ↩︎

  5. I started the post talking about art alone, because that’s what Benjamin talked about and where I began with my thinking. But I think it applies to creation more broadly – any kind of creation. That means, not just painting and music, but also composing essays and coding video games and doing scientific analyses and sending emails and even writing text messages. Putting something into the world. ↩︎

  6. I suppose you could suggest that the audience is oneself, but at that point you’re splitting hairs to the point that you’re erasing the meaning (ha) of communication. But if you want to see it that way, then go ahead, because I think it’s perfectly cromulent to interpret the rest of what I have to say as being about communication with oneself. ↩︎

  7. Unless I am a very, very strange person. ↩︎

  8. This is a picture of my garden, which I’ve spent a lot of time on. It has a lot more meaning to me than the exact same arrangement of plants would have if I’d stumbled on them randomly or they had been arranged by someone else. ↩︎

  9. If you want to argue that AI is used (perhaps in combination with editing) with success for lots of things, from marketing blather to cover letters, then you are making my argument for me. Yes, people use it! They use it because it’s a lot easier. That’s my point. And the resulting creation is meaningless rubbish because we have used it so mindlessly. You might think that’s fine for many things, like marketing blather or cover letters. Well, you do what you want, but don’t expect people to treat that letter any differently than they would treat a form letter downloaded from the internet with alteration. If someone wants me to hire them, they need to show me them; I want to hire a person, not ChatGPT, not a simulacrum, not somebody who will throw some low-effort bilge in my face and expect me to engage with it. ↩︎

  10. If there is any meaning, it is much more like the meaning found in the patterns of the constellations; it is the meaning inherent in the statistical relationships between words in the training data combined with the decisions made by teams of programmers about how to use transformers and other technological tricks to create coherent output out of that data. The creation of the AI is not the result of any kind of intent other than something like “improve the performance of this model on this training set.” That is the meaning I perceive in AI-generated content. ↩︎

  11. It is possible to make good friends via social media – I’ve done it myself! – but that is not the norm. (And my experience is that good friendship is only possible when social media is the starting point of beginning to interact in person). Most people’s time on social media is spent parasocially, or doomscrolling and not interacting at all, or exchanging curated and highly limited messages with people who we know very little. ↩︎

  12. I do not, do not, do not think that the solution to this is to try to create AIs that are agents – beings with their own, independent, goals and desires. That might solve the meaning problem (at least insofar as we are able to introspect about the goals of agents so different from us) but it would introduce many much greater problems. So please, let’s not do that, okay? ↩︎

  13. Similarly, “influencers” create content whose goal is optimised to make money, to get clicks, to gain subscribers: not to communicate. Real communication and authenticity requires vulnerability, which is scary enough between individuals, and positively terrifying online and at scale. It is thus highly rational for them to do this, but regardless, the end result is the same: there’s not much of the authentic person in there at all. ↩︎

  14. This is, incidentally, what one sees in certain kinds of fascism and politics as well; lying no longer matters, the literal meaning of the words no longer matters, because they are being used instrumentally. It is the same kind of breakage as AI does, although in a very different way – because it breaks the link between minds trying to make sense of the world and communicating that sense to each other. It turns words from symbols into tools. ↩︎

Avatar
Andrew Perfors
Professor

I seek to understand how people reason and think, both on their own and in groups.