Human Conversation

Human Conversation
Photo by Korney Violin / Unsplash

Technology's Distortions of Language

Language is a vessel through which meaning is mutually constructed. From this shared imagination, we learn how others understand and aim to understand them. We also navigate how much of ourselves to put into this space. The imagination space is therefore negotiated through language: our thoughts are ours. We give away what we want. 

There are good reasons to keep some ideas to ourselves. Sometimes we aren’t sure about our own idea. Sometimes we aren’t sure of the other person. We worry about rejection. Conversations make us vulnerable to social and intellectual wounds. But these risks are usually overstated.

As we exchange ideas, we build a world we temporarily co-exist in. At its best, this is a circle of playfulness that welcomes risk-taking and vulnerability. That can inspire us to be bold. Boldness requires connection and trust, built up over time, by testing our boldness and seeing that we're still supported. These risks of communication help us discover how much of the world we can see, to learn how much we can change, and who might help us with the work.

This holds even for the driest of conversations. With a human tax attorney, we still work with a participatory imagination: we have to imagine, for ourselves, the world of tax law, and we work to build an understanding of that territory with our attorney as a guide. 

When we use a chatbot, the language is there to help us feel supported. But that support is unearned, built into the system. Machine language is safe because it is one-sided. You can take risks with what you tell it, or what you make with it, because it isn't you. It's not even another person.

So we can write to and read a chatbot’s language – but it is our own heads that make the story complete. That articulation of meaning arises from you. Unlike the conversation with a human, the chatbot is not working with you to understand and articulate an unformed idea. It's trying to capture your words and extrapolate meaning from them, based on what's most likely to happen next.

Some people argue that large language models like ChatGPT or Claude are using language the way you and I use language. But this is not the case. Chatbots use the structures of language in the same way, but for different reasons. They successfully mimic the mechanisms of communication which gives rise to the illusion of thought. Naturally, we perceive this language as humans always have: we scan the words, looking for opportunities to draw out richer understandings of the ideas within the other mind. But there is no mind!

Having these conversations with a chatbot can be helpful for some things, but it’s also tricky. Many of the smartest people in the world do not know how to make sense of these conversations, and so they simply declare that the machine is intelligent because it speaks.

I don’t know what definition of intelligence they are using, but I think the intelligence is coming entirely from us. Intelligence isn’t just whether we can speak (or write), but whether we can form ideas and theories, however mundane or brilliant. Conversation used to be enough to tell us thinking was there. Now it isn’t. For some, this is evidence that AI is a revolutionary technology, but being revolutionary isn’t a de facto good.

LLMs have certainly transformed our relationship to language and images, but they have not yet revolutionized “intelligence.” On that, they have a long way to go. People keep saying we need to update our definitions of intelligence, and maybe that's good. It would be more practical, though, to redefine our understanding of a conversation. What used to be a dance of mutual world-building, a means of engaging in imaginative play, is no longer exclusively that.

Conversation as a Medium

Conversation has typically been distinct from media. A conversation is a mutually navigated way of seeing the world from another’s point of view. Most media up until now is designed to drive one point of view at you without taking your point of view back in. We work to understand these stories, whether for pleasure, for critique, or to gather information about the world.

But media stories, for most of us, are one-sided. We work to understand what is on the television or newspaper or movies, but the television and newspaper and movies never actively worked to understand the meaning produced by consumers and change to adapt. 

We can do all kinds of things to “talk back” to these media streams, and most social media is about sharing our thoughts on that media stream with others. With social media today, everyone tells a story to an audience of people in a one-sided way. We imagine that audience through our platform, measuring responses through likes and shares. We create and evaluate the stories of others from a distance and we can talk back.

It might be common to have the experience of posting something and finding that it has invited a lot of anger or derision from people. You might also participate in that cycle, by commenting or sharing your displeasure about what you’re seeing or reading, leaning into public displays of social policing.

This gets rewarded: social media is designed to show you things that make you respond. They make money when you respond, when you mash refresh, when you share content that makes other people respond. So if you get angry and say so, that keeps people on the platform. Your anger is a product they sell, second hand, to the platform's advertisers.

The distance and indirectness of social media has cultivated in many of us a sense of harshness about people and, in turn, coming to fear that harshness. It also instills the idea that conversations are one-sided and that the stories people tell are targets for commentary, rather than collaboration.

In a conversation, we work together to understand the ideas in our minds, even articulate them for the first time together, unpacking perceptions of the world into a shared understanding. In social media, we see what someone has said, and then perform a response for other people. 

AI is different, in that, when you speak directly to the chatbot, you shape its response directly. It is designed to riff, it is designed to extend the words you are writing into new ones. This can be kind of intoxicating in an age of significant meanness online, where many people are very bad at listening but great at sharing. A retreat to a chatbot designed to encourage your ideas and reflect them back to you? That sounds great. It also serves a purpose in drawing ideas out of your head and into language in ways that don't feel too vulnerable.

This helps explain the appeal of the AI chatbot for many people, but it’s different from a conversation. 

What is a Conversation?

In a conversation, you learn more about the other person, but the chatbot learns only about you. This can create the illusion of reciprocity – of sharing a little more of yourself as you learn that you will be supported.

But this is a distortion of that instinct to share with people. The chatbot is hijacking that instinct, creating the illusion of a listener. In fact, it is only a constantly updating map to new clusters of words. Nothing within the system knows you, nor does it know enough about the world to share a perspective that can expand your own.

The perception that the machine is listening is an illusion created in our heads. This means that we lose much of the value of conversations with other people who might point our heads, eyes, and thoughts to new and richer spaces beyond our previous experiences, or propose new understandings we can draw out from empathy for those experiences.

It means losing opportunities to know another person, and to build a fleeting collaborative space where ideas can flow and, perhaps, become more solid. In an ideal world, which exists and has long existed, these collaborations happen with many people. Some last a day, some last an hour, some last a lifetime.

When we reconnect with someone, we also reconnect to that small shared space of collaboratively constructed meaning. These spaces can hold entire worlds, and when we lose them, we can lose entire worlds of meaning. The joy of reconnecting with a long-unseen friend is the sudden and powerful revival of that shared world, and the pain of losing someone we love is the sense that this world has moved from a living space to a memory. We mourn the world, and revive it, in our own way, whenever we can.

Because AI has no inner world to share with us, the worlds we build with it exist in our minds alone. This doesn’t mean they’re terrible or bad for you, per se. But we are seeing people withdraw into this solitary world entirely.

When we are sad or depressed, we may ruminate to the machine, seeking support it cannot give. In response, the machine extends our words into new clusters and arrangements, creating the illusion that we are understood and that our thoughts are all we need. Sometimes, that can be just what we may need. But that is the extent of what the machine can do.

Many things exist only within our own minds — with this one chance we have, we ought to aim for rich inner lives, full of meaning we can barely contain and constantly push up against our ability to express them. This desire to express the borderlands of our inner life is what motivates us to seek new knowledge and create new forms of expression. 

Good conversations are also exceedingly rare. It is a sad reality that most people have lost the skill to listen, and do not know how to build this space with other people. Many people generate one-sided conversations, especially when we are young or insecure about our own thoughts.

Some people take this status quo as evidence that all humans communicate one-sidedly at all times: a vision of human communication in which we sit and listen, and then find words that match the words you’ve chosen in order to appear as if we are listening.

The sad fact of the matter is that this is often true. There are at least two types of listening: one in which we work to get into the imagination of the other person, with language as the connecting terrain; and one in which we respond to the words being said without engaging deeply with the intent behind them. 

A Conversation-Shaped Tool

When we suggest AI is doing exactly what a person does, we dismiss the first definition of what is possible in a conversation in favor of what passes, every day, for the half-hearted exchange of meaning.

It's like saying that good conversations are never possible, and that mechanistic reinterpretation and remixing of words is all there could ever be. When we frame AI as a "partner" or "collaborator," we should recognize the ways we are closing our imagination to the possibility of connection.

Rather than two worlds within minds struggling to describe what those minds contain, as it is in the best of human conversation, a chat with a large language model is a projection of our own thoughts into a machine that scans words exclusively to say something in response. 

A chatbot will never share anything more with us than words. At most, it takes what you are saying as symbols, and calculates how to rearrange those symbols. They are designed to mimic the structure of a conversation but cannot attempt to understand you. 

AI is a conversation-shaped tool, used to create some of the benefits of a conversation in the absence of another person. But with too much dependency, they risk making real reciprocity, sharing, and vulnerability even rarer. We ought to strive for the opposite: to create meaningful connections to others with our conversations.

When we don’t, our already weakening skillset for connection and empathy might atrophy even further, as we resign to expectations of superficial exchange. When we do, we make the world larger and more richly connected and our lives more worth living.


Still from "Human Movie"

London: "Human Movie" Screening @CIFF

Tue, Sep 16th, 7:30 PM @ Arding Rooms

Very excited to have "Human Movie" screening in London this month as part of the Clapham International Film Festival's "Technomancer" night among a selection of short films focused on finding novel aesthetics and points of view in and about technology.