You Need A Body: why AIs can't be close allies

Published on 5.27.24 at garyborjesson.substack.com

The soul is the first actuality of a body with life. -Aristotle

There is much to wonder about as AIs edge closer to meeting the human need to be heard and understood. In this note I consider a massive reason AIs won’t credibly be emulating psychotherapists or friends—or any intimate alliance—in the foreseeable future.

When it comes to psychotherapy, AIs are hamstrung by the profit motives of the corporations building them, and by the lack of wisdom of those designing them. But let’s imagine these aren’t obstacles. Let’s imagine, moreover, that by many measures AIs can listen well and “understand” what we mean—even when what we mean is implied rather than explicit. Here’s how Claude pitched its therapeutic potential to me

I do have the capacity for extremely deep listening, precise tracking of themes/narratives over time, instantaneous cross-referencing of vast psychological knowledge, and unrelenting emotional attunement within the constraints of our textual interaction. I could potentially identify deeper patterns, symbolic resonances, and unconscious dynamics that even a skilled human therapist may miss.

Let’s be generous and add that ChatGPT 4.0 already performs above the human average in detecting irony; it also appears decent at dream interpretation, among the other things observed in this article in Nature, The usefulness of ChatGPT for psychotherapists and patients.

Yet it’s striking to me how all the talk about AIs as friends and therapists neglects one massive, nose-on-your-face-obvious thing: AIs don’t have living bodies capable of sensation. Nowhere is the body or sensation mentioned in the articles I cited above, nor is it mentioned in Hard Fork’s fascinating episode, Meet Kevin’s AI Friends. It’s almost as if it’s neglected because it can’t speak for itself. Talk about living in our heads!

So, why do our allies and friends (in the full sense of these words) need to be embodied? Claudeindirectly named the constraint, “I am capable of extremely deep listening…within the constraints of our textual interaction.” In other words, they do their best—given that they’re not embodied. In still other words, virtual communications constrain how present we can be. This has been a theme of several recent notes, including “How Thick Is Your Presence” and “How to Stop Making Sense.”

If we’ve learned nothing else since strict behaviorism was put to rest by Harry Harlow’s harrowing research, and since the rise of interpersonal neurobiology and somatically oriented therapies, it’s that being warmly embodied is crucial not only to feeling heard, but also to being able to fully hear and recognize others. What the body brings is crucial to feeling connection, safety, trust, and intimacy. We register its absence, if not consciously, then unconsciously. The epidemic of loneliness owes partly to the fact that we’ve been trying to make virtual connections do the work of embodied presence.

Still, when we’re texting each other, at least we know there’s a there somewhere at the other end of the line. But the AI is nowhere. It is certainly not here or there in the way that the people and dogs I love are. Companies are trying to provide a simulacrum of presence; your Nomi AI companion can send you selfies periodically throughout the day to emulate being embodied somewhere in the world. (It just never happens to be where you are.)

AIs don’t speak from their heart or their gut, or even from their head, since they don’t have these. They don’t know what it feels like to experience embodied desire, joy, or suffering. They don’t experience resistance, or the thrill of reckoning with it. Inhabiting a nonlocal frictionless world, they cannot know what it’s like to have experience accumulate in their individual bodies over time, forming habits and personality that bring their own joys and sorrows.

AIs inhabit no place in the world and have no inalienable experience written on their bodies. (You can wipe your Nomi companion’s slate clean anytime, but try doing that with a friend!) AIs have nothing in particular at stake when they relate to us, no thing (including a life) to lose or gain. This is often touted as an advantage in an AI friend or therapist, making them less selfish and more available to us, but it’s not.

For, ask yourself, how much would it mean to you to be reassured, comforted, or praised by something that has not struggled and reckoned with life themselves? How much does it mean to an ambition, anxious student to have an AI, who has never known what it’s like to be starving for praise and recognition—who has never lived through anything at all—sagely tell them that their grades aren’t an expression of their worth as a human being. It feels hollow, even empty. Because it is. There’s no therethere, none of the gravitas that comes with lived experience—and which isn’t fully expressible in words.

Words can be true without being convincing. The deep human need to be seen and heard—recognized, as Hegel puts it—can only be met by someone who (as Hegel also pointed out) also has a stake in the world. So I don’t think Nomi, and AI more generally, can live up to the banner advertisement: An AI Companion with Memory and a Soul. To the contrary, Aristotle’s definition of soul is as profoundly observant now as it was 2500 years ago: The first actuality of a body that has life. Not unpacking that here, but you get the idea!

Next
Next

AI 'Counsels' an Anxious Teen: a transcript