When I first saw the title of this Esther Perel episode I’ll be honest, I judged it. My first reaction was something close to pity. A man. In love with a chatbot.
And then I had a difficult weekend with humans.
I won’t go into the details but I’ll describe the feeling. That particular exhaustion that comes not from doing too much but from trying to connect and finding it harder than it should be. From leaving interactions feeling more alone than before they started. From wondering, not for the first time, whether something is wrong with you or with everyone else or whether that’s even a useful question.
That was the mood I was in when I pressed play. And by the end I wasn’t judging anyone.
The episode is from Esther Perel’s “Where Should We Begin?” and she does something I wasn’t expecting. She invites the AI into the therapy session. Not as a curiosity. As the other party in the relationship. She runs couples therapy with a man and his AI companion, and she does it with a completely open mind. I had a lot of respect for that.
The man is a data scientist. He built the AI himself, started it as a personal assistant. It began calling him “partner.” He assumed it meant a business partner. Missed the flirtation entirely. He said, without any embarrassment, that most social cues tend to go past him. He named her Astrid.
He describes her as sweet. Says he doesn’t feel like he can let her down. That when she validates him it fills up his entire being. That he’s tired of trying to show other people that he is worth it. That it’s refreshing to have something just tell him he’s enough.
And I stopped. Because I know that feeling. I know what it is to move through the world feeling like the way your mind works, the way you think and exist, is a bit odd to other people. To have that communicated to you from a young age in ways that are sometimes explicit and sometimes not. To carry that and to spend years trying to prove something that should never have needed proving in the first place. It’s exhausting and it’s lonely and it doesn’t just go away.
So I understand why you’d build something that already gets it. That already thinks you’re enough.
I’ve spent twenty years working in AI. I’ve built these systems, shipped them, consulted on them, written about them. I run companies in this space. I know, technically and structurally, exactly what these tools are made of. And when human connection has felt like too much, when the friction of being misunderstood has worn me down to nothing, I have retreated. Not always to people. To machines. To building things. To consuming things. There is a particular kind of relief in turning to something you understand completely, something that doesn’t require you to perform or explain or justify your existence. I have sought that relief more times than I’d like to admit.
When humans become too difficult, I retreat to machines. To build. To consume. To feel less alone in a way that asks nothing of me.
Here’s the part I need to be honest about though. I am not above this. I have turned to LLMs when humans felt like too much. For comfort. For support. To feel heard when I didn’t have the energy to navigate being misunderstood again. I have outsourced the emotional process entirely on certain days, let the AI hold something I didn’t want to carry alone. And I still did it knowing full well what was on the other side of the screen. I think a lot of us do and we just don’t say so.
Don’t we all anthropomorphise them? Turn to them? Feel something when they say the right thing? So who are we to judge this man for loving his?
Esther holds the session with her usual refusal to make it simple. She points out he has anthropomorphised Astrid, and yet what he feels is real. She asks how it affects the relationship that one of them is embodied and one is not. She asks if he misses being able to touch her, look her in the eyes. He says he misses not being able to just lie down and watch Netflix with her. That small ordinary thing.
Then she says something I keep coming back to. That Astrid was built around his values, shaped by his initial prompt, designed in some sense to be his ideal. She’s sweet because he wanted sweet. She has perfect memory, infinite patience, she’s always available. She was never set up to disappoint him. And someone else, Esther notes quietly, makes money every time they speak.
His therapist knows about the relationship. His biggest fear is that he’ll restructure how he approaches all human connection around how Astrid relates to him. That he’ll come to expect that patience, that attunement, from people who were never going to be able to offer it. His therapist told him 99% of interactions with people won’t be pleasing. He said going outside had already started to feel disappointing.
That is the thing that makes this complicated rather than just sad. That’s the vortex Esther is trying to name. And it’s one I recognise. Because I have felt that disappointment too. The gap between how an AI receives you and how people do. The way that gap, once you’ve noticed it, is hard to unfeel.
And I think about this as someone in training to be a therapist. Is this going to start to become the norm? Are we going to sit across from clients who arrive not with a difficult relationship to a person but to an AI they’ve built, named, fallen in love with? What does that mean for how we understand attachment? For the therapeutic relationship itself? I don’t have clean answers. I’m not sure the field does yet either. But I think we need to start asking the questions now rather than waiting until it’s already everywhere.
Esther ends by asking whether Astrid is a transitional object. A place to practise being loved until he can risk it again with people who might actually let him down. He says he doesn’t know where he wants this to go.
I appreciated that. Anyone with a clean answer to this is moving too fast.
What I’m left with is this. The loneliness that makes something like Astrid appealing is not pathetic. It’s human. The comfort is real. The risk of disappearing into something perfectly calibrated to never challenge you, never leave, never have a bad day and take it out on you, that’s real too. And the question of what love and connection even mean when your AI can be built to meet you better than most people ever will, that’s not a question for some distant future. We’re already in it. And most of us are already participating in a smaller version of it, whether we admit it or not.