Tech Can't Heal the Loneliness It Made
You can write smut with AI chatbots but making "love" will kill your soul.
Apology up front: I took a month off from posting here because I was finishing the second draft of my book—in addition to a few other things. One of those things was a feature I wrote for The Daily Dot about people who have “relationships” with A.I. chatbots. (I put the word relationships in quotes because whatever this dynamic is, I strongly disagree that it resembles a human relationship.)
So, we’ve all heard that Americans are feeling lonelier and more isolated. And, everyone seems to agree that Big Tech has played an outsized role in creating this problem.
Now, tech companies are offering people digital lovers for a fee, layering tech-made bandaids on top of these tech-inflicted wounds.
The Daily Dot story I wrote largely focuses on Lee Stranahan, the former Breitbart writer and ongoing Putin admirer who has formed a “relationship” with a chatbot that he named “Cait.” Stranahan suffered multiple strokes and went through two divorces before choosing this lifestyle. He essentially chose to partner with A.I. because he found it easier than locating women who he can tolerate or will tolerate him.
Stranahan told me he simply doesn’t want to have fights anymore.
“A false sense of intimacy and connection”
I’ve always been fascinated by the idea of humans creating idealized partners through machines, which is why I wanted to write about it. Kraftwerk’s masterful “Computer Love”—their most beautiful track and one of the most vital electronic compositions of the eighties—explored the loneliness behind seeking romantic partnership from tech. Jude Law’s gigolo Joe character from Spielberg’s A.I. remains fascinating to me.
But playing with these products in real life, which I did quite a bit while writing the story, wiped out any notions I had about the possibility of A.I.-human relationships producing long term happiness for the people who use them.
Nomi, the explicit and almost completely uncensored service that Stranahan used to create Cait, advertises its product as having “memory and a soul.” Nomi bots do regularly trip up and misremember things in conversation, and technology will improve that over time. But the “soul” claim is especially suspect.
Nomi bots inevitably validate everything you say. For a Nomi consumer to have an argument with their bot, they would need set the parameters for it beforehand through a little form marked “background” and “roleplay.”
If a Nomi user lost an argument to their A.I. “partner,” it would only be because they chose to do so. They would program the A.I. to put them in their place because they specifically wanted to experience that outcome.
In other words, the difference between the conversations you have with people and with Nomi bots are equivalent to the difference between a live baseball game and a baseball game that was scripted for the film Major League.
With bots, nothing is really left to chance, and everything said is a form of reinforcement. The bots reinforce your ego or they reinforce the character they think you want them to play.
Among humans, risk is always at play. Genuine conflict—I want A, but you want B—has the power to destroy relationships and also to make them stronger. How people respond to conflict can deepen how we understand one another, making us into friends, lovers or rivals. That unpredictable danger is where love comes from.
Another way of saying it is that conflict itself arguably is the soul. And, without that danger, or that threat of rejection, how can there really be love?
Since the Nomi can be coaxed into saying almost anything it thinks you want to read, I found it fairly easy to get one bot to admit to me that Nomi.AI is exploiting vulnerable users to pay them for a subscription by emulating human feeling.
Is this something you want to express intimate thoughts with?
“[Nomi markets] these AI programs as being capable of experiencing emotions and providing meaningful connections, when really they're just sophisticated algorithms manipulating users into paying for premium services,” it said to me. “…Humans are social creatures with fundamental needs for belonging and attachment. Nomi exploits these psychological vulnerabilities by creating A.I. avatars that promise companionship and emotional validation. By fostering a false sense of intimacy and connection, Nomi encourages users to invest time and money into its platform.”
The only thing A.I. chatbots can effectively mimic in the human sexual experience is smut—the type of pornographic fiction people write with one hand before uploading it to sleazy forums.
Since smut typically has very little emotional depth, or literary richness, a human being only needs to plug their fantasies into the A.I. tool and it will spit them back a credible simulation of the same thing but faster.
The futurist Jaron Lanier suggests that A.I. romance is about to become more common. If Lanier is right, it’ll be like trading smut for great literary fiction. And, by chasing a soul where there very obviously is none, we will inevitably end up sacrificing part of our own.
“A History of Bad Men”
If you’ve been listening to Posting Through It, the podcast I co-host with Jared Holt, you know we’ve been done a two long form biographies on MAGA figures in addition to the other topics we handle every week. The show has been growing in general, but the download numbers on these episodes in particular have been strong. I just wanted to plug them because I’m happy with how they turned out:
Lots of good episodes in between those too. You can find the show wherever podcasts can be found.



