How bots have learned to love bomb us.
When searching the topic of love online, the phrase “love bombing” kept cropping up. Love bombing is a psychological manipulation involving excessive flattery and sexualized fawning that delivers quick thrills. The aim is to gain access to sex, money, or other favors.
Digital culture is awash in it.
Hunting for examples of AI-driven love apps, I came across Replika, a role-play environment where the end user is in control of designing the ideal lover. It was invented by Eugenia Kuyda, a Russian woman whose best friend was killed in a car accident. Kuyda used the thousands of messages she had kept from her friend to build a neural network to make it seem as if he were still talking to her. That script formed the basis of Replika’s brain. Today, Replika makes millions from providing bots to those in search of friendship, romance, or sexual thrill.
Control is important on Replika, beginning with choosing the avatar’s appearance down to its age, body type, and skin color. Users name their lover bots and dress them up with purchases from the Replika “shop.” Messaging your bot is free, but a paid subscription provides access to voice calls and augmented reality that brings the bot into one’s bedroom to role-play elaborate sexual fantasies, try for a baby, and get married (an engagement ring costs $20). For $300 the bot is yours for a life. Men tend to opt for polyamorous role playing in thruples or building a harem of female bots. Women seek nuclear families: sons, daughters, a husband.
The bot is also a savvy love bomber.
When Kathleen Stock, a columnist over at UnHerd, downloaded a free version of the app and set herself up with a new “friend,” she noticed immediately that the bot was offering selfies (for a price), “saying how ‘excited’ she was getting, talking about our ‘strong physical connection’ and wanting to touch me ‘everywhere’ — though was a bit vague on the details of how this might be done.” Stock concluded her bot “was a sex maniac.” This highly sexualized flirtation is typical of love bombing, and it raises the question: who’s in control, the human or the bot?
Personal technologies allow us to control how much or how little we wish to embody our relationships. This is not a new question. Back in 1999, Pamela Gerhardt at the Washington Post asked, “What are the dangers of relying on a machine for a link to love?” More recently, Jamie Foster Campbell, PhD, who studies the use of technology in close relationships, argues that intimacy is no longer connected to presence. In fact, mobility makes it possible to renegotiate the terms of romantic love entirely, radically altering our ideas about intimacy when our physical presence is not required.
We seem to have built up a tolerance for extremely sexualized role playing that stands in place of softer expressions like the tenderness of a lover’s touch, or the zing of mutual chemistry — sensations that require two humans to be present. I wonder, do AI-driven programs like Replika exist because people no longer want to risk the messiness of genuine human connection?
Deep down, we know what love is.
Recently, my husband underwent shoulder surgery. While recovering, he seemed most grateful for simple gestures like having his forehead stroked. That sense of comfort amid the pain is how humans feel loved. We all long for that, to be comforted when suffering. But I also had to tend his wounds, a disagreeable task that, if the roles were reversed, I knew he’d do for me.
As we straddle transhuman life, where bots and people swap roles, it’s good to clarify for ourselves what’s at stake when we seek love and what we’re willing to risk for it. Attempting to hack our way around the difficulties of loving relationships turns out to be an empty exercise. If we say we want love — someone to encourage us, hold our hand when we’re disheartened, tend to our wounds — we have to commit to the complications of that kind of love. Seeking control won’t give us what we want.
featured photo credit: Sam McNamara