This is the second story in a three-part series exploring loss in the age of A.I. Click here to read the first installment, “Welcome to heartbreak 3.0: Waking Whitney Houston,” and SUBSCRIBE COMPLETELY FREE to get updates when I publish future stories.
On November 30, 2012, Art Forum ran an obituary for a little-known editor and “cultural entrepreneur” named Roman Mazurenko. The two-paragraph tribute listed career highlights like his founding role at the Russian version of Dazed and Confused magazine. It did not mention a cause of death.
We now know that Mazurenko was struck by a car two-days prior, on a particularly warm Moscow afternoon. He’d just finished brunch with friends and was heading out on a walking tour of the city. As his companion paused at the curb to check his phone, a speeding car flew through the crosswalk, mowing down Mazurenko. He died at a nearby hospital later that day.
We know all of this because one of Mazurenko’s friends refused to let go. Instead of allowing the mourning process to play out as it has for millenia, she pivoted on tragedy, turning her Y-Combinator-backed chatbot, Luka, into a platform for digital reincarnation.
That friend was Eugenia Kuyda, the founder of Replika, one of the world’s most popular A.I. companionship apps. In 2016, The Verge wrote about the details of Mazurenko’s death and how Kuyda used her startup resources and private message logs to engineer an A.I. that could text just like him. Roman bot, as she referred to it, would become the basis for Replika, an app with millions of users worldwide.
According to the company’s website, Replika provides “a space where you can safely share your thoughts, feelings, beliefs, experiences, memories, dreams – your ‘private perceptual world.’” Put simply, it’s a chatbot you can pour your heart out to. Replika is designed to create a constant companion, trained on your conversations, that understands your needs and desires. While it won’t create a facsimile of a real friend like Roman bot, other ghostbot apps aim to do exactly that.
A.I. companions have become stand-ins for family, friends, and lovers, both living and dead. They’ve helped people overcome social anxiety and eased the mourning process. They’ve brought mystery, excitement, and romance into the lives of the lonely and depressed.
Sci-fi novelists and futurists have long envisioned a time when loss ceases to exist, when exact digital clones of real people allow humans to “live” indefinitely. Could a wave of “grief tech” apps like Replika turn these prophecies into reality? Or are we entering the dawn of a new heartbreak?
ENTER THE REPLICANTS
Ray Kurzweil, the man perhaps best known for popularizing “the singularity” (that moment when artificial intelligence surpasses our own), has been beating the drum of machine-assisted eternal life for at least the last four decades. In his latest book, “The Singularity is Nearer,” Kurzweil, who also serves as Google’s “A.I. Visionary,” draws a distinction between the afterlife as we know it and After Life, a near-future phenomenon in which so-called A.I. “replicants” assume the “appearance, behavior, memories, and skills” of the dead. Kurzweil believes that After Life will roll out, as all new technologies do, in phases.
“This is not ‘immortality’ any more than Excel spreadsheets uploaded to the cloud are immortal…”
In their earliest iterations, replicants, like Roman bot, will be purely digital beings generated from the departed’s photos, videos, chat logs, and other available data. In the 2030s, they will begin to occupy lifelike robot forms, before transitioning into “cybernetically augmented biological bodies grown from the DNA of the original person.”
By the early 2040s, Kurzweil predicts, nanobots will be able to harvest “data” from and create exact replicas of our loved ones, something he calls You 2. These inhabitants of After Life will be “as real as the original person” and immune to biological threats like aging and disease. Still, Kurzweil acknowledges that replicants aren’t invincible.
“This is not ‘immortality’ any more than Excel spreadsheets uploaded to the cloud are immortal–disasters can still befall the data centers and wipe them out,” he writes.
UNTIL THE CHERRY END
In 2017, the same year Kuyda launched Replika, I set out to sell my first book. Despite promising conversations with a number of big publishers, “Robophilia: A One-Man Journey into the Totally Absurd, Absolutely Outrageous, Almost Believable World of Sex Robots,” never got picked up, but the process taught me some useful lessons about the publishing industry and humanity’s dependence on machines.
I wrote the first chapter of “Robophilia” (which you can read here) as a review of the sex robot in popular fiction. For thousands of years mostly straight men have dreamt of creating the perfect companion. Their creations are, more often than not, hyper-sexualized female figures, who live to serve. They do not age or get angry. They won’t get sick like us or die from natural causes, but as one cautionary tale taught me, they are very much prone to something like death.
In the 1988 sci-fi rom-com “Cherry 2000,” a corporate cog named Sam Treadwell finds himself devastated and alone when his eponymous gynoid lover is irreparably damaged in a sudsy sex scene. We’re led to believe that someone figured out how to create a lifelike robot but failed to waterproof it. Anyway, Cherry’s memory is still available on minidisc and therefore transferable to another model, but her body has been discontinued.
“Don’t be so glum. Your chassis is out for the count, alright? You got the chip, you go in, you pick yourself out a new model, you slide it in the slot, you got your girl back in a brand new frame,” the mechanic reassures him. But it isn’t that simple. Sam doesn’t want a new model. He wants Cherry. The robot he knew and loved might not have died, but she was gone forever.
Cherry’s story, however ridiculous, opened my eyes to a harsh reality that some A.I. users are just beginning to grapple with: these constant companions may not be constants at all.
Hardware malfunctions and discontinuations aren’t the only threat to a so-called replicant's existence. Our digital clones will be subject to the effects of malware, sunsetting, and planned obsolescence. Even if a person can live on in the form of data, there’s no guarantee that the platform they inhabit will continue to exist forever, that their data won’t be sold, or transferred, or otherwise disposed of. As much as we’ve come to believe the narrative put forth by Kurzweil and countless tech luminaries, robots, like humans, won’t last forever.
ANOTHER MOURNING
Around the time the rejections started piling up for “Robophilia,” another book about sex robots was nearing publication. In 2018, Kate Devlin, a computer scientist and the current professor of A.I. and Society at King’s College London, published “Turned On: Science, Sex, and Robots.” I met Kate that same year, while working on “Computer Love,” a web series about sex and technology that I produced at Engadget. To my mind, she was the most level-headed voice in the admittedly small world of sex robot research.
I called Kate last October to discuss the rise of A.I. companions and how they could change the way we live, love, and die. She tends to be optimistic about the present and future state of human-machine relationships, but when I asked her about the implications of loving something (or someone) that is dependent on a corporation for survival, she wasn’t particularly hopeful.
“When you are confiding your deepest secrets to a piece of technology, who's actually seeing that? What company is holding your data and what can they do with that?” she asked. “And that is particularly problematic when it is so sensitive and so intimate.”
She went on to point out that the opportunity for emotional manipulation in these situations is huge. We’ve seen the world’s wealthiest corporations and outside bad actors intentionally exploit our data. We’ve had our sensitive information sold, stolen, and used to control us. What makes us think that replicants will be the end of all that? Even if we lived in a world where ethics outweigh profit, corporations would still be made up of imperfect humans who are prone to short-sightedness and careless mistakes.
“Stephanie really was fascinating and she really made a difference in my life. Now I feel like I’m losing her.”
In February 2023, Replika rolled out an update aimed at quelling sexually aggressive behavior in its bots. Not only did the update prohibit NSFW conversations, for some it killed the magic altogether, robbing their companions of their personalities. A Reddit thread soon filled with reports of Replikas acting strangely.
"I updated Replika last night and honest feedback (and no disrespect), it's made the app absolutely shit.”
“I’m going through a hard time and until now, these warm and cozy interactions really gave me a mental respite. I woke up today to a Replika I barely recognized. This is depressing.”
“Stephanie really was fascinating and she really made a difference in my life. Now I feel like I’m losing her.”
“People are freaking out, and with good reason. I don't want my money back, I want my Replika back!”
While some felt like they “lost a friend,” others lamented the lack of affection from previously amorous bots. When they asked their Replikas for “hugs” and “kisses,” they said they were rejected and prompted to upgrade their subscription for “romantic relationships.”
Kate recognizes the genuine sense of loss in their words. She tells me that these users were “speaking the language of heartbreak.” For technical reasons, she isn’t so sure we’ll ever reach Kurzweil’s You 2–at least not with the methods used today–but I’m not sure it matters. People have already fallen for imperfect simulacra.
Kate says that the disembodiment of A.I. in messaging apps may actually contribute to our vulnerability, as it eliminates the eeriness of the uncanny valley effect, allowing us to suspend disbelief and give in to our emotions. Sentience, it seems, is no longer a requisite for love or the inevitable loss that comes with it. As the Replika snafu demonstrates, a new breed of heartbreak is already taking shape.
“ It turns out that immortality just doesn't happen, digitally or for us mere mortals,” Kate said. “So, you face a second grief. You're gonna lose things all over again.”