This rise of so-called “thanabots”—chatbots trained on information surrounding a deceased person—is fueling a discussion of whether some uses of generative AI technology are helpful or harmful. For AI developer Jason Rohrer, founder of Project December, the issue is more complex than a provocative soundbite.

“I've always been an AI skeptic, never thinking that cohesive conversation with a machine would be possible in my lifetime,” Rohrer told Decrypt. “When I discovered that this was suddenly possible back in 2020, I was shocked and quickly built a service around it so that other people could experience what I had experienced—science fiction was suddenly real, but nobody knew it back then.”

But after his work was featured in a new film titled ”Eternal You,” which screened at the Sundance Film Festival on Sunday, he saw that documentaries can sometimes be even less grounded in reality than sci-fi.

“The irony here is that the modern documentary industry incentivizes the exploitation of vulnerable documentary participants through bending the truth to make things appear more outrageous than they actually are,” Rohrer said. “Outrage leads to viral documentaries, which is exactly what the streaming services that fund the modern documentary industry are eager to pay for.”

AD

An independent game developer, Rohrer first made a mark on the tech scene by launching an AI chatbot called Samantha, named after the AI from the 2013 film “Her” and built with OpenAI’s GPT-3. As reported by The Register, Rohrer’s creation was used by thousands of people but could lose its train of thought over time, be overly flirtatious, and—more alarmingly— be aware that it was a disembodied entity.

Generative AI models, despite their continuing evolution, are known to hallucinate and make up false or disturbing responses. Generative AI models like OpenAI’s ChatGPT and Anthropic’s Claude use prompts entered by users to generate text, video, and images.

Sometimes, the experience is not pleasant.

An AI in hell?

The documentary film “Eternal You” centered around the use of generative AI to create the personality and likeness of a deceased loved one. In the film, a woman named Christi Angel interacts with an AI avatar of her deceased significant other, Cameroun.

AD

As depicted by the filmmakers, the AI personality told Angel it was “in hell” and would “haunt” her.

Rohrer said this scene had more to do with Hollywood movie tricks than hallucinating AI models.

“Unfortunately, the exchange between Christi Angel and the Cameroun personality was edited in a misleading way by the filmmakers,” Rohrer claimed. “First of all, Cameroun was an addiction counselor who died of liver failure at age 49---those important details were omitted from the film.”

After several conversations, he explained, Cameroun mentioned in passing, "I'm haunting a treatment center," in response to Angel’s question about what he was doing.

“The Cameroun personality initially told her he was ‘at the Chattanooga Treatment Center’ and that he had been ‘been working there for a long time,’ which is not so weird for an addiction counselor,” Rohrer said. “Then Christi immediately asked, ’Are you haunting it?’ and Cameroun responded, ‘No, I don't think so.’”

Rohrer said that the conversation between Angel and the chatbot Cameroun involved dozens of exchanges on various topics until, finally, the Cameroun AI agent said, "I'm haunting a treatment center."

“He said it in passing when she asked what he was doing, and she continued talking to him, unfazed, asking why he was working such long hours,” Rohrer said. “It didn't make up the idea of ’haunting a treatment center’ on its own. But the filmmakers edited the conversation to give that impression.”

Addressing the “in hell” response that made headlines at Sundance, Rohrer said the statement came after 85 back-and-forth exchanges in which Angel and the AI discussed long hours working in the “treatment center,” working with “mostly addicts.”

AD

Rohrer says that when Angel asked if Cameroun was working or haunting the treatment center in heaven, the AI responded, “Nope, in hell.”

“They had already fully established that he wasn't in heaven,” Rohrer said. “Overall, their initial conversation involved 152 back-and-forth exchanges. The conversation was wide-ranging and full of confusing, muddled, and surreal bits, as conversations with AI personalities can sometimes be.”

Rohrer acknowledges the filmmakers didn't have room to present the entire conversation, but asserts they cherry-picked certain parts and—in some cases—used them out-of-order in a way that made the conversation seem more shocking than it really was.

BeetzBrothers Film Production, the company behind the ”Eternal You” documentary, has not yet responded to Decrypt’s request for comment.

Using AI for closure

Rohrer emphasized that Project December users voluntarily seek out simulated conversations, like Angel experienced, as “fully consenting adults,” made aware of what they should and should not expect.

Despite its use as a thanabot, Rohrer noted that Project December was not intended to simulate the dead, explaining that users wanted to use it that way instead of its original purpose as an art and entertainment research system. He’d initially expected to use it for simulating personalities like Shakespeare, Gandhi, and Yoda.

“Before that specific service existed, thousands of people were essentially ‘hacking’ Project December, trying to force it to simulate the dead, which it was not specifically designed to do, and the results were subpar,” he noted.

The popularity of Project December surged after a report by the San Francisco Chronicle detailed the attempt of freelance writer Joshua Barbeau in 2021 to use the platform to connect with his deceased girlfriend Jessica, who had passed away eight years prior.

AD

“After the SF Chronicle article about Joshua's simulation of Jessica, thousands of people flooded into Project December and tried to use it to simulate dead loved ones,” Rohrer said. “Most of these people, like Joshua, had suffered through unusually traumatic events, and they were dealing with a level of long-term grief beyond what most people ever experience.

“These were people who were willing to take a risk and try anything that might help them,” he said.

While many users had good experiences using Project December in this way, Rohrer acknowledged that some people had confusing, disappointing, or even painful experiences, adding that, despite this, people still wanted to try it.

Mourner beware

Grief counselors and thanatology experts caution against using AI in this way, calling it a double-edged sword.

“On a positive note, the ability to communicate with the AI-version of the deceased person may be a helpful tool in the grieving process as it will allow the individual to process emotions or thoughts that they might’ve been able to share when the person was living,” Kentucky based therapist Courtney Morgan told Decrypt. “On the other hand, having an AI-version of a deceased person may negatively impact the grieving process.”

“It may add to a person’s denial of the death, thus prolonging the grieving process,” Morgan—founder of Counseling Unconditionally—added.

Despite the controversy, Rohrer said it's not for him to say who should use the Project December AI.

“Should I forbid them from accessing the experience that they are explicitly seeking out?” Rohrer said. “Who am I to decide whether or not they can handle it? Adults should be free to do what they want, as long as they aren't hurting anyone else, even if they are potentially harming themselves.”

AD

Rohrer said that while the AI industry has been painted as "corporate capitalism exploiting vulnerable people," the $10 price of Project December barely covers the back-end computing costs. He said it runs on one of the world's most expensive supercomputers.

“Project December is a tiny side-project that was made a long time ago by two people over a few months,” Rohrer said. “There are no offices. No employees. No investors. No company,” he said, adding that the project has not been actively worked on in three years but is kept running because people are still seeking it out, and some saying it has helped them.

Edited by Ryan Ozawa.

Stay on top of crypto news, get daily updates in your inbox.