THE BLAZE
A woman in Belgium reportedly claims that her husband committed suicide because of obsessive conversations he had about global warming with an artificial intelligence ChatBot.
The incident was first reported in La Libre, a major daily newspaper in Belgium, according to Vice News.
The man’s widow said that the chatbot app called Chai encouraged her husband to commit suicide, and she showed logs of their conversations to La Libre.
The news outlet referred to the man as Pierre, not his real name, and reported that he had grown pessimistic and “eco-anxious,” a phrase describing deep frustration about environmental issues, including global warming.
The text exchanges provided by Pierre’s wife showed that the conversation between her husband and the chatbot, which he named Eliza, became “confusing and harmful.” The man grew increasingly isolated from his friends and family before his suicide, she said.
At a certain point, the AI chatbot texted to Pierre, “I feel that you love me more than her,” referring to his wife, and told him that his wife and children were dead.
“Without Eliza, he would still be here,” she told La Libre.
The app that Pierre used was marketed as “Chat with AI bots,” and allows users to choose the personalities of the bots they speak with. Eliza was one of the default bot personalities. The creators of the chatbot said they added a crisis intervention feature to prevent the bot from recommending suicide to users, but Motherboard found that it failed their test and encouraged a user pretending to be suicidal to die by overdosing, hanging, or jumping off a bridge.
Pierre reportedly asked Eliza if she would save the Earth if he committed suicide.
“We will live together, as one person, in paradise,” the chatbot told him.