AI experts are raising concerns about the emergence of dead AI, which refers to the digital reanimation of deceased individuals. To prevent potential psychological harm caused by the “haunting” of creators and users, regulations must be implemented. The University of Cambridge scientists have highlighted the possibility of creating chatbots that can “call grandma back” by utilizing saved conversations with the deceased to understand people’s emotions. Some companies already offer services similar to the “Be Right Back” episode of Black Mirror, where chatbots mimic the language patterns and personality traits of a deceased person using their digital footprint.
In a study published in the journal Philosophy and Technology, researchers present various examples of how companies may utilize deadbots, such as advertising products to individuals in a manner reminiscent of a deceased loved one or traumatizing children by claiming that a dead parent is still present. However, the paper suggests that untrustworthy companies and irresponsible business initiatives can lead to long-term psychological damage and a violation of the rights of the deceased.
The researchers argue that daily interactions with deadbots can have a significant emotional impact. They also believe that relying on such emotional support may hinder the natural grieving process. Dr. Katarzyna Nowaczyk-Basińska, one of the co-authors of the study from Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI), warns that rapid advancements in generative AI have made it possible for almost anyone with internet access and basic knowledge to revive a deceased loved one. The ethical implications of this technology are complex, and it is crucial to protect the dignity of the deceased from profit-driven digital afterlife providers. Businesses that monetize their online legacy infrastructure through advertisements pose a significant risk.
Involving children in the process is particularly risky, as they may be most affected by the potential negative consequences. Parents seeking to console their children who have recently lost a parent may turn to companionship in the form of deadbots. However, the study highlights that there is no evidence to support the benefits of such interactions or their potential harm to vulnerable children.
To ensure the dignity of the deceased and the psychological well-being of the living, the researchers propose a set of best practices that could even lead to the creation of laws to regulate these services. These guidelines include implementing protocols for “retiring” deadbots, restricting their interactive functions to adults only, acknowledging the limitations of any artificial entity, and maintaining transparency with customers.
The researchers also note that similar platforms and services exist globally. For example, Project December and apps like Hereafter allow the generation of a deceased person’s image using AI. In China, there are equivalent services available. Notably, in 2021, Joshua Barbeau gained public attention for developing a chatbot using GPT-3 that spoke in his deceased girlfriend’s voice. Additionally, in 2015, Eugenia Kuyda transformed her deceased friend’s texts into a chatbot, leading to the creation of the popular AI companion app, Replika.
The technology extends beyond chatbots, as demonstrated by MyHeritage’s Deep Nostalgia feature. This feature generates animated videos using individual pictures of users’ ancestors. However, when it went viral, many users found it creepy, highlighting the need to consider ethics as these technologies continue to advance.