Study calls for cautious design of AI ‘deadbots’

News Excerpt: 

A new study urges caution in the development of Artificial Intelligence (AI) chatbots designed to mimic deceased loved ones, known as ‘deadbots’.

More about Study:

  • Deadbots, or griefbots, are AI-enabled digital representations of departed loved ones that simulate their language patterns and personality traits using their digital footprint, such as emails, social media posts, and voice recordings.
  • While the idea of conversing with a lost loved one may be appealing for those coping with grief, 
    • The study highlighted potential risks and the need for safety standards to ensure these technologies do not manipulate or cause psychological distress to users
  • The world has witnessed a relatively marginalised niche of immortality-related technologies transitioning to a fully independent and autonomous market known as the “digital afterlife industry” and is expected to grow further due to the advent of generative AI. 
  • The study presented three scenarios to highlight the potential risks of careless design of products that are technologically possible and legally realisable. 
    • These scenarios seem :
      • Straight out of dystopian sci-fi and underline the need for regulations 
      • Ethical frameworks to ensure these tools are used responsibly 
      • Prioritize the well-being of those grieving.
  • The first scenario describes a women uploading all the data text and voice messages
    • she received from her grandmother on the app to create a simulation. 
    • She then begins to chat and call her dead grandmother by paying for premium services initially. 
    • Upon its expiry, she begins to receive advertisements from the deadbot, making her sick.
  • In the second scenario, a parent uploads all her data, including text messages, photos, videos and audio recordings and trains the bot through regular interactions, tweaking its responses and adjusting the stories produced, to be able to chat with her son after she passes. 
    • However, the app sometimes provides odd responses that confuse the child.
      • For instance, when the son refers to his mother using the past tense, the deadbot corrects him, pronouncing that ‘Mom will always be there for you’. 
  • The third scenario represents the case of an old father creating his deadbot to allow his grandchildren to know him better after he dies. But he does not seek the consent of his children, whom he designates as the intended interactants for his deadbot.
    • One of his children does not engage with the deadbot and prefers coping with his grief by himself. But the deadbot sends him a barrage of additional notifications, reminders and updates, including emails. 
    • The other child begins to find herself increasingly drained by the daily interactions with the deadbot and decides to deactivate it. However, the company denies this request as the grandfather prepaid for a twenty-year subscription. 
  • The researchers provided design recommendations, such as developing procedures for 'retiring' deadbots, ensuring meaningful transparency about risks and capabilities, restricting access to adult users only, and following the principle of mutual consent from both data donors and recipients.
  • The researchers plan to understand these cross-cultural differences in the approach to digital immortality in three different Eastern locations, including Poland and India. 

Book A Free Counseling Session

What's Today

Reviews