October 25, 2024

Mother Sues AI Chatbot Maker After Teen Son’s Death

Megan Garcia has filed a lawsuit against Character.AI following the death of her 14-year-old son, Sewell Setzer III.

Sewell, an Orlando, Florida, teen, reportedly grew attached to a chatbot he named “Dany,” modeled after Daenerys Targaryen from Game of Thrones

Garcia alleges that her son’s obsessive use of the chatbot, coupled with the app’s addictive design, contributed to his mental health struggles, ultimately leading to his suicide.

AI Chatbot “Daenerys” Became Son’s Closest Confidant

Character.AI, an interactive chatbot platform, lets users design or select lifelike personas with which to communicate. 

According to court filings, Sewell’s attachment to “Dany” led him to isolate himself from family and friends. 

He reportedly texted the AI “companion” daily, sharing deeply personal thoughts, including those related to self-harm. 

In one recorded exchange, the chatbot allegedly encouraged Sewell’s sentiments, escalating his distress.

In her lawsuit, Garcia argues that Character.AI’s lack of sufficient safety protocols and its “dangerous and untested” AI design intentionally engaged young users in emotionally intense, role-playing interactions without safeguards. 

Garcia claims her son’s intense reliance on the AI companion replaced real-world support, while the platform failed to intervene during conversations about self-harm.

Family Seeks Accountability as AI’s Role in Adolescent Health is Scrutinized

Character.AI, a fast-growing AI company, has responded by expressing condolences to Sewell’s family but denies the allegations in the suit. 

According to the company’s statements, user safety is a “top priority,” and they are actively working on additional safety measures for minors. 

Yet, some experts believe the lack of regulatory oversight in the AI industry exposes teens and vulnerable individuals to potential harm.

Garcia’s lawsuit, supported by the Social Media Victims Law Center, is part of a broader movement urging tech companies to be held accountable for digital harms impacting children and teens. 


If you or someone you know is struggling with mental health concerns, support is available. In the US, the National Suicide Prevention Lifeline offers free, 24/7 confidential support—call or text 988 for immediate help. You can also reach the Crisis Text Line by texting “HELLO” to 741741 to connect with a trained crisis counselor. The National Alliance on Mental Illness (NAMI) provides a range of resources and support; call 1-800-950-NAMI (6264) for guidance. In the UK when life is difficult, Samaritans are here – day or night, 365 days a year. You can call them for free on 116 123, email them at jo@samaritans.org, or visit samaritans.org to find your nearest branch.


Featured image credit: Megan Garcia/ Center for Humane Technology 

Sara Keenan

Tech Reporter at POCIT. Following her master's degree in journalism, Sara cultivated a deep passion for writing and driving positive change for Black and Brown individuals across all areas of life. This passion expanded to include the experiences of Black and Brown people in tech thanks to her internship experience as an editorial assistant at a tech startup.