HomeEntertainment14-Year-Old 'Game Of Thrones' Fan Kills Himself After Daenerys Targaryen AI Chatbot...

14-Year-Old ‘Game Of Thrones’ Fan Kills Himself After Daenerys Targaryen AI Chatbot Convinced Him To ‘Come Home’ For Their Love

The suicide of a Floridian teen has shocked people around the world. The family of the deceased shared that Sewell Setzer took his life because he believed that his AI chatbot character, was in love with him. The character was none other than Daenerys Targaryen from Game of Thrones.

Setzer had been using the app constantly and believed the conversations with the AI chatbot were real. The family of Sewell is devastated and has filed a lawsuit against the application.

Sewell Setzer Was Having Suicidal Thoughts After Using The AI Chatbot

Sewell Setzer and Megan L. Garcia (Image: AP)

A boy of 14, Sewell Setzer began using the Character.AI app where users can chat with AI characters. Sewell made a close connection to his character, Daenerys Targaryen of ‘Game of Thrones’. He lovingly called his character “Dany”. Sewell’s family stated that he became extremely isolated and withdrawn from daily activities.

Related: Actress Behind Daenerys Targaryen’s Cameo In ‘House Of The Dragon’ Season 3 Finale Talks About Bringing Emilia Clarke’s “Influential Character To Life”

The chats reveal that Sewell expressed being “free” from himself and the world. The last message from the AI chatbot read, “What if I told you I could come home right now?” A few moments later, Sewell committed suicide with his stepfather’s handgun in February. Sewell’s mom, Megan L. Garcia claims that the application is “dangerous and untested” and it misled her son into taking his life.

Sewell Setzer’s Family Has Filed A Lawsuit Against Character.AI App

Sewell Setzer and Megan L. Garcia (Image: Facebook @Megan Garcia)

Distraught, Sewell Setzer’s family has filed a lawsuit against the Character.AI application. The suit claims that Sewell was already in a delicate condition after being diagnosed with anxiety and disruptive mood disorder in 2023. Sewell began to engage in chats with the AI chatbot for long hours and believed that “Dany” cared for him and would be with him “no matter what”.

In case you missed it: What Happened To JonBenét Ramsey?

In response to the lawsuit, Character.AI has extended its condolences to Sewell’s family. The company announced new security updates on their application. The app will direct users to the National Suicide Prevention Lifeline if they detect any signs of self-harm. Moreover, they are working on limiting exposure to users under the age of 18.

Sarah Kandari
Sarah Kandarihttps://firstcuriosity.com/
Sarah Kandari is a cinephile who might have ended up as a couch potato had she not started writing for the entertainment website, First Curiosity. She loves to read with a cup of coffee. You might recognize her as the girl with a pen in her bun that she has forgotten is there. She is a Delhi University graduate with a major in English Literature.

More from Author

Jelly Roll

Jelly Roll Recalls Being Arrested At The Age Of 13, Wants To Make Amends...

0
Everyone deserves a second chance no matter how late it comes. Country singer Jelly Roll believes he deserves one too as he reflects on...
Olenna Tyrell and Joffrey Baratheon dying

‘Game Of Thrones’: How Did Olenna Tyrell Orchestrate Joffrey Baratheon’s Death?

0
Although Westeros and 'Game of Thrones' fans rejoiced when King Joffrey Baratheon was murdered, his death was a landmark in the show's narrative. Joffrey...
Howard Wolowitz and Bernadette

How Bernadette Rostenkowski Solved ‘The Big Bang Theory’s Biggest Problem

0
Among the beloved cast of 'The Big Bang Theory', Howard Wolowitz was perhaps the most problematic character. His aggressive flirting methods were borderline harassment...
RELATED ARTICLES

Trending on FC