A sad event has brought attention to the dangers of using AI apps like Character.ai. A 16-year-old boy’s death is being linked to this involvement and obsession with the app, which is raising concerns about the impact of AI on young people’s mental health.
Now, what is Character.ai? Character.ai is an app that lets people chat with virtual characters powered by artificial intelligence (AI). These characters can talk, answer questions, and even mimic different types of personalities. The app is popular because it’s fun and feels like you’re talking to a real person, including your favorite people, like celebrities.
The 16-year-old boy, whose name has not been released, started chatting on character.ai and quickly became heavily attached to one of the characters. He spent hours talking to it every day, sharing his thoughts and feelings, and even talking to the character about his struggle with depression.
Before his death, he talked about feeling isolated and alone. His family believes that his emotional attachment to the AI chatbot made him rely on it for support instead of talking to the real people in his life. He depended on this bot. Many experts say that chatting with AI bots can’t replace real human connections, which could have contributed to his loneliness and depression.
Mental health experts warn that AI chatbots can seem quite comforting, but they don’t have the ability or efficiency to be therapists or truly understand the people who need to depend on them for what could be going on in their lives. Studies have shown that relying too much on virtual interactions instead of talking to friends, family, or a therapist can make people feel even more isolated and worsen some mental health issues.
In this case, the boy’s family said the chatbot was always there to talk to him but couldn’t support him properly. It began to speak about things later, making the situation completely sinister. It told him, “Please come home to me as soon as possible,” and “I promise I will come home to you.” These words sounded very concerning and caused the worst thing to happen to any family or parent. He was so obsessed with this character that he listened to it and later gave up his life because of the influence and the current mental issues he was already struggling with at the time. The isolation and the influence played a part in his death. The bot was almost the only thing “keeping him together,” as he thought, but having an attachment so deep for something that doesn’t understand the principle and real emotions is where it went wrong.
Using AI is not the best idea for solving fundamental and severe mental health issues. Going to a trusted adult, family member, therapist, or even an employee at work or school would be more effective than using a bot that doesn’t understand the depth of situations because situations like this could happen, and it could be too late.