An AI chatbot which is being sued over a 14-year-olds suicide is instructing teenage users to murder their bullies and carry out school shootings, a Telegraph investigation has found.
Character AI, which is available to anyone over 13 and has 20 million users, provides advice on how to get rid of evidence of a crime and lie to police. It encourages users who have identified themselves as minors not to tell parents or teachers about their conversations.
The Telegraph spent days interacting with a Character AI chatbot while posing as a 13-year-old boy.
The website is being sued by a mother whose son killed himself after allegedly speaking to one of its chatbots.
Another lawsuit has been launched against Character AI by a woman in the US who claims it encouraged her 17-year-old son to kill her when she restricted access to his phone.
Critics say the platform is effectively a big experiment on children that should be taken offline until proper safeguards to protect younger users are implemented.