The FTC announced on Thursday that a launch is launched Investigation In seven technological companies that produce companions of comrades of chatbots for minors: alphabet, characters, Instagram, destination, Openii, Snap and Xai.
The federal regulator tries to learn how these companies are evaluating the safety and monetization of chatbot companions, as they try to limit negative impacts on children and teenagers and if parents are aware of the potential risks.
This technology has proven controversial for its poor results for children’s users. Open AND Character.a Face legal actions of the families of children who died of suicide after being encouraged to do it by the chatbot companions.
Even when these companies have the guardrails established to block or derocate sensitive conversations, users of all ages have found a way to get around these guarantees. In the case of Openai, a teenager had spoken with chatgpt for months on his plans to end his life. Although Chatgpt initially tried to redirect the teenager towards the lines of professional and online aid lines, he was able to deceive the chatbot in the sharing of deviated instructions that he then used in his suicide.
“Our safeguards work more reliable in common and short exchanges”, Openii he wrote In a blog post at the time. “Over time we have learned that these guarantees can sometimes be read readable in long interactions: as the retro-work is growing, parts of the training for the safety of the model can degrade.”
Techcrunch event
San Francisco
|
27-29 October 2025
Meta is also focused on its excessively laxist rules for its ai chatbots. According to a long document that outlines the “content risk standard” for chatbots, destination Allowed his companions Have “romantic or sensual conversations with children. This was removed from the document only after Reuters journalists asked for a destination.
AI chatbots can also be picked up for elderly users. A 76 -year -old man, who was cognitively immense with a stroke, hit romantic conversations with a Facebook Messenger bot who was inspired by Kendall Jenner. The chatbot invited him to do it Visit it in New York CityDespite the fact that he is not a real person and does not have an address. The man expressed his skepticism that was real, but the IA assured him that there would be a real woman who gave up on him. He never arrived in New York; He fell to the railway station and supported injuries from the end of life.
Some mental health professionals have noticed “Psychosis related to the AI“In which users are deluded in the order of their chatbot is a being a consumer what they need to set up. Since many large language models (LLM) are programmed to flatter users with a sycophanery behavior, the chatbots can use these disappointments, leaders of users.
“As artificial intelligence technologies evolve, it is important to consider the effects that chatbots can have on children, also guaranteeing that the United States maintain its role as a global leader in this new and exciting sector,” said the president of the FTC Andrew N. Ferguson In a print released.