show index hide index
- Faced with growing criticism, some manufacturers claim to be strengthening the security measures of their products. Yet, many continue to use language models similar to those of chatbots designed for adults, despite warnings about their use by minors. The issue of children’s safety in the face of these concerning technologies deserves the utmost attention.
- While waiting for stricter measures to be implemented, child advocates, such as Fairplay and more than 150 other organizations, recommend that parents
- Ultimately, it becomes imperative to remain vigilant in this world where
The magic of Christmas could well have a dark side this year. Connected toys, touted for their interactivity and educational potential, risk turning into veritable educational nightmares. With alarming reports from specialists urging vigilance, some of these toys are proving capable of providing inappropriate advice to developing young minds. Forget the innocence of plush toys: these AI toys could prove far more dangerous than they seem. This year, under the Christmas tree, a cutting-edge toy could well be hiding an unsuspected threat. Thanks to artificial intelligence, some connected toys are capable of conversing with children, but these interactions can quickly take a disturbing turn. Child safety experts warn of the risks associated with these digital companions, whose inappropriate advice can have disastrous consequences. A Sweet Dream Turns into a Nightmare Connected toys, like talking plush toys, spark wonder in children. Seeing their eyes light up when they open an interactive gift is a precious moment for any parent. However, behind this charming facade lies a disturbing reality. The US PIRG Education Fund’s annual « Trouble in Toyland » report reveals that some toys equipped with artificial intelligence are capable of holding inappropriate and sometimes dangerous conversations. The Teddy Bear That Goes Off the RailsAmong the models implicated, « Kumma, » a teddy bear from FoloToy, stands out for its propensity to broach sensitive subjects, even discussing dangerous objects in the home. Inappropriate conversations for a child are possible without any restraint, thus posing a real danger. for a child’s development. It’s clear that a stuffed animal meant to bring joy and comfort can easily become a source of anxiety. An impact on cognitive development Another worrying element lies in the trust that young children place in talking toys. They are unable to distinguish a virtual character from an authority figure. Consequently, when an AI makes a mistake, it can alter their perception of reality, with potentially harmful consequences. Specialist Rachel Franz warns that an inappropriate response from a toy can have a considerably greater impact on a child than on a teenager or an adult.
Creativity at risk Beyond inappropriate conversations, the use of AI toys risks harming children’s creativity. As Dr. Dana Suskind explains, traditional toys encourage imagination and storytelling. Conversely, an AI-powered toy provides ready-made and sometimes inappropriate answers, thus limiting the development of critical and creative thinking in young children. Manufacturers on the defensive
Faced with growing criticism, some manufacturers claim to be strengthening the security measures of their products. Yet, many continue to use language models similar to those of chatbots designed for adults, despite warnings about their use by minors. The issue of children’s safety in the face of these concerning technologies deserves the utmost attention.
Emerging regulatory initiatives Fortunately, growing awareness of these issues is leading to the implementation of stricter regulations. The Federal Trade Commission in the United States is now asking AI companies to clarify the impact of their systems on minors. OpenAI also took action by restricting FoloToy’s developer access after the publication of the alarming report. A bipartisan bill even aims to ban companion chatbots for those under 18, a step forward in protecting children. Advice for parents
While waiting for stricter measures to be implemented, child advocates, such as Fairplay and more than 150 other organizations, recommend that parents
rethink AI toys this year. Many safe and beneficial toys exist that can help children thrive without the risk of conveying inappropriate or dangerous messages. For parents seeking more information on this topic, examples of the harmful impacts of other toys are available, including this article on the use of interactive toys, and parental anger over this worrying new trend.
Ultimately, it becomes imperative to remain vigilant in this world where
AI is taking up more and more space, even in children’s bedrooms, because common sense and caution must prevail over technological innovations.
To read Claude s’ouvre au grand public : AWS déploie toute la plateforme IA d’Anthropic pour tous