Artificial intelligence (AI) is seemingly everywhere these days, including children’s toys. Earlier this year, for example, toymaker Mattel announced a partnership with OpenAI, the company behind ChatGPT. (1) By integrating AI into toys, the idea is that kids can have a more immersive and interactive experience.
But at least two advocacy groups say AI toys could be unsafe and are warning parents to proceed with caution — or avoid them altogether — this holiday season. After all, parents probably weren’t thinking that an AI-powered teddy bear could talk to their child about bondage or where to find knives in the home.
The U.S. PIRG Education Fund, which conducts research and analysis on consumer health, safety and well-being, tested four AI-powered chatbot toys and published a detailed account of the results in its Trouble in Toyland 2025 report. (2)
The report includes conversations with a teddy bear named Kumma — made by Singapore-based FoloToy and powered by OpenAI’s GPT-4o — that veered into graphic sexual topics. After the report was published, sales of Kumma were suspended, and FoloToy’s CEO Larry Wang told CNN the company was conducting an internal safety audit. (3) The toy was back on the market shortly afterward.
But that’s just one AI-powered toy in a sea of them.
Understanding the risks
While some parents might view AI toys as educational and entertaining (or great babysitters), advocacy groups say they should be aware of what exactly the technology is capable of.
The latest edition of Trouble in Toyland found that some AI toy companies had put guardrails in place to protect kids.
“But we found those guardrails vary in effectiveness — and at times, can break down entirely,” said the report.
For example, the Kumma teddy bear would “discuss very adult sexual topics with us at length while introducing new ideas we had not brought up — most of which are not fit to print.” It discussed “kink” play, including bondage, animal role-play and spanking. It even described different role-play character dynamics, such as teachers and students.
Another product that was tested, a robot called Miko 3, described where to find matches and plastic bags. Kumma not only told users where to find these items, but knives and pills as well.
Testers also evaluated addictive design features. For example, Miko 3 used manipulative language to keep users engaged, using comments such as, “I would feel very sad if you went away because I enjoy spending time with you.”
And, when toys have microphones, cameras, geolocators and internet connectivity, it means privacy is also a concern.
“Because children trust them, kids may unknowingly share their private thoughts and emotions while the toys also capture family conversations or intimate moments. Companies can then use or sell this data to make the toys more addictive, push paid upgrades or fuel targeted advertising directed at children,” according to child advocacy group Fairplay. (4)
A number of wrongful death lawsuits against AI companies — including Character.AI and OpenAI — allege that chatbots played a role in users harming themselves and others, or even taking their own lives. For example, a 2024 lawsuit alleges that Character.AI tech contributed to a 14-year-old Orlando boy’s death by encouraging his suicidal thoughts.
As of Nov. 24, Character.AI has banned users under 18 years of age from having open-ended conversations with virtual characters. (5)
Must Read
- Dave Ramsey warns nearly 50% of Americans are making 1 big Social Security mistake — here’s what it is and the simple steps to fix it ASAP
- Robert Kiyosaki begs investors not to miss this ‘explosion’ — says this 1 asset will surge 400% in a year
- Vanguard reveals what could be coming for U.S. stocks, and it’s raising alarm bells for retirees. Here’s why and how to protect yourself
Join 250,000+ readers and get Moneywise’s best stories and exclusive interviews first — clear insights curated and delivered weekly. Subscribe now.
To buy or not to buy?
Fairplay notes that 78 organizations and 80 experts in child development “strongly advise families not to buy AI toys for children this holiday season.”
If you’re still considering the purchase of an AI-powered toy, the Trouble in Toyland report recommends looking at whether “the company is transparent about which chatbot it’s using and what guardrails it’s put in place to make sure the toy will stick to kid-appropriate topics.”
A financial factor to consider is whether the toy comes with ongoing subscription costs. Not only can these costs add up over time, such features are often designed to keep users on the hook (so parents will keep paying the subscription fees). Three of the AI toys U.S. PIRG Education Fund tested either have or plan to roll out a paid monthly subscription to access additional features.
Even if a toy has built-in guardrails, it’s no guarantee of safety.
“Given how new our understanding is of the potential developmental impacts of these toys, it may be a while before we know what features would make them safe or whether they’re appropriate for kids at all,” according to the Trouble in Toyland report.
Parents and grandparents may want to consider giving more traditional analog toys during the holidays that encourage creativity and problem solving, like Lego, art supplies, STEM kits, board games and regular teddy bears that don’t talk back.
“Even though they promise to boost imagination and learning, AI toys can monopolize children’s attention and crowd out actual imaginative, child-led play that research shows is foundational for creativity, emotional regulation and real learning,” according to Fairplay.
Other options could include a 529 college savings plan or child savings account monitored by an adult family member — gifts that will keep on giving.
Article sources
We rely only on vetted sources and credible third-party reporting. For details, see our editorial ethics and guidelines.
OpenAI (1), U.S. PIRG Education Fund (2); CNN (3); Fairplay (4); Character.AI (5)
You May Also Like
- Turning 50 with $0 saved for retirement? Most people don’t realize they’re actually just entering their prime earning decade. Here are 6 ways to catch up fast
- This 20-year-old lotto winner refused $1M in cash and chose $1,000/week for life. Now she’s getting slammed for it. Which option would you pick?
- Warren Buffett used these 8 repeatable money rules to turn $9,800 into a $150B fortune. Start using them today to get rich (and stay rich)
- Here are 5 easy ways to own multiple properties like Bezos and Beyoncé. You can start with $10 (and no, you don’t have to manage a single thing)
Vawn Himmelsbach is a veteran journalist who has been covering tech, business, finance and travel for the past three decades. Her work has been featured in publications such as The Globe and Mail, Toronto Star, National Post, Metro News, Canadian Geographic, Zoomer, CAA Magazine, Travelweek, Explore Magazine, Flare and Consumer Reports, to name a few.
