Artificial Intelligence

Smart toys: Your child’s best friend or a creepy surveillance tool?

Image: Photo by Lucas Santos on Unsplash

Seth Bergeson
Project Fellow, Centre for the Fourth Industrial Revolution, World Economic Forum LLC
Kay Firth-Butterfield
Senior Research Fellow, University of Texas at Austin
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Artificial Intelligence?
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Artificial Intelligence

This article is part of: Global Technology Governance Summit
  • The market for smart toys is rapidly expanding and could grow to $18 billion by 2023.
  • Smart toys can help with learning but pose risks if they are not designed to protect children’s data and safety.
  • Many companies are developing smart toys ethically and responsibly, with makers of AI-powered smart toys encouraged to apply to the Smart Toy Awards.

Imagine a child born this year who will be surrounded by technology at every phase of her childhood. When she is three years old, Sophie’s parents buy her a smart doll that uses facial recognition and artificial intelligence (AI) to watch, listen to, and learn from her.

Like many children, Sophie will come to love this toy. And like previous generations of children with a favorite doll or teddy bear, she will carry it around with her, talk with it, and sleep with it beside her for many years.

If the smart doll is designed responsibly, this toy could be her best friend; if not, it will be a surveillance tool that records her every move and word spoken in its presence by her, her friends, and even her parents.

Have you read?

Smart toys use AI to learn about the child user and personalize the play or learning experience for them. They can learn a child’s favourite colour, song, and learn to recognize that child and other familiar people in that child’s life. While this may sound futuristic, there are many smart toys that already provide these capabilities. The market for these toys is rapidly expanding and will grow to $18 billion by 2023.

To address this urgent use of AI, the World Economic Forum recently launched the Smart Toy Awards to recognize ethically and responsibly designed AI-powered toys that create an innovative and healthy play experience for children.

Smart toys provide enormous promise for children. They can customize learning based on data they gather about children; they can teach computer programming skills to children; and they can help children with disabilities develop cognitive, motor, and social skills.

But at the same time, smart toys provide large potential risks if they are not designed to protect children’s data, safety, and cybersecurity.

A cautionary tale

The example of Sophie’s smart doll is not far-fetched. In 2017, My Friend Cayla – an early smart toy that used facial and voice recognition – was declared an illegal surveillance tool in many countries.

If the Cayla doll was connected to a phone, data was sent to the manufacturer and a third-party company for processing and storage. And anyone with the My Friend Cayla app on their phone within 30 feet of a toy could access the toy and listen to the child user.

Germany issued a “kill order” for the doll and required parents to destroy it “with a hammer.” Today, the only surviving Cayla dolls in Germany reside in the Spy Museum in Berlin.

The risks posed by smart toys

Sophie applies to college when she is 18 years old. If her smart doll had collected data on her from the age of 3 to 9, the company who built the toy could know her better than her parents. Without adequate data protections, the company could also sell this data to the colleges to which she is applying or other third parties.

After college, Sophie applies to a job. If the employer bought data gathered on Sophie as a child, they could learn about her strengths and weaknesses. What if Sophie bullied her younger sister, yelled at her parents, or refused to do her homework as a child? All these actions conducted in the privacy of the family’s home could be known by the company and sold to third parties who could use this information to discriminate against Sophie. The family’s life is no longer private.

Today, data is gold but gathering data on children is inherently problematic. As a company gathers data about children through Sophie’s doll, they may have a responsibility to act or intervene. Imagine that Sophie tells her doll about suicidal thoughts and self-harm. Should the company be required to alert the parents and call 911?

The more data that a smart toy gathers the more complex scenarios smart toy companies will likely face. Every company designing a smart toy with the capabilities to gather this information must consider these worst-case scenarios as they develop toys to protect the safety of the child user and those around them.

The market for smart toys is rapidly expanding and could grow to $18 billion by 2023. Image: Statista

Developing responsible and ethical smart toys

Despite these significant risks, ethical and responsible smart toys are being developed. The Smart Toy Awards have developed four key governance criteria for companies developing AI-powered toys: data privacy and cybersecurity; accessibility and transparency; age appropriateness; and healthy play.

Sophie’s smart doll illustrates the importance of strong data privacy and clearly communicating to adults buying the toy what the smart doll does and how it operates. This must be communicated in the Terms of Service in language understandable by non-technologically literate audiences. At minimum, Smart Toys should meet COPPA requirements in the US and GDPR in the EU.

Parents and guardians should understand with whom children’s data is being shared and for what purpose. Companies should empower parents, guardians, and children to make their own decisions about how children’s data is being used. And companies should not sell children’s data to third parties.

Data privacy is a foundation for ethical and responsible smart toys, but they must also be designed to be accessible, transparent, age appropriate, and promote healthy play and children’s mental health.

The future of childhood

Sophie’s doll doesn’t necessarily pose concern for her and her parents, and data collected on her won’t hinder her future if the data is carefully protected. In the EU, GDPR provides the right to be forgotten, and a similar policy could allow children like Sophie to request that all data collected on them as children by their smart toys be deleted when they turn 18 years old, so they would have a fresh start as they begin adulthood.

Sophie and all children should have a fair shot at childhood, education, careers, and life. The data collected on them as children should not be used to discriminate against them in the future.

Smart toys like Sophie’s doll can play a pivotal role in childhoods, catalyzing creativity and critical thinking skills. Many companies are developing smart toys with careful consideration for ethics and responsibility. We urge companies to adopt our governance criteria as they’re designing and developing smart toys.

Childhood is a sacred time and parents will do everything they can to protect their children’s experiences. This won’t be possible unless stakeholders work together across the private, public, and nonprofit sectors to develop ethical, responsible, and innovative smart toys that protect and foster the essence of childhood.

Learn more, apply, and watch the smart toy awards:

Makers of AI-powered smart toys can apply to the Smart Toy Awards now using the “Apply” link on our website by 9 April 2021.

We encourage everyone interested in smart toys and childhood to watch the live-streamed virtual Awards show on May 22, 2021 when winners will be announced with special guest and judge will.i.am.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Artificial IntelligenceFourth Industrial RevolutionDavos AgendaDavos Agenda
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

How we can prepare for the future with foundational policy ideas for AI in education

TeachAI Steering Committee

April 16, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum