How can we create actionable guidelines to educate, empower and protect children and youth in the age of artificial intelligence?
Children and young people today grow up in an increasingly digital age in which technology pervades every aspect of their lives. From robotic toys and social media to the classroom and home, artificial intelligence (AI) is ubiquitous in the daily life of a child.
Despite the many promising benefits of AI for children, special precautions must be taken to protect society’s most vulnerable and disenfranchised demographic. In 2017, Germany banned an early AI-powered smart toy because of surveillance risks to children, while regulators elsewhere in the world are starting to confront issues of
control, use and monetization of children’s data.
Issues of privacy are compounded by questions about the impact of AI-enabled toys on cognitive development. Is it necessary to protect traditional creative play or is early exposure to AI useful for children who will grow up engaging with AI in school and the workplace? What is developmentally appropriate technology? AI-enabled devices are increasingly able to manipulate and become addictive to users, and children are particularly susceptible. This is highly concerning given the prevalence of bias in AI to which children are less attuned than adults.
In the absence of clear guidelines, parents and caregivers are left to make decisions about products with incomplete information and complex implications for their children’s health and privacy. As more AI for children comes on to the market, all relevant stakeholders need to construct mechanisms to protect children while enabling “precision education” and other benefits of AI.
The World Economic Forum’s Platform for Shaping the Future of Technology Governance: Artificial Intelligence and Machine Learning is partnering with the United Nations Children’s Fund (UNICEF), LEGO, Canadian Institute for Advanced Research (CIFAR), and others to address this urgent challenge. We have worked closely with UNICEF to develop its Policy Guidance on AI for Children. This multistakeholder community of governments, academics, businesses, civil society organizations, and youth representatives is working together to develop best practices for the governance of AI used by children and young people.
The project’s multiple activities focus on three strategic pillars:
Educate: Develop frameworks and toolkits to educate and inspire children, adolescents, parents and guardians around the responsible use of AI.
Empower: Empower children and young people with AI skills to create their own technology to improve the state of the world with an emphasis on underrepresented voices.
Protect: Protect and expand children’s human rights and civil liberties when encountering AI in their homes, schools and public places.
The inaugural AI Youth Council, is a global group comprising young people interested in AI. Members serve as part of the project community and work to co-create governance guidelines. The Forum is establishing the Smart Toy Awards to recognize ethically and responsibly designed AI-powered toys that create an innovative and healthy play experience for children. Using governance criteria established by the Generation AI community, the Smart Toy Awards aim to inspire toy and game-makers to harness the developmental promise of AI to promote adaptive creativity and individualized learning.
How to engage
Project community: Nominate experts, policy-makers or senior executives who can help guide this project by providing regular input as projects develop.
Fellow: Nominate an individual from your company to work full- or part-time at the World Economic Forum’s Centre for the Fourth Industrial Revolution to play an integral role in shaping this initiative.
For more information, contact Seth Bergeson, PwC Fellow, firstname.lastname@example.org.