Media, Entertainment and Sport

Risks to kids online are growing. Here's what we can do 

Image: Kelly Sikkema/Unsplash

Farah Lalani
Global Vice President, Trust and Safety Policy, Teleperformance
Bincheng Mao
Global Shaper, New York Hub, New York Hub
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Media, Entertainment and Sport?
The Big Picture
Explore and monitor how Media, Entertainment and Sport is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Media, Entertainment and Sport

Listen to the article

  • Some major tech companies have delayed rollout of platforms aimed at children amidst child safety concerns.
  • Introducing specific safety features could ensure these platforms are used beneficially as platforms to educate and entertain children.
  • Here's what we know about safety for children online - and how to improve it.

Discussion of the risks to kids online – and the wider implications to society when mitigating these risks – is currently on center stage.

Apple recently delayed its rollout of child safety features on US phones and computers, and Instagram delayed its rollout of Instagram for Kids. In response to accusations that its platforms harm children, Meta, which owns Instagram, this week said it would introduce new safety controls, including asking kids to take a break when using Instagram and steering them away from content that isn't conducive to well-being.

With children making up a significant portion of internet users – one in three internet users is a child under 18 years old – the impact of such decisions will notably alter the future of the web for kids.

How risky is it for kids to be online?

The risks to kids online are significant and growing. While the risks vary by age, gender, online exposure and other factors, as it stands right now at an aggregate level, exposure to sexual content is the largest online risk.

Percentage of children exposed to online risks
Percentage of children exposed to online risks

UNICEF research shows that risks vary significantly by country and that the extent of harm caused by these risks depends on a number of factors, such as children’s resilience, parental guidance/support, and digital skills (such as ability to manage privacy settings).

Percentage of children who have been exposed to online risks
Percentage of children who have been exposed to online risks Image: UNICEF

The risk to young girls is even more significant. The State of the World’s Girls 2020 report by Plan International, which surveyed over 14,000 girls and young women in 31 countries, found that more than half of respondents had been harassed and abused online, and that one in four felt physically unsafe as a result.

What is the industry doing about it?

Many companies are making improvements to protect children online. Some video streaming apps, like TikTok, have versions for users under 13 where they can create but not post videos. Others, like YouTube, have "Approved content only" modes. With these, children can only watch videos, channels and/or collections that parents have selected and approved themselves. Instagram prevents adults from messaging children who do not follow them and have defaulted all child accounts to private.

Nevertheless, several challenges must be overcome to keep children safe online whilst enabling them to benefit from all the opportunities of digital engagement.

Key tensions and barriers to child safety online

Age Verification

Many kids are being exposed to content they shouldn’t be seeing in places they shouldn’t be visiting at all. The largest major survey of its kind points to a traumatic reality: 70% of respondents first saw child sexual abuse material when they were under 18.

In the US, many platforms do not permit users under 13 years of age to use their services in order to comply with the Children’s Online Privacy Protection Act. Many other parts of the world operate similarly. It is estimated that a staggering 82% of children aged 8 to 12 have profiles on social media and messaging apps, according to research from CyberSafeKids.

Research shows that young people can easily sidestep age verification measures when signing-up to some popular social media apps. New regulations, such as the UK’s Age-Appropriate Design Code, sets out requirements for how digital platforms should protect kids online. However, it doesn't prescriptive how companies should verify user age.

Age verification is an important safety issue. Innovations in third-party technologies can offer solutions that are not privacy-invasive. For example, friendship app Tinder has rolled out an ID Verification process. This boosts user confidence that matches are real people. Improving age and identity verification processes through technology, and coupling these to regulatory standards, is critical for user safety.

Privacy

Heightened privacy measures can protect kids from unwanted contact on social applications. It also helps children manage visibility of photos, videos and other shared content. Some privacy measures, however, can complicate children’s safety. For example, end-to-end encryption software lowers authorities' ability to detect illegal material, including Child Sexual Abuse and Exploitation Material (CSAM), as encrypted user content can't be scanned, monitored and filtered.

Many companies are moving toward encrypted environments nonetheless. The National Center for Missing or Exploited Children (NCMEC) estimates that 70% of child abuse reports could be lost with the roll‑out of end-to-end encryption on messaging platforms. While many experts continue to tout the value of encryption, the industry is struggling to come up with viable alternatives to still fight CSAM in these environments.

A balance must be found between privacy and safety. Company decisions about product features, content moderation, user profile settings, and other such areas will require a deep understanding of the implications to all stakeholders. This will ensure that proportionate and effective measures are put in place.

Risk Exposure

Some research has highlighted the negative impacts of social media on kids. These apps may worsen body image and self-perception issues, especially among young girls. Apps targeted to kids should keep the varied experiences and mental health realities of such vulnerable groups in mind.

Research also points to the benefits that social media have for children. A majority of parents whose children watch videos on platforms like YouTube say it helps their children learn new things and keep them entertained. It is important though, that parents should keep a close eye on what children encounter on these streaming sites, with Pew Research finding at least 46% of parents have said their children have encountered inappropriate content online.

The benefits of these apps have a trade-off with the potential risks. The ongoing conversation about these risks and benefits highlights the nuances thereof and the individual factors that parents may want to consider when taking such decisions.

Image: PEW Research Center

Preventative Interventions

In general, preventative measures have not gained the same traction as measures to detect and remove harmful content once it is already circulating. While a number of interventions could help improve safety for kids online, understanding their effectiveness in real-world settings is key to avoiding unintended consequences.

An analysis of 39 studies found that 12% of young people have forwarded a sexually explicit image or video, without consent, while 8.4% had forwarded such imagery of themselves without consent. This needs to be addressed. Research has shown that warnings about these types of privacy violations often backfire and could even have the opposite effect.

There is a great need for a multi-stakeholder approach to make adequate interventions on these challenges. Governments, civil society, researchers, and parents, must work together to ensure robust interventions based on diverse inputs from all stakeholders.

Information Inequalities

It isn't always easy to assess safety measures given the different and evolving understanding of technologies, processes, and issues among the various stakeholders. When it comes to critically assessing changes, as proposed by companies like Apple, stakeholders often lack the data, and/or understanding to do so effectively.

Building mechanisms to share information, enable audits, and independently verify claims with stakeholders is critical to gain buy-in. Doing so in a privacy-preserving way which doesn't compromise company IP, can bridge the information gap. This is necessary to develop more agile, effective, and widely-acceptable policies.

How do we address digital risks?

A holistic approach to safety that considers both the risks and opportunities of digital engagement is needed to move forward. This can only be done with a user-centric approach.

A user-centric framework for safety
A user-centric framework for safety Image: World Economic Forum's

The World Economic Forum supports the development of a cyber regulatory ecosystem which always considers the safety of vulnerable peoples, especially children. We believe in the importance of navigating the safety tensions associated with new technologies, policies, or products - and how these may affect all members of society. The World Economic Forum’s Global Coalition for Digital Safety aims to develop principles and practices that help address such trade-offs in a proportionate manner, knowing that solutions will need to be tailored according to a company’s size, role in the digital ecosystem, the type of user it services, and many other factors. While there isn’t a one-size-fits-all solution, there is a path forward to build solutions that will enable kids to take advantage of the internet whilst staying safe.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

How Paris 2024 aims to become the first-ever gender-equal Olympics

Victoria Masterson

April 5, 2024

1:44

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum