Forum Institutional

Here's how the Digital Services Act changes content moderation

The Digital Services Act will regulate how platforms moderate content, advertise, and use algorithms for recommendation systems

The Digital Services Act will regulate how platforms moderate content, advertise, and use algorithms for recommendation systems Image: Christina Morillo for Pexels

Akash Pugalia
Global President, Media, Entertainment, Gaming and Trust and Safety, Teleperformance
Farah Lalani
Global Vice President, Trust and Safety Policy, Teleperformance
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Forum Institutional?
The Big Picture
Explore and monitor how Internet Governance is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Internet Governance

Listen to the article

  • The Digital Services Act aims to provide clearer and more standardized rules for digital content.
  • Companies will need to consider content removal and take a proactive and transparent approach to moderation.
  • With enforcement throughout Europe, companies need to ensure the right systems and processes are in place.

Platform governance has been at the forefront of the tech and media world recently, given the rise in cyberbullying, hate speech and other harms online. With the UK recently dropping ‘legal but harmful’ clause in its Online Safety Bill, many are wondering how new regulation across the globe will evolve to impact digital safety? The Digital Services Act (DSA) is of particular interest.

With the ethos of 'what is illegal offline should be illegal online', the DSA aims to provide clearer and more standardized rules for large and small digital service providers across the European market. In particular, the DSA will regulate how platforms moderate content, advertise, and use algorithms for recommendation systems. This will apply not just to large platforms but also to small businesses, across online marketplaces, social networks, app stores, travel and accommodation platforms, and many others.

Preparing for Digital Services Act enforcement

Currently, there are more than 10,000 platforms in the EU, 90% of which are small and medium sized enterprises. But with 27 different sets of national rules, the cost of compliance could be prohibitive for small businesses. The DSA aims to ensure that small online platforms are not disproportionately affected but that they remain accountable. So, what are some of the areas that companies may need to consider in preparing for the enforcement of the Digital Services Act? Below are some areas to keep in mind:

Have you read?

Content Removal

The Digital Services Act harmonizes the process by which platforms are notified and must take subsequent action on illegal content. More concretely, once notified by trusted flaggers, platforms will have to remove illegal content ‘expeditiously’. The DSA also stipulates that users are informed about, and can contest removal of content by platforms, having access to dispute resolution mechanisms in their own country.

While the Digital Services Act doesn’t have specific timelines for content removal, companies need to be prepared for quick removal and have the right processes and capacity in place in order to act on notifications from trusted flaggers. In addition, if platforms are not currently providing explanations to users about their removal decisions, this process will need to be instituted across the board.

Specific pieces of information are required in this explanation such as whether the action involves removal or demonetization, whether the action was in response to a notice submitted or based on voluntary own-initiative investigations, whether automation was used in the decision, a reference to the legal ground for illegal content or the community guideline for a policy violation, and redress mechanisms available to the user.

The Digital Services Act harmonizes the process by which platforms are notified and must take subsequent action on illegal content
The Digital Services Act harmonizes the process by which platforms are notified and must take subsequent action on illegal content Image: EDRI.org

Proactivity

The Digital Services Act makes it clear that platforms and other intermediaries are not liable for users’ unlawful behaviour unless they are aware of illegal acts and fail to remove them. This shielding of legal liability aims to encourage companies to be more proactive when moderating the content on their platform. Only if the flagged content is evidently manifestly illegal can such notices give rise to ‘actual knowledge’. According to the text of the Digital Services Act (section 63), “Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal or, respectively, that the notices or complaints are unfounded.”

This helps solve the potential disincentives around voluntary measures taken by small platforms; the Digital Services Act reinforces that diligent platforms are not liable for illegal content they detect themselves. Small platforms can now use this as a means to further invest in robust content moderation practices that can help act as a competitive differentiator in the market, without worrying about potential legal implications.

Notice and Action

The Digital Services Act highlights the need for a notice and action mechanism, covering the creation of complaints system and how these are actioned. Notices flagging purportedly illegal content must be sufficiently precise (section 22); what this means is that providers should be able to determine illegality without needing a substantive analysis or legal review.

When actioning these notices, platforms need to ensure that responses are targeted so that fundamental rights, including free expression, and others outlined in the EU Charter of Fundamental Rights, are upheld. The Digital Services Act also foresees measures against misuse which allows online platforms to potentially suspend individuals who misuse the notice and action mechanism and/or complaint handling systems, as long as the decision to suspend is assessed on a case-by-case basis.

Transparency and due diligence

Increased transparency is a theme that runs throughout the Digital Services Act. Whether this be more transparency around how to report illegal content, the explanations around content removal, the terms and conditions for platforms, the role of algorithms in recommending content, and much more.

Companies will need to be comprehensive, systematic, and diligent in their transparency efforts related to content moderation and the decisions surrounding this. There is a requirement for providers of hosting services to notify law enforcement of any suspicion that a criminal offence involving a threat to the life or safety of a person or persons has or is likely to take place. From an e-commerce perspective, online marketplaces can be required to trace their traders under the “know your business customer” principle and any dark patterns in order to better track down sellers of illegal goods or those engaged in manipulative practices. From an advertising perspective, greater transparency on ad targeting (who sponsored the ad, how and why it targets a user) as well as ban on certain targeted adverts are required. Moreover, clear information on why content is recommended to users will need to be provided.

These are just some of the areas companies need to be aware of when thinking about their trust and safety strategy ahead of the Digital Services Act. The European Commission has provided a summary here of all new obligations and who they apply to.

How will the DSA be enforced and what does it mean for digital service providers?

As an EU Regulation, the Digital Services Act is directly applicable in every member state; enforcement will be split between national regulators and the European Commission. As Commissioner Thierry Breton outlined in his sneak peek on enforcement of the new legislation, the Directorate-General for Communications Networks, Content and Technology will play a key role in enforcement, having control over due diligence obligations including risk assessments, independent audits, data access, amongst other areas.

Discover

What is the Forum doing to improve online safety?

Each EU Member state will assign a national authority to the role of Digital Services Coordinator (DSC), which will be responsible for all matters related to supervision and enforcement of the Digital Services Act at the national level. Many countries are tasking existing audio-visual media regulators to fulfill the role of DSC but are also considering assigning specific tasks to electronic communications regulators, consumer protection authorities, or other such relevant bodies.

Given that enforcement is expected to be seamless and equally enforced throughout Europe, companies should be taking steps now to understand the requirements in detail and shore up the trust and safety capabilities required to comply. But, compliance with the Digital Services Act will serve as the new minimum safety baseline, and companies will need to think even more proactively about their platform policies, enforcement, use of automation vs. people, and closed-loop feedback mechanisms to get ahead of new risks to come.

The information provided in this article does not, and is not intended to, constitute legal advice; instead, all content is for general informational purposes only. Views expressed in this article are those of the individual authors writing in their individual capacities, not those of their employer.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Why private capital is indispensable to close the great development and climate financing divide

Mahmoud Mohieldin and Manuela Stefania Fulga

October 3, 2024

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum