How online platforms must respond to a new era of internet governance
Governments need to find a balance between keeping the internet safe, while also protecting fundamental rights, including freedom of speech. Image: Ludovic Toinel for Unsplash
Theos Evgeniou
Professor of Decision Sciences and Technology Management, INSEAD;, WEF Academic Partner on AI; co-founder, Tremau; external expert, BCG Henderson InstituteGet involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:
Internet Governance
- Governments across the world have started designing new frameworks to regulate online platforms.
- The coming era of regulated internet governance will require tech companies to adapt and develop new risk-management processes and tools.
- Ultimately, complying with the new regulations will require a shift towards a culture of responsibility.
Almost two decades after the rise of online platforms, governments across the world including in the European Union, the United Kingdom, Canada and Australia, have started designing new frameworks to regulate these services. Debate and controversy has been rife, including the recent Congressional hearing of the Facebook whistle-blower Frances Haugen and Apple’s reversal of its plan to scan images and videos on its devices to detect child sexual abuse material (CSAM). Increased regulation will have important implications both for these businesses and for wider society.
Democratic governments face numerous challenges in attempting to regulate the online space. They grapple with sometimes contradictory objectives, as they attempt to find a balance between keeping the internet safe, while also protecting fundamental rights, including freedom of speech.
Challenges for regulators
1. The sheer volume of content posted online means that it is practically impossible to ensure judicial review prior to every removal of content. Governments will need to rely on setting out obligations for the private sector to moderate illegal content based on specific regulatory principles. The more stringent the rules, the higher the risk of content over-removal – the more lenient the rules, the higher the risks of illegal or harmful content spreading.
2. Legislators must define what constitutes “illegal content” in a way broad enough to cover the targeted harms, but specific enough to avoid the risks of 'censorship creep'. Impractically broad definitions could leave substantial 'grey zones', requiring companies to use their sole discretion to decide on content removal. If the definition of illegal content is ambiguous, it increases the risk of over-censorship, with important potential repercussions on freedom of expression.
3. Regulators need to define minimum new obligations for online platforms without creating barriers to innovation or market entry.
The regulatory approaches taken by policy-makers can be divided into two broad categories:
A content-specific approach
Legislation is designed to target one specific type of online harm such as copyright infringements, terrorist content, CSAM, illegal hate speech or illegal products, and focuses on its timely removal. Examples of such regulations include the EU Terrorist Content Online Regulation, the French law on disinformation, the German Network Enforcement Act (NetzDG) as well as the Directive on Copyright in the Digital Single Market.
A systemic approach
This aims to provide a cross-harm legal framework, in which Intermediary Service Providers (ISPs) must demonstrate that their policies and processes are designed and implemented to counter abuses of their services. This is the direction proposed by the EU in its Digital Services Act (DSA), which does not define what constitutes illegal content. Instead it sets new due diligence obligations for ISPs to effectively act on content when they become aware of its illegality. Failure to comply may result in penalties of up to 6 percent of global turnover.
Challenges for businesses
Complying with complex (and sometimes contradictory cross-jurisdictional) demands while maintaining customer trust will pose new challenges for all ISPs. It will require them to develop new risk management frameworks and create new roles and responsibilities, ultimately leading to organizational and cultural changes. It would be useful to learn lessons from other sectors, such as the financial industry.
For example, according to the first draft of the DSA, ISPs will need to put in place adequate processes to address the illegal content they have been notified of by a range of different sources. To track these, the subsequent decisions, and communication with users, ISPs will need to develop robust content moderation management processes. These will also need to fulfil obligations such as “statements of reason”, complaint handling and transparency reporting. Data collection across this complex network of stakeholders will require dedicated tools, which need to be secure and up-to-date to ensure compliance across different jurisdictions.
Online platforms will also need to abide by new transparency obligations for online advertising, requiring them to provide information to their users concerning the advertisers and their target audience. In addition, online marketplaces will be required to set up 'Know Your Business Customer' policies to collect identification information from sellers operating on their platform – an obligation also largely inspired by the financial industry. This will also require new processes and tools.
The largest online platforms will be subject to further requirements, including the obligation to conduct annual assessments of systemic risks. Again, these companies may benefit from compliance risk assessments methodologies and best practices developed by the financial sector.
All these changes will unavoidably add costs for businesses: according to estimates made by the European Commission, annual costs could reach up to tens of millions of euros for the larger players.
What is the Forum doing to improve online safety?
A new culture
Beyond these processes and tools, effective risk management will require a well-balanced enterprise organisation and risk management culture, aligning regulatory obligations, business models, and reputational risk management, and possibly setting up independent compliance functions – much like in the financial sector.
Organizational changes will not be sufficient, however. As they grow, online platforms will need to adapt to a different culture, in which the regulatory risks are understood by all, and employees are empowered to make responsible decisions. This will pave the way towards a new and more responsible use of technology.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Fourth Industrial RevolutionSee all
Richard Dalbello and Mariel Borowitz
October 7, 2024
Andrea Willige
October 4, 2024