The impact of GenAI on the creative industries, and the ethics and governance we must put in place

GenAI is revolutionizing the media, entertainment and sports industries Image: Unsplash/Kelly Sikemma
- Generative artificial intelligence (GenAI) is revolutionizing the media, entertainment and sports industries.
- The boom in AI brings opportunities for the creative industries but raises several challenges that must be addressed. These challenges require a human-centric and holistic approach.
- The Industries in the Intelligent Age – Media, Entertainment, and Sport report aims to offer guidance to business leaders and other stakeholders on fostering innovation alongside developing robust governance and multistakeholder collaborations.
The advent of the Internet digitalized our interaction with content, and 15 years ago, user-generated content platforms emerged, leading to a new era. This new paradigm emerged alongside the mass media model.
This revolution is now entering its next phase with the advent of generative artificial intelligence (GenAI), which can substantially change how we produce and engage with content.
The extent of the ongoing change is comparable to the advent of the printing press when written material was distributed at a large scale.
The revolution at that time led society to hotly debate what institutions and frameworks were needed in the new world. It discussed how to reward creativity, leading to the emergence of intellectual property frameworks and copyright, and how to ensure information was trustworthy and transparent to readers. This led to the emergence of the institutions of authorship and publishers, among others.
The questions we face today are of the same magnitude.
GenAI touches the core of the deeply human creative process, impacting the lives of all consumers and creators, whether amateurs or professionals. This has implications for how we express ourselves, consume information, learn and enjoy our free time with entertainment content.
While GenAI will surely expand the canvas of possibility, empowering more people, including those without deep technical or artistic skills, to join the creators' board, ensuring the advent of this technology benefits humanity is of paramount importance.
In this new era, the value of human creativity should be preserved as unique human elements, such as personal stories, emotional resonance, and cultural nuance, will remain.
We are at the beginning of the journey, with individuals and companies experimenting with the technology still in the early stages of adoption.
Innovation and responsibility
Among GenAI’s most promising applications is the use of conversational interfaces to generate new content or translate or convert existing ones. For example, generating videos or podcasts from articles and blog posts or assisting in ideation by generating variations of a script or storyboard, enabling creators to explore options faster.
Companies can leverage GenAI to dynamically tailor content to user input and data (provided user informed consent), constantly refining and enhancing it through insights gathered at every stage of the creative process. The emergence of AI-powered search experience can potentially change how people access information, similar to when search and social media emerged.
However, alongside these advancements lie substantial challenges that can negatively affect society and hinder widespread adoption. Concerns emerge about data privacy, intellectual property infringement, ethical implications, the negative impact on the information ecosystem, bias and inaccuracy risks, and potential job displacement.
The recent spread of deepfake technologies has heightened concerns about their improper use to produce misinformation or other harmful content. Of particular concern is the increase in the volume of non-consensual adult imagery or deepfakes.
Additionally, Hollywood strikes have brought attention to fears among creatives about AI’s potential to devalue their contributions or use their likeness without proper consent. Concerns that GenAI’s advent could negatively impact human artists or undermine their roles have fueled societal and industry-wide debates.
While fostering innovation is vital to unlocking GenAI’s full potential, robust governance is needed to address emerging challenges.
Regulations such as the European Union’s AI Act or the US Deepfakes Accountability Act are being introduced, yet the speed and approach vary across geographies. Broad questions, such as those related to accountability frameworks, remain unanswered.
As AI sophistication grows and regulatory landscapes evolve, self-governance can be used to foster agility, mitigate risks, and complement regulatory efforts.
In such a context, the industry has an opportunity to come together and build common principles and frameworks, share best practices and lessons learned, enable collaborations across sectors and inform future regulatory efforts.
Scalable GenAI solutions
Recent developments showcase media organizations’ different approaches. Some pursue commercial litigation to obtain fair remuneration for using content to train AI algorithms and for intellectual property infringements.
Recent cases include the New York Times lawsuit against OpenAI and the Wall Street Journal lawsuit against Perplexity AI. Other organizations are making bespoke agreements; relevant examples include Axel Springer SE and NewsCorp, which signed deals with OpenAI.
Developing a common licensing framework for training data and similar shared mechanisms would enable scalable solutions, including for smaller players, such as local news providers.
A reference for this approach can be found in the Nordics, where major outlets partnered with smaller ones, creating a shared framework across layers of collaboration, from tech to product management and licensing.
For example, in 2020, Schibsted partnered with Polaris Media, the owner of 42 media houses in Norway, to provide its proprietary technology platform, which supports the entire editorial workflow, from content production to online and mobile publishing.
How is the World Economic Forum creating guardrails for Artificial Intelligence?
Last year, Schibsted joined forces with the Norwegian Research Centre for AI Innovation (NorwAI), inviting all Norwegian media companies to collaborate. As part of this initiative, they contributed thousands of articles to train Norway’s first generative language model, aiming to create a local alternative to ChatGPT.
In this complex environment, the Industries in the Intelligent Age – Media, Entertainment, and Sport report aims to offer guidance to business leaders and other stakeholders on fostering innovation alongside developing robust governance and multistakeholder collaborations.
The potential impact of this new emerging technology requires engaging a broad set of stakeholders across industry, governments, civil society, labour, and experts across scientific and humanistic disciplines that understand the human mind and society, such as psychology, sociology, anthropology, etc.
Only by undertaking a holistic and human-centred approach can we responsibly harness GenAI’s power to build a sustainable and responsible future for the industry and society.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Generative Artificial Intelligence
Related topics:
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.
More on Fourth Industrial RevolutionSee all
Mark Esposito and Eduardo Araral
February 7, 2025