Forum in Focus

How we use AI on Forum Stories

Published · Updated
The World Economic Forum uses generative AI on its Forum Stories platform as one of a number of tools for research, editing, production and publishing.

Balancing innovation with human judgment, editorial integrity and transparency. Image: Unsplash / Dollar Gill

Gayle Markovitz
Head, Written and Audio Content, World Economic Forum
Maxwell Hall
Creative Editorial Lead, World Economic Forum
  • The World Economic Forum uses generative AI on its Forum Stories platform as one of a number of tools for research, editing, production and publishing.
  • We aim to balance innovation with human judgment, and integrate editorial integrity, transparency and good governance.
  • Our approach to AI-assisted content creation will continue to evolve as the technology and use-cases change.

Artificial intelligence is changing how content is created, edited and shared. Like many organizations, we are exploring what these tools mean for our platform. We've created these guidelines to ensure our editorial values remain consistent even if the way we create the content is transformed. These guidelines are meant to enable a process rather than define the product.

Our editorial values: Human judgment comes first

Forum Stories is a place for ideas, solutions and analysis of the world's biggest issues. Our content aims to spark dialogue on complex global issues, bringing together diverse voices to take part in the global conversation.

The platform is available to partners, constituents and expert networks of the Forum, who are invited to share their thought leadership with a public audience. Authors include Nobel laureates, heads of international organizations and UN bodies, labour leaders, top CEOs and business leaders, academics, religious figures and heads of state.

Our editorial approach seeks to establish trust. It's grounded in human judgment and an appreciation of complexity and context. AI supports that work. It helps us structure ideas, summarize information, translate content and explore creative formats.

Every piece published on Forum Stories is shaped by human authors and editors. AI may assist, but it doesn’t decide what we say and how we say it – and we request the same from our contributors.

Our editing team comprises experienced editors and journalists who review, fact check and apply our editorial principles within an AI-enhanced workflow, that is structured around human accountability and oversight.

3 principles we apply in practice

We apply three principles to guide how we use AI across Forum Stories. It's a careful balance that is not always easy to get right.

1. Keep it human

We use AI to support creativity, not to flatten it. That means ensuring our content reflects human intuition, tone and responsibility. If something feels generic or disconnected from real-world nuance, we treat that as a signal to rework it.

Lived experience, editorial reflection and the element of surprise remain essential to the content.

2. Integrity: We don't outsource the thinking

We are careful not to outsource thinking. So we treat AI-generated material as incomplete. It is part of the process rather than a finished product. In practice, that means developing a clear argument before using AI, verifying facts, data and sources independently (and we may use AI to help with this), and ensuring that analysis reflects human expertise and consultation.

The goal is not to do less thinking. It’s to do better thinking, with support where it helps.

3. Transparency and accountability

Every piece of content on Forum Stories has a human author who is accountable for it, a human editor who collaborates with the author to evolve it, and a senior editor who takes responsibility for publishing it.

We do not cite AI as a source or publish AI-generated claims without verification. And we do not treat AI outputs as inherently reliable.

Where AI has meaningfully contributed to a piece, whether in drafting, translation or visuals, we ask our contributors to disclose that clearly and proportionately. We also take care with how tools are used: protecting confidential information, avoiding bias and ensuring that individuals are not replicated or simulated without consent.

Every piece of content on Forum Stories has a human author who is accountable for it, a human editor who collaborates with the author to evolve it, and a senior editor who takes responsibility for publishing it.

What this looks like on Forum Stories

These principles translate into clear boundaries for how AI is used on our platform:

Writing and editing

AI can support early stages of the process, such as structuring an article, summarizing research or refining language, but does not replace reporting, sourcing or editorial judgment. Our standards of impartiality and accuracy remain fixed.

Visuals and storytelling

AI can help explore concepts or generate visuals where appropriate. But we prioritize authenticity and clarity. We do not use AI to simulate real-world events or create misleading imagery. And we avoid outputs that feel overly stylized or inconsistent with our editorial standards.

All visuals are reviewed to ensure they are accurate, appropriate and aligned with our brand.

Experimentation and new formats

We are testing new ways of using AI, from translation and summarization to more interactive or visual storytelling formats. Where experimentation carries higher risk, we apply additional oversight and a human editor always reviews before publishing.

This will evolve

AI is developing quickly, and so is our understanding of how best to use it. Our approach on Forum Stories is not fixed. It is something we will continue to test, learn from and refine. What matters is that we remain anchored by the same core principles: keep it human; don't outsource the thinking; be transparent.

We see this as an ongoing conversation, with our contributors, our partners and our readers and we're keen to hear from you.

Take part in the poll

Poll

How comfortable are you with AI being used in editorial content (with human oversight)?

  • Loading ...

Portions of this content were developed with the assistance of generative AI. All materials were reviewed and approved by the author(s).

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Forum in Focus
Artificial Intelligence
Share:
Contents
Our editorial values: Human judgment comes first3 principles we apply in practiceWhat this looks like on Forum StoriesThis will evolveTake part in the poll
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

More on Forum in Focus
See all

The power of corporate alumni networks

Jaci Eisenberg

March 10, 2026

'What drives you?' Perspectives from leaders at Davos 2026

About us

Engage with us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2026 World Economic Forum