Emerging Technologies

Turning measurement into momentum so agile governance can keep pace with AI

Shot of a group of business colleagues meeting in the boardroom; AI governance

Tracking global AI governance to create enforceable systems that give businesses and citizens clarity and confidence. Image: Getty Images/Cecilie_Arcurs

Kelly Ommundsen
Head, Digital Inclusion, Member of the Executive Committee, World Economic Forum
This article is part of: Centre for AI Excellence
  • Artificial intelligence (AI) requires robust regulation and must be developed according to sound governance rules.
  • Various indices have been created to assess AI governance, the latest being the AI Governance International Evaluation (AGILE) Index 2025.
  • These measures can help to build trust in AI by tracking its development amid enforceable systems that provide clarity and confidence.

Good regulation starts with good information but, in artificial intelligence (AI), it's rare. Data on how AI systems are built, trained and deployed remains fragmented — scattered across private companies, buried in proprietary processes or simply never collected at all.

This means governments trying to regulate AI often have to work with incomplete snapshots that use voluntary disclosures and inconsistent reporting standards, which provide limited visibility into what's actually happening inside these systems. The result is that regulators are asked to manage risks like algorithmic bias and data misuse without the evidence they need to understand them, let alone to address them effectively or consistently across borders.

Various indices and studies have emerged in recent years to help governments and industry assess where AI governance stands. These include the OECD’s AI Policy Observatory, the AI Readiness Index by Oxford Insights, Stanford University’s AI Index Report and the Global AI Index from Tortoise Media. Each contributes valuable insights, but differs in focus. Some emphasize innovation and investment, while others assess ethics, governance or human capital readiness.

The AI Governance International Evaluation (AGILE) Index is the latest addition to this landscape. Covering more than 40 countries, it offers a comparative snapshot of how legal frameworks, institutional capacity and societal safeguards are evolving – and where they’re not.

Have you read?

The real value of an index like this isn't in ranking countries, but in revealing the gaps and opportunities in AI development. Some countries have strong strategy documents but weak enforcement. Others have clear mandates but fragmented oversight. In many countries, innovation policies are moving faster than legal or ethical guardrails are being established.

These differences matter. Without a shared evidence base, AI regulation risks becoming reactive or symbolic, rather than enabling and effective.

From principle to performance

Regulatory aspiration often develops faster than the institutions designed to uphold it and reaches varying degrees of operational maturity. High-income countries tend to have stronger regulatory structures and technical capacity, for example. On the other hand, many middle-income countries show public awareness and policy intent but struggle with enforcement and coordination. In some contexts, regulatory conversations have happened, but have not necessarily been translated into mandates, oversight or institutional coordination.

But ambition without infrastructure is just aspiration. AI governance must go beyond principles to cover enforcement, implementation and accountability.

The AGILE Index 2025 was developed by a consortium of research institutions in China to assess national AI governance across 43 legal, institutional and societal indicators. It groups countries into four profiles ranging from those with more mature frameworks, called “all-round leaders”, to those still laying basic foundations, referred to as “foundation seekers”.

The index aims to capture the structural reality of AI policy systems worldwide and the latest version provides three takeaways in this respect:

1. AI transparency is power

By laying bare national strengths and blind spots, the AGILE Index encourages governments to move past press releases and into capability-building. The index shows that countries with publicly available AI governance data — such as regulatory frameworks and accountability mechanisms — score markedly higher across institutional maturity indicators. Transparency itself becomes a driver of progress, enabling peer benchmarking and policy learning.

2. AI preparedness is uneven

There are deep asymmetries in institutional preparedness for AI between regions, income groups and even within governments. Some ministries of innovation surge ahead while their country's legal bodies lag behind. Elsewhere, policymakers hesitate to act on AI because they lack the tested frameworks to do so.

AGILE’s institutional dimension highlights a gap of more than 40 percentage points between high- and middle-income countries in regulatory implementation capacity. Even within advanced economies, oversight and data protection bodies often trail innovation agencies, revealing the internal fragmentation the index quantifies.

3. AI governance can be agile

Several countries, including Singapore, the UK and South Korea, have demonstrated that agile governance is not just theoretical, it's operational. But that progress must be shared, studied and scaled.

These countries are top performers in the AGILE Index because they have embedded agile mechanisms — from regulatory sandboxes to ethics-by-design frameworks — that translate principles into practice. Their experience offers a template for others to build flexible but accountable AI systems.

From measurement to momentum

Governance must evolve as rapidly as the innovations it seeks to guide. Launched in July 2025, the Global Regulatory Innovation Platform (GRIP) aims to encourage this by bringing together public and private stakeholders to learn from best practices around the world, co-develop practical frameworks and build communities of practice around agile, inclusive governance.

Initiatives like the AGILE Index provide the dashboard for this, but GRIP provides the steering wheel. While indices can offer measurement, GRIP enables movement, building the connective tissue between data, policy and real-world reform. Whether it's through policy roundtables, shared toolkits or national self-assessment pilots, governments must move faster from principles to prototypes, and from working papers to working policies.

Loading...

The road ahead for AI governance

Measuring maturity is important; building it is essential

This means clarifying the distance between ambition and capability. Trust in AI must be secured through principles but also through enforceable systems that give businesses and citizens clarity and confidence. Countries that are willing to measure their progress in these areas are already ahead of the curve.

For those ready to act, the door is open. In the intelligent age, AI governance that stands still is already falling behind.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Artificial Intelligence

Related topics:
Emerging Technologies
Artificial Intelligence
Digital Trust and Safety
Innovation
Share:
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

More on Emerging Technologies
See all

How the bioeconomy is impacting all 5 of our senses

Dr Gideon Lapidoth and Madeleine North

November 17, 2025

He’s building 'gas stations' in space. How it can drive the space economy

About us

Engage with us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2025 World Economic Forum