Artificial Intelligence

Generative AI has disrupted education. Here’s how it can be used for good – UNESCO

Schools, colleges and universities are having to rapidly adjust to generative AI.

Schools, colleges and universities are having to rapidly adjust to generative AI. Image: Unsplash/MD Duran

Madeleine North
Senior Writer, Forum Agenda
Our Impact
What's the World Economic Forum doing to accelerate action on Artificial Intelligence?
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Artificial Intelligence

Listen to the article

  • Schools, colleges and universities are having to rapidly adjust to generative AI.
  • Now UNESCO has produced the first-ever global guidance for the use of generative AI in education.
  • The World Economic Forum also addresses “educational shifts” in its Presidio Recommendations on Responsible Generative AI.

Barely recovered from the pandemic, schools and universities are beginning the 2023-24 academic year with a fresh challenge to contend with – generative AI.

While some educational institutions initially banned the use of ChatGPT when it launched in November last year – mainly due to concerns around how it could enable cheating – now it is a case of establishing ground rules and guidelines around generative AI use.

The first ever global Guidance for Generative AI in Education and Research, a UNESCO initiative designed to “address the disruptions” caused by the technology, is now available to help this process. As countries continue to work out their governance approach, the guidance “will help policy-makers and teachers best navigate the potential of AI for the primary interest of learners”, says Audrey Azoulay, UNESCO Director-General.

The World Economic Forum and AI Commons have similar goals with their 30 action-oriented recommendations outlined in the Presidio Recommendations on Responsible Generative AI. These guidelines – the result of 100 thought leaders and practitioners coming together at a global summit in April 2023 – are centred around three key themes: responsible development of the technology, international collaboration, and social progress, which incorporates “educational shifts”.

Governance guardrails are essential to ensuring safe and inclusive use of generative AI in education.
Governance guardrails are essential to ensuring safe and inclusive use of generative AI in education. Image: Reuters/Jonathan Drake

What are UNESCO’s guidelines for AI in education?

Fewer than 10% of schools and universities currently have formal guidance on AI, says UNESCO. Its new guidance suggests eight specific measures educational institutions could adopt to ensure “quality education, social equity and inclusion”.

1. Promote inclusion, equity, linguistic and cultural diversity

Bias was identified early on as an issue with generative AI models, and the task of governments and institutions now is to level the AI playing field. To this end, UNESCO says steps must be taken to ensure everyone has internet connectivity and access to AI applications; criteria must be established to eliminate gender or cultural bias; and GenAI systems must evolve to include data in multiple languages, especially minority ones.

2. Protect human agency

There is a danger we may become too dependent on generative AI and lose sight of our human agency, compromising “the development of intellectual skills”, UNESCO warns. One approach the report advises to counteract this possibility is selective banning of the technology in situations “where it would deprive learners of opportunities to develop cognitive abilities and social skills through observations of the real world”.


How is the World Economic Forum creating guardrails for Artificial Intelligence?

3. Monitor and validate GenAI systems for education

This point acknowledges the need for ongoing monitoring of generative AI as the technology continues to evolve. Checks and balances must be put in place to ensure “GenAI systems used in education and research are free of biases”, and that children and “vulnerable learners” understand informed consent. The report adds that “institutions and educators should be willing and able to take swift and robust action to mitigate or eliminate … inappropriate content”.

4. Develop AI competencies including GenAI-related skills for learners

UNESCO says governments should commit to providing AI curricula in schools and colleges, “as well as for lifelong learning”. Tailored programmes should also be provided for older workers and citizens “who may need to learn new skills”.


5. Build capacity for teachers and researchers to make proper use of GenAI

It’s not just students who are making use of generative AI, of course. A recent US poll found that more teachers than pupils were using it, with one teacher admitting that MagicSchool – an educators’ generative AI tool – had enabled him to “take back his summer” after planning a year’s worth of lessons using the technology. Yet UNESCO says only seven countries are currently focused on AI training programmes for teachers and that future policies must “enable teachers to create specific GenAI-based tools to facilitate learning in the classroom and in their own professional development”.

6. Promote plural opinions and plural expressions of ideas

Learners and educators need to approach GenAI with the understanding it is “a fast but frequently unreliable source of information” mostly trained on “dominant worldviews”. Knowing how to objectively critique the technology’s responses is vital to protect “minority opinions and plural expressions of ideas”, says the report.


7. Test locally relevant application models and build a cumulative evidence base

Part of AI’s bias is that GenAI models are mostly trained on information from the northern hemisphere, meaning the Global South and indigenous communities are under-represented. UNESCO is calling on GenAI tools to “be made sensitive to the context and needs of local communities”, recommending that use cases in education and research accurately reflect educational priorities, “rather than novelty, myth or hype”.

8. Review long-term implications in intersectoral and interdisciplinary manner

AI providers, educators, researchers, as well as parent and student representatives, need to collaborate on “system-wide adjustments” across curricula to both mitigate the risks and harness the potential of generative AI, says UNESCO.

“We all need to be cognizant that GenAI might … change the established systems,” concludes the report. AI tools should not undermine, conflict with or usurp us, states UNESCO. To ensure that doesn’t happen, “the transformation of education and research [by GenAI] should be rigorously reviewed and steered by a human-centred approach”.

Have you read?
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Artificial IntelligenceEducation
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

How we can prepare for the future with foundational policy ideas for AI in education

TeachAI Steering Committee

April 16, 2024

About Us



Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum