How do we align the business of GenAI with educational goals?

Is GenAI helpful or detrimental to student learning? Image: Sam Balye/Unsplash
Mandë Holford
Professor and Curator at Harvard University and the Museum of Comparative Zoology, City University of New York (CUNY)- Generative artificial intelligence (GenAI) tools confront students, teachers and society with a fundamental question: What is the purpose of education?
- Educational tools that use GenAI need to clarify the alignment between the purpose of education and the capabilities of these platforms.
- Educator involvement is essential for building GenAI tools that promote meaningful learning.
Generative artificial intelligence (GenAI) – systems of algorithms trained on large datasets to generate new text, images or other content – has made its way into classrooms. GenAI tools are enticing: ask the machine to respond to an essay question, create a work plan or revise written text and, like magic, it does. Type 'provide a 500-word summary of Othello' into interfaces like OpenAI’s ChatGPT or Anthropic’s Claude and a concise summary will appear within seconds. Additionally, both companies offered college students free access to enhanced tiers of service in the spring of 2025 – just in time for finals.
These precisely-timed promotions reveal how the corporations building AI understand the forces at work in educating students. Students turn to AI to cope with limited time, high pressure and assignments they see as low value. Recent anecdotal reports also highlight how, for some students, usage has become so ubiquitous and the results so irresistible, they can’t imagine doing schoolwork without GenAI. If students increasingly turn to GenAI to complete their assignments, are they actually receiving an education?
This is where educational goals and business incentives diverge. While corporations prioritize outcomes, large-scale usage and monetization, learning is a slow, deliberate and cyclical process. GenAI tools that shortcut this process can erode motivation and skill-building. Worse, if institutions lean on AI to scale assessment, offer centralized curricula infused with 'AI slop' or replace teaching functions, we risk designing tools that optimize for efficiency, not growth. As third-grade teacher Timothy Cook wrote: “We’re prioritizing the product of teaching over the process of learning.” Students’ participation in the process of learning is actually more valuable than any assignment they produce.
Aligning AI with learning not efficiency
In AI research, the term ‘alignment’ refers to building GenAI systems that reflect the designer’s goals and values. This can include fairness, privacy and justice, with the intention of preventing sophisticated systems from enacting catastrophic harms. Educational tools using AI need to clarify the alignment between the purpose of education and the capabilities of these platforms. It is counterproductive for GenAI to write essays for students or confidently produce incorrect answers to calculus problems. Students may benefit from scalable, accessible personalization of feedback and support and appraising the limitations of tools they will encounter beyond school, but if education trails the GenAI industry, those needs will never be met.
For a GenAI system to be truly aligned with the goals of education, it would have to:
1. Detect when students misuse it to bypass learning
2. Respond to re-engage students in the task
3. Respect the cognitive, emotional and social needs of students
How is the World Economic Forum creating guardrails for Artificial Intelligence?
Process over product
In education, process matters. Building critical thinking skills, like interpreting new information, analyzing the strengths and weaknesses of different arguments and drawing logical conclusions, will serve students no matter what major or career they eventually choose. Students build these skills through time-intensive cycles of practice and responding to feedback. Self-regulation is also a skill and gaining experience handling challenging, but manageable, tasks gives students the confidence and abilities to tackle challenges in and beyond school.
Current systems of student assessment undervalue the process of learning and educators’ roles as irreplaceable facilitators guiding students through it. When the focus of assessment is a score on a test, it means that the outcome is more important than the process. Changing the focus of assessment to instead capture how knowledge and skills develop, transfer from one context to another and deepen over time would be a better framework to align educational and GenAI goals.
This doesn’t mean GenAI has no role in education. It means we must build it with educators, for learners and against the grain of extractive business logic.
Co-design with teachers
Experienced teachers have a repertoire of skills and methods to enhance the process of learning using technology. The AI sector would be wise to learn from them. To genuinely meet the needs of students, educators must be embedded in the co-design of GenAI tools— not just consulted or given access to AI training and curriculum as an afterthought. Educator involvement is not just ethical, it’s essential for building tools that promote meaningful learning. Co-designing with teachers aligns GenAI systems with the knowledge and best practices that have been earned in the classroom.
What values will we design GenAI tools to embody? Some GenAI tools and approaches to integrating GenAI in classrooms emphasize capabilities as a ‘thought partner’, structuring interactions to elicit student work, instead of producing it. Enhancing those capabilities by designing them in partnership with educators is one important component to meeting this sea change in educational technology.
We have demonstrated that tech in the classroom works when it engages educators, is student-driven, sparks curiosity and enables discovery. GenAI tools have been criticized for being inequitably accessible to users, for inappropriately using copyrighted texts in their training sets, for the intense energy and water resources they use and for providing bad, biased, misleading or dangerous results. Perhaps each of these issues can be resolved with conscientious leadership and engineering know-how. But the project of education in society is not to efficiently deliver a set of correct answers. Designing GenAI for education can’t succeed without educators; if current trends continue to focus on student work outputs, rather than the learning processes, to invoke computer scientist Stuart Russell, “we may get exactly what we asked for, but not what we want.”
Jessica Ochoa Hendrix, CEO and Co-Founder of Killer Snails, also contributed to this piece.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Generative Artificial Intelligence
Related topics:
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.





