Resilience, Peace and Security

Local communities should be part of designing humanitarian AI

A preparatory workshop, in collaboration with the Red Cross, to develop a new humanitarian AI-powered tool improve crisis response in Nepal.

A preparatory workshop, in collaboration with the Red Cross, to develop a new humanitarian AI-powered tool improve crisis response in Nepal. Image: Saurav Poudel

Kathy Peach
Director, Centre for Collective Intelligence Design, Nesta
Aleks Berditchevskaia
Principal Researcher, Centre for Collective Intelligence Design, Nesta

Listen to the article

  • The humanitarian sector is increasingly trying to anticipate needs, including by using AI.
  • But mostly AI tools are built by and for large organizations, not local actors.
  • Humanitarian AI should be designed with crisis-affected communities.

When it comes to managing a crisis, timing is everything. Getting ahead of a disaster through predictions, early warning systems or receiving timely and accurate information about the impact of a crisis can make all the difference to hundreds of lives. That’s why the humanitarian sector is increasingly promoting anticipatory action as a key way to reduce humanitarian needs. One of the emerging tools at the heart of this change is artificial intelligence (AI).

In 2021, a World Bank analysis of AI innovation in disaster risk management, highlighted that AI is being used for a wide range of tasks including vulnerability mapping, modelling population movement, predicting risks and supporting damage assessment. These tools have unprecedented potential to equip both responders and communities affected by crises with the timely and relevant information they need to make smarter decisions.

But deploying humanitarian AI presents several challenges. Some of these, such as the risks of data bias, threats to privacy and poor interpretability of models, are common to all high-stakes domains, from medicine to social services. Others, such as the lack of accountability and the danger of reinforcing old power dynamics between international and local actors pose a unique threat.

Have you read?

The risks posed by AI are in direct opposition to the Core Humanitarian Standard, the charter to which the aid sector strives to adhere, and also undermine sector commitments to give more agency to local organizations. An analysis of predictive analytics in the humanitarian sector showed that these tools are mostly used by large international agencies or industry. Our own research into existing AI tools for humanitarian action found that even when AI models are trained using locally generated data or insights, less than one-third are built to be used by local organizations or crisis-affected populations.

Unsurprisingly, AI tools are rarely designed or developed with the participation of locals on the ground who would be impacted by them. This puts communities at risk: AI models are notoriously bad at transferring to a new context. If they are built by teams thousands of miles away without meaningful local input, they are likely to miss out relevant insights about which problems to address, what data could prove useful, and how these technologies should be used.

Humanitarian AI: a novel approach

The solution to this challenge lies in building locally developed and locally owned humanitarian AI, using participatory AI methodologies. Tools should be created that harness the collective intelligence of crisis-affected communities, for use by local organizations and communities who are typically the first responders to any crisis.

With a grant from the UK Humanitarian Innovation Hub, Nesta’s Centre for Collective Intelligence Design and the International Federation of the Red Cross & Crescent Societies, recently tested out this new approach, developing two new tools in Nepal and Cameroon to improve frontline crisis response. This project included working with communities to build new, more diverse datasets on which to train AI models – helping to minimize the risk of biased data. It also meant working with communities and frontline responders to agree how the tools would be evaluated – ensuring the AI model was optimized for the outcomes that mattered to them. In addition, communities were involved in refining the AI models during the development phase. As a result of community workshops in Nepal, ethnicity was removed as one of the model inputs; while in Cameroon, Red Cross volunteers identified a blind spot with the way the model worked.

Overview of the methodology that influenced the development and testing of two humanitarian AI models that powered a tool for battling misinformation in Cameroon.
Overview of the methodology that influenced the development and testing of two humanitarian AI models that powered a tool for battling misinformation in Cameroon. Image: Nesta/Green-Doe Graphic Design Ltd

Though challenging, projects like this one prove that it is possible to build local AI with local data, local infrastructure and local talent. And that it is possible to build AI that responds to local priorities and values. In short, taking a participatory AI approach that foregrounds local context and affected communities can help create more robust and socially acceptable AI tools.

Making inclusive AI a mainstream practice rather than an experimental one-off will require a shift across the sector from funders, agencies and tech innovators. Investing in the right technical skills at a local level is of course an obvious starting point. In addition, we need technologists everywhere to develop AI models that work in resource-constrained, data-scarce settings (which is the reality for most countries facing humanitarian crises), rather than building large, supervised, data-hungry models.

We need humanitarian funders to invest in a coordinated approach to filling data gaps, and build open datasets that are important to the operations of local humanitarian responders and communities, not just the big agencies. And we really need to see more efforts to design AI tools with the participation and oversight of crisis-affected communities. To achieve this, it will be critical to upskill community engagement teams to become participatory AI practitioners – and develop the skills and tools that can help bridge between technologists and local communities.

Discover

How is the World Economic Forum ensuring the responsible use of technology?

Changing the prevailing logic of humanitarian tech development won't be easy, but what if every agency or funder simply started asking the question: How will affected communities participate in the design and oversight of this technology? This is surely the first move in creating humanitarian AI that is truly intelligent.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Humanitarian Action

Related topics:
Resilience, Peace and SecurityEmerging Technologies
Share:
The Big Picture
Explore and monitor how Humanitarian Action is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

How the Maldives can revive its economy through sustainable growth

Kanni Wignaraja and Enrico Gaveglia

October 17, 2024

What is the International Day of Peace and why is it important?

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum