Fourth Industrial Revolution

Why tech needs to focus on the needs of marginalized groups

Illustration of diverse group of people.

Participation in tech development is key to enable equality of experience. Image: Noa Snir for the Digital Freedom Fund.

Nani Jansen Reventlow
Director, Digital Freedom Fund
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Fourth Industrial Revolution?
The Big Picture
Explore and monitor how Fourth Industrial Revolution is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Fourth Industrial Revolution

Listen to the article

  • Marginalized groups are not well represented in the design and implementation of new technology.
  • Dominant narrative suggests that technology can be "fixed" to address systemic issues prevalent in society.
  • What we need is inclusive participation to centre on the concerns of these groups.

When discussing the impact of technology on marginalized groups, all too often the focus is on “fixing technology” to address harms against marginalized groups. This narrative is fundamentally flawed: it is based on the premise that technology is imposed on marginalized groups by some third party, at best unaware – at worst indifferent – to how the technology will affect these groups.

It also disregards that deploying even the most perfect functioning piece of technology can negatively affect groups due to the context in which it is rolled out. It is time we consider the needs of marginalized groups when designing and implementing technology, including engaging, consulting and encouraging participation of these groups during the process, instead of continuing to feed a false narrative about technological "fixes" to structural power dynamics.

Have you read?

The idea persists that technology that negatively impacts marginalized and racialised groups in our society can be set straight through "fixing". It gets reinforced even when policymakers are seeking to act against reproducing society's power structures through technology, as evidenced by the European Commission's recent draft regulation on Artificial Intelligence.

The Commission’s White Paper on Ethics and Artificial Intelligence that preceded the draft regulation, published in April 2021, stated that, “The use of algorithms can perpetuate or even stimulate racial bias if data to train algorithms does not reflect the diversity of EU society,” and the draft regulation follows that line by seeking to address "bias" in AI technology.

This position, for self-evident reasons, is also very popular with the big tech companies. For example IBM states that the shortcomings of facial recognition technology – which research by Joy Buolamwini, then a researcher at the MIT Media Lab, showed to only work accurately for male, white faces – was "not with the AI technology itself, per se, but with how the AI-powered facial recognition systems are trained".

Discover

What's the World Economic Forum doing about diversity, equity and inclusion?

Marginalized and racialized groups disproportionately bear the brunt of the increased use of technology in our societies and this needs to be addressed urgently. However, the focus on technological fixes is not the conversation we should be having.

What's the problem and how can we fix it?

It's true that we have a "brogrammer" problem that engrains the choices made by a mostly white, male, privileged pool of designers and engineers into the technology they design, thereby imposing their perception of the world. And, yes, there are serious problems with the datasets used to train technology, reproducing the power structures in our societies that keep in place systems of oppression for racialised and marginalized groups. But focusing on these aspects alone feeds a false narrative that as long as we can get the technology right, if we can "fix" it, using that technology will be alright. This is a misconception: we are not going to undo centuries of structural oppression by tweaking a few datasets.

As EDRi (European Digital Rights), Europe's biggest membership organization for digital rights said in response to the EU Commission's draft regulation, "There is no ‘quick fix’, no risk assessment sophisticated enough, to undo centuries of systemic racism and discrimination. The problem is not just baked into the technology, but into the systems in which we live."

The deployment of technology in and of itself can constitute violence against certain groups. We need only look at the use of technology for the policing of migration and borders, law enforcement, or the provision of essential government services and a myriad of examples of how this further exacerbates hardship, further entrenches power structures and hampers, and sometimes even blocks access to justice.

We should focus on ensuring that technology is designed and built around the needs of, and in participation with, individuals and communities.

Nani Jansen Reventlow.

To truly address this, we need a fundamental shift in the debate. Instead of accepting and reinforcing the notion of a power dynamic in which technology is designed by a privileged few to then be imposed on marginalized groups, we should focus on ensuring that technology is designed and built around the needs of, and in participation with, individuals and communities.

We already have examples of participation and ownership of communities in technology that we can build on: an app to safely record and store incidents of police violence, created by a collective led by family members who lost a loved one to police brutality in France; a more institutional example in the US is a similar app developed by the ACLU; a decolonial approach to smart cities in Togo; to feminist infrastructure projects that instead of focusing on the mechanics of technology look at how technological practices can be emancipatory tools for all of us.

These practices show that there is a different avenue open to us. It is true, as Laurence Meyer recently wrote, that "technology is not made out of thin air ... it is the product of extraction, of exploitation, of displacement, of violence." But, this current reality is human-made, and it can be changed by us. The point is to start making changes now, while we can still create a more just future for everyone in our society. Those setting the terms for policy debates on technology and human rights, regulators, those investing in technology, and those buying and implementing it should not waste any time.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Fourth Industrial RevolutionArtificial Intelligence
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Space: The $1.8 Trillion Opportunity for Global Economic Growth

Bart Valkhof and Omar Adi

February 16, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum