Emerging Technologies

This is best practice for using facial recognition in law enforcement

Women walking while looking at phone

Policymakers are increasingly aware of the opportunities and risks of law enforcement’s use of facial recognition technology Image: Pixabay

Lofred Madzou
Project Lead, Artificial Intelligence and Machine Learning, World Economic Forum LLC
John Riemen
Lead Biometric Specialist, Center for Biometrics, Netherlands Police
Luc Garcia
Face Examiner, International Criminal Police Organization (INTERPOL)
Odhran McCarthy
Programme Officer, Centre for AI and Robotics, UNICRI
Maria Eira
Information and Technology Officer, Centre for AI and Robotics, UNICRI
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Justice and Law is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Emerging Technologies

Listen to the article

  • Facial recognition technology has the potential to help conduct faster investigations, bring offenders to justice and, thus, resolve, stop and prevent crimes.
  • Eventual widespread use by law enforcement agencies raises concerns over the potential risk of wrongful arrests, surveillance and human rights violations.
  • A new white paper from the World Economic Forum, in partnership with the International Criminal Police Organization (INTERPOL), the Centre for Artificial Intelligence and Robotics of the United Nations Interregional Crime and Justice Research Institute (UNICRI) and the Netherlands police, offers a framework to ensure the responsible use of facial recognition technology.
  • Tests of this framework will start in January 2022.

In April 2021, the European Commission (EC) released its much-awaited Artificial Intelligence Act, a comprehensive regulatory proposal that classifies AI applications under distinct categories of risks. Among the identified high-risk applications, remote biometric systems, which include facial recognition technology (FRT), were singled out as particularly concerning. Their deployment, specifically in the field of law enforcement, may lead to human rights abuses in the absence of robust governance mechanisms.

Have you read?

Law enforcement and facial recognition technology

Across jurisdictions, policymakers are increasingly aware of both the opportunities and risks associated with law enforcement’s use of FRT. Here facial recognition refers to the process of the (possible) recognition of a person by comparing a probe image (photos or movies/stills of suspects or persons of interest) to facial images of criminals and missing persons stored in one or multiple reference databases to advance a police investigation.

On one hand, FRT has the potential to help resolve, stop and prevent crimes and bring offenders to justice. More specifically, it could be useful for various types of investigations, including finding the identity of an ATM fraud criminal, looking for a terrorist in public spaces, fighting child abuse or even finding missing persons. On the other hand, early experience shows that without proper oversight, FRT could result in abuses of human rights and harm citizens.

In this context, striking the right balance appears difficult. Policymakers may explore various options ranging from an outright ban to the introduction of additional accountability mechanisms to limit the risk of wrongful arrests. In the US, cities such as San Francisco, Oakland and Boston have banned the use of FRT by public agencies, while the states of Washington, Virginia and Massachusetts have introduced legislation to regulate its use. In other regions, court decisions play an important role in shaping the policy agenda. The UK Court of Appeal ruled unlawful the deployment of FRT by the South Wales Police to identify wanted persons at certain events and public locations where crime was considered likely to occur.

At a more global level, the United Nations Office of the High Commissioner for Human Rights’ (OHCHR) recent report on the right to privacy in the digital age recommends governments halt the use of remote biometric recognition in public spaces in real-time until they can show there are no significant issues with accuracy or discriminatory effects. It also suggests that these AI systems must comply with robust privacy and data protection standards.

Loading...

Facial recognition technology requires a robust governing structure

Despite these important developments, most governments around the world recognize the potential of facial recognition systems for national safety and security but are still grappling with the challenges of regulating FRT because crucial considerations have been largely overlooked. If we were to authorize the proportional use of FRT for legitimate policing aims, what oversight body should be in charge of assessing the compliance of law enforcement activities with human rights and following potential complaints from citizens? How might we maintain a high level of performance of the FRT solutions deployed? What procurement processes should be in place for law enforcement agencies?

To address these challenges, the World Economic Forum – in partnership with the International Criminal Police Organization (INTERPOL), the Centre for Artificial Intelligence and Robotics of the United Nations Interregional Crime and Justice Research Institute (UNICRI) and the Netherlands police – has released a white paper that introduces a governance framework structured around two critical components:

  • A set of principles for action that defines what constitutes responsible use of facial recognition for law enforcement investigations by covering all relevant policy considerations;
  • A self-assessment questionnaire that details the requirements that law enforcement agencies must respect to ensure compliance with the principles for action.

As such, this initiative represents the most comprehensive policy response to the risks associated with FRT for law enforcement investigations, led by a global and multistakeholder community.

A new initiative represents the most comprehensive policy response to the risks associated with FRT for law enforcement investigations
A new initiative represents the most comprehensive policy response to the risks associated with FRT for law enforcement investigations Image: World Economic Forum

Moving forward

This project is now entering the pilot phase. During this period, we will test the governance framework to ensure its achievability, relevance, usability and completeness. We will update it based on the observed results.

The Netherlands police force is the first law enforcement agency that has agreed to participate in the testing process. Yet, considering the sensitivity of this use case, we strongly encourage other law enforcement agencies to join us and contribute to this global effort. We also invite policymakers, industry players, civil society representatives and academics engaged in the global policy debate about the governance of facial recognition technology to join our initiative.

Once this pilot phase is completed, we will update the principles and the self-assessment questionnaire, and the final version will be published.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Emerging TechnologiesCivil Society
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

What is the 'perverse customer journey' and how can it tackle the misuse of generative AI?

Henry Ajder

July 19, 2024

About Us

Events

Media

Partners & Members

  • Sign in
  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum