Artificial Intelligence

Hey Siri, you're sexist, finds UN report on gendered technology

Luke Peters demonstrates Siri, an application which uses voice recognition and detection on the iPhone 4S, outside the Apple store in Covent Garden, London  October 14, 2011. REUTERS/Suzanne Plunkett (BRITAIN - Tags: BUSINESS SOCIETY SCIENCE TECHNOLOGY TELECOMS) - LM1E7AE0TEE01

Many believe that Siri is submissive in the face of gender abuse. Image: REUTERS/Suzanne Plunkett

Sonia Elks
Journalist, Reuters
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Artificial Intelligence?
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Artificial Intelligence

Popular digital assistants that reply in a woman's voice and are styled as female helpers are reinforcing sexist stereotypes, according to a United Nations report.

The vast majority of assistants such as Apple's Siri, Amazon Alexa and Microsoft's Cortana are designed to be seen as feminine, from their names to their voices and personalities, said the study.

They are programmed to be submissive and servile - including politely responding to insults - meaning they reinforce gender bias and normalise sexist harassment, said researchers from the U.N. scientific and cultural body UNESCO.

The study highlighted that Siri was previously programmed to respond to users calling her a "bitch" by saying "I'd blush if I could" as an example of the issue.

"Siri's submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products," it said.

Apple, Amazon and Microsoft were all not immediately available for comment.

A spokeswoman for Microsoft has previously said the company researched voice options for Cortana and found "a female voice best supports our goal of creating a digital assistant".

Voice assistants have quickly become embedded into many people's everyday lives and they now account for nearly one-fifth of all internet searches, said the report, which argued they can have a significant cultural impact.

Have you read?

As voice-powered technology reaches into more communities worldwide, the feminisation of digital assistants may help gender biases to take hold and spread, they added.

"The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them," said Saniye Gulser Corat, UNESCO's director for gender equality.

The report called on companies to take action including to stop making digital assistants female by default, exploring gender neutral options and programming assistants to discourage gender-based insults and abusive language.

A team of creatives created the first gender neutral digital assistant voice earlier this year in an attempt to avoid reinforcing sexist stereotypes.

The UNESCO report was welcomed by women's groups, with Womankind spokeswoman Maria Vlahakis saying it gave "much needed attention" to gender bias in algorithms.

"These algorithms perpetuate gender stereotypes and sexist and misogynist behavior and are reflective of wider structural gender inequalities in technology," she said.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Artificial IntelligenceGender Inequality
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Causal AI: the revolution uncovering the ‘why’ of decision-making

Darko Matovski

April 11, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum