Emerging Technologies

What is Nightshade – the new tool allowing artists to ‘poison’ AI models?

Illustration of a brain.

Nightshade allows artists to change pixels in their work in a way that is invisible to the human eye. Image: Pexels/Google Deepmind

Victoria Masterson
Senior Writer, Forum Agenda
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Artificial Intelligence

  • Nightshade is a “data poisoning tool” developed at the University of Chicago to confuse AI programs that generate images.
  • Artists can deploy it to try and stop AI using their work without permission.
  • Generative AI ranks as the world’s second top emerging technology in the World Economic Forum’s Top 10 Emerging Technologies of 2023 report.

The word “nightshade” brings to mind the highly poisonous plant, deadly nightshade. But now it is also the name of a “data poisoning tool” helping artists fight back against unauthorized use of their work by artificial intelligence (AI).

Nightshade has been developed by computer scientists at the University of Chicago, who have published a paper on their work, Prompt-Specific Poisoning Attacks on Text-to-Image Generative Models.

What is Nightshade?

Nightshade allows artists to change pixels in their work in a way that is invisible to the human eye. This misleads AI programs, which then believe the image shows one thing, whereas viewers see something different.

“Dogs become cats, cars become cows, and so forth,” the MIT Technology Review explains in its preview article on Nightshade.

Discover

How is the World Economic Forum creating guardrails for Artificial Intelligence?

This approach is known as “poisoning” data. It will lead to generative AI tools creating images that are messed up or don’t match what is being searched for.

AI tools extract data from billions of images online, so the more Nightshade is used, the more AI could become poisoned, MIT Technology Review adds.

Graphic illustrating the uses of Nightshade poisons.
How Nightshade poisons the data that AI tools use. Image: University of Chicago

Why has Nightshade been developed?

AI companies disrespect artists’ copyright and intellectual property when they train their image-generating tools on artists’ work without permission, says Ben Zhao, the computer science professor at University of Chicago leading the Nightshade project.

He hopes Nightshade will deter this behaviour and give some power back to artists.

Artists, performers and record labels have filed lawsuits against AI companies including OpenAI, the creator of ChatGPT, for training their AI on artists’ material without permission, reports tech news site VentureBeat.

How could Nightshade affect the AI industry?

Nightshade could have an important impact if AI companies are forced to be more respectful of artists and their rights, for example by agreeing to pay out royalties, say the contributors to the MIT article.

Some AI companies have mechanisms for artists to opt out of their work being used to train generative AI. But artists argue that this isn’t enough, and that AI companies still have too much power.

Loading...

Generative AI is a global emerging technology

Generative AI is one of the top technologies listed in the World Economic Forum’s Top 10 Emerging Technologies of 2023 report.

The report highlights the technologies expected to have a positive impact on society in the next three to five years. Generative AI ranks in second place.

“While generative AI is still currently focused on producing text, computer programming, images and sound, this technology could be applied to a range of purposes, including drug design, architecture and engineering,” the Forum says.

Generative AI applications, though, should build public trust by meeting agreed professional and ethical standards.

Have you read?
Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Emerging TechnologiesArts and Culture
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

What to know about generative AI: insights from the World Economic Forum 

Andrea Willige

October 4, 2024

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum