Emerging Technologies

Google's new AI knows which images you'll like - before you've even seen them

A woman takes a picture with her cell phone of "Genius of the Year", a mosaic dating from the late 2nd century, at the National Archaeological Museum in Madrid March 27, 2014. REUTERS/Juan Medina (SPAIN - Tags: SOCIETY)

A Google algorithm has learned which pictures we'll find 'aesthetically near-optimal'. Image: REUTERS/Juan Medina

Rob Smith
Writer, Forum Agenda
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Digital Communications is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Digital Communications

Machines can now rank photos according to their aesthetic appeal, in much the same way humans do, thanks to a new artificial intelligence (AI) model created by researchers at Google.

Neural Image Assessment, or NIMA, was designed using a machine learning algorithm known as a convolutional neural network (CNN), which uses data that’s been rated and labeled by humans.

Google says its new AI can predict which images will hold most appeal Image: Pixabay

Unlike existing models which typically only categorize images based on their quality, NIMA can be trained to predict which images a human would rate as technically good and aesthetically attractive.

NIMA achieves this by recognizing and classifying images based on a variety of characteristics humans associate with emotions and beauty, scoring each on a scale of one to 10, the researchers said in a blog post announcing the technology.

“This is more directly in line with how training data is typically captured, and it turns out to be a better predictor of human preferences when measured against other approaches,” the researchers claim.

To test the model’s capabilities, NIMA was pitted against photos from the large-scale database for Aesthetic Visual Analysis (AVA). Each AVA photo is scored by an average of 200 people in response to photography contests.

After training, the aesthetic ranking of many of these photos by NIMA closely matched the mean scores given by humans, as illustrated below.

Image: Google

According to Google, NIMA scores can be used to compare the quality of different versions of the same image, which may have been distorted in various ways, while also being used to enhance the perceptual quality of an image, finding “aesthetically near-optimal” settings for brightness, highlights and shadows.

NIMA can rate the quality of different versions of the same image. Image: Google

NIMA could also be used to improve photography in real-time and assist in the editing process, the researchers say.

Speaking with authority

Interestingly, NIMA is just one of a number of machine learning tools with near human-capabilities that Google has developed.

For example, a research paper published by the tech giant in January details a text-to-speech system called Tacotron 2, which claims near-human accuracy at imitating audio of a person speaking from text.

The system can also handle hard-to-pronounce words and names, as well as alter the way it enunciates based on punctuation. For instance, capitalized words are stressed, as someone would do when indicating that specific word is an important part of a sentence.

Chess champion

Meanwhile, researchers from Google’s DeepMind AI lab recently developed AlphaZero, which absorbed all of humanity’s chess knowledge in around four hours.

After being programmed with only the rules of chess, AlphaZero had mastered the game to the extent it was able to beat the highest-rated chess-playing program, Stockfish.

Loading...

David Kramaley, CEO of chess science website Chessable, said: “It will no doubt revolutionize the game, but think about how this could be applied outside chess. This algorithm could run cities, continents, universes.”

Have you read?
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Emerging TechnologiesIndustries in Depth
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Energy transition: Everything you need to know and live coverage from #SpecialMeeting24

Ella Yutong Lin and Kate Whiting

April 23, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum