Emerging Technologies

A low-cost, high-speed algorithm could put an end to needless animal testing

Animal testing, in addition to its ethical concerns, can be too costly and time-consuming to meet this need. Image: Futurity.org

Patti Verbanas-Rutgers
Author, Futurity
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Canada is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:


Toxicity testing—determining the amount of exposure to a chemical that is unsafe for humans—is vital to the safety of millions of workers in various industries. But researchers have not comprehensively tested a majority of the 85,000 compounds in consumer products for safety.

Animal testing, in addition to its ethical concerns, can be too costly and time-consuming to meet this need, according to a new study in Environmental Health Perspectives.

“There is an urgent, worldwide need for an accurate, cost-effective and rapid way to test the toxicity of chemicals, in order to ensure the safety of the people who work with them and of the environments in which they are used,” says lead researcher Daniel Russo, a doctoral candidate at the Rutgers University-Camden Center for Computational and Integrative Biology. “Animal testing alone cannot meet this need.”

Image: EHP

Previous efforts to solve this problem used computers to compare untested chemicals with structurally similar compounds whose toxicity is already known. But those methods could not assess structurally unique chemicals—and the fact that some structurally similar chemicals have very different levels of toxicity confounded them.

The researchers overcame these challenges by developing a first-of-its-kind algorithm that automatically extracts data from PubChem, a National Institutes of Health database of information on millions of chemicals. The algorithm compares chemical fragments from tested compounds with those of untested compounds, and uses multiple mathematical methods to evaluate their similarities and differences in order to predict an untested chemical’s toxicity.

“The algorithm developed by Daniel and the Zhu laboratory mines massive amounts of data, and discerns relationships between fragments of compounds from different chemical classes, exponentially faster than a human could,” says coauthor Lauren Aleksunes, an associate professor at Rutgers’ Ernest Mario School of Pharmacy and the Environmental and Occupational Health Sciences Institute.

“This model is efficient and provides companies and regulators with a tool to prioritize chemicals that may need more comprehensive testing in animals before use in commerce.”

Have you read?

To fine-tune the algorithm, the researchers began with 7,385 compounds with known toxicity data, and compared it with data on the same chemicals in PubChem. They then tested the algorithm with 600 new compounds.

For several groups of chemicals, the algorithm had a 62 percent to 100 percent success rate in predicting their level of oral toxicity. And by comparing relationships between sets of chemicals, they shed light on new factors that can determine the toxicity of a chemical.

Although researchers only directed the algorithm to assess the chemicals’ level of toxicity when consumed orally, the researchers conclude that their strategy can extend to predict other types of toxicity.

“While the complete replacement of animal testing is still not feasible, this model takes an important step toward meeting the needs of industry, in which new chemicals are constantly under development, and for environmental and ecological safety,” says coauthor Hao Zhu, an associate professor of chemistry.

Additional researchers from Rutgers, Integrated Laboratory Systems, Johns Hopkins Bloomberg School of Health, and the University of Kostanz contributed to the work.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Why the global bioeconomy urgently needs technical standards and metrics

Paul Freemont, India Hook-Barnard and Matthew Chang

June 10, 2024

About Us



Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum