Testing chemicals, medicines, and cosmetics on animals may soon be a thing of the past thanks to this new, “more reliable” approach.
A recently published study shows that advanced algorithms working from large chemical databases can predict a new chemical’s toxicity better than standard animal tests.
The researchers at Johns Hopkins Bloomberg School of Public Health mined a large database of known chemicals they developed to map the relationships between chemical structures and toxic properties. They then showed that one can use the map to automatically predict the toxic properties of any chemical compound—more accurately than a single animal test would do.
The most advanced toxicity-prediction tool the team developed was on average about 87% accurate in reproducing consensus animal-test-based results—across nine common tests, which account for 57% of the world’s animal toxicology testing. By contrast, the repetition of the same animal tests in the database were only about 81% accurate—in other words, any given test had only an 81% chance, on average, of obtaining the same result for toxicity when repeated.
“These results are a real eye-opener—they suggest that we can replace many animal tests with computer-based prediction and get more reliable results,” says the study’s principal investigator Thomas Hartung.
The computer-based approach could also be applied to many more chemicals than animal testing, which could lead to wider safety assessments. Due to costs and ethical challenges, only a small fraction of the roughly 100,000 chemicals in consumer products have been comprehensively tested.
Animals such as mice, rabbits, guinea pigs and dogs annually undergo millions of chemical toxicity tests in labs around the world. Although this animal testing is usually required by law to protect consumers, it is opposed on moral grounds by large segments of the public, and is…
I have a crazy passion for #music, #celebrity #news & #fashion! I'm always out and about on Twitter.
More from Around the Web