Stepping up toxicity testing means a safer environment for all

Posted on Tuesday, January 28, 2020

A view of a polluted city.

Experts report ‘substantial progress’ in shifting away from animal testing to speedier and less costly testing methods.

By Michelle Read

Humans are exposed to thousands of chemical substances every day. Yet alarmingly, the vast majority have not been tested in depth for their effects on the human body.

To date, more than 85 million chemicals have been synthesized, 140,000 on a commercial scale and even more appearing in natural products. However, only 10,000 to 20,000 have been tested for toxicity in humans—signalling a need to catch up with the growing numbers.

Fortunately, a group of risk assessment experts, led by Faculty of Medicine professor Daniel Krewski, reports that science has made “substantial progress” over the past decade in finding faster, more reliable methods of determining toxicity, including methods that don’t rely on animal testing.

In fact, US and international government agencies are beginning to incorporate the new methodologies into regulatory practice, with the US Environmental Protection Agency announcing that they will “stop conducting or funding toxicity studies on mammals by 2035.”

The international group of experts in risk assessment reported on their progress in a comprehensive review, published in the December 2019 Archives of Toxicology, titled Toxicity testing in the twenty-first century: progress in the past decade and future perspectives.

The review serves as a follow-up to the  2007 US National Research Council report, Toxicity Testing in the 21st Century: A Vision and a Strategy, chaired by professor Krewski.  This landmark report laid out a bold new vision for toxicity testing and defined more efficient testing strategies for the large number of chemical substances present in the environment.

This new paradigm in toxicity testing relies on high-throughput, in-vitro testing of human cells, as well as computer models. It also identifies a range of “new approach methodologies” that would greatly accelerate the rate of toxicity testing and expand the scientific toolbox available for testing.

For example, animals have traditionally been used for toxicity testing, a method the authors note is both expensive and time consuming. It doesn’t always give reliable information regarding toxicity in humans, either—hence the superior alternatives within the new vision.

In their December 2019 review, the authors identify the importance of gaining a better understanding of toxicity pathways in the human body, called “mapping the human toxome.”

“Mapping the human toxome will require a ‘big science’ effort,” say the authors, but is essential to understanding and predicting what chemicals could pose a toxic hazard.

Other factors have contributed to making toxicity testing speedier and less costly since 2007, says the group. Major advances in technology, increasing open-source data and more bioinformatic tools have allowed more rapid assessment of chemical toxicity.

“We must embrace new ways of thinking about risk assessment science,” the authors say, if we are to improve the safety assessment of environmental and industrial chemicals.

Collaborators in this latest review included both academic and regulatory scientists from Canada, the United States and Europe. The hope is that realizing the 2007 vision will dramatically reduce the potential health risks in the human environment.

“We expect this paradigm shift in toxicological risk assessment to provide greater assurance of the safety of the large number of agents we are exposed to,” concludes Dr. Krewski.

A view of a factory spewing pollution into a clear blue sky.

Of the 85 million chemicals that have been synthesized to date, only 10,000 to 20,000 have been tested for toxicity in humans – signalling a need to catch up to the growing numbers.


Back to top