Blogs

blog_1.jpg

New and Innovative Ways to Use Our Mountains of Data: How About a Framework for Safer Chemicals?

By Maria Trainer posted 03-25-2015 08:58 AM

  

 

We have a lot of data. A LOT of data. And we’re really good at generating more and more of it every day. A single metabolomics study can generate 1 terabyte of data. But what does it mean? Being good at generating data isn’t enough. We need to learn to use these data in new and innovative ways in order to more effectively develop new chemistries and more efficiently regulate existing ones. As a starting point, how about a framework for developing a “safer” chemical? That was the vision articulated during this well attended workshop on Tuesday afternoon.

While definitions of “green chemistry” might vary, the convergence of toxicology and chemistry to develop lower toxicity chemicals was a recurring theme in this session. The need to identify a common language that enables toxicologists and chemists to communicate and interact more openly and from an earlier point in the product development cycle was clearly articulated by several of the speakers.

Structure dictates pretty much everything about a molecule. Can we develop a framework that leverages structural information, along with data on bioavailability, mode (or mechanism) of action, QSAR, toxicokinetics, and toxicodynamics to design chemicals with reduced hazard profiles?  This so-called “green toxicology” approach makes intuitive sense when we consider the tremendous advances we are now seeing across the breadth of disciplines that contribute to the vision that is Tox21.

Stephen DeVito of the US EPA neatly summarized the three broad categories of chemicals that constituted the focus of this workshop: pharmaceuticals, pesticides, and consumer chemicals. The third group being loosely comprised of commercial chemicals that don’t fall into either of the first two groups. Pesticide and pharmaceutical chemicals have substantial pre-market regulatory testing requirements that are both data-intensive and very expensive. Consumer chemicals, in the U.S. at least, are not subject to that same level of regulatory oversight. As such, the drivers for change may be different and they may come from different sources. And that’s OK.

For example, “fail early, fail cheap” is a mantra common to many of us who work in the pharmaceutical or pesticide industries. The pre-market testing requirements for a new product are substantial and very costly. As such, there is a need for cost-effective, high throughput approaches to screen candidate molecules early in the development process. If we know something about the mechanism by which a chemical is toxic, an early candidate with unfavorable toxicity profile might get a second chance if there is a way to make a structural modification that could mitigate toxicity.

This same principle could easily apply to existing consumer chemicals. Mitigating risk by structural modification isn’t a new concept but in this age of increasing data availability, is one that has surely come of age.

Thomas Hartung ended his talk with one of my favorite quotes “The difficulty lies not in the new ideas but in escaping from the old ones. 21st century challenges need 21st century solutions. We need to use the best available science to make decisions and we need solutions that are fit for purpose. To do this, we need to figure out how to maximize our existing data and learn to leverage all the tools in our toolbox.

These are exciting times. I for one can’t wait to see what’s coming next!

 

This blog discusses highlights from the SOT Annual Meeting and ToxExpo Workshop “An Ounce of Prevention Is Worth a Pound of Cure: How 21st Century Toxicology Can Transform Product Safety Assessments and Design of Lower Toxicity Products.”

0 comments
0 views