Blogs

blog_1.jpg

Efforts Underway to Mine Big Data for Better Safety Assessments

By David Faulkner posted 03-22-2018 03:32 PM

  

2018 SOT Annual Meeting in San Antonio banner

Safety assessments are important. The process of evaluating human, animal, and environmental toxicity data is crucial for regulators, researchers, and companies that sell chemicals or biologics. The speakers for the 2018 Annual Meeting and ToxExpo Symposium Session “Application of Data from New Approaches in Regulatory and Product Safety Decisions” included employees of the Canadian government, US Environmental Protection Agency (US EPA) luminaries, pharmaceutical safety assessors, and Superfund researchers; they presented new tools, new web services to simplify assessment, and new paradigms for the use of animal testing data.

As the various online repositories of toxicological information accrue and high-throughput screening technology creates billions of data points as a matter of course, the problem of data analytics and interpretation looms large for toxicologists used to traditional toxicity studies. Although statistics and data analysis have long been key to the success of any scientific researcher, the emphasis on manipulating and extracting meaning from massive data sets presents a novel challenge for investigators who lack a background in data science.

I was pleased to note that many of the talks in this session were by employees of government agencies, indicating that public research and regulatory bodies are aware of these data challenges and taking them seriously. Alicia Frame, a Superfund researcher and US EPA veteran, introduced attendees to the US EPA Chemistry Dashboard, a powerful tool for synthesizing the myriad toxicological data streams available for more than 760,000 chemicals. Another US EPA researcher, the venerable Grace Patlewicz also hinted at a forthcoming computational tool to assist researchers with read-across techniques and proposed a novel approach called generalized read-across, which consolidates the various and similar read-across methodologies into a single simple scheme. Katie Paul Friedman, US EPA, presented a nuanced and thorough account of a new method for calculating point of departure (POD) values using in vivo guideline study data from the US EPA, Health Canada, European Chemicals Agency (ECHA), and European Food Safety Authority (EFSA), yielding more conservative POD estimates than conventional screening level assessments.

Of course, the United States isn’t the only regulatory game in town, and Tara Barton-MacLaren, Health Canada, presented work by the Canadian government’s Environmental Protection Agency (CEPA) to narrow the universe of chemicals down to a more manageable (and regulate-able) scope. Using quantitative structure-activity relationships, adverse outcome pathways, and the thresholds of toxicological concern paradigm, the CEPA has winnowed its list of current priority chemicals to a modest 4,300 to be evaluated by 2020.

Reza Rasoulpour, Dow AgroSciences, the sole presenter from industry, delivered an engaging and thought-provoking talk about Dow’s efforts to streamline the chemical development process (also called “discovery”) to reduce the number of animals used and effort wasted in the pursuit of new pesticides. Dr. Rasoulpour made a case for the use of liver toxicogenomic assays as a necessary—and possibly sufficient—testing tool in the early phases of the mammalian toxicity testing pipeline. How will regulators respond to this new testing strategy? How predictive is it in the long term? It’s tough to say, but it’s encouraging that the questions are at least being asked. Certainly, narrowing the testing process to a single assay would reduce costs and data processing load, but one wonders at the comprehensiveness of the assay—certainly a developing story worth keeping tabs on.

There’s a world of data out there, and it grows ever larger with each new ’omics technology, but in the sprint to develop ever greater data-generating techniques, it’s possible that many researchers and risk assessors have bitten off more than they can chew, necessitating the development of new and more effective data-processing tools. As new techniques and technologies emerge, no doubt the field of data processing will, too, and for that reason, this session will likely be evergreen, flowering each year in the spring with new models, tools, and ideas.

This blog was prepared by an SOT Reporter. SOT Reporters are SOT members who volunteer to write about sessions and events they attend during the SOT Annual Meeting and ToxExpo. If you are interested in participating in the SOT Reporter program in the future, please email SOT Communications Director Michelle Werts.

0 comments
0 views