The topic of uncertainty could easily be one of the most controversial yet relevant conversations in the field of toxicology currently. The 2018 Continuing Education course entitled “Uncertainty Characterization in 21st-Century Toxicology: Current Practice and Practical Methods Supporting Regulatory Risk Assessment” addressed not only known relevance of uncertainty in chemical risk assessment, but also barriers to acceptance of uncertainty within the context of the title.
Toxicologists have been assessing uncertainty for decades, but with the advent of new approach methodologies (NAMs), how to describe and assess their uncertainties is critical. NAMs, which are also considered alternative methodologies that do not require animal testing, include in vitro assays, in chemico, in silico, or quantitative approaches. Progressive bodies consider these methodologies as the future of 21st-century toxicology.
Buzz around the term 21st-century toxicology, picked up after the National Research Council’s 2007 Toxicity Testing in the 21st Century: A Vision and a Strategy report and has not waned since. In fact, the push to embrace toxicology testing for chemical and pharmaceutical safety assessment with methodologies that are faster and more relevant to humans is very much an international conversation as discussed in the course.
Here are three reasons all toxicologists need to be concerned about the NAMs uncertainty:
- You cannot get rid of it.
Uncertainty is an inherent part of all research. As outlined in the course, possible sources of uncertainty can come from 1) using data from one species or model to assess effects in a different species or model; 2) using data generated in a limited timeframe; and 3) using data for one population to infer effects in another.
- We are already comfortable accepting it.
What I appreciated most about this course was that some presentations addressed a barrier to acceptance that had nothing to do with science, but rather culture. Scientists have become comfortable with the level of uncertainty we currently accept. Developing and accepting NAMs would mean not only quantifying, but also assessing uncertainty afresh. Inevitably, this will initially make it uncomfortable to accept NAMs for regulatory acceptance, but not impractical to transfer the concept.
- Understanding is power.
One key take-home point throughout the session was the importance of defining uncertainty for transparent communication for risk assessment. Characterization of uncertainty in high-throughput toxicokinetics data, for example, will be different from defined in vivo uncertainties. However, uncertainty assessments should not be levied heavily against NAMs disproportionate to the accepted use of uncertainty of in vivo models. Coming full circle, back to point number 1, we cannot get rid of uncertainty, nor should we should we try. Therefore, understanding uncertainty as an important part of safety assessment will help the scientific community embrace uncertainty characterization for 21st-century NAMs.
This CE course was elegantly done and well-attended with approximately 150 attendees (crude eyeball counting, factor in the level of uncertainty). I strongly recommend a deeper dive into the topic when the recording becomes available as part of CEd-Tox, the SOT online Continuing Education program, in May 2018.
This blog was prepared by an SOT Reporter. SOT Reporters are SOT members who volunteer to write about sessions and events they attend during the SOT Annual Meeting and ToxExpo. If you are interested in participating in the SOT Reporter program in the future, please email SOT Communications Director Michelle Werts.