Blogs

blog_1.jpg

2024 Annual Meeting Report: Exploring the Practicality of Implementing Next-Generation Risk Assessment

By Mary Iheanacho posted 05-16-2024 11:30 AM

  

Next-Generation risk assessment (NGRA) is a prospective approach to assessing the possible hazards associated with chemicals and components. It uses new approach methodologies (NAMs) to explore metabolic behaviors, exposure, and bioactivity with the goal of gaining a thorough understanding for applications. NGRA is based on an integrated strategy that combines chemical-specific data with computational and in vitro approaches. But it Is acknowledged that more NGRA education is needed, especially when it comes to choosing information sources and critically analyzing data to make decisions. Regulatory agencies are examining many case studies to confirm the effectiveness of NGRA.

To illustrate the practical application of NGRA principles, the 2024 SOT Continuing Education course “Putting Theory into Practice: Using Computational New Approach Methodologies in Next-Generation Risk Assessment,” chaired by Kristie Sullivan from the Institute for In Vitro Sciences and Gavin Maxwell from Unilever, offered a deep dive into computational approaches integral to NGRA.

We began by laying the groundwork, introducing NGRA concepts, and highlighting its regulatory progress. We then delved into human exposure modeling techniques crucial for estimating exposure across various settings. Subsequently, we explored computational methods to derive quantitative effect levels from in vitro data, drawing from real-world examples and insights from Health Canada. Finally, we discussed innovative endeavors aimed at bolstering confidence in NGRA methodologies, with a focus on computational tools.

John Wambaugh presented on the use of computational NAMs in human exposure modeling. A chemical’s potential for harm can be attributed to its intrinsic toxicity as well as the possibility and degree of exposure. One can conceptualize exposure in terms of consumer, occupational, and ambient sources. NAMs have allowed for exposure assessment and estimation of thousands of chemicals. Complementary to toxicity NAMs, exposure NAMs encompass tools to detect and describe chemical exposure, facilitating the estimation of mg/kg/day daily intake rates (exposure) and conversion of toxicity NAM points of departure (PODs) to daily equivalent doses (hazard). Exposure NAMs employ a range of methodologies, such as toxicokinetics, machine learning models, chemical descriptors, and measurements. High-Throughput exposure (HTE) models are capable of handling numerous chemicals with minimal descriptive information, and there are several publicly available exposure NAM tools, including SHEDS-HT, RAIDAR, and USEtox. Ensemble or consensus models aggregate predictions from multiple predictive models, providing both the most likely outcome and the range of possibilities (uncertainty). Examples of consensus models include CERAPP, COMPARA, and CATMos, all accessible from OPERA. Similarly, various HTE models cater to different aspects of exposure and can be integrated into consensus exposure predictions, such as in the Systematic Empirical Evaluation of Models (SEEM). SEEM consensus predictions are accessible from the CompTox Chemicals Dashboard and NICEATM ICE. Governmental entities utilizing exposure NAMs include US EPA, European Commission Joint Research Center, European Food Safety Authority, US National Toxicology Program, and Health Canada.

Bioactivity-Based PODs can be combined with exposure estimates to create a risk-based assessment metric. Tara Barton-Maclaren highlighted the use of NAMs in bioactivity characterization, the importance of using in vitroin vivo extrapolation (IVIVE) to derive human-relevant bioactivity–based PODs, and computational methods for using in vitro data to derive quantitative bioactivity–based effect levels. These include pathway-specific bioactivity thresholds and nonspecific (general) bioactivity thresholds. In contrast to the traditional risk assessment approach relying on animal studies to extract no-observed-adverse-effect levels (NOAELs) and lowest-observed-adverse-effect levels (LOAELs), the bioactivity-based approach retrieves AC50 values (concentration for half-maximal activity) from ToxCast in vitro assays. It then adopts the 5th percentile AC50 as a nonspecific bioactivity threshold and conducts IVIVE using toxicokinetic modeling to derive an administered equivalent dose (AED) representing a “sensitive” population. The AED serves as the point of departure for risk characterization. Gene expression variations are examined using transcriptomics. Both pathway-specific and non-specific bioactivity threshold techniques can be used to determine a transcriptome point of departure using concentration-response data.

A summary of how various computational approaches can be combined within a tiered decision framework for conducting safety assessments, how to combine the outputs from different models to obtain an estimate of the bioactivity:exposure ratio (BER), and how this can be used to enable decision-making in risk assessment were among the topics covered by Alistair Middleton in his discussion of the use of computational NAMs in NGRA decision-making. Physiologically based kinetic models serve to estimate internal concentration time profiles of substances, with parameterization influencing their accuracy. Dose-Response models are employed to estimate concentration-response data by fitting different shapes to the data and selecting the optimal model fit. Bayesian statistical models define uncertainty in various quantities (e.g., Cmax, PODs) by integrating prior knowledge to infer the plausibility of variable values given available data/hypotheses. By combining estimates of potency (PODs) and internal exposure (Cmax), a BER can be calculated to support safety decisions. The discussed concepts and computational approaches are applicable across a spectrum of toxicities, and these were illustrated using exposure to caffeine under various scenarios as a case example.

Nicole Kleinstreuer touched on building confidence in NAMs to support NGRA. She emphasized the importance of addressing diverse regulatory requirements, decision-making frameworks, and validation considerations. Dr. Kleinstreuer highlighted the significance of standards, guidance documents, and other resources in supporting the validation of computational and other NAMs. Using effective and adaptable strategies, encouraging federal agencies and regulated industries to adopt novel approaches, and assisting end users in steering the development of new techniques are all part of the plan to build trust in new methods. Another essential step in fostering confidence is independent review. Models should have clearly defined purposes, demonstrate consistency with human data or across multiple methods, and undergo thorough technical characterization. Additionally, models should capture essential aspects of human biology or toxicity mechanisms. International initiatives are underway to accelerate the establishment of scientific confidence in NAMs and computational approaches supporting NGRA. These efforts involve collaboration among regulatory agencies, public-private partnerships, and various organizations such as ICCVAM, OECD, and ICATM.

As confidence in NAMs grows, there’s a crucial need for education to foster their acceptance and implementation. Ms. Sullivan spoke on educational needs and resources to bridge the gap. Education removes obstacles to acceptance, provides clarity for toxicologists and risk assessors, ensures the integrity of science, accommodates diverse educational backgrounds, and raises the likelihood of method adoption and recognition. Ms. Sullivan highlighted NGRA training resources which include webinars from OECD, American Society for Cellular and Computational Toxicology (ASCCT), Physicians Committee for Responsible Medicine, and a series organized by the US Environmental Protection Agency, PETA Science Consortium International, the Institute for In Vitro Sciences, and the California Department of Pesticide Regulation (known collection as EPIC).

This blog reports on the Continuing Education course titled “Putting Theory into Practice: Using Computational New Approach Methodologies in Next-Generation Risk Assessment” that was held during the 2024 SOT Annual Meeting and ToxExpo. All 2024 Continuing Education courses were recorded and are available for virtual viewing through the SOT CEd-Tox online library. SOT Postdocs and Students and individuals from select countries receive free access to all CE courses.

This blog was prepared by an SOT Reporter and represents the views of the author. SOT Reporters are SOT members who volunteer to write about sessions and events in which they participate during the SOT Annual Meeting and ToxExpo. SOT does not propose or endorse any position by posting this article. If you are interested in participating in the SOT Reporter program in the future, please email SOT Headquarters.


#2024AnnualMeeting
#Communique:AnnualMeeting
#SOTReporter

0 comments
13 views