Science Inventory

Progress in data interoperability to support computational toxicology and chemical safety evaluation

Citation:

Watford, S., S. Edwards, M. Angrish, R. Judson, AND K. Friedman. Progress in data interoperability to support computational toxicology and chemical safety evaluation. TOXICOLOGY AND APPLIED PHARMACOLOGY. Academic Press Incorporated, Orlando, FL, 380(1 October 2019):114707, (2019). https://doi.org/10.1016/j.taap.2019.114707

Impact/Purpose:

Toxicology is in a period of rapid change and growth to meet the challenge of safety assessment for tens of thousands of chemicals that have potential human exposures yet lack sufficient data for hazard identification. After over a decade since the publication of the seminal National Research Council report, Toxicity Testing in the 21st Century: A Vision and a Strategy that called for advancements in the field of toxicology using new approach methodologies (NAMs) for hazard, substantial progress has been made. One issue with development of these applications and data models is that information is siloed, which prevents easy integration and exchange of data (i.e. interoperability) creating problems like inconsistent versioning, lack of provenance, and unnecessary duplication. Ultimately the consequence of the lack of data interoperability is that progress in understanding biological and toxicological effects of chemical exposures is hampered despite an abundance of information. Indeed, data interoperability is an issue across all kinds of data that inform chemical safety evaluation. For toxicology, the lack of consensus on how the vast amount of concentration-response data collected from myriad heterogeneous in vitro platforms can be applied to regulatory toxicology applications has clarified the need for implementation of data management strategies that maximize interoperability. For instance, “big data” is being generated via whole-genome sequencing, high-content imaging, and high-throughput bioactivity screening, and how these data are formatted, processed, analyzed, stored, and accessed are dissimilar, between data types and data generators. These dissimilarities create an additional obstacle for data integration to answer applied questions. Building consensus on reporting standards, both for assay design principles and observed effects, would contribute to progress in the use of these data for regulatory applications. Data interoperability is a salient and critical need to address if computational toxicology is to succeed in supporting modern chemical safety evaluation and research in public health and toxicology.

Description:

New approach methodologies (NAMs) in chemical safety evaluation are being explored to address the current public health implications of human environmental exposures to chemicals with limited to no data for assessment. Over a decade since a push toward “Toxicity in the 21st Century,” the field has focused on massive data generation efforts to inform computational approaches for preliminary hazard identification, adverse outcome pathways that link molecular initiating events and key events to apical outcomes, and high-throughput approaches to risk-based ratios of bioactivity and exposure to inform relative priority and safety assessment. Projects like the interagency Tox21 and US EPA’s ToxCast program have generated dose-response information on thousands of chemicals, identified and aggregated information from legacy systems, and created tools for access and analysis. The resulting information has been used to develop computational models as viable options for regulatory applications. This progress has introduced challenges in data management that are new, but not unique, to toxicology. Some of the key questions require critical thinking and solutions to promote semantic interoperability, including: (1) identification of bioactivity information from NAMs that might be related to a biological process; (2) identification of legacy hazard information that might be related to a key event or apical outcomes of interest; and, (3) integration of these NAM and traditional data for computational modeling and prediction of complex apical outcomes such as carcinogenesis. This review serves as an update on efforts being made within toxicology addressing issues specifically related to bioactivity and toxicological data interoperability, in line with goals established by Findable, Accessible, Interoperable, and Reusable (FAIR) Data Principles, that will help promote utilization of both NAM and legacy information in data-driven toxicology applications. The views expressed in this article are those of the authors and do not necessarily represent the views or policies of the US EPA.

Record Details:

Record Type:DOCUMENT( JOURNAL/ PEER REVIEWED JOURNAL)
Product Published Date:10/01/2019
Record Last Revised:11/17/2020
OMB Category:Other
Record ID: 350163