{"id":1490,"date":"2011-06-14T13:58:59","date_gmt":"2011-06-14T18:58:59","guid":{"rendered":"http:\/\/blogs.edf.org\/nanotechnology\/?p=1490"},"modified":"2026-04-02T10:48:34","modified_gmt":"2026-04-02T15:48:34","slug":"chemical-safety-evaluation-limitations-of-emerging-test-methods","status":"publish","type":"post","link":"https:\/\/blogs.edf.org\/health\/2011\/06\/14\/chemical-safety-evaluation-limitations-of-emerging-test-methods\/","title":{"rendered":"Chemical safety evaluation: Limitations of emerging test methods"},"content":{"rendered":"<p><a href=\"http:\/\/environmentaldefense.org\/page.cfm?tagID=62101\"><em>Jennifer McPartland, Ph.D.<\/em><\/a><em>, is a Health Scientist.<\/em> <em>Richard Denison, Ph.D.<\/em><em>, is a Senior Scientist.<\/em><\/p>\n<p><strong>Parts in this series:\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><a href=\"http:\/\/blogs.edf.org\/nanotechnology\/2011\/03\/02\/epa-is-doing-the-%e2%80%9crobot%e2%80%9d-21st-century-style\/\"><strong>Part 1<\/strong><\/a><strong>\u00a0\u00a0\u00a0\u00a0 <\/strong><a href=\"http:\/\/blogs.edf.org\/nanotechnology\/2011\/03\/17\/chemical-safety-evaluation-packing-tox-tests-into-single-drops-of-liquid\/\"><strong>Part 2<\/strong><\/a><strong>\u00a0\u00a0\u00a0\u00a0 <\/strong><a href=\"http:\/\/blogs.edf.org\/nanotechnology\/2011\/05\/12\/chemical-safety-evaluation-potential-benefits-of-emerging-test-methods\/\"><strong>Part 3<\/strong><\/a><strong>\u00a0\u00a0\u00a0\u00a0 <\/strong><a href=\"http:\/\/blogs.edf.org\/nanotechnology\/2011\/06\/14\/chemical-safety-evaluation-limitations-of-emerging-test-methods\/\"><strong>Part 4<\/strong><\/a><\/p>\n<p>This is the fourth in a series of blog posts on new approaches that federal agencies are exploring to improve how chemicals are evaluated for safety.\u00a0 In this post, we\u2019ll discuss a number of current limitations and challenges that must be overcome if the new approaches are to fulfill their promise of transforming the current chemical safety testing paradigm.\u00a0 <!--more--><\/p>\n<ul>\n<li><em><span style=\"text-decoration: underline;\">In vivo<\/span><\/em><span style=\"text-decoration: underline;\"> versus <em>in vitro<\/em>.<\/span>\u00a0 In\u00a0our <a href=\"http:\/\/blogs.edf.org\/nanotechnology\/2011\/03\/17\/chemical-safety-evaluation-packing-tox-tests-into-single-drops-of-liquid\/\">second post<\/a> in this series, we addressed the question of whether high-throughput (HT) tests tell us anything different than we could learn from traditional tests in laboratory animals.\u00a0 We noted that traditional toxicity testing aims to determine whether a particular dose of a chemical results in an observable change in the health or normal functioning of the <em>whole animal<\/em> (<em>in vivo<\/em>), while HT tests look to see whether and by what mechanism a chemical induces changes at the <em>cellular or molecular level<\/em> (<em>in vitro<\/em>) that may be precursor events leading to an actual disease outcome.\u00a0 These types of changes can\u2019t be easily detected or measured in whole animals.\u00a0 However, the converse question must always be asked:\u00a0 Can tests conducted <em>in vitro<\/em> accurately reflect the effects that a chemical would have in the more complex and complete environment of a whole animal?<\/li>\n<\/ul>\n<p style=\"padding-left: 30px;\">Both EPA and the <a href=\"http:\/\/books.nap.edu\/openbook.php?record_id=11970&amp;page=47\">National Research Council<\/a> have acknowledged that high-throughput, <em>in vitro<\/em> methods may not capture processes affecting chemicals that may occur within the more complex biology of the whole organism or within or between tissues.\u00a0 To quote an EPA <a href=\"http:\/\/ehsehplp03.niehs.nih.gov\/article\/fetchArticle.action?articleURI=info%3Adoi%2F10.1289%2Fehp.0901392\">study<\/a>, \u201cThe most widely held criticism of this <em>in vitro<\/em>-to-<em>in vivo<\/em> prediction approach is that genes or cells are not organisms and that the emergent properties of tissues and organisms are key determinants of whether a particular chemical will be toxic.\u201d\u00a0 This concern frequently arises in <a href=\"http:\/\/www.ncbi.nlm.nih.gov\/pubmed\/21381051\">related conversations around chemical metabolites<\/a>.<\/p>\n<p style=\"padding-left: 30px;\">The toxicity posed by a chemical is not always derived from the chemical itself, but rather from the compounds generated from its breakdown inside an organism (called metabolites).\u00a0 A classic example is the polycyclic aromatic hydrocarbon, benzo[a]pyrene, the metabolites of which are mutagenic and carcinogenic.\u00a0 Metabolism can also work in reverse, of course, rendering a toxic chemical less or non-toxic.\u00a0 Many of the high-throughput assays utilized in ToxCast and other HT systems <a href=\"http:\/\/www.epa.gov\/ncct\/download_files\/posters\/Volarath_SOT11_2.pdf\">lack explicit<\/a> metabolizing capabilities.\u00a0 EPA is exploring ways to better incorporate whole animal capabilities <a href=\"http:\/\/www.epa.gov\/athens\/research\/process\/comptox\/\">such as metabolism<\/a>, in the context of HT tools, but until greater confidence in capturing these complexities exists, this issue will remain a factor limiting the extent to which <em>in vitro<\/em> HT test data can be considered fully predictive of <em>in vivo<\/em> effects.<\/p>\n<ul>\n<li><span style=\"text-decoration: underline;\">HT tests don\u2019t yet adequately cover the waterfront.<\/span>\u00a0\u00a0 Determining whether a chemical perturbs a biological pathway necessarily requires that the pathway is included within the battery of high-throughput assays.\u00a0 In other words, it\u2019s impossible to detect an adverse effect if it\u2019s not being tested for.\u00a0 Dr. Robert Kavlock, Director of the National Center for Computational Toxicology at the EPA, put this well <a href=\"http:\/\/ehp03.niehs.nih.gov\/article\/fetchArticle.action?articleURI=info%3Adoi%2F10.1289%2Fehp.trp030110#Podcast Transcript\">during an interview<\/a> with <em>Environmental Health Perspectives<\/em> on ToxCast, \u201cAnd then another lack that we have is we\u2019re looking at 467 [HT] assays right now. We may need to have 2,000 or 3,000 assays before we cover enough human biology to be comfortable that when we say something doesn\u2019t have an effect, that we\u2019ve covered all the bases correctly.\u201d<\/li>\n<\/ul>\n<p style=\"padding-left: 30px;\">Likewise, during the <a href=\"http:\/\/www.epa.gov\/risk\/nexgen\/workshops.htm\">NexGen Public Conference<\/a>, Dr. Linda Birnbaum, Director of the National Institutes of Environmental Health Sciences (NIEHS), <a href=\"http:\/\/www.epa.gov\/risk\/nexgen\/docs\/Birnbaum_NexGen_Conf_Presentation_2-2011.pdf\">identified gene targets<\/a> relevant to disease pathways involved in diabetes that currently are not included in the HT battery of assays.\u00a0 These gene targets were suggested by experts during an <a href=\"https:\/\/blogs.edf.org\/nanotechnology\/2011\/01\/19\/do-these-chemicals-make-me-look-fat\/\">NIEHS workshop<\/a> on chemicals and their relationship to obesity and diabetes.\u00a0\u00a0 It will be critical for ToxCast-like efforts to continuously mine and integrate the latest science into their HT assay regimes.<\/p>\n<ul>\n<li><span style=\"text-decoration: underline;\">Ability to account for diversity in the population.<\/span>\u00a0 Another challenge, not unique to newer testing methods, is the ability to account for real-world diversity among the human population that influences susceptibilities and vulnerabilities to toxic chemical exposures.\u00a0 Individual differences in our <a href=\"http:\/\/www.genome.gov\/glossary\/index.cfm?id=90\">genomes<\/a>, <a href=\"https:\/\/commonfund.nih.gov\/epigenomics\/\">epigenomes<\/a>, life stage, gender, pre-existing health conditions and other characteristics are integral in determining the ultimate health effect of a chemical exposure.\u00a0 Traditional animal toxicity tests typically use inbred, genetically identical animal strains to extrapolate and predict a chemical\u2019s effect in humans.\u00a0 This experimental design presents shortcomings not only because animal data are being used to predict effects in humans, but also because the data from a highly homogenous population is being used to make predictions for a very diverse human population.<\/li>\n<\/ul>\n<p style=\"padding-left: 30px;\">Newer methods like HT testing will need to surmount similar constraints that arise from testing on homogenous populations of cells or cell components.\u00a0 As we mentioned in the last post in this series, use of stem cells offers some ability to account for early life stages, and it may be possible to use multiple, genetically diverse cell lines to incorporate genetic variations.\u00a0 This challenge has not escaped the federal experts behind new testing initiatives.\u00a0 In fact, NIH Director Francis Collins confronted this issue in a <a href=\"http:\/\/rusynlab.unc.edu\/publications\/course_data\/Collins2008.pdf\">2008 publication<\/a>, commenting on federal research endeavors that involve testing thousands of compounds on different human cell lines to account for differential susceptibility to effects.<\/p>\n<ul>\n<li><span style=\"text-decoration: underline;\">Accounting for different patterns of exposure.<\/span>\u00a0 We know that we are not exposed to one chemical at a time; that we are in fact exposed to multiple chemicals at the same time.\u00a0 Along the same lines, the frequency, duration, and intensities of exposure to a chemical or mixture of chemicals vary among us.\u00a0 The ultimate impact of a toxic chemical exposure on our health may be quite different if that exposure happens, for example, one time late in life and at a high dose than if exposure is continuous, starting at a young age, and at a low dose.\u00a0 Similar to the previous challenge, this is not a challenge specific to newer testing methodologies.\u00a0 Nevertheless, whether and, if so, how these types of issues can and will be addressed in more novel testing strategies should be articulated to stakeholders.\u00a0 Once again, this has been acknowledged by agency experts in a <a href=\"http:\/\/ehsehplp03.niehs.nih.gov\/article\/fetchArticle.action?articleURI=info%3Adoi%2F10.1289%2Fehp.0901392#Discussion\">peer-reviewed publication<\/a>:\u00a0 \u201cA related challenge is the understanding of what short-timescale (hours to days) <em>in vitro<\/em> assays can tell us about long-timescale (months to years) processes that lead to <em>in vivo<\/em> toxicity end points such as cancer.\u201d<\/li>\n<li><span style=\"text-decoration: underline;\">What level of perturbation is biologically significant?<\/span>\u00a0 At some point, an informed decision will need to be made as to what level of chemically-induced perturbation observed in an HT assay is considered sufficiently indicative or predictive of a toxic effect.\u00a0 In other words, even if an assay <em>performs<\/em> perfectly, determining how to interpret and translate HT data into a measure of actual toxicity to humans is a challenge, especially when one attempts to overlay issues like human variability.\u00a0 Not to beat a dead horse, but again this is not an issue unique to newer testing methods.\u00a0 With the newer methods, decision rules will be needed to govern extrapolation to humans, analogous to adjustment factors and other means used currently to extrapolate from animal studies, aided where possible by data from human epidemiological studies and other benchmark references.\u00a0 What\u2019s most important at this stage is that Federal efforts <a href=\"http:\/\/www.ncbi.nlm.nih.gov\/pubmed\/21384849\">continue<\/a> to confront this challenge head on and are transparent about the approaches used to translate HT assay data into measures of human toxicity.<\/li>\n<li><span style=\"text-decoration: underline;\">Insufficient accounting for epigenetic effects<\/span>.\u00a0 <a href=\"http:\/\/dels-old.nas.edu\/envirohealth\/newsletters\/newsletter1_epigenetics.pdf\">Epigenetics<\/a> is a burgeoning field of science that studies how gene expression and function can be altered by means other than a change in the sequence of DNA, i.e., a mutation.\u00a0 As we have noted in an <a href=\"https:\/\/blogs.edf.org\/nanotechnology\/2011\/04\/20\/could-these-chemicals-make-my-grandchild-look-fat\/\">earlier post<\/a>, epigenetic changes are critical to normal human development and function.\u00a0 For example, epigenetics is the reason why skin cells stay skin cells and don\u2019t change into kidney cells and vice versa.\u00a0 Evidence is increasing that <a href=\"https:\/\/blogs.edf.org\/nanotechnology\/2011\/01\/19\/do-these-chemicals-make-me-look-fat\/\">certain chemicals can interfere with normal epigenetic patterns<\/a>.\u00a0 For example, epigenetic changes induced by tributyltin have been shown to influence the programming of stem cells to become fat cells as opposed to bone cells.\u00a0 The current ToxCast battery of assays is limited in explicitly measuring epigenetic effects of chemicals; see <a href=\"http:\/\/ntp.niehs.nih.gov\/ntp\/ohat\/diabetesobesity\/presentations\/Tox21OverviewTice.pdf\">slide 13 here<\/a>, and <a href=\"http:\/\/0-pubchem.ncbi.nlm.nih.gov.opac.acc.msmc.edu\/assay\/assay.cgi?aid=1865\">this description of one of the few such assays<\/a> currently available.<\/li>\n<li><span style=\"text-decoration: underline;\">False Negatives\/False Positives.<\/span>\u00a0 Fundamental to the success of HT assays is their ability to correctly identify chemicals that are \u2013 and are not \u2013 of concern.\u00a0 \u00a0Such concerns inform EPA\u2019s intense focus on validating of these methods.\u00a0 EPA\u2019s validation strategy largely involves testing chemicals with well defined hazard characteristics in the high-throughput assays, and seeing if the HT assays appropriately identify their hazards.<\/li>\n<\/ul>\n<p style=\"padding-left: 30px;\">To the extent use of HT assays is presently discussed, it is generally within the context of screening chemicals for prioritization for further assessment.\u00a0 Within this context, a proclivity of such assays to allow \u201cfalse negatives\u201d would be of much greater concern than their yielding \u201cfalse positives.\u201d\u00a0 Why?<\/p>\n<p style=\"padding-left: 30px;\">If a truly hazardous chemical isn\u2019t \u201ccaught\u201d in HT assays, then it could be erroneously deemed low-priority and set aside indefinitely.\u00a0 While the converse could also happen \u2013 a chemical that is actually safe could be erroneously flagged as toxic and assigned a higher priority than warranted \u2013 such an error would almost certainly later be caught, as it would be subject to further scrutiny.<\/p>\n<p style=\"padding-left: 30px;\">Already, false negatives have arisen in ToxCast.\u00a0 For example, during last year\u2019s NIEHS Workshop on the \u201cRole of Environmental Chemicals in the Development of Obesity and Diabetes,\u201d experts examining organotins and phthalates noted that ToxCast high-throughput assays <a href=\"http:\/\/cerhr.niehs.nih.gov\/evals\/diabetesobesity\/presentations\/OrganotinsPhthalatesFinal_508.pdf\">did not successfully identify<\/a> chemicals known to interfere with PPAR\u2014a protein important for proper lipid and fatty acid metabolism\u2014in assays designed to flag this interference.<\/p>\n<p style=\"padding-left: 30px;\">Now, using HT tools to screen chemicals is a realistic near-term first routine use of these technologies.\u00a0 \u00a0But we should proceed with caution, because even prioritization is a decision with consequences. \u00a0As <a href=\"http:\/\/ehp03.niehs.nih.gov\/article\/fetchArticle.action?articleURI=info%3Adoi%2F10.1289%2Fehp.trp030110#top\">Dr. Kavlock put it<\/a>, \u201cYou want to have as few false negatives as possible in the system, because if you put something low in a priority queue, we may never get to it, and so you really want to have confidence that when you say something is negative, it really does have a low potential.\u201d<\/p>\n<ul>\n<li><span style=\"text-decoration: underline;\">Ultimate Challenge: Use in regulatory decision-making.<\/span>\u00a0 If federal agencies are serious about advancing the newer testing methods to a point where they can form the core of a new toxicity testing and assessment paradigm, by extension these methods will also need to be deemed a sufficient basis for regulatory decisions.\u00a0 In comparison to the other challenges discussed in this post, this one is the ultimate challenge: \u00a0to move the new methods from the research and development phase to serve as the basis for risk management and other regulatory determinations.\u00a0 A corollary implication is that data derived using the new methods must be able to meet statutory and regulatory standards governing how the safety of a chemical is to be determined.<\/li>\n<\/ul>\n<p style=\"padding-left: 30px;\">To meet this challenge, regulatory bodies will ultimately need to attain sufficient buy-in or acceptance from relevant stakeholders in the industry, NGO, and governmental sectors.\u00a0 And to achieve that buy-in, at a minimum each of the challenges we\u2019ve laid out in this post will need to be addressed.\u00a0 Now, all of this will not happen overnight, of course, and will likely take many years. But it is imperative that the ultimate challenge in kept in sight, guiding the development of newer testing strategies as they move forward.<\/p>\n<p><strong>Conclusion<\/strong><\/p>\n<p>New molecular and computational testing approaches should ultimately improve our ability to protect human health and the environment from toxic chemical exposures.\u00a0 To get there, however, scientific and stakeholder confidence in the capabilities of the new methods must be built.\u00a0 In addition, explicit recognition and communication of their limitations, the associated uncertainty, and appropriate and inappropriate applications are essential as these methods mature and evolve.<\/p>\n<p>Our impression from the <a href=\"http:\/\/www.epa.gov\/risk\/nexgen\/workshops.htm\">NexGen Public Conference<\/a> is that federal officials by and large embrace this perspective and forthrightly acknowledge some of the challenges we have discussed.\u00a0 EPA\u2019s earnestness in building confidence in the new approaches is evident by <a href=\"http:\/\/www.epa.gov\/ncct\/research_projects.html\">the intense research it is conducting<\/a> on assessing and testing their capabilities.\u00a0 A number of scientific publications by EPA personnel on high-throughput testing, and related efforts are already available, <a href=\"http:\/\/www.epa.gov\/ncct\/publications.html\">here<\/a>.<\/p>\n<p>EPA is also working on a series of <a href=\"http:\/\/epa.nexgen.icfi.com\/images\/NexGen%20Prototypes%20Workshop%20Summary%202-1-11b.pdf\">chemical-disease\/disorder prototypes<\/a>.\u00a0\u00a0 The agency is using these prototypes as case studies to determine whether researchers can successfully \u201creverse-engineer\u201d well-established adverse outcomes for data-rich chemicals using the newer types of molecular, systems biology-based methods.\u00a0 The four initial draft prototypes underway are: \u00a01) lung injury caused by ozone, 2) developmental impairment caused by thyroid hormone disruptors, 3) cancer caused by polycyclic aromatic hydrocarbons, and 4) cancer caused by benzene.<\/p>\n<p>We applaud EPA\u2019s and other agencies\u2019 efforts in improving chemical assessments and ultimately chemical risk management.\u00a0 The impetus for NexGen is in part due to the thousands of chemicals on the market that have not been adequately assessed for safety.\u00a0 We all should be for developing means to assess chemical safety better, faster, at less cost and requiring the use of fewer laboratory animals.\u00a0 This is part of the vision laid out in two seminal National Academy Reports, <a href=\"http:\/\/www.epa.gov\/spc\/toxicitytesting\/\">Toxicity Testing in the 21<sup>st<\/sup> Century<\/a> and <a href=\"http:\/\/cfpub.epa.gov\/ncea\/cfm\/recordisplay.cfm?deid=202175\">Science and Decisions<\/a>.<\/p>\n<p>But at the same time, we can\u2019t get too far ahead of ourselves.\u00a0 New approaches must be validated as effective for their intended uses and contexts before being so used.\u00a0 And every effort needs to be directed at clearly communicating uncertainty and limitations even as their use expands.\u00a0 Doing so will not only build stakeholder confidence in the project, but will also afford opportunities to work on solutions to address those uncertainties and challenges.<\/p>\n<p>For EDF\u2019s part, we believe the agency\u2019s recent flurry of activity\u2014ranging from <a href=\"http:\/\/www.epa.gov\/ncct\/\">CompTox<\/a> to <a href=\"http:\/\/www.epa.gov\/risk\/nexgen\/\">NexGen<\/a> to <a href=\"http:\/\/www.epa.gov\/ncct\/Tox21\/\">Tox21<\/a> to the recent rollout of the <a href=\"http:\/\/css.ideascale.com\/\">Chemical Safety for Sustainability Program<\/a>\u2014is important to follow, encourage and help shape.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Jennifer McPartland, Ph.D., is a Health Scientist. Richard Denison, Ph.D., is a Senior Scientist. Parts in this series:\u00a0\u00a0\u00a0\u00a0\u00a0 Part 1\u00a0\u00a0\u00a0\u00a0 Part 2\u00a0\u00a0\u00a0\u00a0 Part 3\u00a0\u00a0\u00a0\u00a0 Part 4 This is the fourth in a series of blog posts on new approaches that federal agencies are exploring to improve how chemicals are evaluated for safety.\u00a0 In this post, &#8230;<\/p>\n","protected":false},"author":5105,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[56094,5009],"tags":[39167,39174,39169,39170,39199,39163,39184,91637,39186],"coauthors":[114134],"class_list":["post-1490","post","type-post","status-publish","format-standard","hentry","category-new-testing-methods","category-health-science","tag-computational-toxicology","tag-epigenetics","tag-in-vitro","tag-in-vivo","tag-national-institutes-of-environmental-health-sciences-niehs","tag-phthalates","tag-polycyclic-aromatic-hydrocarbons-pah","tag-toxcast","tag-tributyltin"],"acf":[],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/posts\/1490","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/users\/5105"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/comments?post=1490"}],"version-history":[{"count":3,"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/posts\/1490\/revisions"}],"predecessor-version":[{"id":13669,"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/posts\/1490\/revisions\/13669"}],"wp:attachment":[{"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/media?parent=1490"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/categories?post=1490"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/tags?post=1490"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/blogs.edf.org\/health\/wp-json\/wp\/v2\/coauthors?post=1490"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}