Variety is the spice of … accurate chemical testing

Rachel Shaffer is a research assistant.  Jennifer McPartland, Ph.D., is a Health Scientist.

There has been a lot of buzz in recent years about the federal government’s new chemical testing initiatives, ToxCast and Tox21 (see, for example, these articles in Scientific American and the New York Times).  These programs are developing high-throughput (HT) in-vitro testing to evaluate—and ultimately predict—the biological effects of chemicals.  In contrast to the relatively slow pace of traditional animal testing, ToxCast and Tox21 use sophisticated robots to rapidly test thousands of chemicals at a time. As a result, they hold the potential to more efficiently fill enormous gaps in available health data, predict adverse effects, and shed light on exactly how chemicals interact and interfere with our biology. (For more on these potential benefits, see Section 5 of EDF’s Chemical Testing Primer).

Yet, among the key challenges that these new methods must address is one that traditional, animal-based methods have faced for decades: how can laboratory testing adequately account for the high degree of variability in the human population? The latest research suggests the exciting possibility that genetic diversity, at least, may be able to be incorporated into emerging HT in vitro approaches.  

In the real world, individual susceptibility to chemicals is mediated by a variety of factors, such as our genes, the expression of our genes (referred to as our epigenome), gender, age, pre-existing health conditions, and more.  However, neither homogenous, inbred laboratory animals nor the genetically identical cell lines that are typically used in the ToxCast and Tox21 programs sufficiently capture these critical differences.  

Many scientists and risk assessors have recognized this challenge, and an article published last year in Toxicological Sciences (Lock et al., 2012) presents one attempt to address this issue by testing chemicals on a large number of genetically distinct cell lines.  The researchers found that genetically diverse cell lines can, in fact, respond differently to the same chemical exposure.  Their work also illustrates the feasibility of incorporating this type of diversity into HT in vitro chemical testing.

In this study, 81 different human lymphoblast cell lines (immature cells that will later develop into mature white blood cells that originate in bone marrow)  were exposed to 240 chemical compounds and then evaluated for two adverse effects: cytotoxicity (toxicity to cells) and apoptosis (self-induced cell death; in other words, programmed cell “suicide”).  Some of the chemicals produced the same patterns of cytotoxicity and apoptosis in all of the cell lines, while other chemicals showed no effects in any of the cell lines.  Most interesting, however, was that some chemicals caused different levels of cytotoxicity and apoptosis across the set of genetically-distinct cell lines—a testament to the strong influence of genetics on toxicity.  In some cases, a single chemical caused severe apoptosis in certain cell lines, low levels of apoptosis in other cell lines, and no apoptosis in yet another set of cell lines.

Because the ToxCast and Tox21 programs do not currently incorporate genetically diverse cells into routine HT testing, differences such as those described above may go undetected. This is especially problematic if results obtained from a single cell line are assumed to represent the entire population or if the lack of diversity is not expressly noted as a serious limitation in any communication or use of the data.

Furthermore, if these methods are used to prioritize a chemical or to inform a risk assessment, then individuals who may be more susceptible to an exposure may be inadequately protected.   Uncertainty factors are commonly used in the practice of risk assessment to account for the limitations of traditional animal testing, including the lack of genetic diversity among highly inbred laboratory animals.  It follows that if HT testing approaches utilize only a few cells lines, analogous uncertainty factors may be needed.

But we may not have to resort to the use of such uncertainty factors.  This new study not only demonstrates the importance of considering individual genetic differences in in vitro testing but also its feasibility.  The results obtained were generally quite reproducible and mirrored results from previous HT toxicity testing of the same chemicals by the National Toxicology Program. This consistency, as well as the availability of many genetically distinct human cell lines made possible through recently established international cell repository programs, may mean that genetic diversity could be integrated into ToxCast and Tox21 in the near future.

Yet, as always, questions and challenges remain. While we clearly need to incorporate genetic diversity into toxicity testing, how do we decide how much diversity is “enough”? Each one of us is distinct, but we can’t possibly all be represented in each chemical test.  So, then, what constitutes a representative sample of the genetic diversity of the human population?

Another question relates to what types of cells should be used in HT testing. Some chemicals exert their effects on specific “target organs,” while others exhibit more general “systemic” toxicity.  In most cases, to fully understand the potential hazards of a chemical, it will be important to test it on a variety of cell types representing different organs with their own set of potential adverse effects. The study we describe in this post used lymphoblast cells; how might the results have differed if liver or mammary gland cells had been used instead?  How should we decide how many and which cell types to include in testing?

Finally, as noted above, genetic diversity is but one of many factors that contribute to varying susceptibilities in the population. How can we also begin to address the individual differences in our epigenome, gender, age, and pre-existing health conditions?  Should we apply uncertainty factors to HT data in the way we have done for data derived from traditional animal testing?  Or, can we attempt to incorporate these additional dimensions of diversity directly into HT testing?

Clearly, we have a long way to go to reach an optimal system for comprehensive chemical testing. ToxCast, Tox21, and other EPA Computational Toxicology (CompTox) programs, are important steps towards building this system. But as with all testing approaches, there are important challenges and limitations. Scientists are continuing to explore ways in which new testing technology can better incorporate genetic diversity (see, e.g., Zeise et al., 2013), but it is also essential that all stakeholders – including  those from the public interest community that represent the interests of subpopulations who may be more susceptible to chemical exposures – engage in the development and application of EPA’s new testing approaches.

For additional information on EPA’s chemical testing programs, please visit these EDF webpages:

- Chemical Testing in the 21st Century: A Primer

- Chemical Testing in the 21st Century: Webinar Series

- Chemical Testing in the 21st Century: Additional Resources

 

This entry was posted in Emerging Testing Methods, Health Science and tagged , , , , . Bookmark the permalink. Both comments and trackbacks are currently closed.

2 Comments

  1. Andrew Rowan
    Posted January 10, 2013 at 3:59 pm | Permalink

    The above posting illustrates the advantage that new hi-thruput approaches offer to chemical risk assessment. These new hi-thruput techniques allow one to generate data at least a million times faster (according to my crude, "back-of-the-envelope" calculations) than the traditional animal testing approaches. As a result of this, the new approaches open up tremendous opportunities for new bioinformatic analyses of this flood of new data, leading to a rapid expansion of both understanding of toxicity pathways and the potential risks to humans and the environment. We now need a more co-ordinated approach to take advantage of these new approaches (and understanding?) and identify how we should spend the approximate $200+ million a year that is currently being devoted globally to developing and implementing new testing strategies and risk assessment systems in the most effective way.

  2. Rachel Shaffer
    Posted January 11, 2013 at 3:58 pm | Permalink

    Thank you very much for your comment, Andrew. New testing methods certainly offer the opportunity to advance our understanding of how chemicals can affect our health, while also potentially reducing costs and the need for laboratory animals. However, they present challenges as well that will take time and effort to overcome. We couldn't agree more that collaborative efforts are needed to facilitate effective and appropriate development and use of these new methods.

  • About this blog

    Science, health, and business experts at Environmental Defense Fund comment on chemical and nanotechnology issues of the day.

    Our work: Chemicals

  • Categories

  • Get blog posts by email

    Subscribe via RSS

  • Filter posts by tags

    • aggregate exposure (10)
    • Alternatives assessment (3)
    • American Chemistry Council (ACC) (55)
    • arsenic (3)
    • asthma (3)
    • Australia (1)
    • biomonitoring (9)
    • bipartisan (6)
    • bisphenol A (18)
    • BP Oil Disaster (18)
    • California (1)
    • Canada (7)
    • carbon nanotubes (24)
    • carcinogen (21)
    • Carcinogenic Mutagenic or Toxic for Reproduction (CMR) (12)
    • CDC (6)
    • Chemical Assessment and Management Program (ChAMP) (13)
    • chemical identity (30)
    • chemical testing (1)
    • Chemicals in Commerce Act (3)
    • Chicago Tribune (6)
    • children's safety (23)
    • China (10)
    • computational toxicology (10)
    • Confidential Business Information (CBI) (52)
    • conflict of interest (4)
    • consumer products (48)
    • Consumer Specialty Products Association (CSPA) (4)
    • contamination (4)
    • cumulative exposure (4)
    • data requirements (45)
    • diabetes (4)
    • DNA methylation (4)
    • DuPont (11)
    • endocrine disruption (28)
    • epigenetics (4)
    • exposure and hazard (49)
    • FDA (8)
    • flame retardants (20)
    • formaldehyde (15)
    • front group (13)
    • general interest (22)
    • Globally Harmonized System (GHS) (5)
    • Government Accountability Office (5)
    • hazard (6)
    • High Production Volume (HPV) (22)
    • in vitro (14)
    • in vivo (11)
    • industry tactics (41)
    • informed substitution (1)
    • inhalation (18)
    • IUR/CDR (27)
    • Japan (3)
    • lead (6)
    • markets (1)
    • mercury (4)
    • methylmercury (2)
    • microbiome (3)
    • nanosilver (6)
    • National Academy of Sciences (NAS) (20)
    • National Institute for Occupational Safety and Health (NIOSH) (7)
    • National Institute of Environmental Health Sciences (NIEHS) (5)
    • National Nanotechnology Initiative (NNI) (6)
    • obesity (6)
    • Occupational Safety and Health Administration (OSHA) (3)
    • Office of Information and Regulatory Affairs (OIRA) (4)
    • Office of Management and Budget (OMB) (15)
    • Office of Pollution Prevention and Toxics (OPPT) (3)
    • oil dispersant (18)
    • PBDEs (16)
    • Persistent Bioaccumulative and Toxic (PBT) (22)
    • pesticides (7)
    • phthalates (17)
    • polycyclic aromatic hydrocarbons (PAH) (5)
    • prenatal (6)
    • prioritization (35)
    • risk assessment (69)
    • Safe Chemicals Act (24)
    • Safer Chemicals Healthy Families (33)
    • Significant New Use Rule (SNUR) (19)
    • Small business (1)
    • South Korea (4)
    • styrene (6)
    • Substances of Very High Concern (SVHC) (15)
    • systematic review (1)
    • test rule (16)
    • tributyltin (3)
    • trichloroethylene (TCE) (3)
    • Turkey (3)
    • U.S. states (14)
    • vulnerable populations (1)
    • Walmart (2)
    • worker safety (23)
    • WV chemical spill (11)
  • Archives