Chemical safety evaluation: EPA is doing the “Robot” 21st century style

Jennifer McPartland, Ph.D., is a Health Scientist.

Parts in this series:      Part 1     Part 2     Part 3     Part 4

Remember that then-new dance move from the 20th century?  Now don’t get too excited, EPA is not adding a dance category to its new sustainability research program.

No, the ‘Robot’ in my title refers to some of the impressive machines involved in EPA’s efforts to develop and apply new automated approaches to chemical toxicity testing.  These approaches integrate modern insights being gleaned from the biological sciences with advances in computation.  A new term has even been coined for all this:  Computational toxicology.

Though perhaps less of a draw than a dance-off featuring EPA staff, EPA’s exploration of new ways to better assess and address the safety of the tens of thousands of chemicals in use today is pretty exciting.   

A few weeks ago, I participated in a Public Dialogue Conference hosted by EPA on Advancing the Next Generation of Risk Assessment (NexGen).  NexGen is an EPA initiative launched in 2010 that now falls under the recently-announced restructured research program at EPA, the Chemical Safety for Sustainability Research Program (CSSRP).

NexGen is a collaborative effort across multiple federal agencies and California’s Environmental Protection Agency.  They are sharing resources to explore how recent advances in molecular biology and computational sciences may improve, inform, and influence our understanding of chemical toxicity and risk.

What are these recent advances in molecular biology?  Recent research has begun to elucidate how chemicals act at the molecular and cellular levels to give rise to a particular effect or set of effects.  In science speak, we refer to this activity as a chemical’s mode of action (MOA).  Of primary interest are those chemical activities that negatively affect – or perturb – the biological pathways underlying “normal” cellular processes and functions.

Biological pathways and chemicals’ MOAs are quite complex, potentially involving many biological components (e.g., DNA, RNA, proteins) and influenced by other factors such as diet and health status.  Scientists dedicate entire careers trying to put the many pieces of these intricate puzzles together.  Here’s just one measure of the complexity:  In each one of our cells there are roughly 3 billion – that’s billion with a ‘b’ – base pairs of DNA!  Now consider trying to figure out what happens when a chemical enters a cell and comes face-to-face with that DNA.  There are equally complex sets of potential interactions at the cell surface and within the cell.  And with groups of cells that form our organs and other vital systems.  Quite a daunting task, but scientists have been learning a lot about those interactions!

Beyond identifying and characterizing the individual MOAs of chemicals at these levels, it is equally critical to overlay them on and place them in the context of the complex physiological interactions occurring in our bodies.  And we’re learning that these interactions vary significantly across the human population, depending on life stage, level and timing of exposures, genetic differences, and so on.  Each of us represents a unique biology in some respects.

That’s a lot for any human mind to grasp.   In fact, it’s impossible – and is the reason why EPA and its partners are investing equally in the development of the computational power necessary to handle and identify meaningful patterns within the massive amounts of biological data being generated.

Enterprising industries have for some time been using these emerging understandings of biology and computational science for purposes ranging from drug discovery to green chemical design for some time.  Now EPA is intensifying efforts to capitalize on new science to enhance its ability to more effectively and quickly assess chemical risk.

So, great, while the science is really fascinating, and is spurring a flurry of business and government activity, at the end of the day what does all this have to do with keeping us safe from dangerous chemicals?  Quite a bit, it turns out, which I’ll turn to in future posts in this series.  Stay tuned.

This entry was posted in Emerging testing methods, Health science and tagged , , . Bookmark the permalink. Both comments and trackbacks are currently closed.

2 Comments

  1. J.J.
    Posted March 8, 2011 at 12:55 am | Permalink

    If they are truly using non-animal testing, I think this is a step in the right direction and will reap better information than animal testing. Any maybe int he process they’ll find out why so many medications such as Levaquin and Cipro cause horrible side effects.

  2. stephenie hendricks
    Posted March 10, 2011 at 4:41 pm | Permalink

    So if industry or the firms they hire – like ICF, Gradient, etc – receive contracts to assess chemicals in this way, can they skew data to show a chemical is harmless when in fact it can have hazardous effects?