ACC’s chemical prioritization tool: Helpful, but flawed and off the mark for EPA to use without TSCA reform

Richard Denison, Ph.D., is a Senior Scientist.

As I noted in my last post, the American Chemistry Council (ACC) issued its own “prioritization tool” in anticipation of the Environmental Protection Agency’s (EPA) public meetings  to get input on the approach it will use to identify additional chemicals of concern under its Enhanced Chemicals Management Program.

In the context of TSCA reform, various actors in the industry have long called for prioritization, often saying they support EPA’s ability to get off to a quick start on identifying chemicals for further work – only to propose schemes that are more likely to do the opposite.

ACC itself has over time come off as a bit schizophrenic on prioritization, apparently being for it before they were against it.  ACC’s release of its tool puts it squarely back in the pro-prioritization camp, but just what is it proposing?  My sense is it’s after something quite different from what EPA proposes, and frankly, different from what EPA is currently capable of deploying, given its limited authority and resources under TSCA.  In this sense, ACC’s proposal is more relevant in the context of TSCA reform, where we presumably would have an EPA with a mandate to review all chemicals in commerce, the authority to readily get the data it needs, and the resources required to execute the kind of comprehensive prioritization scheme ACC proposes.

But setting that disconnect aside for the moment, let’s delve a bit deeper into the ACC proposal on its own merits. 

ACC’s proposal is welcome in several ways:  First, it’s substantive and specific (which I haven’t always been able to say about what ACC has offered in the past).  It’s so much easier to start working toward common ground when you know where the other guy is coming from.

Second, there are some refreshing elements and acknowledgments:

  • ACC at least implicitly notes (p. 2) that there are gaps in available hazard data for many chemicals – and that a chemical with such gaps should be elevated in priority to a high ranking.  (Unfortunately, ACC makes no such provision for what is arguably an even larger knowledge gap for chemicals:  data on use and exposure.  This is one of several ways in which ACC’s tool over-relies on limited exposure information.)
  • Chemicals with multiple uses would be assigned the overall exposure ranking corresponding to the use with the greatest potential exposure (p. 5) – an appropriately conservative approach.  (Unfortunately, this is not the approach ACC uses in other cases; more below).
  • ACC rightly criticizes at some length (p. 9) EPA’s reliance on presence in children’s products as insufficiently indicative of kids’ exposure – noting, for example, that products used in the home but not by children may well lead to higher exposures.  (Unfortunately, it relegates children’s exposure potential to a second-tier consideration in prioritization.)
  • ACC appropriately proposes using production volume (p. 6) as one of several surrogate measures of exposure – a bit ironic, given how much the industry railed against the European Union’s REACH Regulation for doing the same.  (Unfortunately, ACC reserves its “high” ranking for those few chemicals annually produced at the staggeringly high level of 100 million pounds per year – that’s 100 times higher than the level EPA has designated as a high production volume (HPV) chemical.)
  • ACC proposes that EPA be able to use its professional judgment (p. 1) in certain aspects of prioritization – though it then appears to limit that allowance to hazard ranking (not exposure ranking!) and second-tier considerations.  (And as we’ll see below, little evidence of such flexibility is evident in the details of ACC’s proposal.)

There are also a number of quite problematic aspects of ACC’s proposal:

Overly rigid rules applied in lockstep

For an organization that has frequently asserted that the greatest strength of TSCA has been its flexibility, ACC has produced a remarkably rigid tool for prioritization.  With calculator-like precision, neatly-assigned little numbers get tallied up in the ACC tool:  Each element gets a numeric score, which are then added up and banded to yield crisp overall scores, which are finally assigned to high-, medium- and low- priority status.  But the real world is not quite so reducible to simple arithmetic.

ACC’s tool demands EPA use certain “rules” in the name of sound science and consistency:

The “equal basis” rule:  Most prominent among these rules is that “the hazard and exposure elements should be applicable across all substances being evaluated” (p. 10), “rather than just those information elements available only for subsets of chemicals” (p. 1).

By this sleight of hand, ACC manages to rule out any types of information that may indicate a hazard or exposure of high concern unless it has been measured across basically all chemicals subject to prioritization.  This rule may well help to explain ACC’s relegation to a second-tier consideration any direct evidence of human or environmental exposure – e.g., biomonitoring and environmental release and media monitoring data – because such data aren’t collected for all chemicals.  ACC instead would have EPA resort to extremely narrow and rigid definitions and measurements of persistence and bioaccumulation potential even where direct real-world exposure data exist (more on this below).

The lockstep application of this rule would have EPA ignore a chemical like perfluorooctanoic acid (PFOA) because it accumulates in blood rather than in fat tissue – the latter being the only kind of data that are available for many if not most chemicals and to which ACC restricts its bioaccumulation criterion.

ACC’s rule would also have EPA ignore other chemicals with unique or uncommon properties simply because either most chemicals haven’t been examined for those properties or because those properties actually distinguish certain chemicals from most others.  An example of the former might be a chemical deemed of concern because it is known to disrupt expression of a particular gene, while an example of the latter would be virtually all nanomaterials with unique size-dependent behavior that only shows up at the nanoscale.

High hazard and high exposure:  A second such rigid rule in ACC’s tool is that only chemicals for which high hazard and high exposure can be demonstrated warrant high priority.  While such chemicals certainly merit prioritization, applying this as a hard-and-fast rule is overly limiting of professional judgment.

I noted in my last post that ACC invokes the Canadian approach to categorization to support its tool.  But ACC fails to point out that the criteria Canada used to screen its inventory included separate criteria for hazard and exposure, and any chemical meeting either advanced to the next stage.

Given the large gaps in hazard and exposure data for many chemicals, it’s simply shortsighted to automatically set aside as low priority any chemical for which evidence of both high hazard and high exposure is lacking – without any regard for how high the hazard or exposure might be.  A potent developmental toxicant for which there is uncertainty about the extent to which pregnant women or infants are exposed may well warrant prioritization; likewise for a chemical released to the environment that is highly bioaccumulative and where there is suggestive but not definitive evidence of serious hazard.

This need is especially acute given that one of the key actions to be taken on chemicals that are prioritized is to get more information on their hazards, uses and exposures – a step that would be forgone if a high-hazard or high-exposure chemical were set aside indefinitely.

Over-relying on exposure information – appropriately called the “weakest link” in risk assessment – to relegate high-hazard chemicals to low priority is especially problematic – a topic on which I have blogged at some length earlier.

Persistent and bioaccumulative:  A third rigid rule relates both to how ACC defines these P and B properties, and how ACC would only assign high priority to chemicals that are both P and B.  Here again, ACC’s invoking of Canada as the ideal approach fails to acknowledge that that country’s criteria included chemicals that, in addition to being toxic, were found to be persistent or bioaccumulative.

ACC’s tool uses extremely narrow definitions of P and B, presumably due in part to the “equal basis” rationale that more data exist from tests based on the narrow definitions.  The B definition, for example, assumes that the only means by which chemicals bioaccumulate is by being taken up from water into the fat tissue of aquatic organisms.  Yet bioaccumulation can occur in other tissues (e.g., blood, bone) and by other routes, for example, through food-web uptake and accumulation by air-breathing animals.

Many chemicals that may not qualify as P or B using ACC’s narrow definitions are for all intents and purposes persistent or bioaccumulative.  This is often the case for chemicals that are frequently or even continuously released into the environment or to which people are routinely exposed.  Bisphenol A is not P or B, yet shows up in the bodies of more than 90% of the American population.  Why?  Because exposure to it is ubiquitous and ongoing, it’s being replaced as fast as it’s being eliminated.  It makes no sense for EPA to be required to ignore that fact.

There’s every reason to consider the data on P and B that ACC proposes be used – but there’s also every reason not to stop there.  Especially if data from biomonitoring and monitoring of environmental releases and media reveal direct evidence of persistence, EPA can and should consider this information in making prioritization decisions.  Yet ACC’s tool would relegate such data to at best second-class status.

Consistent use of the least conservative classification values

For toxicity, ACC proposes that EPA rely on classification criteria developed under the Globally Harmonized System (GHS) for Classification and Labeling (p. 1). 

(Now, I simply must stop here for a moment to flag a statement in ACC’s document (p. 2) that I can only hope is an inadvertent – if gross –misstatement.  ACC claims that “GHS classification information is readily available for all substances, as U.S. manufacturers have developed GHS classifications for their products to meet international requirements.”  But GHS does not require any company to generate any data where it doesn’t already exist; it simply provides a means of classifying already available data.  Because GHS provides criteria for dozens of different endpoints, I can imagine that most chemicals will have some data for some endpoints for which GHS provides classification criteria.  But it is simply untrue that companies have data for all such endpoints.  A very small number of chemicals have been tested for carcinogenicity, for example, yet GHS provides criteria for this endpoint.)

But back to ACC’s tool:  I generally support ACC’s proposal that EPA rely on GHS criteria, but with two caveats:  First, GHS does not include every endpoint of concern, and its use should not limit EPA’s ability to consider other health or environmental endpoints.

Second, GHS’ cutoff values must be used faithfully – and here, ACC fails badly.

ACC’s Table 2 (p. 3) lists what it says are cutoff values for repeat dose toxicity test data.  But the table neglects to specify the corresponding test duration or to note that the cutoff values depend on the duration of the repeat dose test.  Instead, ACC uses the least conservative values – those corresponding to the less commonly used 90-day test duration.  Applying ACC’s cutoff values to data from the much more commonly used 28-day repeat dose test would relegate what GHS would classify as a high-toxicity chemical to a lower ranking.  See Table 6, p. 17, in this EPA document summarizing the GHS repeat dose criteria.

This is not the only case in which ACC has selected the least conservative cutoff values:

  • Persistence:  ACC’s tool would designate any chemical with a degradation half-life in water, soil or sediment of less than 180 days as “non-persistent” (p. 7).  Yet EPA’s own PBT criteria, also used in the New Chemicals Program, would classify chemicals with half-lives all the way down to 16 days as at least moderately persistent!  Here are EPA’s criteria (from Table 12, p. 23 in this EPA document):
    • > 180 days half-life = Very highly persistent
    • 60-180 days half-life = Highly persistent
    • 16-59 days half-life = Moderately persistent
  • Bioaccumulation:  ACC’s tool would only designate a chemical as bioaccumulative if its fish bioaccumulation factor (BAF) or bioconcentration factor (BCF) exceeded 5,000 (p. 7).  Yet here again, EPA’s own PBT criteria, also used in the New Chemicals Program, would classify chemicals with BAF or BCF values all the way down to 100 to be at least moderately bioaccumulative!  Here are EPA’s criteria (from Table 13, p. 24 in this EPA document):
    • >5000 BAF/BCF = Very highly bioaccumulative
    • 1,000-5,000 BAF/BCF = Highly bioaccumulative
    • 100-1,000 BAF/BCF = Moderately bioaccumulative

ACC’s tool fails on two counts:  Not only does it use the least conservative values, which would relegate many P and B chemicals to low priority.  It also takes two critical chemical properties that manifest themselves along a broad continuum, and assigns a single bright line – when all authoritative bodies have explicitly acknowledged the continuous nature of such properties by designating multiple classification categories, ranging from very high to low or very low.

Over-relying on limited exposure information and discounting evidence of hazard

In several subtle ways, ACC’s tool reflects its longstanding tendencies to over-rely on limited exposure information and discount evidence of hazard:

  • The tool collapses its health hazard and environmental hazard rankings into a single score (albeit the higher of the two), whereas it combines scores for its three exposure elements.  This means that a chemical that harms both people and other organisms only gets counted once, while a chemical that is low-volume and used only as an intermediate and is not P or B gets credit for being of low concern for all three attributes.
  • The tool’s scale for hazard runs from only 1-4, whereas its exposure scale runs from 1-5.  Because these scores ultimately get combined, it’s that much harder for a high-hazard chemical to get a high overall ranking than it is for a low-exposure ranking to get a low overall ranking.
  • High-exposure scenarios that occur in industrial and commercial settings get discounted (Table 3, p. 5).  Only chemicals with consumer exposure get a high ranking; this means that even if large numbers of workers are exposed to a very high-hazard chemical, that chemical automatically gets assigned a lower exposure priority.  Again, this approach is too rigid:  EPA needs to be able to elevate in priority a chemical where the risk to a subset of the population is disproportionately high.


While ACC’s tool has some serious flaws and is not something that EPA has the authority or resources to utilize under current TSCA, ACC has put forth a serious proposal for prioritization that should help to raise the level of debate over this critical issue in TSCA reform. 

This entry was posted in Health policy, Regulation, TSCA reform and tagged , , , , , , , . Bookmark the permalink. Both comments and trackbacks are currently closed.

One Comment

  1. alvord
    Posted September 22, 2011 at 4:14 pm | Permalink

    Developing prioritization tools for chemical testing/regulation under TSCA is a fools errand. My proof? EPA/TSCA history. This type of prioritizing effort has all been done, one way or another, many times before. Two examples: the endocrine disruptor project or going further back the RM1/RM2 effort. Would anybody really call either of those two efforts to prioritize chemicals for testing and/or regulation a success or worth the effort that was put into them. Those are just two of the countless efforts EPA and stakeholders have made to prioritize chemicals under TSCA for testing and/or regulation. Most efforts have been lost in the sands of time. Yet here EPA, industry and enviros are again, still trying to prioritize chemicals to test them or regulate them or something. I know people think that this time it will be different. No, it won’t be different. You will take a long time coming up with a scheme, put it into effect, perhaps begin to use it and then get bogged down in disagreements about the prioritization methodology or implementation or something else. That is the history of TSCA. To pursue prioritization as a goal in itself is a gift to those who want to delay and avoid chemical testing or regulation or those who want to avoid having to make a regulatory decision.

    Why doesn’t EPA try something different. How about EPA scientists, engineers, economists and others use their judgement and experience to select some chemicals to test or possibly regulate. Don’t get bogged down in developing a formal prioritization process.Then develop, for each chemical individually, a paper that lays out the rational for the testing and/or regulation for that chemical. The important thing is that EPA provide an adequate rational for the required testing and/or regulating the chemicals they select irregardless of the chemicals “priority.” Keep doing this for more chemicals as long as there is the need and the funding is available. Don’t give those who want to avoid testing/regulations or those who want to avoid making regulatory decisions any more gifts.