O Canada!

Richard Denison, Ph.D., is a Senior Scientist.

Some time back, I promised a look at whether Canada’s Chemical Management Plan provides a model for TSCA reform.  This post will provide that look.  Bottom line:  While our neighbor to the north has undertaken and accomplished a great deal over the past decade, it has done so with one hand tied behind its back. 

Canada’s activism on chemical safety issues is remarkable, given that Canada has only about 2% of the global chemicals market, and that a large fraction of the chemicals used there is imported.  Population-wise, it’s also much smaller than the US.  Yet Canada was the first country in the world to tackle the huge legacy of un- and under-assessed chemicals that were – as with TSCA in the US – grandfathered into its regulatory system decades ago.

What did Canada do?

In the 1990’s, concern was mounting among the Canadian public in response to increasing evidence of widespread exposure to hazardous chemicals, especially those that are persistent or bioaccumulative, as well as toxic (PBT) chemicals, which were accumulating in wildlife and people even in remote regions of the Arctic.  That concern culminated in amendments to the Canadian Environmental Protection Act (CEPA), adopted by Parliament in 1999.

CEPA 1999 mandated a novel approach to identifying and initiating government action on chemicals of concern.  It called for two government agencies – Health Canada and Environment Canada – to review all 23,000 previously unassessed chemicals listed on its Domestic Substances List (DSL), Canada’s counterpart to the US TSCA Inventory.

Relying on existing information, government was to “categorize” all of these chemicals, by developing and applying specific criteria to identify those that:

  • may present, to individuals in Canada, the greatest potential for exposure; or
  • are persistent or bioaccumulative and inherently toxic to human beings or to nonhuman organisms.

CEPA 1999 also mandated that this large task be done within seven years of enactment.  Given that legislative mandate and a strict deadline, it not only got done – it got done on time.

This effort remains the most ambitious initiative undertaken to date by any region of the world to examine large numbers of existing chemicals to identify those requiring further data development, assessment and management.

What was the outcome?

Going into the DSL Categorization, most observers expected at most a few hundred chemicals would be “categorized in,” i.e., meet one or more of the hazard or exposure criteria noted above.  Instead, more than 4,000 chemicals were found to have one or more of the characteristics of concern.  That, in my view, puts to rest any notion that only a small number of chemicals possess properties of potential concern.

The Canadian government then prioritized among these, identifying about 500 as high priority for action.  Most of these are PBTs, though some showed evidence of high human toxicity without being persistent or bioaccumulative.

CEPA 1999 also mandated that screening-level risk assessments be done on all chemicals that were “categorized in.”  It also required that:

  • affirmative decisions be made, based on such assessments, to take no further action, to place the chemical on the Priority Substance List, or to place it on the List of Toxic Substances;
  • for chemicals placed on the Priority Substance List, to decide based on a more detailed assessment whether or not to place it on the List of Toxic Substances, and to do so within at most five years; and
  • for chemicals placed on the List of Toxic Substances, to develop and propose a management strategy within two years, and finalize and begin implementing that strategy within another 18 months.

Canada has been plugging away at these tasks since completing the DSL categorization in September, 2006.

Mandatory assessments?  Affirmative decisions?  Mandatory management strategies?  And deadlines for all of the above?

I must be dreaming – none of this is required for existing chemicals under TSCA!

So, what’s not to like about Canada’s system?

Here’s the hand that was tied behind Canada’s back:  It was forced to rely on already available information, however limited it was.  And it was indeed limited.

Hazard data:  Overshadowed by how many chemicals Canada “categorized in” is the fact that there were also thousands of chemicals for which sufficient data did not exist to allow categorization.  For example, for the more than 11,000 organic substances examined, database searches found:

  • experimental bioaccumulation data for 410 substances, and one-quarter of these data was of acceptable quality;
  • experimental persistence data for 850 substances, and one-third of these data was of acceptable quality; and
  • experimental data on inherent toxicity to nonhuman organisms for 1,051 substances, and three-quarters of these data were of acceptable quality.

To try to make some headway in the face of these huge data gaps, and given its lack of a mandate to compel testing, Environment Canada had to rely heavily on estimation models.  And while it managed to cobble together enough data and estimates to categorize most chemicals, thousands were not categorized or were done so at less than a high-confidence level.

Health Canada, in seeking evidence of toxicity to humans, employed a different approach, in which it ranked health endpoints in a hierarchy of more to less serious.  Failure to identify data for a high-concern endpoint simply bounced the chemical down the hierarchy.  Unlike Environment Canada, Health Canada did not reveal the extent of data gaps, but surely they were large as well.

Unfortunately, given the large number of chemicals that were categorized in and the limited resources of the agencies (remember, Canada has a population one-tenth that of the US), scant attention has been paid to filling data gaps.

Exposure data:  Under CEPA 1999, Health Canada was to identify DSL chemicals posing the Greatest Potential for Exposure (GPE) to Humans.  But the data gap was even worse for production, use and exposure information than for hazard data.  The information available to Canada was exceedingly old, dating back to when the DSL was first developed, between 1984 and 1986.

Because of this, and the lack of a routine reporting requirement, Canada is finding that many of the chemicals it categorized in based on their hazards or high exposure potential are no longer manufactured or used in Canada.  Unfortunately, the same data gap raises the converse critical question: How many chemicals that were not manufactured or used in significant quantities in the mid-1980s are today, and hence pose a risk of significant exposure not captured through the DSL Categorization?

At least TSCA has an updating mechanism, called the TSCA Inventory Update Rule (IUR).  But that system is infrequent, incomplete and riddled with exemptions, too-high-reporting thresholds, and other loopholes; see here and here.

Conclusion

While much can be learned from the Canadian experience, it is not a sufficient model for US reform.  Industry likes it (see here, for example) – I suspect in large part because it puts very little burden on them, and a lot on government.  That’s one thing when you’re dealing primarily with imports, quite another when you’re dealing with producers – and the biggest market for chemicals of any country in the world.  We can and should do better.

When it comes to prioritization, the basic choice is between:

  • muddling through with whatever data can be cobbled together and then addressing whatever happens to rise to the top; OR
  • acknowledging that available data are insufficient to effectively prioritize among chemicals, and therefore first collecting a good baseline of data, and then using that to prioritize.

In addition to relying on incomplete and insufficient hazard data for the great majority of chemicals and employing questionable, ill-informed assumptions about use and exposure (that sounds a lot like EPA’s ChAMP initiative before it was suspended), the former approach risks focusing only on those chemicals about which we already know enough to know they’re problems, while ignoring the rest – the proverbial looking for car keys lost somewhere in the parking lot at night only under the streetlights because the light is better there.

That approach also fails to allow us to identify with confidence what could be safer chemicals and avoid so-called “regrettable substitutions,” because even if limited available data don’t show a problem, what we don’t know about such chemicals could still hurt us.

This entry was posted in Health policy, International, TSCA reform and tagged , , , , , , . Bookmark the permalink. Both comments and trackbacks are currently closed.

4 Comments

  1. Jen Sass
    Posted July 27, 2009 at 9:38 am | Permalink

    very useful summary. Thanks as always, for helping to keep us all updated on chem policy here and across the border.

  2. tolga
    Posted July 27, 2009 at 10:35 am | Permalink

    this is great post, thank you.

  3. Richard Wiles
    Posted July 27, 2009 at 1:12 pm | Permalink

    The key problems you identify with the Canadian system, reliance on old and incomplete data to generate lists of priority chemicals, is shared by just about all efforts at chemical policy reform passed to date. Indeed, what constitutes reform so far has basically been the creation of priority lists that are themselves compiled from existing lists like California’s prop 65, the National Toxicology Program’s list of carcinogens, the TRI’s list of PBT’s, or any number of other lists have been assembled by environmental and health authorities.

    The critical missing ingredient in all of these lists is whether or not people are exposed to any of the chemicals on them.

    Last week, Maine published its list of 1,700 chemicals of high concern, derived from 12 authoritative lists of chemical hazards from around the world. This list sends an important signal to the industry that some compounds are in the cross hairs, particularly persistent bioaccumulators. But none of these 12 source lists provide any meaningful information on whether people are exposed to these chemicals.

    To set priorities for action we need authoritative data on which chemicals are in people.

    A review of the science from the CDC, our own work at EWG, and reports from many other scientists from around the world have produced a relatively short list of currently used chemicals found in people. But for most compounds on the priority lists in Canada, Maine, and elsewhere there are no reliable data at all on human exposure.

    Until we demand human exposure data on chemicals of concern we will be stuck with long and not very useful lists of chemicals with known hazard characteristics, but no sound basis for moving forward to protect the public from the chemicals that present the greatest risk to human health.

    Richard Wiles

  4. Posted August 7, 2009 at 11:59 am | Permalink

    Richard:
    Thanks for your thoughts, with which I agree. A cornerstone of chemicals policy reform must be development of sound use and exposure information on chemicals in commerce, which otherwise is virtually always the weakest link. Relying on supposition and simplistic assumptions, rather than empirical data, about what uses of chemicals actually lead to exposure has led us astray time and again. That’s why we support both comprehensive use reporting and a significantly expanded biomonitoring program as key elements of chemicals policy reform.
    Best,
    Richard Denison