More than weather heating up in DC: Rush-Waxman House bill puts TSCA reform back on front burner

Richard Denison, Ph.D., is a Senior Scientist.

We’ve just moved another step closer to protecting Americans and our environment from dangerous chemicals.

The Toxic Chemicals Safety Act of 2010 (H.R. 5820) has been formally introduced by Congressmen Bobby Rush (D-IL) and Henry Waxman (D-CA).  The legislation would implement a top-to-bottom overhaul of the outmoded and ineffectual 1976 Toxic Substances Control Act (TSCA). 

Environmental Defense Fund, along with the 250 organizations that comprise the Safer Chemicals, Healthy Families coalition, welcome the legislation and promise a vigorous campaign to advance it in Congress, along with companion legislation, the Safe Chemicals Act of 2010, sponsored by Senator Frank Lautenberg (D-NJ).

Introduction of the House bill represents the culmination of an intensive 3-month process to gather and incorporate feedback on a “discussion draft” that was introduced in mid-April.  Staff of the House Energy & Commerce Committee actively solicited and considered input from a wide array of stakeholders – all sectors of business and industry, health groups, parent groups, the religious community, animal protection organizations, labor, environmental justice and community organizations, and state and national environmental organizations.

While many of the details of the discussion draft have changed as a result, the new legislation goes a long way toward ameliorating the major structural flaws of TSCA.  The table below (updated from a version I posted when the discussion draft was released) shows how the new legislation would correct each of the major flaws in TSCA.

Currently under TSCA Under the Toxic Chemicals Safety Act of 2010
SAFETY DATA:  Few data call-ins are issued, even fewer chemicals are required to be tested and no minimum data set is required even for new chemicals. Up-front data call-ins for all chemicals would be required.  A minimum data set (MDS) on all new and existing chemicals sufficient to determine safety would be required to be developed and made public.
BURDEN OF PROOF:  EPA is required to prove harm before it can regulate a chemical. Industry would bear the legal burden of proving their chemicals are safe.
ASSESSMENT OF SAFETY:  No mandate exists to assess the safety of existing chemicals.  New chemicals undergo a severely time-limited and highly data-constrained review. Both new and existing chemicals would be subject to safety determinations as a condition of entering or remaining on the market, using the best available science that relies on the advice of the National Academy of Sciences.
SCOPE OF ASSESSMENT:  Where the rare chemical assessment is undertaken, there is no requirement to assess exposure to all sources of exposure to a chemical, or to assess risk to vulnerable populations. No guidance is provided on how to determine whether a chemical presents an “unreasonable risk.” The safety standard would require EPA to account for aggregate and cumulative exposures to all uses and sources of a chemical, and to ensure protection of vulnerable populations that may be especially susceptible to chemical effects (e.g., children, the developing fetus) or subject to disproportionately high exposure (e.g., low-income communities living near contaminated sites or chemical production facilities).
REGULATORY ACTION:  Even chemicals of highest concern, such as asbestos, have not been able to be regulated under TSCA’s “unreasonable risk” cost-benefit standard.  Instead, assessments often drag on indefinitely without conclusion or decision. Chemicals would be assessed against a health-based standard, and deadlines for decisions would be specified.  EPA would have authority to restrict production and use or place conditions on any stage of the lifecycle of a chemical needed to ensure safety.
CHEMICALS AND EXPOSURES OF HIGH CONCERN:  No criteria are provided for EPA to use to identify and prioritize chemicals or exposures of greatest concern, leaving such decisions to case-by-case judgments. EPA would develop and apply criteria to identify toxic chemicals that persist and build up in the environment and people, and promptly mandate controls to reduce use of and exposure to such chemicals.  “Hot spots” where people are subject to disproportionately high exposures would be specifically identified and addressed.
INFORMATION ACCESS:  Companies are free to claim, often without providing any justification, most information they submit to EPA to be confidential business information (CBI), denying access to the public and even to state and local government.  EPA is not required to review such claims, and the claims never expire. All CBI claims would have to be justified up front.  EPA would be required to review them, and only approved claims would stand.  Approved claims would expire after a period of time.  Other levels of government would have access to CBI.
RULEMAKING REQUIREMENTS: To require testing or take other actions, EPA must promulgate regulations that take many years and resources to develop. In addition to the MDS requirement, EPA would have authority to issue an order rather than a regulation to require reporting of existing data or additional testing.
This entry was posted in Health policy, TSCA reform and tagged , , , , . Bookmark the permalink. Both comments and trackbacks are currently closed.

2 Comments

  1. Steffen Foss Hansen
    Posted July 26, 2010 at 2:36 am | Permalink

    Dear Richard

    I know that nanomaterials is probably not the most important thing to address when it comes to revising TSCA, but I wonder what the situation is for nanomaterials and whether you believe they will be adequately covered despite the lack of a “formal” definition of nanomaterials in the proposed legislation – also, do you see any important nano-relevant differences in the two versions suggestion in the House and in the Senate, respectively and what kind of issues do you see for the future in this regard that still need to be cave out?

    Yours,

    Steffen

  2. Charli
    Posted July 28, 2010 at 2:58 pm | Permalink

    Making industrial chemicals safer is something we can all get behind. However, if we want safer chemicals and a safer environment then we must use nonanimal methods of testing.

    Currently, many toxicity tests are based on experiments in animals and use methods that were developed as long ago as the 1930’s; they and are slow, inaccurate, open to uncertainty and manipulation, and do not adequately protect human health. These tests take anywhere from months to years, and tens of thousands to millions of dollars to perform. More importantly, the current testing paradigm has a poor record in predicting effects in humans and an even poorer record in leading to actual regulation of dangerous chemicals.

    The blueprint for the development and implementation of nonanimal testing is the National Academy of Sciences report, “Toxicity Testing in the 21st Century: A Vision and a Strategy in 2007.” This report calls for a shift away from the use of animals in toxicity testing. The report also concludes that human cell- and computer-based approaches are the best way to protect human health because they allow us to understand more quickly and accurately the varied effects that chemicals can have on different groups of people. They are also more affordable and more humane.

    These methods are ideal for assessing the real world scenarios such as mixtures of chemicals, which have proven problematic using animal-based test methods. And, they’re the only way we can assess all chemicals on the market.