Climate 411

EPA’s Proposed Carbon Pollution Standards are Legally and Technically Sound

America is building cleaner cars, more efficient freight trucks, and smarter power systems.

Wind power was the top source of capacity additions for new electricity generation in 2012, with states like Oklahoma, Texas, Kansas, Iowa, Minnesota, and Colorado leading the way.

Yet even as American companies build cars that are leading the world in fuel economy and saving families money at the pump, and as innovative new wind turbines provide zero-emitting electricity for all of us and a stable income source for farmers and ranchers, the supporters of high-emitting coal power claim that it is not capable of deploying advanced technologies to cut carbon pollution.

On September 20th, the U.S. Environmental Protection Agency (EPA) proposed Carbon Pollution Standards that will provide the first nationwide limits on carbon pollution from new power plants. The Carbon Pollution Standards could be met through clean renewable energy resources or fossil fuels such as an efficient combined cycle natural gas plant or coal plants using carbon capture and storage (CCS) technology to control their carbon emissions.

But coal’s boosters have attacked the long overdue EPA standards, asserting that coal is unable to use modern technologies. Last month, Majority members of the House Energy and Commerce Committee sent a letter to EPA asking the agency to withdraw the proposed standards.  The letter argues that because three of the coal plants currently being built to use CCS receive funding under the Energy Policy Act of 2005 (EPAct), EPA cannot rely on those plants to support its determination that CCS is an adequately demonstrated technology and the best system of emission reduction for coal-fired power plants.

As this legal analysis shows, EPA’s proposal is technically and legally sound.

Although EPAct provides that an innovative technology supported under that Act cannot by itself prove that the technology is adequately demonstrated, EPA relied on a broad body of evidence beyond the three EPAct-funded plants in identifying CCS as the best system of emission reduction for coal-fired power plants.

EPA’s finding that CCS is adequately demonstrated is in line with what the power industry itself has said.  American Electric Power’s former CEO and president Mike Morris had this to say about the company’s Mountaineer CCS project in 2011:

We’re encouraged by what we saw. We’re clearly impressed with what we learned and we feel that we have demonstrated to a certainty that carbon capture and storage is in fact viable technology for the United States and quite honestly for the rest of the world going forward.

There is no time to delay our transition to a clean energy economy. The United States experienced twelve separate climate disasters in 2012 each costing over a billion dollars, and climate change continues to impact the health and wellbeing of our families and communities every day. As the success of clean energy and energy efficiency programs across our country demonstrates, the solutions are at hand. We have but to deploy them.

While coal refuses to innovate, the world is turning toward cleaner energy. Earlier this year the U.S. and World Bank announced that they would no longer finance dirty coal projects abroad. Meanwhile, the wind farms continue to crop up across America’s heartland.

As a Midwesterner, I am thankful that there is a bolder vision for America – of engineers, welders, fabricators, and inventors, working together, who know that we can and we must make clean energy our future.  For our sake, and for our children and grandchildren.

Posted in Greenhouse Gas Emissions, News, Policy / Comments are closed

Global climate change can make fish consumption more dangerous

Hundreds of thousands of babies are born in the U.S each year with enough mercury in their blood to impair healthy brain development. As they grow, these children’s capacity to see, hear, move, feel, learn and respond can be severely compromised. Why does this happen? Mostly because a portion of mercury emitted from local power plants and other global anthropogenic sources is converted to methylmercury, a neurotoxic and organic form of mercury that accumulates in fish.

In addition to poisoning human diet, mercury continues to poison the Arctic. Despite a lack of major industrial sources of mercury within the Arctic, methylmercury concentrations have reached toxic levels in many arctic species including polar bears, whales, and dolphins because of anthropogenic emissions at lower latitudes.

Relationship between mercury exposure and climate change: In its latest report to policymakers, the International Governmental Panel on Climate Change (IPCC) has made it clear that climate change and local high temperatures will worsen air pollution by increasing concentrations of ozone and PM2.5 in many regions. However, no scientific body has collectively assessed the potential impact of changing climate on mercury, a dangerous pollutant that contaminates not just our air but our soils and waters (and as a result human and wildlife’s food supply).

After attending this summer’s International Conference on Mercury as a Global Pollutant (ICMGP) in Edinburgh (Scotland), I don’t have good news. In the past few months, I have talked to several leading scientists who do research on different aspects on mercury cycle and they all seemed to agree with many recently presented and published peer-reviewed studies (see a selected list below): Climate change can significantly worsen mercury pollution. Even if global anthropogenic emission rate of mercury was to somehow be made constant, climate change can make fish-eating more dangerous because of the following:

Enhanced inorganic mercury release into waters — A combination of the following climate-related factors can lead to the release of higher amounts of mercury into waters:

  • Climate change (i.e., increased local precipitation under warmer conditions) will cause more local direct deposition of the emitted inorganic mercury on our lakes and ocean as compared to deposition under colder and dryer conditions.
  • Run-off (i.e., flow of mercury over land in a watershed that drains into one water body) an indirect but primary means by which mercury enters our local waters, will also increase under warmer and wetter conditions.
  • Extreme events (storms, hurricanes, forest-fires, tornadoes and alternating wetting-drying cycles) will cause erosive mobilization of inorganic mercury and organic matter in soils and release it into coastal and open waters where it can get methylated.
  • Thawing of the enormous areas of northern frozen peatlands may release globally significant amounts of long-stored mercury and organic matter into lakes (including those in the Arctic), rivers and ocean.

Enhanced Methylmercury production from inorganic mercury: In addition to increased release on inorganic mercury into the waters, the inorganic mercury might also have higher chances of getting converted to methylmercury.

  • In the open ocean, methylmercury is produced in regions known as “oxygen minimum zones”. Increased carbon dioxide concentrations in the atmosphere will cause higher primary productivity  which will widen the existing ocean’s oxygen deficient zones leading to enhanced production of methylmercury.
  • Continued melting of permafrost will release organic matter which naturally contains high concentration of aromatic structures (structures similar to benzene rings). These kinds of organic matter have been shown to enhance the production rate of methylmercury.

Enhanced methylmercury bioaccumulation in the fish:

  • For a given amount of methylmercury in the water, there are various factors that control the concentration and bioaccumulation of methylmercury in the food chain. In a given water body, bigger fishaccumulate more methylmercury than smaller fish. Because of climate change, oceanic temperatures will be higher and higher temperatures have been shown to increase the metabolic growth rate and size of fish. Therefore, for a given amount of inorganic mercury emitted in the atmosphere or water, more methylmercury will accumulate in the fish (consequently, increase human exposure to methylmercury) as climate change becomes more severe.

These research results combined with the recent reports on higher genetic susceptibility of some people to mercury poisoning suggest that in order to protect human and wildlife health from negative effects of methylmercury exposure it is essential to swiftly enact and implement stringent laws to reduce both global mercury and greenhouse emissions from all major sources including coal power plants.

Governments across the globe now recognize that mercury is an extremely toxic metal that harms health of millions of children and adults every year and have moved forward with an international treaty to address this toxic pollution, called the Minamata convention. The Minamata convention was recently opened for signatures after 4 years of negotiations. The treaty will come into effect as soon as the 50th nation ratifies it. It has already been signed by 93 nation-states. I am happy to note that United States has been the first nation to ratify the treaty. We await , however, ratification from 49 more countries before the treaty can go into effect.

As an organization, EDF has been educating consumers and seafood businesses about mercury in seafood via our EDF Seafood Selector by doing quantitative Synthesis of Mercury in Commercial Seafood for many years. We also have expertise on the scientific, legal, and stakeholder processes that laid the groundwork for implementation of Mercury and Air Toxics Standards in the U.S; the health and economic implications of these emission standards; and the current state of technology available to reduce emissions from power plants in the U.S.

Thanks to your strong support, the U.S. has taken action to reduce mercury from power plants, the largest domestic source of mercury pollution. While many power plant companies are moving forward with investments to reduce mercury pollution, we need you to continue making your voices heard because the mercury standards (MATS) are still being challenged in the court from time to time.

References

  1. Kathryn R. Mahaffey, Robert P. Clickner, and Rebecca A. Jeffries (2009) Adult Women’s Blood Mercury Concentrations Vary Regionally in the United States: Association with Patterns of Fish Consumption (NHANES 1999–2004) Environ Health Perspect. 117(1): 47–53.
  2. Goacher, W. James and Brian Branfireun (2013). Evidence of millennial trends in mercury deposition in pristine peat geochronologies. Presented at the 11th International Conference on Mercury as a Global Pollutant; Edinburgh, Scotland.
  3. Dijkstra JA, Buckman KL, Ward D, Evans DW, Dionne M, et al. (2013) Experimental and Natural Warming Elevates Mercury Concentrations in Estuarine Fish. PLoS ONE 8(3): e58401. doi:10.1371/journal.pone.0058401
  4. Webster, Jackson P. et al. (2013) The Effect of Historical and Recent Wildfires on Soil-Mercury Distribution and Mobilization at Mesa Verde National Park, Colorado, USA. Presented at the 11th International Conference on Mercury as a Global Pollutant; Edinburgh, Scotland.
  5. Blum et al (2013) Methylmercury production below the mixed layer in the North Pacific Ocean Nature Geoscience 6, 879–884
  6. Stramma, Lothar (2010) “Ocean oxygen minima expansions and their biological impacts,” Deep Sea Research Part I: Oceanographic Research Papers. 57: 587–595
  7. Bjorn, Erik et al. (2013) Impact of Nutrient and Humic Matter Loadings on Methylmercury Formation and Bioaccumulation in Estuarine Ecosystems. Presented at the 11th International Conference on Mercury as a Global Pollutant; Edinburgh, Scotland.
  8. Bedowski, Jacek et al. (2013) Mercury in the coastal zone of Southern Baltic Sea as a function of changing climate: preliminary results. Presented at the 11th International Conference on Mercury as a Global Pollutant; Edinburgh, Scotland.
  9. Grandjean, Philippe, et al. (2013) Genetic vulnerability to MeHg. Presented at the 11th International Conference on Mercury as a Global Pollutant; Edinburgh, Scotland.
  10. Qureshi et al (2013): Impacts of Ecosystem Change on Mercury Bioaccumulation in a Coastal-Marine Food Web presented at the 11th International Conference on Mercury as a Global Pollutant; Edinburgh, Scotland.
Posted in Health, Policy, Science / Read 2 Responses

UN talks produce a strong agreement on forest protection, but otherwise déjà vu

This post originally appeared on EDF’s Climate Talks blog. 

Around midnight on Friday, November 25 – several hours after the annual UN climate conference was scheduled to have ended – I stood in the hallway of a temporary conference center erected on the soccer pitch of the National Stadium in Warsaw, watching the scrum of the climate talks in their final hours.

nat_keohane-377x287

Nat Keohane is EDF’s Vice President for International Climate and a former economic adviser to the Obama administration.

NGO representatives were pitching stories and sharing intelligence with reporters, negotiators were huddling in groups or dashing off to last-minute bilateral meetings, and everyone was scrounging for coffee or late-night sandwiches to power another all-nighter.

The talks appeared on the brink of failure as countries deadlocked over the core questions of which countries should be obligated to reduce emissions and who should pay for it. In the end, as nearly always happens, an agreement was reached and the talks didn’t fall apart. That has become a typical pattern at these annual UN talks.

If the scene was familiar, the headlines that came out of the talks were familiar as well: Developing Nations Stage Protest at Climate Talks (NY Times); UN presses rich nations to act on climate funds (FT); Modest deal breaks deadlock at UN climate talks (AP); UN talks limp towards global 2015 climate deal (Reuters); Climate Finance Battle Shows Expectation Gap at UN Talks (Bloomberg).

But despite the dulling sense of déjà vu that Friday night in Warsaw, there was already reason for celebration. That’s because earlier that same evening – in a break with past years – the Conference of the Parties (or COP, as the talks are formally labeled) had already held the first part of its closing plenary to formally adopt decisions on areas in which negotiators could agree.

During that session, the COP agreed on a comprehensive agreement on Reducing Emissions from Deforestation and forest Degradation (REDD+) – leading to what the UN, countries, media outlets and NGOs all identified as a bright spot in the negotiations.

Forest protection remains a crucial part of the climate action toolkit

With deforestation responsible for about 15% of the world’s manmade greenhouse gas emissions – that’s more than all the cars and trucks in the world – we can’t solve climate change without saving our forests. REDD+ creates economic incentives to reward countries and jurisdictions that reduce emissions from deforestation and degradation below rigorously defined baselines.

The Warsaw Framework for REDD+ Action, as it’s formally known, sets down deep roots for REDD+, and sends a clear signal that it will continue to be a crucial tool for protecting forests and the people who depend on them, by:

  1. ensuring a rigorous, transparent framework for measuring emissions reductions from reduced deforestation;
  2. affirming that financial flows will be “results-based,” meaning that REDD+ compensation will be tied to demonstrated results; and
  3. creating a structure for forest nations to share views on the effectiveness of REDD+ implementation.

The REDD+ outcome was a “big step forward,” my colleague and EDF REDD+ expert Chris Meyer told E&E News, explaining:

We had a foundation for the house; now we have the walls, the plumbing, the electricity and the roof for REDD+.

On the issue of forest protection, at least, the UN talks did exactly what they are supposed to do: they reaffirmed work that had been done in previous years, built upon it in negotiating sessions held over the past twelve months, and made the final push to resolve key issues of disagreement in the two weeks of talks in Warsaw.

This comprehensive package of decisions provides a structure for countries to develop REDD+ programs at a national level, and take advantage of the approximately $700 million per year already pledged for REDD+ program preparation and to pilot results-based payments.

The REDD+ agreement also opens a path for the International Civil Aviation Organization and other bodies that are considering developing market-based mechanisms, whether multi-lateral, national or regional, to bring REDD+ into their systems with an imprimatur of a multilateral standard.

Beyond REDD+, little formal progress

Outside of REDD+, the talks were notable more for what didn’t happen than what did. The talks didn’t make significant progress, although they managed not to collapse.

With two years until a new agreement is supposed to be reached in Paris, countries didn’t set a clear template for what they need to announce in terms of emissions reductions targets, or when they need to announce the targets. Nor did they make much progress on the key issue of climate finance – although surprisingly constructive talks on the difficult issue of compensating the world’s most vulnerable countries for the impacts of climate change reached a compromise agreement to create the Warsaw International Mechanism on Loss and Damage to address the issue going forward.

On two important but lower-profile issues, there appeared to be signs of common ground behind closed doors – but these didn’t translate into movement in the formal negotiations.

On the issue of agriculture, useful conversations occurred that could help integrate agriculture into a more holistic discussion of the role of the land sector in responding to climate change, even if no formal progress were made in the context of these negotiations.

On the critical question of how to construct an international climate architecture that promotes and supports ambitious national action through carbon markets, countries put some useful options on the table – but could not reach a decision, instead deferring further discussion until next June.

To be sure, we never expected much to happen at these Warsaw talks. They were always going to be more about headaches than headlines.

But it’s hard to escape the sense that countries spent two weeks reopening issues that we thought had been resolved and fighting the same battles that have been fought before, only to make a last-minute lunge in the final hours to finish barely ahead of where they started.

A good example is on the key question of participation. Since the 1992 UN Framework Convention on Climate Change, which listed the world’s advanced economies in an appendix or “annex,” the distinction between “Annex I” and “Non-Annex I” countries has been a central point of contention. Five years later, the Kyoto Protocol assigned emissions reductions only to “Annex I” countries. Eliminating the so-called “Kyoto firewall” has been a red line of the U.S. and other advanced economies, which point to the rapid growth in major emerging economies such as China and India, and the concomitant rise in their greenhouse gas emissions.

In 2011, at the UN talks in Durban, South Africa, countries declared that a new agreement, to be finalized in Paris in 2015, would be “applicable to all Parties” – a phrase widely understood to mean that the Annex I/Non-Annex I distinction would be erased. But the first draft of the negotiating text in Warsaw hardly referred to Durban and instead used the different term “broad participation.” That opening salvo didn’t last, and the final text reaffirmed the Durban agreement – but not before significant energy had gone into re-fighting that battle.

The world outside the UN talks

With little to show for their two weeks of long days and all-nighters, negotiators have left themselves a lot to do over the next two years to reach a meaningful outcome in Paris.

However, countries and other actors don’t need to wait for an international agreement in 2015 to start addressing climate change. It was clear, through events on the sidelines of the negotiations and conversations with other attendees at the conference, that cities, states, countries and regions around the world have already started moving to cut their emissions and adapt to climate change.

Some of the most interesting side events highlighted the progress made in China on provincial carbon trading pilots and explored how the Chinese experiments could learn from California’s experience in building a successful carbon market. And the Climate and Clean Air Coalition – a group of more than 70 state and nonstate partners working together to reduce short-lived super-pollutants like methane, black carbon, and HFCs – also announced important progress. Those side events were a reminder that the UN talks, while they remain important, are not the only game in town.

That’s a good thing, and a reason for optimism. Because with the damaging impacts of climate change already apparent in the United States and around the world, the world urgently needs near-term action to turn the corner on global emissions and put us on a downward trajectory toward climate safety.

Also read EDF’s press release on the outcome of the Warsaw negotiations: Strong agreement to protect forests highlight of UN climate talks.

Posted in International, News, Policy / Comments are closed

Correcting the maths of the “50 to 1 Project”

A nine-minute video, released earlier this fall, argues that climate mitigation is 50 times more expensive than adaptation. The claims are based on calculations done by Christopher Monckton. We analyzed the accompanying “sources and maths” document. In short, the author shows a disconcerting lack of understanding of climate science and economics:

  1. Fundamental misunderstanding of basic climate science: Pre-industrial levels of carbon dioxide (CO2) were at around 280 parts per million (ppm).[i] One of the most commonly stated climate policy goals is to keep concentrations below 450 ppm CO2. Monckton, oddly, adds 280 and 450 to get to 730 ppm as the goal of global stabilization efforts, making all the rest of his calculations wildly inaccurate.
  2. Prematurely cutting off analysis after ten years: Monckton calculates the benefits of the carbon tax over a ten-year time horizon. That is much too short to see the full effects of global warming or of the policy itself. Elevated carbon levels persist for hundreds to thousands of years.[ii]
  3. Erroneously applying Australian “cost-effectiveness” calculation to the world: This may be the most troubling aspect from an economist’s point of view. Monckton first calculates the effect of the Australia-only tax on global temperatures, which is unsurprisingly low, as Australia accounts for only 1.2% of world emissions. Next, he calculates the tax’s resulting “cost-effectiveness” — defined as the Australian tax influencing global temperatures. No surprise once again, that influence is there, but Australia alone can’t solve global warming for the rest of us. Then, Monckton takes the Australia-only number and scales it to mitigate 1ºC globally, resulting in a purported cost of “$3.2 quadrillion,” which he claims is the overall global “mitigation cost-effectiveness.” But this number simply represents the cost of avoiding 1ºC of warming by acting in Australia alone. Monckton has re-discovered the fact that global warming is a global problem! The correct calculation for a globally applied tax would be to calculate cost-effectiveness on a global level first. If Australia’s carbon price were to be applied globally, it would cut much more pollution at a much lower cost. And that, of course, is very much the hope. Australia, California, and the European Union are called “climate leaders” for a reason. Others must follow.

What’s the real cost of cutting carbon? The U.S. government’s estimate of the cost of one ton of CO2 pollution released today is about $40.[iii] That’s also the optimal price to make sure that each of us is paying for our own climate damages. Any policy with a lower (implied) carbon price—including the Australian tax—easily passes a benefit-cost test.

With all due respect Lord Monckton, 3rd Viscount of Brenchley, your maths are way off.


[i] “Summary for Policymakers,” IPCC Fifth Assessment Report, Working Group I (2013).

[ii] Results differ across scenarios, but a rough rule of thumb suggests that approximately 70% of the ‘peak enhancement level’ over the preindustrial level of 280 ppm perseveres after 100 years of zero emissions, while approximately 40% of the ‘peak enhancement level’ over the preindustrial level of 280 ppm persevered after 1,000 years of zero emissions (Solomon, Susan, Gian-Kasper Plattner, Reto Knutti and Pierre Friedlingstein, “Irreversible climate change due to carbon dioxide emissionsProceedings of the National Academy of Sciences 106, no. 6 (2009): 1704-1709). Note that this refers to the net increase in carbon dioxide in the atmosphere, not the exact molecule. Archer, David, Michael Eby, Victor Brovkin, Andy Ridgwell, Long Cao, Uwe Mikolajewicz, Ken Caldeira et al. “Atmospheric lifetime of fossil fuel carbon dioxide.” Annual Review of Earth and Planetary Sciences 37 (2009): 117-134 discusses these two often confused definitions for carbon’s ‘lifetime,’ and concludes that 20-40% of excess carbon levels remain hundreds to thousands of years (“2-20 centuries”) after it is emitted. Each carbon dioxide molecule has a lifetime of anywhere between 50 to 200 years, according to the U.S. Environmental Protection Agency’s “Overview of Greenhouse Gases: Carbon Dioxide Emissions.” The precise number is under considerable scientific dispute and surprisingly poorly understood. (Inman, Mason, “Carbon is forever,” Nature Reports Climate Change 20 November 2008)

[iii] The precise value presented in Table 1 of the Technical Update of the Social Cost of Carbon for Regulatory Impact Analysis Under Executive Order 12866 for a ton of carbon dioxide emitted in 2015, using a 3% social discount rate increased is $38. For 2020, the number is $43; for 2030, the number increases to $52. All values are in inflation-adjusted 2007 dollars. For a further exploration of this topic, see Nordhaus, William D. The Climate Casino: Risk, Uncertainty, and Economics for a Warming World. Yale University Press (2013) as only one of the latest examples summarizing this kind of analysis. Nordhaus concludes that the optimal policy, one that maximizes net benefits to the planet, would spend about 3% of global GDP.

Many thanks to Michelle Ho for excellent research assistance.

Posted in Basic Science of Global Warming, Economics, International / Comments are closed

IPCC mention of geoengineering, though brief, opens window for discussion

The IPCC’s latest report includes a brief mention of geoengineering — a range of techniques for reducing global warming through intervention in the planet’s climate system. (Photo credit: NASA)

(Originally posted yesterday on EDF’s Climate Talks blog)

Just a few weeks ago, the United Nations Intergovernmental Panel on Climate Change (IPCC) released the first piece of their fifth crucial report on global warming – and it confirms that our climate is changing. Key messages from the report include:

  • Warming of the climate is unequivocal
  • Human influence on the climate system is clear, and the evidence for human influence has only increased since the last IPCC report
  • Further changes in temperature, precipitation, weather extremes, and sea level are imminent

In short, humans are causing dramatic climate change—and we’re already witnessing the effects. Oceans are warming and acidifying. Weather patterns are more extreme and destructive. Land-based ice is declining—and leading to rising sea levels.

None of this should be surprising to those following the science of climate change. What has generated surprise amongst some, however, is the IPCC’s brief mention of the science of geoengineering, tucked into the last paragraph of the IPCC’s 36-page “Summary for Policymakers.”

Understanding the science of geoengineering

As communities and policymakers around the world face the risks presented by a rapidly changing climate, interest in the topic of “geoengineering” is growing.

Geoengineering refers to a range of techniques for reducing global warming through intervention in the planet’s climate system, by removing carbon dioxide from the atmosphere (carbon dioxide removal, or CDR) or by reflecting away a small percentage of inbound sunlight (solar radiation management, or SRM).

Some of these ideas have been proposed by scientists concerned about the lack of political progress in curbing the continued growth in global carbon emissions, and who are looking for other possibilities for addressing climate change if we can’t get emissions under control soon.

With the risks and impacts of rising temperatures already being felt, the fact that SRM would likely be cheap to deploy and fast-acting means that it has attracted particular attention as one possible short-term response to climate change.

The world’s governments tasked the IPCC with investigating these emerging technologies in its new report, and the IPCC summary rightly sounds a cautionary note on their potential utility, warning:

Limited evidence precludes a comprehensive quantitative assessment of both Solar Radiation Management (SRM) and Carbon Dioxide Removal (CDR) and their impact on the climate system…

Modelling indicates that SRM methods, if realizable, have the potential to substantially offset a global temperature rise, but they would also modify the global water cycle, and would not reduce ocean acidification. If SRM were terminated for any reason, there is high confidence that global surface temperatures would rise very rapidly to values consistent with the greenhouse gas forcing. CDR and SRM methods carry side effects and long-term consequences on a global scale.

So what does this mean? Three things are clear from the IPCC’s brief analysis:

  1. CDR and SRM might have benefits for the climate system, but they also carry risks, and at this stage it is unknown what the balance of benefits and risks may be.
  2. The overall effects of SRM for regional and global weather patterns are likely to be uncertain, unpredictable, and broadly distributed across countries. As with climate change itself, there would most likely be winners and losers if SRM technologies were to be used.
  3. Finally, and perhaps most importantly, SRM does not provide an alternative to reducing greenhouse gas emissions, since it does not address the rising emissions that are the root cause of ocean acidification and other non-temperature related climate change impacts.

This last point is particularly important. The most that could be expected from SRM would be to serve as a short-term tool to manage some temperature-related climate risks, if efforts to reduce global greenhouse gas emissions prove too slow to prevent severe disruption of the earth’s climate.

In that case, we need to understand what intervention options exist and the implications of deploying them. In other words, ignorance is our enemy.

Need for inclusive and adaptive governance of solar radiation management research

While much of the limited research on solar radiation management has taken place in the developed world – a trend likely to continue for the foreseeable future – the ethical, political, and social implications of SRM research are necessarily global. Discussions about governance of research should be as well.

But a transparent and transnationally agreed system of governance of SRM research (including norms, best practices, regulations and laws) does not currently exist. With knowledge of the complex technical, ethical, and political implications of SRM currently limited, an effective research governance framework will be difficult to achieve until we undertake a broad conversation among a diversity of stakeholders.

Recognizing these needs, The Royal Society, Environmental Defense Fund (EDF), and TWAS (The World Academy of Sciences) launched in 2010 an international NGO-driven initiative to explore how SRM research could be governed. SRMGI is neither for nor against SRM. Instead, it aims to foster inclusive, interdisciplinary, and international discussion on SRM research and governance.

SRMGI’s activities are founded on a simple idea: that early and sustained dialogue among diverse stakeholders around the world, informed by the best available science, will increase the chances of SRM research being handled responsibly, equitably, and cooperatively.

Connecting dialogues across borders

A key goal is to include people in developing countries vulnerable to climate change and typically marginalized in discussions about emerging science and technology issues, to explore their views on SRM, and connect them in a transnational conversation about possible research governance regimes.

This month, for example, saw the launch of a report by the African Academy of Sciences and SRMGI describing the results from a series of three SRM research governance workshops held in Africa in 2012 and 2013. Convened in Senegal, South Africa, and Ethiopia, the workshops attracted more than 100 participants – including scientists, policymakers, journalists and academics – from 21 African nations to explore African perspectives on SRM governance.

To build the capacity for an informed global dialogue on geoengineering governance, a critical mass of well-informed individuals in communities throughout the world must be developed, and they must talk to each other, as well as to their own networks. An expanding spiral of distinct, but linked outreach processes could help build the cooperative bridges needed to manage potential international conflicts, and will help ensure that if SRM technologies develop, they do so cooperatively and transparently, not unilaterally.

The way forward

No one can predict how SRM research will develop or whether these strategies for managing the short-term implications of climate risk will be helpful or harmful, but early cooperation and transnational, interdisciplinary dialogue on geoengineering research governance will help the global community make informed decisions.

With SRM research in its infancy, but interest in the topic growing, the IPCC report reminds us that now is the time to establish the norms and governance mechanisms that ensure that where research does proceed, it is safe, ethical, and subject to appropriate public oversight and independent evaluation.

It’s worth remembering that the IPCC devoted only one paragraph of its 36-page summary report to geoengineering. So while discussion about geoengineering technologies and governance is necessary, the key message from the IPCC must not be lost: it’s time to recognize that the billions of tons of carbon pollution we put in our atmosphere every year are causing dangerous changes to our climate, and work together to find the best ways to reduce that pollution.

Posted in Geoengineering, Greenhouse Gas Emissions, News, Science / Comments are closed

Setting the Record Straight — What this Week’s Supreme Court Order Really Means

This week the Supreme Court denied numerous legal attacks seeking further judicial review of the Environmental Protection Agency’s (EPA) determination that greenhouse gas emissions are dangerous to human health and welfare, and of other key aspects of EPA’s first generation of climate policies.

The Court agreed to hear arguments on one narrow issue, relevant to one specific Clean Air Act permitting program.

This marked the end of the road for years of sustained industry attacks on the scientific and legal foundation for addressing climate pollution under the Clean Air Act. This was a tremendous victory for science and the rule of law.

But some media reporting suggested just the opposite.

This was the lead of USA Today’s story:

Dealing a potential blow to the Obama administration and environmentalists, the Supreme Court agreed Tuesday to consider limiting the Environmental Protection Agency’s power to regulate greenhouse gases.

(We don’t mean to single out USA Today, which has a well-deserved reputation for excellent environmental reporting. Other media coverage was also confusing. We have more examples at the end of this post.)

Given all that, it seems like it might be helpful to look at the facts of what the Court did and did not do:

Fact One

Industry lawyers threw every attack they could think of at EPA’s science-based finding that greenhouse gas emissions endanger the public health and welfare of current and future generations due to intensifying smog levels, floods, drought, wildfires, and other dangerous climate impacts. The Supreme Court rejected every single industry challenge to the Endangerment Finding.

What this means

This is the end of the road for more than four years of industry regulatory, procedural, and legal attacks on the Endangerment Finding. The End.

But it means more than that. The reason why fossil fuel interests have been so desperate to discredit the Endangerment Finding is because it is the cornerstone for controlling climate pollution under the Clean Air Act — not just for the Clean Car Standards, but also for the forthcoming Carbon Pollution Standards for new and existing power plants and other major sources.

EPA’s Endangerment Finding reflects a vast body of peer-reviewed scientific research by thousands of scientists. Attempts to attack it through litigation have failed. This is a tremendous moment, and an unmistakable sign of the strength of the legal foundation for controlling climate pollution from cars and trucks, power plants, and other major sources under the Clean Air Act.

Fact Two  

The Supreme Court denied every legal challenge seeking review of the Clean Car Standards.

What this means

The landmark Clean Car Standards were strongly supported by U.S. automakers and the United Auto Workers. The Association of Global Automakers and the Alliance of Automobile Manufacturers helped to defend them in court.

These standards, combined with the second generation Clean Car Standards, mean the U.S. will achieve a fleet-wide average of 54.5 mpg by 2025, cut greenhouse gas pollution by six billion tons, avoid 12 billion barrels of oil imports, and save consumers $1.7 trillion at the gas pump — an average of $8,000 per vehicle by 2025.

Fact Three

The Supreme Court did grant review of a narrow question relevant to one specific (and important) Clean Air Act permitting program — did the regulation of greenhouse gases under the clean car program also make greenhouse gases regulated under the program requiring pre-construction review permits for major stationary pollution sources.

What this means

We believe that the Clean Air Act is clear — on its face — that this permitting program applies to all pollutants, as EPA has implemented it.  We will vigorously defend this interpretation in front of the Supreme Court, and we believe that we will succeed.

Moreover, even some petitioners have recognized — as did U.S. Court of Appeals Judge Kavanaugh in his dissent below — that even if the permit program were limited in the way they assert, the requirement to adopt the best pollution controls for greenhouse gases would still apply to sources that are required to obtain permits due to their emissions of other airborne contaminants regulated under national ambient air quality standards.

What this does NOT mean

The question being reviewed by the Supreme Court is important. But it does not have any effect on the programs going forward to address carbon pollution from the two largest sources in our nation — power plants, under the forthcoming Carbon Pollution Standards, and transportation, under the Clean Car Standards.

Bottom Line

The Obama Administration’s vital plan to protect our communities and families from climate change has NOT been called into question by the Supreme Court’s review of one question related to the permitting program for major stationary sources of emissions.

By rejecting every petition challenging the Endangerment Finding and the Clean Car Standards, the Court has yet again indicated that EPA is fulfilling its statutory duty in addressing greenhouse gas emissions under the Clean Air Act.

Building on this firm foundation, EPA has a responsibility to protect Americans’ health and well-being from the threat of climate change. That includes establishing limits on carbon pollution from power plants — the single largest source of climate destabilizing emissions in our nation.

 

(As mentioned above, here are other examples of confusing media coverage from Tuesday morning)

The Supreme Court on Tuesday said it would consider challenges to the Environmental Protection Agency’s permitting requirements for power plants and other facilities that emit large amounts of greenhouse gases, throwing the Obama administration’s regulations into a state of uncertainty. (emphasis is ours)

  • Wall Street Journal (available by subscription only)

The hearings, set for next year, could allow the Court to scale back the Obama Administration’s climate regulations at a time when the chance of passing legislation to limit carbon emissions—long the preferred route of the White House and most environmental groups—seems virtually nil. (emphasis is ours)

At issue is whether the federal Environmental Protection Agency can tighten emission standards for stationary greenhouse gas sources, such as power plants, in what the government says is an effort to stem the effects of global warming. (emphasis is ours)

Posted in Clean Air Act, Greenhouse Gas Emissions, News, What Others are Saying / Comments are closed