Climate 411

New Climate-Economic Thinking

By Gernot Wagner and Martin Weitzman

Each ton of carbon dioxide emitted into the atmosphere today causes about $40 worth of damages. So at least says standard economic thinking.

A lot goes into calculating that number. You might call it the mother of all benefit-cost analyses. It’s bean-counting on a global scale, extending out decades and centuries. And it’s a process that requires assumptions every step along the way.

The resulting $40 figure should be taken for what it is: the central case presented by the U.S. Government Interagency Working Group on Social Cost of Carbon when using its preferred 3% discount rate for all future climate damages. But it is by no means the full story.

Choose a different discount rate, get a different number. Yale economist Bill Nordhaus uses a discount rate of slightly above 4%. His resulting price is closer to $20 per ton of carbon dioxide. The Stern Review on the Economics of Climate Change uses 1.4%. The resulting price per ton is over $80.

And the discount rate is not the only assumption that makes this kind of a difference. In Climate Shock, we present the latest thinking on why and how we should worry about the right price for each ton of carbon dioxide, and other greenhouse gases, emitted into the atmosphere. There are so many uncertainties at every step—from economic projections to emissions, from emissions to concentrations, from concentrations to temperatures, and back to economics in form of climate damages—that pointing to one single, final number is false precision, misleading, or worse.

Of course, that does not mean that we shouldn’t attempt to make this calculation in the first place. The alternative to calculating the cost of carbon is to use a big fat zero in government benefit-cost calculations. That’s clearly wrong.

Most everything we know about what goes into calculating the $40 figure leads us to believe that $40 is the lower bound for sensible policy action. Most everything we know that is left out would push the number higher still, perhaps much higher.

As just one example, zero in on the link between carbon concentrations in the atmosphere and eventual temperature outcomes. We know that increasing concentrations will not decrease global temperatures. Thank you, high school chemistry and physics. The lower bound for the temperature impact when carbon concentrations in the atmosphere double can be cut off at zero.
In fact, we are pretty sure it can be cut off at 1°C or above. Global average temperatures have already warmed by over 0.8°C, and we haven’t even doubled carbon concentrations from preindustrial levels. Moreover, the temperature increases in this calculation should happen ‘eventually’—over decades and centuries. Not now.

What’s even more worrying is the upper tail of that temperature distribution. There’s no similarly definitive cut-off for the worst-case scenario. In fact, our own calculations (based on an International Energy Agency (IEA) scenario that greenhouse gas concentrations will end up around 700 parts per million) suggest a greater-than-10% chance of eventual global average warming of 6°C or above.

Focus on the bottom row in this table. If you do, you are already ahead of others, most of whom focus on averages, here depicted as “median Δ°C” (eventual changes in global average surface temperatures). The median is what we would expect to exceed half the time, given particular greenhouse gas concentrations in the atmosphere. And it’s bad enough.

But what really puts the “shock” into Climate Shock is the rapid increase in probabilities of eventual temperatures exceeding 6°C, the bottom row. While average temperatures go up steadily with rising concentrations, the chance of true extremes rises rapidly:

That 6°C is an Earth-as-we-know-it-altering temperature increase. Think of it as a planetary fever. Normal body temperatures hover around 37°C. Anything above 38°C and you have a fever. Anything above 40°C is life-threatening.

Global average warming of 3°C wouldn’t be unprecedented for the planet as a whole, in all of it geological history. For human society, it would be. And that’s where we are heading at the moment—on average, already assuming some ‘new policies’ to come into play that aren’t currently on the books.

It’s the high-probability averages rather than low-probability extremes that drive the original $40 figure. Our table links greenhouse gas concentrations to worryingly high probability estimates for temperatures eventually exceeding 6°C, an outcome that clearly would be catastrophic for human society as we know it.

Instead of focusing on averages then, climate ought to be seen as a risk management problem. Some greenhouse gas concentration thresholds should simply not be crossed. The risks are too high.

This kind of focus on temperature extremes is far from accepted wisdom. We argue it ought to be.

Gernot Wagner and Martin L. Weitzman are co-authors of Climate Shock (Princeton University Press, 2015). This post was originally published by The Institute for New Economic Thinking.

Also posted in Economics, Energy / Comments are closed

Experts Agree: We Can Preserve Electric Reliability While Protecting Public Health Under the Clean Power Plan

power-poles-503935_1920Last June, the Environmental Protection Agency (EPA) proposed the first ever national carbon pollution standards for existing power plants. Fossil fuel-fired power plants account for almost 40% of U.S. carbon dioxide emissions, making them the largest source of greenhouse gas emissions in the nation and one of the single largest categories of greenhouse gas sources in the world.

Under the Clean Power Plan, these emissions will decline to 30% below 2005 levels by 2030 – accompanied by a significant decline in other harmful pollutants from the power sector, such as sulfur dioxide and oxides of nitrogen. The power sector is already halfway to this target, already 15% below 2005 levels.

The EPA has carefully designed the Clean Power Plan to provide extensive flexibility so that states and power companies can continue to deliver a steady flow of electricity while deploying cost-effective measures to reduce carbon pollution over the next fifteen years.

The Clean Power Plan:

  • Allows states and power companies to determine the optimal timing of emission reductions over a ten year-long averaging period starting in 2020;
  • Allows states to decide how to most cost-effectively reduce carbon pollution, including through market-based programs and clean energy policies that have been successfully used around the country; and
  • Allows states to cooperate with one another in complying with the long-term reduction goals.

In addition, the Clean Power Plan preserves the ability of grid operators to deploy long-standing tools and processes that have been successfully used in the past to keep the electric grid functioning reliability during periods of significant change. EDF has released a white paper identifying these well-established tools and practices, and describing how they will continue to ensure a reliable grid under the Clean Power Plan.

Grid operators are well-equipped to ensure reliability as we transition to a cleaner and more efficient power sector, just as they have under all previous Clean Air Act regulations. EPA’s proposed Clean Power Plan is eminently achievable, reliable, and cost-effective – and integral to our climate security, human health and prosperity.

Ample tools and practices exist to ensure a clean and reliable grid

Grid operators have long-standing tools and practices available to ensure that our nation’s grid continues to provide power reliably. These include well-established planning principles that have motivated large amounts of new generation year in, year out. Since 2000, roughly 30 gigawatts of new generation have been added per year, largely consisting of low or zero-emitting resources such as wind turbines and natural gas combined cycle power plants. Over the next two years, the solar industry alone expects to add another 20 gigawatts of power. In addition, reliability is ensured through tools and practices including:

  • Transmission Upgrades: Because upgraded transmission infrastructure can help move generation more easily, transmission upgrades can enhance reliability without needing to add new generation.
  • Long-term forecasting: Grid planners and reliability regulators forecast the needs of the electric grid years in advance. By determining how much transmission and generation will be needed, any long-term reliability issue can be identified and resolved quickly and effectively.
  • Reliability Must-Run (“RMR”) Contracts: Short term contracts that, in the case of sudden and unexpected retirements or plant losses, require a unit to be kept operational until reliability can be ensured through the use of longer term tools.
  • Operating Procedures: Manuals and standard practices exist to ensure that, in the case of particular reliability scenarios, grid operators know the best way to respond.

These tools are already in use throughout the country, and have proven extremely effective in maintaining reliability over the last few decades – even as the power sector has begun a rapid transition towards cleaner sources of electricity, and has implemented important public health protections under the Clean Air Act. In the Mid-Atlantic region, for example, roughly 12,500 MW of coal-fired power plant capacity retired from 2010 to 2014 due to economic reasons. Employing these well-established tools and practices, the region saw a large quantity of new resources added, without reducing reliability.

Clean energy resources and reliability

In complying with the Clean Power Plan, states and power companies will be able to draw on reliable, low-cost clean energy resources like demand response, renewable energy, and energy efficiency. Energy efficiency is almost three times cheaper than the next cheapest alternative and primed for enormous growth. Resources like demand response help prevent blackouts, such as in the case of the 2013 polar vortex. And renewable energy continues to grow, with states such as Maine, California, and Iowa already using it to meet close to one quarter of their entire demand.

No reliability crisis has resulted from implementing clean air standards

Claims that we can’t have clean air and a reliable power grid are as old as the Clean Air Act itself — and have never proven accurate. As far back as the 1970s, a power company issued an ad claiming the lights would go out as a result of the Clean Air Act. In recent years, some power companies that oppose public health protections under the Clean Air Act have made similar claims that the Mercury and Air Toxics Standards and Cross-State Air Pollution Rule will harm electric reliability.

These assertions have consistently been discredited: in the 45-year history of the Clean Air Act, no emission standard has ever caused the lights to go out. This is a testament both to the rigorous process and analyses EPA relies on to develop Clean Air Act standards, as well as the effective tools that grid operators and other authorities use to manage reliability on a short-term and long-term basis.

Numerous states, power companies, and reliability experts have indicated that the Clean Power Plan is achievable

A diverse collection of energy experts and power company officials have recently made comments noting the feasibility of achieving the emission reduction goals of the Clean Power Plan; describing their experience in reducing carbon emissions in a cost-effective way as well as explaining approaches to ensure reliability is maintained while making progress to reduce carbon emissions.

Written Testimony of Kathleen Barrón, Senior Vice President, Exelon Corporation, Before the Federal Energy Regulatory Commission: Technical Conference on EPA’s Clean Power Plan (Feb. 19, 2015):

Exelon strongly supports EPA’s goal of reducing carbon emissions from the electric power sector. As EPA notes in the Clean Power Plan, the current level of carbon emissions is environmentally unsustainable, and action must be taken now in order to prevent significant, irreversible environmental damage and major economic loss. By providing regulatory certainty, well-designed carbon reduction rules will be a driving force to modernize our aging electric system so that our customers will continue to have a safe and reliable electric system to support our Nation’s economic growth.”

Written Testimony of Susan F. Tierney, Ph.D, Analysis Group, Before the House Comm. on Energy and Commerce: Hearing to Examine EPA’s Proposed 111(d) Rule for Existing Power Plants (Apr. 14, 2015):

The Clean Power Plan provides states a wide range of compliance options and operational discretion that can prevent reliability issues while also reducing carbon pollution and compliance costs. Experience has shown that such approaches allow for seamless, reliable implementation of emissions-reduction targets. By contrast, many stakeholders’ concerns about the Clean Power Plan presume inflexible implementation, are based on worst-case scenarios, and assume that policy makers, regulators, and market participants will stand on the sidelines until it is too late to act. There is no historical basis for these assumptions.”

Joshua Epel, Chairman, Colorado Public Utilities Commission, Before the Federal Energy Regulatory Commission: Western Regional Technical Conference on EPA’s Clean Power Plan (Feb. 25, 2015).

In Colorado we have charted our own course to decarbonize our electric system. . . . Now when the Clean Power Plan is finalized I believe that Colorado as a state will come up with an approach which will meet the revised goals . . . . I’m very pleased with some of the steps we have taken with just approved unprecedented amounts of utility scale solar . . . . We are doing a lot with wind, we are doing a lot with innovat[ive] approaches actually passed by the legislature. . . . So we think there’s a lot of innovative tools for Colorado to use.”

Flexibility in the Clean Power Plan

EPA’s Clean Power Plan wholly preserves the ability of grid operators, power companies, and other institutions to deploy the well-established tools and practices that ensure the reliable operation of the power grid.

The Plan provides state-wide goals for emission reductions, while affording states ample flexibility in how those goals must be met. States are not limited to using any particular pathway to meet the Plan, and can deploy a variety of existing and new policies to meet the state-wide greenhouse gas reduction goals, including flexible market-based tools. This already existing flexibility allows grid operators the freedom to use long-standing and tested actions to ensure reliability.

Although the Clean Power Plan represents an important step forward for our country, it builds on a nation-wide trend toward a cleaner and more efficient power sector that is already under way. As noted above, carbon emissions from the power sector are already 15% lower than in 2005 – reflecting a sharp decline in coal-fired power generation, as well as a significant increase in natural gas generation and renewables and rising investment in energy efficiency.

Since 2005, many fossil fuel-fired power plants have also installed modern pollution controls in response to state and federal clean air standards adopted to protect public health from harmful particulates, ozone-forming pollution, and toxic air pollutants such as mercury and arsenic.

The robust system of reliability safeguards described above has responded deftly to these developments, ensuring a consistent and reliable supply of affordable power while helping reduce harmful air pollution. There is every reason to believe that the Clean Power Plan, with its extended implementation timeframe and numerous compliance flexibilities, will similarly achieve important reductions in air pollution without compromising electric reliability.

For more information please read our white paper: Protective Carbon Pollution Standards and Electric Reliability

Also posted in Clean Air Act, Clean Power Plan, Energy, News / Read 2 Responses

Let There Be No Doubt: We Can Cut Truck Emissions & Fuel Use Today

(This post originally appeared on our EDF+Business blog)

The can-do spirit of American automotive engineers has been on full display over the past few weeks, as truck manufacturers unveil innovation after innovation to boost the efficiency of heavy trucks that move companies’ freight cross-country.

It is crystal clear that we possess— today— the know-how to dramatically cut fossil fuel consumption and greenhouse gas emissions from heavy trucks. Moreover, we can do this while saving consumers hundreds of dollars annually and giving trucking companies the high-quality, affordable equipment they require.

DTNA Super Truck HighSome of the recently-announced advances include:

All of these fuel-saving solutions are available today thanks to the acumen of engineers at these leading manufacturers. The first round of well-designed federal fuel efficiency and greenhouse gas standards are also driving innovations like these to the market.

Even so, the strides we are making today should only be the beginning.

Daimler’s Super Truck Doubles Efficiency

The team at Daimler Trucks North America provided the best example yet of our future potential with its entry in the Department of Energy Super Truck program. DTNA announced its team has “achieved 115 percent freight efficiency improvement, surpassing the Department of Energy program’s goal of 50 percent improvement.” Its truck registered 12.2 mpg recently – a leap above the 6 MPG typical of pre-2014 trucks.

Improvements where made across the platform, including electrified auxiliaries, controlled power steering and air systems, active aerodynamics, a long-haul hybrid system, and trailer solar panels. Engine efficiency advancements were particularly noteworthy – given the permanence of such solutions.  The Detroit Diesel engine reported a 50.2 percent engine brake thermal efficiency which was combined with further improvements from engine downspeeding and the use of a waste-heat recovery system.

Daimler’s fantastic results demonstrate that – when given a goal anchored in science, economics and innovation – our engineers can deliver phenomenal results.    Daimler should now lead the way in driving these solutions to national and global scale.

Setting the Bar Higher on Fuel Efficiency and Emissions

The time has come to give our engineers a new goal.

EDF is calling on the Environmental Protection Agency and Department of Transportation to set new fuel efficiency and greenhouse gas standards for heavy trucks that cut fuel consumption by 40 percent in 2025 compared to 2010.  This equates to an average of 10.7 mpg for new tractor-trailer trucks.

President Obama has called for new standards. These are expected to be announced late spring and were sent to the White House Office of Management and Budget for review this past week.

The first generation standards have created a strong, industry-supported foundation on which the coming standards can be built. These standards push improvements in all aspects of trucks through complementary engine and vehicle standards.  In fact, Daimler – a leading manufacturer of heavy trucks with the engineering prowess to set the high bar of 12.2 mpg for the Super Truck program – has recognized these standards as “very good examples of regulations that work well.”

We Have The Technology

Let there be no doubt that if we set a bold goal for 2025 we will meet it:

Setting a bold goal will help us take these technologies from the test track to the highway over the next decade, helping companies reduce both their costs and carbon risks, while delivering benefits for communities’ air quality and the climate.

Also posted in Cars and Pollution, Clean Air Act, Policy / Comments are closed

Carbon Pollution Standards that Begin by 2020: Vital for Climate Security, Human Health

The U.S. Environmental Protection Agency (EPA) is hard at work right now on the Clean Power Plan – the first ever national carbon pollution standards for power plants.

Among the many important aspects of this historic plan, we believe this: It is critical that EPA finalize carbon pollution standards for the power sector that include protective, well-designed standards beginning in 2020. 

Power plants account for almost 40 percent of U.S. carbon dioxide emissions, making them the largest source of greenhouse gas emissions in the nation and one of the largest sources of greenhouse gases in the world.

The Clean Power Plan will be finalized this summer. When fully implemented, it is expected to reduce greenhouse gas emissions from the power sector to 30 percent below 2005 levels. That makes these eminently achievable and cost-effective standards integral to climate security, human health and prosperity.

The Clean Power Plan will phase in over a 15-year period, with interim standards commencing in 2020, and final standards taking effect in 2030 – and there is strong reason to believe that the interim standards covering the period 2020 to 2029 should be strengthened in the final rule.

Interim standards can help the U.S. secure near-term low-cost opportunities to reduce greenhouse gas emissions, while generating the market signals necessary to achieve the deeper reductions required in the years ahead. They also can deliver important public health benefits for our families by providing healthier and longer lives for millions of Americans. And EPA has designed the interim standards in a manner that provides considerable flexibility to states and power companies to comply while deploying their own unique solutions.

Carbon Pollution Limits that Begin by 2020 are Essential for Driving Near-term Actions to Reduce Dangerous Emissions and to Advance Climate Security

As proposed, the Clean Power Plan’s interim standards could deliver cumulative emissions reductions of more than 5 billion tons of carbon dioxide. That approaches the total annual carbon dioxide emissions for the entire United States. Protective interim standards that require states and power companies to take near-term action to reduce carbon pollution are essential to secure these climate benefits.

Interim standards are essential for mobilizing the full range of near-term cost-effective opportunities to cut pollution, as they are the only way to ensure that investments in activities that reduce carbon pollution are fully recognized and properly rewarded. This is true whether the investments are new renewable generation, customer-friendly demand side energy efficiency programs, or other low-carbon solutions.

As the cost of clean energy decreases and the heavy burden of carbon pollution increases, a near-term limit on carbon emissions helps ensure these vital solutions are deployed without delay.

Interim standards can also help drive sustained investments in one especially important area – energy efficiency. Investments in energy efficiency can lead to direct financial benefits for customers – families and businesses alike – in the form of lower electric bills.

23 states are already implementing mandatory efficiency savings targets. These efforts have been overwhelmingly successful, regularly delivering two dollars of savings to customers for every one dollar invested – and in some cases up to five dollars for every one dollar invested.

Even in those states that have been implementing these programs for a while, there is little reason to believe that they have come anywhere close to exhausting the available potential. For example, analysis by McKinsey & Company found that implementing only those efficiency measures that pay for themselves would reduce the country’s total end-use energy consumption by 23 percent by 2020 relative to a business-as-usual scenario.

Analysis by the National Academy of Sciences found that the building sector could reduce energy consumption by 25 to 30 percent between 2030 and 2035, at a cost of just 2.7 cents per kilowatt hour saved. In addition, they found that cost-effective measures could reduce industrial demand 14 to 22 percent by 2020.

For all these reasons, electricity bills are actually expected to decrease as a result of efficiency investments power companies and states make to comply with the Clean Power Plan.

Near-Term Reductions are Essential to Ensure Healthier, Longer Lives for Millions of Americans

The interim emission standards are expected to drive significant near-term public health benefits across America.

In 2020 the proposed standards are expected to prevent:

  • Up to 4,300 premature deaths
  • Up to 100,000 asthma attacks in children
  • Up to 2,100 heart attacks
  • Up to 1,500 hospital admissions
  • Up to 290,000 missed school and work days

Even greater benefits are anticipated in later years.

That’s because power plants are major sources of emissions for a range of pollutants that contribute to ground-level ozone, better known as smog, and dangerous particulate pollution, better known as soot. Power plants are also a major source of emissions for pollutants that have neurotoxic or carcinogenic (cancer-causing) effects.

Power plants account for about 70 percent of U.S. total sulfur dioxide emissions and 46 percent of mercury emissions, and are important sources of nitrogen oxides. Steps taken to reduce carbon pollution under the Clean Power Plan will have the co-benefit of reducing emissions of these and other harmful air pollutants.

EPA estimates that these human health benefits outweigh the costs of compliance by a factor of seven to one.

Each year they are in effect, these important safeguards provide healthier and longer lives for Americans.

Protective Interim Standards are Flexible and Achievable

The goals of the Clean Power Plan are eminently achievable, as they are based on proven and cost-effective methods for reducing carbon pollution that many states and power companies are already demonstrating.

In addition, the Clean Power Plan provides an extraordinarily flexible structure in which states are able to craft their own path forward for reducing carbon pollution, so long as they meet the 10-year average interim target over the period 2020-2029 and then achieve the final reduction target in 2030. This flexibility provides states with the opportunity to harness their own unique opportunities and solutions in light of their own policy preferences.

When evaluating the feasibility of the standards, it is important to consider how quickly the nation’s grid is already decarbonizing. Emissions of carbon pollution from the power sector fell 15 percent from 2005 to 2014. As proposed, the Clean Power Plan only requires them to fall another 15 percent by 2030. Analysis by EIA suggests that the U.S. could cost-effectively reduce greenhouse gas emissions from the power sector about four times faster.

Here’s more evidence that the grid is decarbonizing at a considerably faster rate than what is required by EPA – in the five year period from 2007 to 2012, the Northeastern states reduced their carbon dioxide emissions from large power plants by 37 percent to 42 percent below 2005 levels. The reductions were due to a wide range of factors, including the adoption of the Regional Greenhouse Gas Initiative, shifting natural gas prices, and efficiency investments. That demonstrates the dynamic flexibility and adaptability of which the grid is capable.

This is all happening in the context of a continuously evolving and decarbonizing electric system. Since 2000, the U.S. has installed roughly 30 gigawatts of new generation capacity per year, the vast majority of which was natural gas and renewables. According to EIA, more than 20 gigawatts of utility scale renewables, natural gas, and nuclear generation are already scheduled to come online in 2015, almost half of which is wind.

Meanwhile, we continue to build new infrastructure – which can help unlock even greater opportunities.

For example, according to the Department of Energy, during the last several years more than 2,300 circuit miles of new transmission additions were constructed annually. According to FERC, there are almost 10,000 miles of proposed new transmission projects in various stages of development that have a “high probability of completion” by January 2017.

Protective interim standards will align our nation’s major investments in new infrastructure with climate security – providing lasting protections and smart investments.

Interim Standards Can Help Promote Investments that Drive Even Deeper Reductions in the Years Ahead 

The cost of zero carbon generation is rapidly falling. Wind and solar are cheaper than coal – and even natural gas in a growing number of markets.

Renewable prices are expected to continue their meteoric decline. The price for photovoltaic modules has fallen 80 percent since 2007, and wind prices have fallen 64 percent since 2009.

As a result, the solar industry is expecting to build another 20 gigawatts of new generation over the next two years alone. That’s roughly equivalent to the generation of 13 mid-sized coal plants. (The average capacity factor for new utility scale solar array is around 20 percent, while the average monthly capacity factor for the coal fleet was 61 percent in 2014.)

While EPA’s building blocks assume only modest growth in renewable generation over the next 15 years, recent shifts in price dynamics suggest that the actual market opportunity could be considerable. For this opportunity to materialize, however, power companies and investors need a clear signal about the value of reducing carbon pollution from the power sector.

Providing the clear investment signal beginning in 2020 can shape the broader range of infrastructure investments expected in the coming years, and ensure that they are consistent with the low carbon future we will need if we are to stave off the worst impacts of climate change.

That broader range of infrastructure investments includes the vast miles of electric transmission and natural gas pipelines that are expected to be built in the coming years, as well as investment decisions in today’s generation fleet. More than 30 percent of coal plants are 50 years old, and approximately one in four plants do not contain controls for sulfur dioxide or nitrogen oxides.

In total, utilities appear poised to invest up to $2 trillion in new generation, transmission, and distribution infrastructure between 2010 and 2030 in order to modernize aging generating facilities and grid systems. Any delay in establishing carbon pollution standards for the power sector increases the uncertainty and increases the risk that investments could become stranded in the future.

All of this suggests that well-designed interim standards are both achievable and essential. If anything, the standards should be strengthened given the urgency of the climate challenge, the scale of change we have seen in the power sector to date, and the significant public health and economic benefits the standards can provide.

We have an opportunity as a nation to take advantage of the fact that the economics of power generation are rapidly changing. The best way for both companies and states to position themselves for a competitive advantage in the future is to think long-term and to get on the leading edge of these emerging trends. Otherwise, there is a risk of reinvesting in assets that will be left behind by a changing market, leaving shareholders and ratepayers on the hook.

The Clean Power Plan presents a real opportunity. Let’s all work together to strengthen the program, and work to deliver a vibrant low-carbon economy for the United States.

Also posted in Clean Power Plan, Economics, Energy, Health / Read 1 Response

On El Niño, snowballs and real climate science

Source: NASA

Just as we thought science was finally taking root, here comes another article claiming that the rise in global temperatures has nearly stopped over the last 15 years. We heard it most recently from the Wall Street Journal.

Never mind that it’s been 30 years since a month was below the 20th century global average surface temperature. Or that climate change is evidenced by clearly visible sea ice and glacial melt. Skeptics support their argument by pointing out, time and time again, how little the Earth has warmed since 1998.

Indeed, the “nearly-stopped warming” may at face value appear to be supported by convincing scientific data. But don’t be fooled: 1998 was an exceptionally warm year thanks to a very intense El Niño, a naturally-occurring phenomenon involving unusually warm water in the Eastern Pacific Ocean.

The change in temperature from 1998 to today, therefore, is not at all a good representation of the long-term trend. It makes the nearly-stopped warming argument no more scientific than a snowball would be in Washington in February.

Selective statistics don’t make a trend

Think of it as if you were to use the holiday season as a benchmark for measuring body weight.

If I looked at the weight change I had between Thanksgiving and December 31, a time of year when I usually enjoy lots of good food, the picture would look very different than if my weight monitoring began the week before Thanksgiving. That’s because a Thanksgiving start date would be a higher-than-normal weight day, an anomaly.

And, yet, this is exactly what proponents of the nearly-stopped-warming theory are doing.

While it’s true that the rate of temperature change has decreased since 2001, they cherry-pick a recent 15-year period, 1998 to 2012, starting with an initial year that is already way above average to prove their point. Of course, these quasi-scientists aren’t transparent about their strategy, so a non-expert would have to dig into the data to realize they are being tricked.

El Niño always a wild card

El Niño, meanwhile, was just doing what niños tend to do: It threw us for a loop.

The one occurring for 10 consecutive months 1997-98 was the most intense ever recorded, making 1998 the hottest year up until that point. (Three years have since broken that record: 2005, 2010 and 2014.)

Scientists have a number of technical and statistical methods for delineating natural from human influences on the temperature record, and apply these tools depending on the research questions they’re trying to answer.

But the overall global record is not touched, so if you don’t know which years were affected by natural events such volcanic eruptions, it can look noisy and confusing.

This is why we need to look at long-term trends to get the real answers.

This post originally appeared on our EDF Voices blog.

Also posted in Basic Science of Global Warming, Extreme Weather, Setting the Facts Straight / Comments are closed

A Little-Known Federal Rule Brings Invisible Pollution Into Focus

Cropped rig houseLegal fellow Jess Portmess also contributed to this post.

Unlike an oil spill, most greenhouse gas emissions are invisible to the naked eye. Though we can’t see them, this pollution represents a daily threat to our environment and communities, and it is important to understand the extent of this pollution and where it comes from.

This is why in 2010 the Environmental Protection Agency (EPA) finalized a rule requiring facilities in the oil and gas industry to report yearly emissions from their operations.

The Rule is part of a larger greenhouse gas measurement, reporting, and disclosure program called for by Congress and signed into law by President George W. Bush. By coincidence, the rule is known as Subpart W.

The emissions data required by the Rule helps communities near oil and natural gas development better understand pollution sources, and gives companies better ways to identify opportunities to reduce emissions.

As these policies have gotten stronger under the Obama administration, industry has continued to fight them in federal court. Read More »

Also posted in Energy, News, Policy / Comments are closed