Market Forces

A growing call for environmental integrity

The recent introduction of bipartisan carbon fee legislation is demonstrating an important pattern taking hold as policymakers focus on climate change solutions. The Energy Innovation and Carbon Dividend Act, like the MARKET CHOICE Act introduced earlier this year by Republican Rep. Curbelo, recognizes that any carbon fee aimed at meeting the challenge of climate change must be designed with environmental performance in mind.

The new legislation is the first time in a decade that lawmakers from both sides of the aisle have come together to put forth serious climate policy. And like the MARKET CHOICE Act, it uses a fee to reduce pollution across the economy and includes “environmental integrity mechanisms” (EIMs) — provisions that tie a carbon fee to clear, measurable pollution reduction goals and keep us on track to meet those goals. EIMs are still a relatively new concept on the climate policy scene, but leading thinkers have begun to pay them significantly more attention, and it is clear they are emerging as a critical component of any serious carbon fee proposal: and with good reason.

A carbon fee – which sets a price per unit of pollution – prompts the economy to respond by providing powerful incentives to reduce that pollution, but it cannot guarantee the environmental result. While energy and economic modeling tools can provide critical insight into possible or likely outcomes, they cannot provide certainty over the magnitude of the impact. That’s why it is critical to include EIMs designed to provide greater assurances that a fee will deliver on its pollution reduction potential.

Momentum is building for urgent action on climate change. Last week, hundreds of protestors flooded into the Capitol calling for lawmakers to act. The recent Intergovernmental Panel on Climate Change report on the climate impacts of 1.5 degrees of warming and the U.S. National Climate Assessment paint a stark picture: the effects of warming are already here and there is no room for more delay if we are to avert disastrous impacts on human health and our economy. As this demand for action has gotten louder, so has the call for solutions that guarantee the results we need.

There is growing recognition that we need serious solutions to these pressing problems – and that means performance-based policy designed to ensure pollution reductions occur at the pace and scale the science demands. EIMs play the role of an insurance mechanism – they may never be triggered if a fee performs as expected, but provide critical safeguards in case it does not. As Rep. Ted Deutch, the author of the new bill, recognized, a price on pollution can harness the power of the market and provide a flexible cost-effective means of achieving results. But Rep. Deutch and the bill’s co-sponsors also realized that it is no longer acceptable to simply set a price and walk away, hoping the fee does the job. We also need limits on pollution, and effective mechanisms to ensure we meet our critical emissions reduction goals.

Indeed, the most straightforward way to cut pollution is to place enforceable declining limits on pollution, guaranteeing the environmental outcome, while giving businesses flexibility to determine the best way to meet it. We already have proof that those kinds of policies can help meet environmental goals faster and more cheaply than expected while growing the economy.

Regardless of the approach we take, the cornerstones of good policy design are the same: clear and measurable emission reduction goals, effective provisions to ensure they are met, and flexibility in how to meet them coupled with strong incentives to do it cheaply and efficiently. In the context of a carbon fee, that means including an EIM.

Ultimately, in order to achieve the dramatic transformational change needed to reach a 100% clean energy economy and the pollution reductions that science demands: net-zero emissions as soon as possible, a portfolio of policy approaches is needed (as others have pointed out). This means not only a limit and a price on pollution, but also investing in innovation and development of promising emerging clean energy technologies. It also means putting in place other programs that accelerate deployment of clean transportation infrastructure and promote electrification of cars and buildings. And it means encouraging states and cities to continue to lead and take action to cut pollution, pushing beyond federal requirements.

The seeds of future progress in Congress are being planted and demand is growing for durable and effective solutions that ensure environmental goals will be met. The key metric for any climate policy is environmental performance – lawmakers on both sides of the aisle are demonstrating they recognize this fundamental principle.

Posted in Economics, Uncategorized / Leave a comment

And the Nobel Prize goes to… Climate Economics

How newer research is building off Nordhaus’ past contributions

Äntligen! (Swedish—my native tongue—for “Finally!”) Last week, the Royal Swedish Academy of Sciences awarded the Nobel Prize in Economic Sciences to William Nordhaus for his pioneering work on “integrated assessment modeling” (IAM) – a framework which has made it possible to analyze the interplay between the economy and climate change, and the consequences of climate policies. And while the recognition of Nordhaus’ achievements is an encouraging sign that mainstream economics is starting to recognize the important contributions of environmental economics to the field, it’s critical that economists continue to build on and strengthen the research Nordhaus initiated.

Nordhaus started his research – in what was to become the now very active and expanding field of climate economics – already in the 1970s. His fundamental contribution came in the early 1990s when he introduced his Dynamic Integrated model of Climate and the Economy (DICE), which became the foundational framework for the IAMs used today by the Intergovernmental Panel on Climate Change (IPCC) as well as by an Interagency Working Group to develop estimates of the Social Cost of Greenhouse Gas Emissions during the Obama Administration.

The novelty of DICE was the integration of findings across disparate disciplines including physics, chemistry, and economics to model the link between economic activity and carbon emissions, leading to higher atmospheric carbon concentration and related higher global average temperatures. Furthermore, his model linked this increase in average temperature to economic damages. This integrated framework laid out the principles for estimating the damaging impacts of greenhouse gas emissions (GHGs) on human welfare, and could therefore be used to calculate the social cost of greenhouse gas emissions and to study the consequences of climate policy interventions such as carbon pricing.

In awarding him the Nobel Prize, The Royal Swedish Academy of Sciences recognized Nordhaus’ research as a methodological breakthrough and critical step forward – but one which does “not provide final answers.” While DICE, an acronym which nods to the perilous game we’re playing with the planet, laid the groundwork for the development of robust estimates of the social cost of GHGs by the Interagency Working Group (which experts agree reflect a lower bound), his research has also served to highlight how newer and ongoing research can further strengthen these estimates.

Such enhancements which further strengthen integrated assessment modeling include:

  • Incorporating more of the many non-market health and environmental impacts which are still omitted in IAMs by constructing more detailed damage functions (the assumed relationship between climatic changes and economic damages) founded in empirical studies of climate impacts using real-world data and taking into account that the value of environmental assets relative to other goods and services may increase as they suffer a larger share of the damages from climate change.
  • Strengthening how inter- and intra-generational equity is taken into account.
    • The highly influential Stern Review commissioned by the UK government in 2005 argued persuasively that Nordhaus put too little weight (through his choice of parameter values related to the discount rate) on the welfare of future generations which resulted in lower estimates of economic damages, and spurred an academic debate leading to recommendations that governments instead use declining discount rates when evaluating public projects and policies with long term impacts.
    • Climate change will impact different regions of the world very differently, with poorer regions generally hit worse than richer parts of the world. How well economists represent the spatial distribution of damages across regions of the world and the functional form and parameter values they choose for weighting differences in such damages significantly impact estimates of the social cost of greenhouse gas emissions.
  • Strengthening the way IAMs deal with risk and uncertainty – an inherently crucial element in any analyses of climate change – and the representation of so-called “tipping points” beyond which damages accelerate or become irreversible. This more recent research shows that such model enhancements also significantly increase estimates of the social cost of greenhouse gases, and underscore the vital importance of drastically reducing GHG emissions to insure against high‐temperature catastrophic climate risks.

Nordhaus shares the Nobel Prize with Paul Romer, who is separately awarded for integrating technological innovations into long-run macroeconomic analysis and his analyses of how markets develop new technologies. This is a very appropriate choice considering the importance of technological change for addressing climate change and gives this year’s prize the common theme of how to achieve both sustained and sustainable economic growth.

It is extremely timely that the Nobel Prize in Economics to Nordhaus’ work highlighting the critical role of carbon pricing and Romer’s work on the importance of technological innovation for long-run welfare was announced on the same day as the IPCC released its special report on the impacts of global warming of 1.5 °C showing the urgency of addressing climate change, and how both carbon pricing as well as technological innovation and diffusion have important roles to play.

 

 

Posted in Uncategorized / Leave a comment

What California’s history of groundwater depletion can teach us about successful collective action

California’s landscape will transform in a changing climate. While extended drought and recent wildfires seasons have sparked conversations about acute impacts today, the promise of changes to come is no less worrying. Among the challenges for water management:

These changes will make water resources less reliable when they are needed most, rendering water storage an even more important feature of the state’s water system.

One promising option for new storage makes use of groundwater aquifers, which enable water users to smooth water consumption across time – saving in wet times and extracting during drought. However, when extraction exceeds recharge over the long term, “overdraft” occurs. Falling water tables increase pumping costs, reduce stored water available for future use, and entail a host of other collateral impacts. Historically, California’s basins have experienced substantial overdraft.

Falling water tables reflect inadequate institutional rules

One cause of the drawdown is California’s history of open-access management. Any landowner overlying an aquifer can pump water, encouraging a race to extract. Enclosing the groundwater commons and thereby constraining the total amount of pumping from each aquifer is critical for achieving efficient use and providing the volume and reliability of water storage that California will need in the future. However, despite evidence of substantial long-run economic gain from addressing the problem, only a few groups of users in California have successfully adopted pumping regulations that enclose the groundwater commons.

SMGA addresses overdraft—but pumpers must agree to terms

California’s Sustainable Groundwater Management Act (SGMA) of 2014 aims to solve this challenge by requiring stakeholders in overdrafted basins to form Groundwater Sustainability Agencies (GSAs) and create plans for sustainable management. However, past negotiations have been contentious, and old disagreements over how best to allocate the right to pump linger. The map presented below illustrates how fragmentation in (historical) Groundwater Management Plans also tracks with current fragmentation in Groundwater Sustainability Agencies (GSAs) under SGMA. Such persistent fragmentation suggests fundamental bargaining difficulties remain.

Spatial boundaries of self-selected management units within basins under SGMA (GSAs) mirror those of previous management plans (GMPs). Persistent fragmentation may signal that adoption of SGMA doesn’t mean the fundamental bargaining difficulties facing the basin users have disappeared.

New research, co-authored with Eric Edwards (NC State) and Gary Libecap (UC, Santa Barbara) and published in the Journal of Environmental Economics and Management, provides broad insights into where breakdowns occur and which factors determine whether collective action to constrain pumping is successful. From it, we’ve gleaned four suggestions for easing SGMA implementation.

Understanding the costs of contracting to restrict access

To understand why resource users often fail in adopting new management institutional rules, it’s important to consider the individual economic incentives of various pumpers. Even when they broadly agree that groundwater extraction is too high, collective action often stalls when users disagree about how to limit it. When some pumpers stand to lose economically from restricting water use, they will fight change, creating obstacles to addressing over-extraction. When arranging side payments or other institutional concessions is difficult, these obstacles increase the economic costs of negotiating agreement, termed “contracting costs.”

To better understand the sources of these costs in the context of groundwater, we compare basins that have adopted effective institutions in the past with otherwise similar basins where institutions are fragmented or missing. Even when controlling for the level of benefits, we found that failures of collective action are linked to the size of the basin and its user group, as well as variability in water use type and the spatial distribution of recharge. When pumpers vary in their water valuation and placement over the aquifer, the high costs of negotiating agreement inhibit successful adoption of management institutions, and overdraft persists. Indeed, in many of California’s successfully managed basins, consensus did not emerge until much farmland was urbanized, resulting in a homogenization of user demand on the resource.

Four key takeaways to ease agreement

In the face of such difficult public choices, how can pumpers and regulators come to agreement? Four main recommendations result from our research:

  • Define and allocate rights in a way that compensates users who face large losses from cutbacks in pumping. Tradable pumping rights can help overcome opposition. Pumpers can sell unused rights and are oftentimes made better off. The option to sell also incentivizes efficient water use.
  • Facilitate communication to reduce costs of monitoring and negotiations. The Department of Water Resources has already initiated a program to provide professional facilitation services to GSAs.
  • Promote and accept tailored management. Stakeholders and regulators should remain open to approaches that reduce contracting costs by addressing issues without defining allocations or attempting to adopt the most restrictive rules uniformly throughout the basin. For example, pumpers have successfully adopted spatially restricted management rules to address overdraft that leads to localized problems; others have adopted well-spacing restrictions that reduce well interference without limiting withdrawals.
  • Encourage exchange of other water sources. Imported, non-native surface water may lower contracting costs because it can save users from large, costly cutbacks. Pumpers have written contracts to share imported water in order to avoid bargaining over a smaller total pie; where such water is available, exchange pools (such as those described here) can help to limit the costs of adjustment.

SMGA is a large-scale public experiment in collective action. To avoid the failures of previous attempts to manage groundwater, stakeholders crafting strategies for compliance and regulators assessing them should keep in mind the difficult economic bargaining problem pumpers face. Hopes for effective, efficient, and sustainable water management in California depend on it.

Posted in California, Uncategorized / Leave a comment

New analyses agree carbon pricing is a powerful solution

This post is co-authored with Steve Koller

To tackle dangerous climate change at the pace and scale the science demands, we must take advantage of every cost-effective opportunity to cut pollution now. Several recent analyses from leading experts on the impacts of carbon pricing demonstrate once again why flexible, market-based policy is the most effective and efficient tool we have to address dangerous climate change.

These studies reaffirm that penalizing pollution and requiring companies to pay for their contribution to climate change can help the United States achieve needed reductions while generating substantial revenue. What’s more, none of these studies even account for the enormous benefits of averting climate change impacts.

While these studies examine carbon taxes (which place a price on pollution and allow the economy to respond), approaches that establish overall declining pollution limits and allow the market to determine the price can achieve similar pollution reductions and economic outcomes. But since uncertainty about market factors and technological trends prevent even the most robust economic modeling from providing guarantees, it is crucial that any carbon tax policy be linked to clear, concrete pollution reduction goals and include transparent provisions to help ensure those goals are met. A policy where the price is derived from overall enforceable pollution limits already includes those assurances.

The analyses by the Stanford Energy Modeling Forum (EMF 32, comprising 11 leading modeling teams), Columbia University’s Center on Global Energy Policy (CGEP) and the U.S. Energy Information Administration (EIA) examine a range of scenarios with price paths from $15  to $50 per ton and annual price escalators from 1 to 5 percent, along with various ways of distributing the revenue. In addition, Resources for the Future and Columbia modeled the carbon tax bill recently introduced in the House by Representative Curbelo and co-sponsors, which includes a starting price of $24 per ton and rising 2 percent annually.

Let’s take a look at four key takeaways across analyses:

1. National policy that puts a price on carbon could significantly reduce climate pollution

In all scenarios that examine a price across the economy, pollution reductions are achieved consistent with meeting or exceeding the U.S. Paris Agreement climate commitment by 2025 of 26 to 28 percent reductions below 2005 levels. On our current path, we will almost certainly fall short of meeting those goals, according to recent analysis from the Rhodium Group.

However, the analyses also show that to achieve deeper reductions aligned with long-term, science-based targets (for example, net-zero emissions by mid-century), we will likely need pricing paths at the more ambitious end of the spectrum—as well as companion policies to help the most difficult sectors decarbonize.

Of course, a key advantage of pricing pollution is that it will spur innovation—encouraging new technologies and approaches to slashing emissions that today’s models cannot foresee. These advances could allow us to meet our goals at lower costs than anticipated—exactly what’s happened with the well-designed, market-based U.S. acid rain program.

The EMF 32 results also underscore that the starting price matters for reductions in the short term, while the rate of increase over time is important for bending the emissions curve down in the long term. For example, in the first decade, the $50 plus 1 percent price achieves roughly 40 percent more cumulative emissions reductions than the $25 plus 5 percent scenario. However, by 2038 cumulative reductions in the $25 plus 5 percent price exceeds the $50 plus 1 percent price, and cumulative emissions through 2050 are similar. This dynamic is important since cutting pollution now is good for the climate, but we also need to sustain the pollution decline over the long term. Ultimately, the total cumulative climate pollution in the atmosphere is what matters.

2. Carbon pricing has extremely minor impacts on the economy—without accounting for the economic benefits of avoided climate change

Both EMF 32 and CGEP’s results suggest that GDP would continue to grow at historical or near-historical rates across scenarios—and could be net positive, depending on how revenue is used. Additionally, despite misleading rhetoric from opponents of climate action, a carbon price would have an extremely small net effect on employment. A recent analysis from Resources for the Future suggests that the net impact of a $40 tax would be less than half of one percent or even lower. And while many other studies confirm that net impacts on employment are likely to be small, they note that even mainstream modeling efforts tend to overestimate the impacts by a factor of 2.5 or more. Meanwhile, national climate policy would mean investing in the clean energy revolution: the economic engine of the future.

None of these analyses even consider the economic benefits associated with slashing climate pollution. Citibank estimates that climate change will cause tens of trillions of dollars in damages if left unchecked. These analyses also do not account for the additional associated benefits such as improvements in air quality. Taken together, these benefits make an overwhelming economic case for reducing pollution as soon as possible.

3. The lion’s share of reductions occur in the power sector, underscoring the importance of companion policies in other sectors

All analyses of an economy-wide price find that the vast majority of reductions occur in the power sector, driven primarily by declines in coal consumption. In the analyses examining $50 per ton scenarios, Columbia shows that approximately 80 percent of economy-wide emissions reductions occur in the power sector with a significant shift towards renewable energy, and EMF 32 results predict that coal demand reaches near-zero by 2030. This is consistent with modeling analysis conducted by the United States Mid-Century Strategy for Deep Decarbonization in 2016.

Some other sectors, notably transportation, tend to be less responsive to carbon pricing in the models—at least in the short term. Both Columbia and EMF 32 find that transportation sector emissions only drop a few percentage points relative to 2005 levels by 2030 even in the higher pricing scenarios. These results underscore the importance of policies that put a firm limit on pollution across the economy as well as companion policies that can help address specific barriers to change in sectors that will be more difficult to decarbonize.

4. How revenue is used matters

Carbon pricing has the potential to raise significant revenue—for example, just under a trillion dollars over the first 5 years with a $40 price, rising at 5 percent. How revenue is used plays a significant role in overall economic impacts as well as the distribution of those impacts across regions and populations.

For example, CGEP’s analysis finds that certain revenue recycling approaches—including the use of revenues to reduce payroll taxes or the national debt—result in larger long-run economic growth than scenarios without a carbon price. EMF results find that using revenues to reduce capital income taxes generally achieve the highest GDP growth of the scenarios they considered, but these gains are disproportionately captured by the wealthy.

Alternatively, revenue can be directed to not only avoid this sort of inequitable distribution of benefits, but also protect low-income families and disadvantaged communities who already bear a disproportionate share of the burden of climate change and air pollution, and are more sensitive to changes in energy costs. For example, Columbia’s analysis shows that approaches putting even a small portion of revenue back into the pockets of American households can compensate those in the lowest income quintile from potential increases in energy costs. The Center on Budget and Policy Priorities has also demonstrated how carbon pricing can be designed to fully offset impacts of the policy on the most vulnerable households and provide considerable support for others while leaving significant revenue to spare.

While assumptions and model structures may differ, bottom line findings all point in the same direction: well-designed, national carbon pricing policy can spark deep reductions in climate pollution alongside economic growth, while spurring technological innovation and protecting vulnerable populations.

The “price” itself is only one part of effective climate policy. We need firm declining limits on pollution to accompany a price and ensure environmental outcomes, as well as a portfolio of approaches working together to ensure that investment and innovation are happening where it matters. A pricing program can be a catalyst for driving additional climate solutions at the federal, state, and local level, while other policies can share the burden and tackle the problem from multiple angles. This model has already proven itself in California, where the state has hit pollution reduction targets even earlier and at lower cost than anticipated.

To be successful, we need bipartisan leadership and a serious discussion about meaningful solutions. The United States can and must address the challenge by working together in the best interest of all Americans to put in place ambitious, effective, and fair climate policy.

Posted in Uncategorized / Leave a comment

How China is cleaning up its air pollution faster than the post-Industrial UK

Beijing has seen some of the lowest air pollution levels in recent history this past winter, just as China’s Ministry of Environmental Protection (MEP) – now strengthened and renamed to Ministry of Ecology and Environment (MEE) – has put the final touches on a new, three-year plan to improve air quality. But while the trend is positive, air pollution levels in China are still dire: The MEP calculates an annual average PM2.5 concentration of 43 µg/m3 for China’s cities in 2017, more than 4 times the level of 10 µg/m3 recommended by the WHO. Official measurements for Beijing even showed the capital’s air quality at 58 µg/m3

Still, China is cleaning up its air faster than the United Kingdom did after its Industrial Revolution. Despite this early success, however, China could spark even more efficient improvements by adopting market-based incentives.

Let’s take a look at how both countries fared immediately after each of their industrial booms.

Figure notes: The figure shows annual average concentrations of total suspended particles (TSP), a coarse and now outdated measure of air pollution. The black line shows the average for China, while the grey line shows London. Data sources: TSP concentrations for China through 2003 are based on the China Energy Databook 9.0 based on data provided by State Environmental Protection Administration. From 2004 on, TSP concentrations for China are based on author-collected air pollution index (API) data from the MEP datacenter. I imputed PM10 concentrations based on information on the main pollutant on a given day and the assumption that an API reading below 51 reflects PM10 (see Stoerk 2016 for explanations on the procedure). I then converted the PM10 concentrations into TSP using a conversion factor of 2 following Matus et al. 2012. TSP concentrations for London come from Fouquet 2011, who generously shared his dataset.

 

Air quality in London is far from perfect, but it’s also come a long way from the days when people died in the “Great Smog.” The graphic above brings together the earliest known air pollution data from China, from 1980 to 2012, and from the UK from the Industrial Revolution until 2008. Air pollution levels in the main Chinese cities at the beginning of the 1980s were almost exactly at the level of London at the height of the Industrial Revolution in 1890 (a shocking outlier is Hohhot, the capital of Inner Mongolia, which reached a concentration of Total Suspended Particles of 1,501 µg/m3 in 1987, possibly the highest level of urban air pollution in recorded history).

The difference is in the speed of improvements: Air pollution in China has been decreasing at a similar trajectory as London’s 90 years earlier, but at twice the pace. While extreme air pollution levels in China’s recent history are typical for an industrializing economy, its pace in cleaning up the pollution is fast by historical standards.

China started to seriously control air pollution from 2006 to 2010 by limiting emissions for each province. Relying on satellite data, my research shows that this first attempt was ultimately successful in reducing nationwide SO2 emissions by over 10 percent relative to 2005. Studying compliance over time, however, suggests that reductions in air pollution only happened after the Chinese government created the MEP in 2008. After its creation, among the many changes in environmental policy, the MEP started to gather reliable SO2 emissions data from continuous emissions monitoring systems (CEMS) at the prefecture level and increased the number of enforcement officials by 17 percent (a task that EDF China actively supported).

This early success notwithstanding, China could do better by implementing well-designed market-based solutions, policies that align with the country’s ambition to combine economic prosperity and environmental protection. Or, in the words of President Xi, to combine ‘green mountains and gold mountains’.

For example, a well-designed cap-and-trade program at the province level could have decreased the cost of air pollution abatement from 2006 to 2010 by 25% according to my research. The anticipated launch of a sectoral emissions trading system to limit a portion of China’s greenhouse gas emissions suggests that the Chinese government is looking to embrace lessons learned in air pollution control and wishes to build on its own pilot market-based pollution control programs to bring its environmental policy into the 21st century.

EDF is playing a key role in helping this endeavor through both hands-on policy work and research. The timing is serendipitous: China is at a cross-roads in environmental policy. Evidence based policy making is welcome. And data quality has improved in recent years. Given the right set of policies, countries can control air pollution, and improvements in air quality typically go hand in hand with economic prosperity.

Both China and London have remaining challenges. Despite dramatic improvements, Londoners, like the Chinese, still live with significant air pollution. A recent report on London’s air pollution found the city is not close to meeting WHO standards. Meeting them will be a challenge, in part because of the complexity of the causes (road transport accounts for over half of local contributions). So just as London must keep battling to improve air quality, Beijing will need to do likewise–but at least now each can now learn from the other.

Posted in air pollution, china, International / Leave a comment

Study: Renewables played crucial role in U.S. CO2 reductions

This blog was co-authored with Jonathan Camuzeaux, Adrian Muller, Marius Schneider and Gernot Wagner.

After a nearly 20-year upward trend, U.S. CO2 emissions from energy took a sharp and unexpected turn downwards in 2007. By 2013, the country’s annual CO2 emissions had decreased by 11% – a decline not witnessed since the 1979 oil crisis.

Experts have generally attributed this decrease to the economic recession, and to a huge surge in cheap natural gas displacing coal in the U.S. energy mix. But those same experts mostly overlooked another key factor: the parallel rise in renewable energy production from sources like wind and solar, which expanded substantially over the same 2007-2013 timeframe.

Between 2007 and 2013, wind generated electricity grew almost five-fold to 168 TWh and utility-scale solar from 0.6 TWh to 8.7 TWh. During the same period, bioenergy production grew 39 percent to 4,800 trillion BTUs.

Given these increases, how much did renewables contribute to the emissions reductions in the United States? In a paper published this month in the journal Energy Policy, we use a method called decomposition analysis to answer just that.

Unpacking the Factors

Decomposition analysis is an established method which enables us to separate different factors of influence on total CO2-emissions and identify the contribution of each to the observed decrease. The factors considered here are total energy demand, the share of gas in the fossil fuel mix (capturing the switch from coal and petroleum to gas), and the share of renewables and nuclear energy in total energy production.

Introducing a new approach for separately quantifying the contributions from renewables, we find that renewables played a crucial role in driving U.S. energy CO­2 emissions down between 2007 and 2013 – something which has previously largely gone unrecognized.

According to our index decomposition analysis, of the total 640 million metric ton (Mt) decrease (11%) during that period two-thirds resulted from changes in the composition of the U.S. energy mix (with the remaining third due to a reduction in primary energy demand). Of that, renewables contributed roughly 200 Mt reductions, about a third of the total drop in energy CO2 emissions. That’s about the same as the contribution of the coal and petroleum-to-gas switch (215 Mt). Conversely, increases in nuclear generation contributed a relatively minor 35 Mt.

While the significant role of renewables in reducing CO2 emissions does not diminish the contribution of the switch to natural gas, it is important to note that the climate benefits of switching from coal and petroleum to gas are undermined by the presence of methane leakage along the natural gas supply chain, the extent of which is likely underestimated in national greenhouse gas (GHG) emissions inventories.

Methane, of course, is a powerful greenhouse gas. Methane leakage from increased natural gas use could have wiped out up to 30% of the short-term GHG benefit (on a CO2-equivalent basis) calculated in this paper of switching from coal and petroleum to natural gas. For the natural gas industry to truly sustain the claim that it has made a positive contribution to reducing the country’s carbon footprint, the methane emissions associated with natural gas must be substantially reduced.

These results show that past incentives to support the expansion of renewable energy have been successful in reducing the country’s emissions, and that decreasing costs for renewable energy offers some hope for continued progress even despite the current administration’s refusal to address climate change.

Such progress, however, will never be sufficient without ambitious climate and clean energy policies- whether at the federal or at the state level – that can drive further emission reductions.

Posted in emissions, Energy efficiency / Leave a comment