Market Forces

How can the U.S. gas pipeline system support a path to net-zero GHG emissions by 2050?

An economist’s guide to filling in the research gaps.

Natural gas currently accounts for more than a third of U.S. energy-related CO2 emissions, but efforts to decarbonize the economy – in particular by replacing gas with electricity in a wide variety of critical applications – imply decreasing future gas demand and CO2 emissions from the industrial and building sectors as well as the power sector.

Resolving the economic and regulatory challenges that follow from this will require filling in crucial knowledge gaps about the U.S. gas transportation system – and how that market could be designed to support the energy transition.

An energy system already in transition

Transitioning the U.S. to a clean energy system is a critical step toward the long-term goal of reaching net-zero greenhouse gas emissions by 2050. The U.S. power system has already taken steps in the right direction. More electricity is coming from variable renewable energy sources (VREs) like solar and wind, while coal plants are being retired.

But even when we factor in options like energy storage, demand response and build out of electric transmission capacity, gas-fired generators will likely continue to have a role in the next decades by providing peak and ramping capacity at times when electricity production from wind and solar is low or electricity demand is high.

This, in turn, means that the country’s vast network of interstate gas pipelines has its own role to play in the US energy transition.

The problem is that the pipeline transportation market was built to support predictable, relatively constant demand (e.g. industry and buildings). It is not currently designed to accommodate the variability of demand from gas-fired power plants which can fluctuate significantly by the hour – or even more frequently. Nor is the pipeline system designed to be compatible with other low-carbon fuel options or phased down as electrification increases.

More economics research needed

To reconcile this disconnect, we need a much better understanding of how the pipeline market works, and how it could work. Compared to U.S. power markets, the interstate gas pipeline transportation market is characterized by opaque operations and practices and has not been studied much by economists. This has limited the economic analysis available to support decision-making by policy makers and stakeholders looking to address this problem.

More research and analysis is needed to inform how design, regulation and operation of the US gas transportation market can be improved, and the stranded asset risk and associated distributional impacts managed.

To stimulate and facilitate new research in this area important to the US energy transition, I recently published an introductory guide to the U.S. gas pipeline transportation market for researchers and energy market analysts. It outlines the main market features and regulations important for understanding the U.S. gas transportation market.

The objective is to facilitate further research that will help answer questions like:

  • Who is, or should be, shouldering the costs of gas transportation infrastructure and bearing the risk of some of these assets becoming stranded in a low-carbon-energy future? How should such long-term stranded asset risk be managed in the face of electrification and decarbonization?
  • What changes are needed in the gas transportation markets to provide more flexible gas delivery services to gas-fired generators that provide valuable balancing in the power markets?
  • What role can hydrogen play in U.S. decarbonization efforts? How could a potential hydrogen market be created and which parts of the gas pipeline network would be beneficial to make compatible with hydrogen transportation, given potential centers of hydrogen supply and demand.

By publishing this paper, we hope to inspire PhD students, researchers, consultancies and market analysts to conduct analyses on this topic crucially important to the U.S. energy transition. Such new research would ideally generate policy-relevant conclusions on how to reform the U.S. gas pipeline transportation market – and next be communicated to  energy market regulators and policy makers to support decision-making that will facilitate the US transition to net zero greenhouse gas emissions by 2050.

Posted in Economics, Energy Transition / Leave a comment

How can economics contribute to decarbonizing power markets?

Electricity system operators balance supply and demand precisely at every moment of every day through market design grounded in economic principles. As the share of variable renewable resources like wind and solar electricity on our electricity system increases, system operators, policy makers and energy market regulators are facing new questions on how to design the rules governing our electricity market to support decarbonization of our energy system.

Christopher Holt, PhD student in agricultural and resource economics at the University of Maryland, recently published an EDF Economics Discussion Paper, in which he reviews these new questions in wholesale electricity market design and identifies a number of areas where economic research can help inform decision-makers to facilitate decarbonization.

Chris wrote this paper during a summer pre-doc fellowship at EDF and Kristina Mohlin, who hosted Chris during the fellowship, recently chatted with him about his paper and his experience as a pre-doc fellow at EDF.

Kristina: What was your starting point for this pre-doc fellowship?

Chris: State and local leaders have been setting ambitious decarbonization targets. More recently, President Biden has pledged to make US electricity production free of carbon by 2035. My starting point was to try and understand how electricity market practitioners are working to change and refine the sophisticated set of rules governing wholesale markets, so that these targets can be met.

During my time at EDF, I spoke with industry representatives, policy makers, external economists, and other stakeholders at policy meetings, conferences, through video chats, and over countless cups of coffee. I also learned a lot by chatting with the highly talented folks internal to EDF. These conversations alerted me to many gaps in the applied economics literature, which I then described in the paper.

Kristina: One defining characteristic of electricity markets is that consumers do not respond to wholesale price fluctuations in real time. How can markets be designed to enable and encourage price-responsive demand?

Chris: California’s Demand Response Auction Mechanism (DRAM) is a promising example of how market design can reward innovation that encourages response to prices at the individual user level. This mechanism, still in its pilot form, allows companies to bring together demand across a group of electricity consumers, e.g. by coordinating power drawn from their appliances. The aggregator can then curtail demand when electricity is scarce at a minimal loss to consumers, who may be compensated for their agreement to participate. Wholesale market prices are kept low by way of the curtailed demand, benefitting all consumers of electricity (not just the participating ones). This is a “win-win”: lower prices for consumers and a profitable return for the aggregator. Importantly, this arrangement would not exist without the wholesale market design.

While the incentives are powerful, getting the design right is not easy. Projects like DRAM have a long way to go before they are approved for permanent integration into market operations—which is exactly why additional research is needed.

Kristina: Another key defining characteristic of electricity markets is that electricity is not storable. How will utility-scale storage affect market operations?

Chris: Yes, storage is not yet available at large scale, but this seems likely to change in the near future. The Federal Energy Regulatory Commission’s landmark Order 841 is intended to facilitate the participation of storage resources in energy markets. Some firms have already begun to complement variable renewable generation assets with large-scale battery technologies, and industry forecasts suggest major cost reductions for batteries in the near future. When storage technologies are deployed at scale, short-run market operations will require a new set of rules, which must be guided by economic research.

Kristina: Could you explain to our readers what this has to do with decarbonization?

Chris: Both price responsive demand and storage are closely tied to decarbonization because they allow consumers to buy electricity when it is cheap and clean rather than when it is expensive and carbon intensive (this is most apparent when there is a price on carbon). Electricity from wind and solar is essentially free once investment costs have been incurred and the plants have been built. Currently, when electricity is scarce, carbon-intensive peaker plants are needed to maintain reliability. These peaker plants, which are also relatively expensive to run, are increasingly needed to complement the variability of renewables, e.g. when the sun goes down in California, or the wind stops blowing in Texas. Unlocking price responsive demand and introducing storage capacity will reduce the need to rely on peakers.

Kristina: How will long-run investments be affected by increased participation of electric storage and price-responsive demand?

Chris: Changes to demand-side price response and the storability of electricity have crucial implications for how firms plan to invest in new generation assets and retire old plants. If consumers are able to pre-empt the high prices associated with peaker plants, why invest in peaker plants at all? Storage may bring benefits in helping to reduce emissions, but will firms be incentivized to invest in it? Regulators in New York, for example, have set considerable storage capacity targets. Experts suggest many ways to reach such targets and to ensure that more storage capacity indeed translates to decarbonization (carbon pricing is central among them).

The difficult task of guiding efficient long-run investment is further complicated when an electricity system spans political jurisdictions with differing policies. The simple fundamentals of electricity market economics are of value here, reminding us that proper pricing is often the key to efficiency—pricing that reflects resource scarcity, the value of quickly dispatchable resources and demand response, and the harm imposed by carbon pollution. Through my EDF pre-doc fellowship, I found that we need new research to connect these classic fundamentals to the new challenges associated with scaling up renewables.

Kristina: Finally, what would you like to tell other PhD students who have the opportunity to apply for an EDF pre-doc fellowship about your experience at EDF?

Chris: The pre-doc fellowship is a great way to focus in on the questions you might want to address in your dissertation. My job market paper was inspired in large part by my time at EDF. Having access to the network of experts that the fellowship offered was an ideal way to become more familiar with certain areas and overcome the steep learning curve associated with my field. EDF also values their alumni—I have continued to keep in touch with folks I met through the fellowship and attend EDF workshops. Overall, I would highly recommend the fellowship!

Posted in Uncategorized / Leave a comment

How Climate Economics supports the Paris agreement temperature targets

New research building on Nobel Prize winner Nordhaus’ past contributions shows reaching UN climate targets is a good investment for the planet

Two years ago William Nordhaus was awarded the Nobel Prize in Economic Sciences for his pioneering work on “integrated assessment modeling” (IAM) and his Dynamic Integrated model of Climate and the Economy (DICE)—a framework designed to analyze the interplay between the economy and climate change, and used to assess economically optimal CO2 emission pathways and the social cost of carbon (SCC). Now a new paper published in Nature Climate Change demonstrates that a 1.5 to 2 degree target in line with the UN Paris agreement is economically optimal when the DICE model is updated to reflect newer research and latest expert assessment.

As I described in a blog about Nordhaus’ Nobel Prize two years ago, there were several ways new research could strengthen the results from Nordhaus’ DICE model and other IAMs. In this new paper, Martin C. Hänsel and co-authors (including Daniel Johansson, Christian Azar and EDF Senior Contributing Economist Thomas Sterner) made a number of such modifications to the baseline assumptions to update the results coming out of Nordhaus’ DICE model.

Two of their key updates relates to the economic assumptions/inputs to the model:

  • Updating the damage function (the assumed relationship between climatic changes and economic damages) to reflect a recent meta-analysis of climate damage estimates; and
  • Updating how equity between present and future generations is taken into account in DICE by revising the parameters determining the social discount rate. The choice of discount rate has a large impact on the results coming out of IAMs, since it determines the weight given to the climate damages affecting future generations.  This has spurred a long-standing debateespecially since the value of at least one of the parameters typically discussed is based on value judgments. Hänsel et al therefore chose to update the values of the parameters determining the social discount rate according to a recent survey of expert opinions.

The authors also made a number of additional updates to reflect new research in climate science and thereby improve the assumptions determining the relationship between greenhouse gas (GHG) emissions and temperature change (which include assumptions with respect to the global carbon cycle and the energy balance model translating radiative forcing to temperature impacts).

The authors also considered the impact of:

  • NETs (negative emissions technologies) such as afforestation, Biomass Energy with Carbon Capture and Storage (BECCS), and direct air capture. By providing the additional option of negative emissions after 2050, NETs further reduce the optimal equilibrium temperature, but also leads to a lower SCC in 2020 since the availability of NETs makes it optimal to postpone some emission reductions. However, it’s important to note that the potential magnitude of NETs available and on what timeline is debated and, for some strategies, still to be demonstrated.
  • Emission pathways with higher abatement of non-CO2 GHG emissions (which are not determined inside the DICE model) and make even lower equilibrium temperatures attainable. This illustrates the value of also addressing short term climate forcers such as methane emissions.

Both these latter updates contribute to a reduction in the economically optimal equilibrium temperature in DICE (i.e., the long run global average temperature which would provide the theoretically optimal balance between the social cost of climate damages and the costs of emission reductions).

The combined results of all these updates – reflecting recent findings in the climate economics and climate science literature – to the baseline assumptions in DICE are:

  • The SCC in 2020 is twice as high with all the other updates but with Nordhaus’ baseline assumptions for the social discount rate left unchanged. This is well in line with the strong consensus that SCCs at the levels produced with the baseline assumptions in DICE ($39 per tonne) significantly underestimate the true social costs of carbon dioxide emissions.
  • Optimal climate policy according to this updated DICE model keeps equilibrium temperature below 2 °C in 2100 in three quarters of all model runs.

Despite these key updates to the DICE framework, there are still—as the authors also point out—additional enhancements that can be made to improve this type of climate economic analysis, which weighs the costs and benefits of climate action. Such enhancements include consideration of risk and uncertainty and the representation of so-called “tipping points” as well as taking into account that the value of environmental assets relative to other goods and services may increase as they suffer a larger share of the costly damages from climate change.

Overall, these new findings show that the temperature targets in the Paris agreement (where countries committed to limiting the global temperature rise to well below 2 °C and to actively pursue a 1.5 °C target) are also supported by climate economic analysis and that reaching the UN climate targets is a good investment for the planet.

Posted in Uncategorized / 1 Response

How renewables, natural gas and flat demand led to a drop in CO2 emissions from the US power sector

New state-by-state research shows significant reductions across the country from 2005-2015

 Decarbonizing the power sector in the United States will be critical to achieving the goal of a 100% clean economy by 2050 – especially since reaching “net-zero” greenhouse gas emissions across the economy means that other energy-using sectors such as buildings and transport will increasingly need to be electrified, switching away from direct fossil fuel use and relying on low-carbon electricity instead. Demand for electricity is therefore very likely to grow in the future – which makes it critical that its CO2 emissions sharply decrease through the accelerated deployment of low carbon technologies, such as wind and solar power, in the decades ahead.

US power sector CO2 emissions, 1990-2015

For now, US power sector CO2 emissions appear to have turned a corner. While CO2 emissions from the U.S. power sector increased between 1990 and 2005, they peaked shortly thereafter, and then decreased to the point that by 2015, they had fallen by 20% (or 480 million metric tonnes CO2) compared to 2005.

In recently published research, my co-authors and I wanted to understand the drivers behind the drastic fall in the country’s—and individual states’–power sector CO2 emissions, and in particular the role that low carbon technologies such as wind and solar power have already played in reducing US power sector CO2 emissions. Our analysis, published in Environmental Research Letters  used an approach called index decomposition analysis and found that natural gas substituting for coal and petroleum coupled with large increases in renewable energy generation—primarily wind—were responsible for 60% and 30%, respectively, of the decline in CO2 emissions from the US power sector between 2005 and 2015.

Renewable growth in red states

Most of the emissions reductions driven by renewable energy growth came from Texas and states in the Midwest — Iowa, Kansas, Illinois and Oklahoma. While many of these states are not necessarily known for supporting aggressive climate policies, the combination of federal tax credits, state energy policies, decreasing costs of renewables and windy conditions appears to have provided powerful support for renewable energy deployment.

Texas, in particular, is an interesting case. In 2005, it was the leading emitter of U.S. power sector CO2 emissions across the country. But by 2015, its gross reductions from wind energy totaled 27 million metric tons, or more than 5% of the total net US reduction in power sector CO2 emissions since 2005 (i.e., a sixth of the total US reduction attributed to renewables). The state achieved its final renewable portfolio standard (RPS) target in 2008—seven years ahead of its 2015 goal. In addition to reduced costs of turbine technologies, federal tax credits and positive wind conditions also likely played a role in wind’s growth.

Wind generation in Texas, Iowa, Kansas, Illinois and Oklahoma together contributed half of the renewables-related emission reductions (70Mt or 3%-points out of the 20% reduction in US power sector CO2 emissions since 2005).

Over the same period, many states that had relied heavily on coal like Pennsylvania, Georgia, Alabama and Florida, reduced emissions by substituting natural gas for coal in electricity generation. While that prompted a decline in CO2 emissions, it’s important to note that while natural gas emits less CO2 emissions than coal and petroleum when producing electricity it is still a source of CO2 emissions and can only take us so far in decarbonizing the power sector. In addition, methane leakage across the supply chain remains a significant issue–and is not accounted for in this analysis, meaning the overall net greenhouse gas benefit from this natural gas expansion was–potentially significantly—lower.

Need for new policy

While there are positive signs in the power sector—the cost of renewables continues to decline and a growing number of states are taking crucial action to cut CO2 emissions, these trends as well as the specific factors identified in this analysis cannot be relied upon to achieve the deep emissions reductions needed in the decades ahead.

U.S. power sector CO2 emissions are projected to remain relatively flat over the next decade and rise slowly after that, absent new policies. This is particularly significant given that, much of the decarbonization of other sectors such as buildings and transportation will need to rely heavily on electrification.

Ultimately, new policy interventions are necessary, including strong limits on climate pollution – not only in the power sector, but across the entire economy to drive reductions at the pace and scale needed for the US to be 100% clean no later than 2050.

Posted in emissions, Markets 101 / Leave a comment

And the Nobel Prize goes to… Climate Economics

How newer research is building off Nordhaus’ past contributions

Äntligen! (Swedish—my native tongue—for “Finally!”) Last week, the Royal Swedish Academy of Sciences awarded the Nobel Prize in Economic Sciences to William Nordhaus for his pioneering work on “integrated assessment modeling” (IAM) – a framework which has made it possible to analyze the interplay between the economy and climate change, and the consequences of climate policies. And while the recognition of Nordhaus’ achievements is an encouraging sign that mainstream economics is starting to recognize the important contributions of environmental economics to the field, it’s critical that economists continue to build on and strengthen the research Nordhaus initiated.

Nordhaus started his research – in what was to become the now very active and expanding field of climate economics – already in the 1970s. His fundamental contribution came in the early 1990s when he introduced his Dynamic Integrated model of Climate and the Economy (DICE), which became the foundational framework for the IAMs used today by the Intergovernmental Panel on Climate Change (IPCC) as well as by an Interagency Working Group to develop estimates of the Social Cost of Greenhouse Gas Emissions during the Obama Administration.

The novelty of DICE was the integration of findings across disparate disciplines including physics, chemistry, and economics to model the link between economic activity and carbon emissions, leading to higher atmospheric carbon concentration and related higher global average temperatures. Furthermore, his model linked this increase in average temperature to economic damages. This integrated framework laid out the principles for estimating the damaging impacts of greenhouse gas emissions (GHGs) on human welfare, and could therefore be used to calculate the social cost of greenhouse gas emissions and to study the consequences of climate policy interventions such as carbon pricing.

In awarding him the Nobel Prize, The Royal Swedish Academy of Sciences recognized Nordhaus’ research as a methodological breakthrough and critical step forward – but one which does “not provide final answers.” While DICE, an acronym which nods to the perilous game we’re playing with the planet, laid the groundwork for the development of robust estimates of the social cost of GHGs by the Interagency Working Group (which experts agree reflect a lower bound), his research has also served to highlight how newer and ongoing research can further strengthen these estimates.

Such enhancements which further strengthen integrated assessment modeling include:

  • Incorporating more of the many non-market health and environmental impacts which are still omitted in IAMs by constructing more detailed damage functions (the assumed relationship between climatic changes and economic damages) founded in empirical studies of climate impacts using real-world data and taking into account that the value of environmental assets relative to other goods and services may increase as they suffer a larger share of the damages from climate change.
  • Strengthening how inter- and intra-generational equity is taken into account.
    • The highly influential Stern Review commissioned by the UK government in 2005 argued persuasively that Nordhaus put too little weight (through his choice of parameter values related to the discount rate) on the welfare of future generations which resulted in lower estimates of economic damages, and spurred an academic debate leading to recommendations that governments instead use declining discount rates when evaluating public projects and policies with long term impacts.
    • Climate change will impact different regions of the world very differently, with poorer regions generally hit worse than richer parts of the world. How well economists represent the spatial distribution of damages across regions of the world and the functional form and parameter values they choose for weighting differences in such damages significantly impact estimates of the social cost of greenhouse gas emissions.
  • Strengthening the way IAMs deal with risk and uncertainty – an inherently crucial element in any analyses of climate change – and the representation of so-called “tipping points” beyond which damages accelerate or become irreversible. This more recent research shows that such model enhancements also significantly increase estimates of the social cost of greenhouse gases, and underscore the vital importance of drastically reducing GHG emissions to insure against high‐temperature catastrophic climate risks.

Nordhaus shares the Nobel Prize with Paul Romer, who is separately awarded for integrating technological innovations into long-run macroeconomic analysis and his analyses of how markets develop new technologies. This is a very appropriate choice considering the importance of technological change for addressing climate change and gives this year’s prize the common theme of how to achieve both sustained and sustainable economic growth.

It is extremely timely that the Nobel Prize in Economics to Nordhaus’ work highlighting the critical role of carbon pricing and Romer’s work on the importance of technological innovation for long-run welfare was announced on the same day as the IPCC released its special report on the impacts of global warming of 1.5 °C showing the urgency of addressing climate change, and how both carbon pricing as well as technological innovation and diffusion have important roles to play.

 

 

Posted in Uncategorized / Leave a comment

Study: Renewables played crucial role in U.S. CO2 reductions

This blog was co-authored with Jonathan Camuzeaux, Adrian Muller, Marius Schneider and Gernot Wagner.

After a nearly 20-year upward trend, U.S. CO2 emissions from energy took a sharp and unexpected turn downwards in 2007. By 2013, the country’s annual CO2 emissions had decreased by 11% – a decline not witnessed since the 1979 oil crisis.

Experts have generally attributed this decrease to the economic recession, and to a huge surge in cheap natural gas displacing coal in the U.S. energy mix. But those same experts mostly overlooked another key factor: the parallel rise in renewable energy production from sources like wind and solar, which expanded substantially over the same 2007-2013 timeframe.

Between 2007 and 2013, wind generated electricity grew almost five-fold to 168 TWh and utility-scale solar from 0.6 TWh to 8.7 TWh. During the same period, bioenergy production grew 39 percent to 4,800 trillion BTUs.

Given these increases, how much did renewables contribute to the emissions reductions in the United States? In a paper published this month in the journal Energy Policy, we use a method called decomposition analysis to answer just that.

Unpacking the Factors

Decomposition analysis is an established method which enables us to separate different factors of influence on total CO2-emissions and identify the contribution of each to the observed decrease. The factors considered here are total energy demand, the share of gas in the fossil fuel mix (capturing the switch from coal and petroleum to gas), and the share of renewables and nuclear energy in total energy production.

Introducing a new approach for separately quantifying the contributions from renewables, we find that renewables played a crucial role in driving U.S. energy CO­2 emissions down between 2007 and 2013 – something which has previously largely gone unrecognized.

According to our index decomposition analysis, of the total 640 million metric ton (Mt) decrease (11%) during that period two-thirds resulted from changes in the composition of the U.S. energy mix (with the remaining third due to a reduction in primary energy demand). Of that, renewables contributed roughly 200 Mt reductions, about a third of the total drop in energy CO2 emissions. That’s about the same as the contribution of the coal and petroleum-to-gas switch (215 Mt). Conversely, increases in nuclear generation contributed a relatively minor 35 Mt.

While the significant role of renewables in reducing CO2 emissions does not diminish the contribution of the switch to natural gas, it is important to note that the climate benefits of switching from coal and petroleum to gas are undermined by the presence of methane leakage along the natural gas supply chain, the extent of which is likely underestimated in national greenhouse gas (GHG) emissions inventories.

Methane, of course, is a powerful greenhouse gas. Methane leakage from increased natural gas use could have wiped out up to 30% of the short-term GHG benefit (on a CO2-equivalent basis) calculated in this paper of switching from coal and petroleum to natural gas. For the natural gas industry to truly sustain the claim that it has made a positive contribution to reducing the country’s carbon footprint, the methane emissions associated with natural gas must be substantially reduced.

These results show that past incentives to support the expansion of renewable energy have been successful in reducing the country’s emissions, and that decreasing costs for renewable energy offers some hope for continued progress even despite the current administration’s refusal to address climate change.

Such progress, however, will never be sufficient without ambitious climate and clean energy policies- whether at the federal or at the state level – that can drive further emission reductions.

Posted in emissions, Energy efficiency / Leave a comment