Market Forces

And the Nobel Prize goes to… Climate Economics

How newer research is building off Nordhaus’ past contributions

Äntligen! (Swedish—my native tongue—for “Finally!”) Last week, the Royal Swedish Academy of Sciences awarded the Nobel Prize in Economic Sciences to William Nordhaus for his pioneering work on “integrated assessment modeling” (IAM) – a framework which has made it possible to analyze the interplay between the economy and climate change, and the consequences of climate policies. And while the recognition of Nordhaus’ achievements is an encouraging sign that mainstream economics is starting to recognize the important contributions of environmental economics to the field, it’s critical that economists continue to build on and strengthen the research Nordhaus initiated.

Nordhaus started his research – in what was to become the now very active and expanding field of climate economics – already in the 1970s. His fundamental contribution came in the early 1990s when he introduced his Dynamic Integrated model of Climate and the Economy (DICE), which became the foundational framework for the IAMs used today by the Intergovernmental Panel on Climate Change (IPCC) as well as by an Interagency Working Group to develop estimates of the Social Cost of Greenhouse Gas Emissions during the Obama Administration.

The novelty of DICE was the integration of findings across disparate disciplines including physics, chemistry, and economics to model the link between economic activity and carbon emissions, leading to higher atmospheric carbon concentration and related higher global average temperatures. Furthermore, his model linked this increase in average temperature to economic damages. This integrated framework laid out the principles for estimating the damaging impacts of greenhouse gas emissions (GHGs) on human welfare, and could therefore be used to calculate the social cost of greenhouse gas emissions and to study the consequences of climate policy interventions such as carbon pricing.

In awarding him the Nobel Prize, The Royal Swedish Academy of Sciences recognized Nordhaus’ research as a methodological breakthrough and critical step forward – but one which does “not provide final answers." While DICE, an acronym which nods to the perilous game we’re playing with the planet, laid the groundwork for the development of robust estimates of the social cost of GHGs by the Interagency Working Group (which experts agree reflect a lower bound), his research has also served to highlight how newer and ongoing research can further strengthen these estimates.

Such enhancements which further strengthen integrated assessment modeling include:

  • Incorporating more of the many non-market health and environmental impacts which are still omitted in IAMs by constructing more detailed damage functions (the assumed relationship between climatic changes and economic damages) founded in empirical studies of climate impacts using real-world data and taking into account that the value of environmental assets relative to other goods and services may increase as they suffer a larger share of the damages from climate change.
  • Strengthening how inter- and intra-generational equity is taken into account.
    • The highly influential Stern Review commissioned by the UK government in 2005 argued persuasively that Nordhaus put too little weight (through his choice of parameter values related to the discount rate) on the welfare of future generations which resulted in lower estimates of economic damages, and spurred an academic debate leading to recommendations that governments instead use declining discount rates when evaluating public projects and policies with long term impacts.
    • Climate change will impact different regions of the world very differently, with poorer regions generally hit worse than richer parts of the world. How well economists represent the spatial distribution of damages across regions of the world and the functional form and parameter values they choose for weighting differences in such damages significantly impact estimates of the social cost of greenhouse gas emissions.
  • Strengthening the way IAMs deal with risk and uncertainty – an inherently crucial element in any analyses of climate change – and the representation of so-called “tipping points” beyond which damages accelerate or become irreversible. This more recent research shows that such model enhancements also significantly increase estimates of the social cost of greenhouse gases, and underscore the vital importance of drastically reducing GHG emissions to insure against high‐temperature catastrophic climate risks.

Nordhaus shares the Nobel Prize with Paul Romer, who is separately awarded for integrating technological innovations into long-run macroeconomic analysis and his analyses of how markets develop new technologies. This is a very appropriate choice considering the importance of technological change for addressing climate change and gives this year’s prize the common theme of how to achieve both sustained and sustainable economic growth.

It is extremely timely that the Nobel Prize in Economics to Nordhaus’ work highlighting the critical role of carbon pricing and Romer’s work on the importance of technological innovation for long-run welfare was announced on the same day as the IPCC released its special report on the impacts of global warming of 1.5 °C showing the urgency of addressing climate change, and how both carbon pricing as well as technological innovation and diffusion have important roles to play.

 

 

Posted in Uncategorized / Leave a comment

What California’s history of groundwater depletion can teach us about successful collective action

California’s landscape will transform in a changing climate. While extended drought and recent wildfires seasons have sparked conversations about acute impacts today, the promise of changes to come is no less worrying. Among the challenges for water management:

These changes will make water resources less reliable when they are needed most, rendering water storage an even more important feature of the state’s water system.

One promising option for new storage makes use of groundwater aquifers, which enable water users to smooth water consumption across time – saving in wet times and extracting during drought. However, when extraction exceeds recharge over the long term, “overdraft” occurs. Falling water tables increase pumping costs, reduce stored water available for future use, and entail a host of other collateral impacts. Historically, California’s basins have experienced substantial overdraft.

Falling water tables reflect inadequate institutional rules

One cause of the drawdown is California’s history of open-access management. Any landowner overlying an aquifer can pump water, encouraging a race to extract. Enclosing the groundwater commons and thereby constraining the total amount of pumping from each aquifer is critical for achieving efficient use and providing the volume and reliability of water storage that California will need in the future. However, despite evidence of substantial long-run economic gain from addressing the problem, only a few groups of users in California have successfully adopted pumping regulations that enclose the groundwater commons.

SMGA addresses overdraft—but pumpers must agree to terms

California’s Sustainable Groundwater Management Act (SGMA) of 2014 aims to solve this challenge by requiring stakeholders in overdrafted basins to form Groundwater Sustainability Agencies (GSAs) and create plans for sustainable management. However, past negotiations have been contentious, and old disagreements over how best to allocate the right to pump linger. The map presented below illustrates how fragmentation in (historical) Groundwater Management Plans also tracks with current fragmentation in Groundwater Sustainability Agencies (GSAs) under SGMA. Such persistent fragmentation suggests fundamental bargaining difficulties remain.

Spatial boundaries of self-selected management units within basins under SGMA (GSAs) mirror those of previous management plans (GMPs). Persistent fragmentation may signal that adoption of SGMA doesn’t mean the fundamental bargaining difficulties facing the basin users have disappeared.

New research, co-authored with Eric Edwards (NC State) and Gary Libecap (UC, Santa Barbara) and published in the Journal of Environmental Economics and Management, provides broad insights into where breakdowns occur and which factors determine whether collective action to constrain pumping is successful. From it, we’ve gleaned four suggestions for easing SGMA implementation.

Understanding the costs of contracting to restrict access

To understand why resource users often fail in adopting new management institutional rules, it’s important to consider the individual economic incentives of various pumpers. Even when they broadly agree that groundwater extraction is too high, collective action often stalls when users disagree about how to limit it. When some pumpers stand to lose economically from restricting water use, they will fight change, creating obstacles to addressing over-extraction. When arranging side payments or other institutional concessions is difficult, these obstacles increase the economic costs of negotiating agreement, termed “contracting costs.”

To better understand the sources of these costs in the context of groundwater, we compare basins that have adopted effective institutions in the past with otherwise similar basins where institutions are fragmented or missing. Even when controlling for the level of benefits, we found that failures of collective action are linked to the size of the basin and its user group, as well as variability in water use type and the spatial distribution of recharge. When pumpers vary in their water valuation and placement over the aquifer, the high costs of negotiating agreement inhibit successful adoption of management institutions, and overdraft persists. Indeed, in many of California’s successfully managed basins, consensus did not emerge until much farmland was urbanized, resulting in a homogenization of user demand on the resource.

Four key takeaways to ease agreement

In the face of such difficult public choices, how can pumpers and regulators come to agreement? Four main recommendations result from our research:

  • Define and allocate rights in a way that compensates users who face large losses from cutbacks in pumping. Tradable pumping rights can help overcome opposition. Pumpers can sell unused rights and are oftentimes made better off. The option to sell also incentivizes efficient water use.
  • Facilitate communication to reduce costs of monitoring and negotiations. The Department of Water Resources has already initiated a program to provide professional facilitation services to GSAs.
  • Promote and accept tailored management. Stakeholders and regulators should remain open to approaches that reduce contracting costs by addressing issues without defining allocations or attempting to adopt the most restrictive rules uniformly throughout the basin. For example, pumpers have successfully adopted spatially restricted management rules to address overdraft that leads to localized problems; others have adopted well-spacing restrictions that reduce well interference without limiting withdrawals.
  • Encourage exchange of other water sources. Imported, non-native surface water may lower contracting costs because it can save users from large, costly cutbacks. Pumpers have written contracts to share imported water in order to avoid bargaining over a smaller total pie; where such water is available, exchange pools (such as those described here) can help to limit the costs of adjustment.

SMGA is a large-scale public experiment in collective action. To avoid the failures of previous attempts to manage groundwater, stakeholders crafting strategies for compliance and regulators assessing them should keep in mind the difficult economic bargaining problem pumpers face. Hopes for effective, efficient, and sustainable water management in California depend on it.

Also posted in California / Leave a comment

New analyses agree carbon pricing is a powerful solution

This post is co-authored with Steve Koller

To tackle dangerous climate change at the pace and scale the science demands, we must take advantage of every cost-effective opportunity to cut pollution now. Several recent analyses from leading experts on the impacts of carbon pricing demonstrate once again why flexible, market-based policy is the most effective and efficient tool we have to address dangerous climate change.

These studies reaffirm that penalizing pollution and requiring companies to pay for their contribution to climate change can help the United States achieve needed reductions while generating substantial revenue. What’s more, none of these studies even account for the enormous benefits of averting climate change impacts.

While these studies examine carbon taxes (which place a price on pollution and allow the economy to respond), approaches that establish overall declining pollution limits and allow the market to determine the price can achieve similar pollution reductions and economic outcomes. But since uncertainty about market factors and technological trends prevent even the most robust economic modeling from providing guarantees, it is crucial that any carbon tax policy be linked to clear, concrete pollution reduction goals and include transparent provisions to help ensure those goals are met. A policy where the price is derived from overall enforceable pollution limits already includes those assurances.

The analyses by the Stanford Energy Modeling Forum (EMF 32, comprising 11 leading modeling teams), Columbia University’s Center on Global Energy Policy (CGEP) and the U.S. Energy Information Administration (EIA) examine a range of scenarios with price paths from $15  to $50 per ton and annual price escalators from 1 to 5 percent, along with various ways of distributing the revenue. In addition, Resources for the Future and Columbia modeled the carbon tax bill recently introduced in the House by Representative Curbelo and co-sponsors, which includes a starting price of $24 per ton and rising 2 percent annually.

Let’s take a look at four key takeaways across analyses:

1. National policy that puts a price on carbon could significantly reduce climate pollution

In all scenarios that examine a price across the economy, pollution reductions are achieved consistent with meeting or exceeding the U.S. Paris Agreement climate commitment by 2025 of 26 to 28 percent reductions below 2005 levels. On our current path, we will almost certainly fall short of meeting those goals, according to recent analysis from the Rhodium Group.

However, the analyses also show that to achieve deeper reductions aligned with long-term, science-based targets (for example, net-zero emissions by mid-century), we will likely need pricing paths at the more ambitious end of the spectrum—as well as companion policies to help the most difficult sectors decarbonize.

Of course, a key advantage of pricing pollution is that it will spur innovation—encouraging new technologies and approaches to slashing emissions that today’s models cannot foresee. These advances could allow us to meet our goals at lower costs than anticipated—exactly what’s happened with the well-designed, market-based U.S. acid rain program.

The EMF 32 results also underscore that the starting price matters for reductions in the short term, while the rate of increase over time is important for bending the emissions curve down in the long term. For example, in the first decade, the $50 plus 1 percent price achieves roughly 40 percent more cumulative emissions reductions than the $25 plus 5 percent scenario. However, by 2038 cumulative reductions in the $25 plus 5 percent price exceeds the $50 plus 1 percent price, and cumulative emissions through 2050 are similar. This dynamic is important since cutting pollution now is good for the climate, but we also need to sustain the pollution decline over the long term. Ultimately, the total cumulative climate pollution in the atmosphere is what matters.

2. Carbon pricing has extremely minor impacts on the economy—without accounting for the economic benefits of avoided climate change

Both EMF 32 and CGEP’s results suggest that GDP would continue to grow at historical or near-historical rates across scenarios—and could be net positive, depending on how revenue is used. Additionally, despite misleading rhetoric from opponents of climate action, a carbon price would have an extremely small net effect on employment. A recent analysis from Resources for the Future suggests that the net impact of a $40 tax would be less than half of one percent or even lower. And while many other studies confirm that net impacts on employment are likely to be small, they note that even mainstream modeling efforts tend to overestimate the impacts by a factor of 2.5 or more. Meanwhile, national climate policy would mean investing in the clean energy revolution: the economic engine of the future.

None of these analyses even consider the economic benefits associated with slashing climate pollution. Citibank estimates that climate change will cause tens of trillions of dollars in damages if left unchecked. These analyses also do not account for the additional associated benefits such as improvements in air quality. Taken together, these benefits make an overwhelming economic case for reducing pollution as soon as possible.

3. The lion’s share of reductions occur in the power sector, underscoring the importance of companion policies in other sectors

All analyses of an economy-wide price find that the vast majority of reductions occur in the power sector, driven primarily by declines in coal consumption. In the analyses examining $50 per ton scenarios, Columbia shows that approximately 80 percent of economy-wide emissions reductions occur in the power sector with a significant shift towards renewable energy, and EMF 32 results predict that coal demand reaches near-zero by 2030. This is consistent with modeling analysis conducted by the United States Mid-Century Strategy for Deep Decarbonization in 2016.

Some other sectors, notably transportation, tend to be less responsive to carbon pricing in the models—at least in the short term. Both Columbia and EMF 32 find that transportation sector emissions only drop a few percentage points relative to 2005 levels by 2030 even in the higher pricing scenarios. These results underscore the importance of policies that put a firm limit on pollution across the economy as well as companion policies that can help address specific barriers to change in sectors that will be more difficult to decarbonize.

4. How revenue is used matters

Carbon pricing has the potential to raise significant revenue—for example, just under a trillion dollars over the first 5 years with a $40 price, rising at 5 percent. How revenue is used plays a significant role in overall economic impacts as well as the distribution of those impacts across regions and populations.

For example, CGEP’s analysis finds that certain revenue recycling approaches—including the use of revenues to reduce payroll taxes or the national debt—result in larger long-run economic growth than scenarios without a carbon price. EMF results find that using revenues to reduce capital income taxes generally achieve the highest GDP growth of the scenarios they considered, but these gains are disproportionately captured by the wealthy.

Alternatively, revenue can be directed to not only avoid this sort of inequitable distribution of benefits, but also protect low-income families and disadvantaged communities who already bear a disproportionate share of the burden of climate change and air pollution, and are more sensitive to changes in energy costs. For example, Columbia’s analysis shows that approaches putting even a small portion of revenue back into the pockets of American households can compensate those in the lowest income quintile from potential increases in energy costs. The Center on Budget and Policy Priorities has also demonstrated how carbon pricing can be designed to fully offset impacts of the policy on the most vulnerable households and provide considerable support for others while leaving significant revenue to spare.

While assumptions and model structures may differ, bottom line findings all point in the same direction: well-designed, national carbon pricing policy can spark deep reductions in climate pollution alongside economic growth, while spurring technological innovation and protecting vulnerable populations.

The “price” itself is only one part of effective climate policy. We need firm declining limits on pollution to accompany a price and ensure environmental outcomes, as well as a portfolio of approaches working together to ensure that investment and innovation are happening where it matters. A pricing program can be a catalyst for driving additional climate solutions at the federal, state, and local level, while other policies can share the burden and tackle the problem from multiple angles. This model has already proven itself in California, where the state has hit pollution reduction targets even earlier and at lower cost than anticipated.

To be successful, we need bipartisan leadership and a serious discussion about meaningful solutions. The United States can and must address the challenge by working together in the best interest of all Americans to put in place ambitious, effective, and fair climate policy.

Posted in Uncategorized / Leave a comment

Linking in a world of significant policy uncertainty

This guest blog was co-authored with Thomas Sterner

And then there were three. As of January 1st, 2018, Ontario has joined California and Québec, linking their respective carbon markets. In a post-Paris world of bottom-up climate policy, linking of climate policy matters. It provides a concrete step forward on the Paris Declaration on Carbon Pricing in the Americas. It shows that, while the U.S. federal government is dismantling much-needed climate protections, states, together with Canadian provinces, are moving forward. Linking, if done right, can be a powerful enabler of greater ambition. It also raises important questions.

To be clear, there are real advantages to linking carbon markets: Linking of climate policies is a political affirmation of joint goals and a signal to others to move toward concerted carbon policies. It also shows the real advantages of market-based environmental policies. Bigger markets also afford greater cost-savings opportunities.

The textbook illustration of such savings is instructive. Take two jurisdictions, the high-cost abatement area (“H”) and the low-cost abatement area (“L”), with vastly different marginal costs (MC) of abatement. The total costs of abatement, the respective shaded areas in this graph, will be vastly different, too:

 

Now consider the idealized linked market. Total abatement (ΣX) will remain the same. The difference? Prices equilibrate across markets, with PL now equal to PH, lowering the total cost of achieving the same tons of carbon dioxide-equivalent (CO2e) abated.

 

Abatement costs clearly matter. The lower the costs of achieving the same goal, the better. All else equal, the two jurisdictions can now afford to abate more at the same cost.

Will all else indeed be equal?

It is clear the world needs to do a lot more to stabilize greenhouse-gas concentrations. That means quickly getting net emissions of carbon dioxide going into the atmosphere to zero.

There, too, the simple textbook case can be instructive. Linking implies sending money from country H to country L to pay for the cheaper abatement. This raises important questions of baselines, accounting, and transparency. Moreover, lower abatement costs are not the only objective of climate policies. Direct support for the deployment of new, cleaner technologies often tops the list. Given the political economy of reducing greenhouse-gas emissions in the first place, there are many competing domestic objectives and indeed real tradeoffs that need attention.

The big question then is what linkage does to the overall level of policy ambition. Lower costs imply the potential for more ambitious policies. That is clearly good but the devil is in the details. It is important to assure that coordination and collaboration among different jurisdictions really do raise the level of ambition, as the Paris Declaration pledges.

It is also clear that climate policy overall ought to have a balance of bottom-up and top-down policies. Linkage is one potentially important element in that equation. The ultimate measure, however, is tons of greenhouse gases abated from the atmosphere.

Posted in Uncategorized / Leave a comment

Moody’s Challenge: Prepare for Climate Change or Risk Credit Rating Downgrades

This post was co-authored by Aurora Barone

In the face of havoc wrought by recent storms and hurricanes, Moody’s Investors Services, Inc. has declared that state and local bondholders must account for climate change or face downgrades. It is the first of the three major credit rating agencies to incorporate climate change risks into its ratings assessment, a move that may incentivize policymakers to make smarter, long-term investments in resilience efforts like stormwater systems or flood management programs.

Bond rating agencies like Moody’s help investors determine the risk of companies and governments defaulting on repayments. Revenue, debt levels, and financial management are all common measures of creditworthiness.

States at high risk–mainly on the coast–including Texas, Florida, Georgia and Mississippi, will have to account for how they are preparing for the adverse effects of climate change, including the effects of storms and floods, which are predicted to become more frequent and intense as temperatures climb.

In its report to its clients, Moody’s outlined parameters that it will use to assess the “exposure and overall susceptibility of U.S. states to the physical effects of climate change.” Some of these parameters include reviewing an area’s economic, institutional, fiscal strengths, and susceptibility to event risk – all of which will influence the borrower’s ability to repay debt. Coastal risks, like rising sea levels and flooding, and an increase in the frequency of extreme weather events, like tornadoes, wildfires, and storms, are just a few of the indicators that will be incorporated into the rating.

This wasn’t always the case. Take New Jersey’s Ocean County, for example. In 2012, Hurricane Sandy devastated Seaside Heights, destroying local businesses and oceanfront properties. Yet, last summer, Ocean County sold $31 million in bonds maturing over 20 years – bonds which received a perfect triple-A rating from both Moody’s and S&P Global Ratings. In 2016, major bond companies issued triple-A ratings for long-term bonds to Hilton Head and Virginia Beach, despite the U.S. Navy’s warnings that the latter faced severe threats from climate change. A recent World Bank study calculated future urban losses that many coastal cities may face because of climate change; Miami, New York, New Orleans, and Boston ranked highest in overall risk.  In March 2016, Moody’s and S&P gave top ratings to bonds issued by Boston of $150 million maturing over 20 years, evidently not accounting for any associated climate risks.

In Moody’s new effort to incorporate the risk of climate change into its ratings, it is trying to account for “immediate and observable impacts on an issuer’s infrastructure, economy and revenue base, and environment” as well as economic challenges that may result, such as “smaller crop yields, infrastructure damage, higher energy demands, and escalated recovery costs”.

The hope: in facing the threat of a rating downgrade and more expensive debt, local governments should move to implement major adaptation and resilience projects as a way to entice investors, and of course, to plan for the effects of climate change.

 

Posted in Uncategorized / Leave a comment

Why rolling back common-sense rules puts taxpayers on the hook for future disasters

This post was co-authored with Beia Spiller

Since 2000, major flood and hurricane disasters have cost the nation $499.5 billion – that’s more than double what floods cost us from 1980 to 1999 – and doesn’t even reflect damages from Hurricanes Harvey, Irma, or Maria. You’d think that at a time when our nation faces greater threats from extreme weather, reducing the economic and social costs of flood disasters would be a top priority.

Instead, President Trump rescinded a requirement that federal agencies take future flood risks into greater consideration for federal projects in or affecting floodplains, setting us up for future fiscal disaster. Given the number and size of this year’s hurricanes, and the devastation they have wrought to millions of Americans, it’s clear that we aren’t doing enough to reduce the costs of these disasters. Yet, President Trump’s action, if uncorrected, will increase the costs to our country. Instead of rolling back common-sense rules meant to protect taxpayers, Congress and the administration should be ensuring that our federal investments can better withstand the impacts of flooding.

Where and how we build helps us better cope with disasters and saves money

The two best ways to minimize flood damage losses are: building outside of floodplains and building structures capable of coping with flooding. Federal agencies should be held accountable for implementing these proven best practices.

According to the Department of Homeland Security (DHS), benefits of implementing stronger building codes for natural hazards include savings from lowered insurance rates, increased property values, and reduced losses during floods. When building codes offer enhanced protection against the threats of flood-related disasters, communities recover faster and reduce the fiscal pressure on governments responding to damages.

Furthermore, designing for resiliency can be cost-effective. According to one study evaluating the effectiveness of flood building codes, constructing new buildings to withstand floods by increasing their elevation usually costs less than 1% of the total building cost for each foot they are raised. And, given the risks of flooding over time, these investments were found to pay for themselves in as little as one or two years for those areas with the highest risk of flooding. It’s noteworthy that buildings constructed after Andrew, following the more rigorous codes, withstood Irma.

In light of this, it is ironic that the most hurricane-prone state in the country could retreat from its renowned building code system given that Florida Governor Rick Scott signed into law changes to state’s system that had been adopted after Hurricane Andrew. The changes include reducing inspections and the frequency of code updates, and allowing for fewer votes from the state’s Building Commission to make further code revisions. The latter is seen by many as an opportunity for the Commission, which is dominated by contractors and construction firms, to further weaken the codes that have been seen as some of the best in the country.

From 1978-2016, FEMA paid out more than $59 trillion (in 2016 dollars) for losses associated with significant floods, with 76% of those payments occurring after 2004. Importantly, the average paid loss increased by almost 2.5 times since 1978 (even after accounting for inflation). These moves toward lowered building codes and standards will only ensure more and more costly FEMA payouts, with taxpayers footing the bill. In the long run, these actions are ultimately at odds with administrations preaching fiscal conservatism.

Investing now to save in the future

Instead of taking such unnecessary risks, cities and states should adopt more stringent risk-informed building codes and zoning, so we can start building now for a more reliable, sustainable and resilient infrastructure. Similarly, the administration should enhance flood resilience standards for federal investments, including those made as part of disaster recovery, to reduce the costs of flooding today and in the future. Doing so will improve long-term protection of human health and welfare. If we build smarter now, communities, taxpayers and nature will reap rewards in the future.

Posted in Uncategorized / Leave a comment