Market Forces

What California’s history of groundwater depletion can teach us about successful collective action

California’s landscape will transform in a changing climate. While extended drought and recent wildfires seasons have sparked conversations about acute impacts today, the promise of changes to come is no less worrying. Among the challenges for water management:

These changes will make water resources less reliable when they are needed most, rendering water storage an even more important feature of the state’s water system.

One promising option for new storage makes use of groundwater aquifers, which enable water users to smooth water consumption across time – saving in wet times and extracting during drought. However, when extraction exceeds recharge over the long term, “overdraft” occurs. Falling water tables increase pumping costs, reduce stored water available for future use, and entail a host of other collateral impacts. Historically, California’s basins have experienced substantial overdraft.

Falling water tables reflect inadequate institutional rules

One cause of the drawdown is California’s history of open-access management. Any landowner overlying an aquifer can pump water, encouraging a race to extract. Enclosing the groundwater commons and thereby constraining the total amount of pumping from each aquifer is critical for achieving efficient use and providing the volume and reliability of water storage that California will need in the future. However, despite evidence of substantial long-run economic gain from addressing the problem, only a few groups of users in California have successfully adopted pumping regulations that enclose the groundwater commons.

SMGA addresses overdraft—but pumpers must agree to terms

California’s Sustainable Groundwater Management Act (SGMA) of 2014 aims to solve this challenge by requiring stakeholders in overdrafted basins to form Groundwater Sustainability Agencies (GSAs) and create plans for sustainable management. However, past negotiations have been contentious, and old disagreements over how best to allocate the right to pump linger. The map presented below illustrates how fragmentation in (historical) Groundwater Management Plans also tracks with current fragmentation in Groundwater Sustainability Agencies (GSAs) under SGMA. Such persistent fragmentation suggests fundamental bargaining difficulties remain.

Spatial boundaries of self-selected management units within basins under SGMA (GSAs) mirror those of previous management plans (GMPs). Persistent fragmentation may signal that adoption of SGMA doesn’t mean the fundamental bargaining difficulties facing the basin users have disappeared.

New research, co-authored with Eric Edwards (NC State) and Gary Libecap (UC, Santa Barbara) and published in the Journal of Environmental Economics and Management, provides broad insights into where breakdowns occur and which factors determine whether collective action to constrain pumping is successful. From it, we’ve gleaned four suggestions for easing SGMA implementation.

Understanding the costs of contracting to restrict access

To understand why resource users often fail in adopting new management institutional rules, it’s important to consider the individual economic incentives of various pumpers. Even when they broadly agree that groundwater extraction is too high, collective action often stalls when users disagree about how to limit it. When some pumpers stand to lose economically from restricting water use, they will fight change, creating obstacles to addressing over-extraction. When arranging side payments or other institutional concessions is difficult, these obstacles increase the economic costs of negotiating agreement, termed “contracting costs.”

To better understand the sources of these costs in the context of groundwater, we compare basins that have adopted effective institutions in the past with otherwise similar basins where institutions are fragmented or missing. Even when controlling for the level of benefits, we found that failures of collective action are linked to the size of the basin and its user group, as well as variability in water use type and the spatial distribution of recharge. When pumpers vary in their water valuation and placement over the aquifer, the high costs of negotiating agreement inhibit successful adoption of management institutions, and overdraft persists. Indeed, in many of California’s successfully managed basins, consensus did not emerge until much farmland was urbanized, resulting in a homogenization of user demand on the resource.

Four key takeaways to ease agreement

In the face of such difficult public choices, how can pumpers and regulators come to agreement? Four main recommendations result from our research:

  • Define and allocate rights in a way that compensates users who face large losses from cutbacks in pumping. Tradable pumping rights can help overcome opposition. Pumpers can sell unused rights and are oftentimes made better off. The option to sell also incentivizes efficient water use.
  • Facilitate communication to reduce costs of monitoring and negotiations. The Department of Water Resources has already initiated a program to provide professional facilitation services to GSAs.
  • Promote and accept tailored management. Stakeholders and regulators should remain open to approaches that reduce contracting costs by addressing issues without defining allocations or attempting to adopt the most restrictive rules uniformly throughout the basin. For example, pumpers have successfully adopted spatially restricted management rules to address overdraft that leads to localized problems; others have adopted well-spacing restrictions that reduce well interference without limiting withdrawals.
  • Encourage exchange of other water sources. Imported, non-native surface water may lower contracting costs because it can save users from large, costly cutbacks. Pumpers have written contracts to share imported water in order to avoid bargaining over a smaller total pie; where such water is available, exchange pools (such as those described here) can help to limit the costs of adjustment.

SMGA is a large-scale public experiment in collective action. To avoid the failures of previous attempts to manage groundwater, stakeholders crafting strategies for compliance and regulators assessing them should keep in mind the difficult economic bargaining problem pumpers face. Hopes for effective, efficient, and sustainable water management in California depend on it.

Also posted in California / Comments are closed

New analyses agree carbon pricing is a powerful solution

This post is co-authored with Steve Koller

To tackle dangerous climate change at the pace and scale the science demands, we must take advantage of every cost-effective opportunity to cut pollution now. Several recent analyses from leading experts on the impacts of carbon pricing demonstrate once again why flexible, market-based policy is the most effective and efficient tool we have to address dangerous climate change.

These studies reaffirm that penalizing pollution and requiring companies to pay for their contribution to climate change can help the United States achieve needed reductions while generating substantial revenue. What’s more, none of these studies even account for the enormous benefits of averting climate change impacts.

While these studies examine carbon taxes (which place a price on pollution and allow the economy to respond), approaches that establish overall declining pollution limits and allow the market to determine the price can achieve similar pollution reductions and economic outcomes. But since uncertainty about market factors and technological trends prevent even the most robust economic modeling from providing guarantees, it is crucial that any carbon tax policy be linked to clear, concrete pollution reduction goals and include transparent provisions to help ensure those goals are met. A policy where the price is derived from overall enforceable pollution limits already includes those assurances.

The analyses by the Stanford Energy Modeling Forum (EMF 32, comprising 11 leading modeling teams), Columbia University’s Center on Global Energy Policy (CGEP) and the U.S. Energy Information Administration (EIA) examine a range of scenarios with price paths from $15  to $50 per ton and annual price escalators from 1 to 5 percent, along with various ways of distributing the revenue. In addition, Resources for the Future and Columbia modeled the carbon tax bill recently introduced in the House by Representative Curbelo and co-sponsors, which includes a starting price of $24 per ton and rising 2 percent annually.

Let’s take a look at four key takeaways across analyses:

1. National policy that puts a price on carbon could significantly reduce climate pollution

In all scenarios that examine a price across the economy, pollution reductions are achieved consistent with meeting or exceeding the U.S. Paris Agreement climate commitment by 2025 of 26 to 28 percent reductions below 2005 levels. On our current path, we will almost certainly fall short of meeting those goals, according to recent analysis from the Rhodium Group.

However, the analyses also show that to achieve deeper reductions aligned with long-term, science-based targets (for example, net-zero emissions by mid-century), we will likely need pricing paths at the more ambitious end of the spectrum—as well as companion policies to help the most difficult sectors decarbonize.

Of course, a key advantage of pricing pollution is that it will spur innovation—encouraging new technologies and approaches to slashing emissions that today’s models cannot foresee. These advances could allow us to meet our goals at lower costs than anticipated—exactly what’s happened with the well-designed, market-based U.S. acid rain program.

The EMF 32 results also underscore that the starting price matters for reductions in the short term, while the rate of increase over time is important for bending the emissions curve down in the long term. For example, in the first decade, the $50 plus 1 percent price achieves roughly 40 percent more cumulative emissions reductions than the $25 plus 5 percent scenario. However, by 2038 cumulative reductions in the $25 plus 5 percent price exceeds the $50 plus 1 percent price, and cumulative emissions through 2050 are similar. This dynamic is important since cutting pollution now is good for the climate, but we also need to sustain the pollution decline over the long term. Ultimately, the total cumulative climate pollution in the atmosphere is what matters.

2. Carbon pricing has extremely minor impacts on the economy—without accounting for the economic benefits of avoided climate change

Both EMF 32 and CGEP’s results suggest that GDP would continue to grow at historical or near-historical rates across scenarios—and could be net positive, depending on how revenue is used. Additionally, despite misleading rhetoric from opponents of climate action, a carbon price would have an extremely small net effect on employment. A recent analysis from Resources for the Future suggests that the net impact of a $40 tax would be less than half of one percent or even lower. And while many other studies confirm that net impacts on employment are likely to be small, they note that even mainstream modeling efforts tend to overestimate the impacts by a factor of 2.5 or more. Meanwhile, national climate policy would mean investing in the clean energy revolution: the economic engine of the future.

None of these analyses even consider the economic benefits associated with slashing climate pollution. Citibank estimates that climate change will cause tens of trillions of dollars in damages if left unchecked. These analyses also do not account for the additional associated benefits such as improvements in air quality. Taken together, these benefits make an overwhelming economic case for reducing pollution as soon as possible.

3. The lion’s share of reductions occur in the power sector, underscoring the importance of companion policies in other sectors

All analyses of an economy-wide price find that the vast majority of reductions occur in the power sector, driven primarily by declines in coal consumption. In the analyses examining $50 per ton scenarios, Columbia shows that approximately 80 percent of economy-wide emissions reductions occur in the power sector with a significant shift towards renewable energy, and EMF 32 results predict that coal demand reaches near-zero by 2030. This is consistent with modeling analysis conducted by the United States Mid-Century Strategy for Deep Decarbonization in 2016.

Some other sectors, notably transportation, tend to be less responsive to carbon pricing in the models—at least in the short term. Both Columbia and EMF 32 find that transportation sector emissions only drop a few percentage points relative to 2005 levels by 2030 even in the higher pricing scenarios. These results underscore the importance of policies that put a firm limit on pollution across the economy as well as companion policies that can help address specific barriers to change in sectors that will be more difficult to decarbonize.

4. How revenue is used matters

Carbon pricing has the potential to raise significant revenue—for example, just under a trillion dollars over the first 5 years with a $40 price, rising at 5 percent. How revenue is used plays a significant role in overall economic impacts as well as the distribution of those impacts across regions and populations.

For example, CGEP’s analysis finds that certain revenue recycling approaches—including the use of revenues to reduce payroll taxes or the national debt—result in larger long-run economic growth than scenarios without a carbon price. EMF results find that using revenues to reduce capital income taxes generally achieve the highest GDP growth of the scenarios they considered, but these gains are disproportionately captured by the wealthy.

Alternatively, revenue can be directed to not only avoid this sort of inequitable distribution of benefits, but also protect low-income families and disadvantaged communities who already bear a disproportionate share of the burden of climate change and air pollution, and are more sensitive to changes in energy costs. For example, Columbia’s analysis shows that approaches putting even a small portion of revenue back into the pockets of American households can compensate those in the lowest income quintile from potential increases in energy costs. The Center on Budget and Policy Priorities has also demonstrated how carbon pricing can be designed to fully offset impacts of the policy on the most vulnerable households and provide considerable support for others while leaving significant revenue to spare.

While assumptions and model structures may differ, bottom line findings all point in the same direction: well-designed, national carbon pricing policy can spark deep reductions in climate pollution alongside economic growth, while spurring technological innovation and protecting vulnerable populations.

The “price” itself is only one part of effective climate policy. We need firm declining limits on pollution to accompany a price and ensure environmental outcomes, as well as a portfolio of approaches working together to ensure that investment and innovation are happening where it matters. A pricing program can be a catalyst for driving additional climate solutions at the federal, state, and local level, while other policies can share the burden and tackle the problem from multiple angles. This model has already proven itself in California, where the state has hit pollution reduction targets even earlier and at lower cost than anticipated.

To be successful, we need bipartisan leadership and a serious discussion about meaningful solutions. The United States can and must address the challenge by working together in the best interest of all Americans to put in place ambitious, effective, and fair climate policy.

Posted in Uncategorized / Comments are closed

Linking in a world of significant policy uncertainty

This guest blog was co-authored with Thomas Sterner

And then there were three. As of January 1st, 2018, Ontario has joined California and Québec, linking their respective carbon markets. In a post-Paris world of bottom-up climate policy, linking of climate policy matters. It provides a concrete step forward on the Paris Declaration on Carbon Pricing in the Americas. It shows that, while the U.S. federal government is dismantling much-needed climate protections, states, together with Canadian provinces, are moving forward. Linking, if done right, can be a powerful enabler of greater ambition. It also raises important questions.

To be clear, there are real advantages to linking carbon markets: Linking of climate policies is a political affirmation of joint goals and a signal to others to move toward concerted carbon policies. It also shows the real advantages of market-based environmental policies. Bigger markets also afford greater cost-savings opportunities.

The textbook illustration of such savings is instructive. Take two jurisdictions, the high-cost abatement area (“H”) and the low-cost abatement area (“L”), with vastly different marginal costs (MC) of abatement. The total costs of abatement, the respective shaded areas in this graph, will be vastly different, too:

 

Now consider the idealized linked market. Total abatement (ΣX) will remain the same. The difference? Prices equilibrate across markets, with PL now equal to PH, lowering the total cost of achieving the same tons of carbon dioxide-equivalent (CO2e) abated.

 

Abatement costs clearly matter. The lower the costs of achieving the same goal, the better. All else equal, the two jurisdictions can now afford to abate more at the same cost.

Will all else indeed be equal?

It is clear the world needs to do a lot more to stabilize greenhouse-gas concentrations. That means quickly getting net emissions of carbon dioxide going into the atmosphere to zero.

There, too, the simple textbook case can be instructive. Linking implies sending money from country H to country L to pay for the cheaper abatement. This raises important questions of baselines, accounting, and transparency. Moreover, lower abatement costs are not the only objective of climate policies. Direct support for the deployment of new, cleaner technologies often tops the list. Given the political economy of reducing greenhouse-gas emissions in the first place, there are many competing domestic objectives and indeed real tradeoffs that need attention.

The big question then is what linkage does to the overall level of policy ambition. Lower costs imply the potential for more ambitious policies. That is clearly good but the devil is in the details. It is important to assure that coordination and collaboration among different jurisdictions really do raise the level of ambition, as the Paris Declaration pledges.

It is also clear that climate policy overall ought to have a balance of bottom-up and top-down policies. Linkage is one potentially important element in that equation. The ultimate measure, however, is tons of greenhouse gases abated from the atmosphere.

Posted in Uncategorized / Comments are closed

Moody’s Challenge: Prepare for Climate Change or Risk Credit Rating Downgrades

This post was co-authored by Aurora Barone

In the face of havoc wrought by recent storms and hurricanes, Moody’s Investors Services, Inc. has declared that state and local bondholders must account for climate change or face downgrades. It is the first of the three major credit rating agencies to incorporate climate change risks into its ratings assessment, a move that may incentivize policymakers to make smarter, long-term investments in resilience efforts like stormwater systems or flood management programs.

Bond rating agencies like Moody’s help investors determine the risk of companies and governments defaulting on repayments. Revenue, debt levels, and financial management are all common measures of creditworthiness.

States at high risk–mainly on the coast–including Texas, Florida, Georgia and Mississippi, will have to account for how they are preparing for the adverse effects of climate change, including the effects of storms and floods, which are predicted to become more frequent and intense as temperatures climb.

In its report to its clients, Moody’s outlined parameters that it will use to assess the “exposure and overall susceptibility of U.S. states to the physical effects of climate change.” Some of these parameters include reviewing an area’s economic, institutional, fiscal strengths, and susceptibility to event risk – all of which will influence the borrower’s ability to repay debt. Coastal risks, like rising sea levels and flooding, and an increase in the frequency of extreme weather events, like tornadoes, wildfires, and storms, are just a few of the indicators that will be incorporated into the rating.

This wasn’t always the case. Take New Jersey’s Ocean County, for example. In 2012, Hurricane Sandy devastated Seaside Heights, destroying local businesses and oceanfront properties. Yet, last summer, Ocean County sold $31 million in bonds maturing over 20 years – bonds which received a perfect triple-A rating from both Moody’s and S&P Global Ratings. In 2016, major bond companies issued triple-A ratings for long-term bonds to Hilton Head and Virginia Beach, despite the U.S. Navy’s warnings that the latter faced severe threats from climate change. A recent World Bank study calculated future urban losses that many coastal cities may face because of climate change; Miami, New York, New Orleans, and Boston ranked highest in overall risk.  In March 2016, Moody’s and S&P gave top ratings to bonds issued by Boston of $150 million maturing over 20 years, evidently not accounting for any associated climate risks.

In Moody’s new effort to incorporate the risk of climate change into its ratings, it is trying to account for “immediate and observable impacts on an issuer’s infrastructure, economy and revenue base, and environment” as well as economic challenges that may result, such as “smaller crop yields, infrastructure damage, higher energy demands, and escalated recovery costs”.

The hope: in facing the threat of a rating downgrade and more expensive debt, local governments should move to implement major adaptation and resilience projects as a way to entice investors, and of course, to plan for the effects of climate change.

 

Posted in Uncategorized / Comments are closed

Why rolling back common-sense rules puts taxpayers on the hook for future disasters

This post was co-authored with Beia Spiller

Since 2000, major flood and hurricane disasters have cost the nation $499.5 billion – that’s more than double what floods cost us from 1980 to 1999 – and doesn’t even reflect damages from Hurricanes Harvey, Irma, or Maria. You’d think that at a time when our nation faces greater threats from extreme weather, reducing the economic and social costs of flood disasters would be a top priority.

Instead, President Trump rescinded a requirement that federal agencies take future flood risks into greater consideration for federal projects in or affecting floodplains, setting us up for future fiscal disaster. Given the number and size of this year’s hurricanes, and the devastation they have wrought to millions of Americans, it’s clear that we aren’t doing enough to reduce the costs of these disasters. Yet, President Trump’s action, if uncorrected, will increase the costs to our country. Instead of rolling back common-sense rules meant to protect taxpayers, Congress and the administration should be ensuring that our federal investments can better withstand the impacts of flooding.

Where and how we build helps us better cope with disasters and saves money

The two best ways to minimize flood damage losses are: building outside of floodplains and building structures capable of coping with flooding. Federal agencies should be held accountable for implementing these proven best practices.

According to the Department of Homeland Security (DHS), benefits of implementing stronger building codes for natural hazards include savings from lowered insurance rates, increased property values, and reduced losses during floods. When building codes offer enhanced protection against the threats of flood-related disasters, communities recover faster and reduce the fiscal pressure on governments responding to damages.

Furthermore, designing for resiliency can be cost-effective. According to one study evaluating the effectiveness of flood building codes, constructing new buildings to withstand floods by increasing their elevation usually costs less than 1% of the total building cost for each foot they are raised. And, given the risks of flooding over time, these investments were found to pay for themselves in as little as one or two years for those areas with the highest risk of flooding. It’s noteworthy that buildings constructed after Andrew, following the more rigorous codes, withstood Irma.

In light of this, it is ironic that the most hurricane-prone state in the country could retreat from its renowned building code system given that Florida Governor Rick Scott signed into law changes to state’s system that had been adopted after Hurricane Andrew. The changes include reducing inspections and the frequency of code updates, and allowing for fewer votes from the state’s Building Commission to make further code revisions. The latter is seen by many as an opportunity for the Commission, which is dominated by contractors and construction firms, to further weaken the codes that have been seen as some of the best in the country.

From 1978-2016, FEMA paid out more than $59 trillion (in 2016 dollars) for losses associated with significant floods, with 76% of those payments occurring after 2004. Importantly, the average paid loss increased by almost 2.5 times since 1978 (even after accounting for inflation). These moves toward lowered building codes and standards will only ensure more and more costly FEMA payouts, with taxpayers footing the bill. In the long run, these actions are ultimately at odds with administrations preaching fiscal conservatism.

Investing now to save in the future

Instead of taking such unnecessary risks, cities and states should adopt more stringent risk-informed building codes and zoning, so we can start building now for a more reliable, sustainable and resilient infrastructure. Similarly, the administration should enhance flood resilience standards for federal investments, including those made as part of disaster recovery, to reduce the costs of flooding today and in the future. Doing so will improve long-term protection of human health and welfare. If we build smarter now, communities, taxpayers and nature will reap rewards in the future.

Posted in Uncategorized / Comments are closed

What’s behind President Trump’s mystery math?

This post originally appeared on EDF’s Climate 411

By this time, your eyes may have glazed over from reading the myriad of fact checks and rebuttals of President Trump’s speech announcing the United States’ withdrawal from the Paris climate agreement. There were so many dizzying falsehoods in his comments that it is nearly impossible to find any truth in the rhetorical fog.

Of all the falsehoods, President Trump’s insistence that compliance with the Paris accord would cost Americans millions of lost jobs and trillions in lowered Gross Domestic Product was particularly brazen, deceptive, and absurd. These statements are part of a disturbing pattern, the latest in a calculated campaign to deceive the public about the economics of reducing climate pollution.

Based on a study funded by industry trade groups

Let’s be clear: the National Economic Research Associates (NERA) study underpinning these misleading claims was paid for by the U.S. Chamber of Commerce and the American Council for Capital Formation (ACCF) – two lobbying organizations backed by fossil fuel industry funding that have a history of commissioning exaggerated cost estimates of climate change solutions. When you pay for bad assumptions, you ensure exaggerated and unrealistic results.

In the past five years alone, NERA has released a number of dubious studies funded by fossil fuel interests about a range of environmental safeguards that protect the public from dangerous pollution like mercury, smog, and particulate matter – all of which cause serious health impacts, especially in the elderly, children, and the most vulnerable. NERA’s work has been debunked over and over. Experts from MIT and NYU said NERA’s cost estimates from a 2014 study on EPA’s ozone standards were “fraudulent” and calculated in “an insane way.” NERA’s 2015 estimates of the impacts of the Clean Power Plan, which are frequently quoted by President Trump’s EPA Administrator Scott Pruitt and others, have also been rebutted due to unrealistic and pessimistic assumptions.

The study does not account for the enormous costs of climate pollution

In his speech about the Paris agreement, President Trump crossed a line that made even NERA so uncomfortable that it released a statement emphasizing that its results were mischaracterized and that the study “was not a cost-benefit analysis of the Paris agreement, nor does it purport to be one.”

The most important point embedded in this statement is that the study does not account for the enormous benefits of reducing the carbon pollution causing climate change. Climate change causes devastating impacts including extreme weather events like flooding and deadly storms, the spread of disease, sea level rise, increased food insecurity, and other disasters. These impacts can cost businesses, families, governments and taxpayers hundreds of billions of dollars through rising health care costs, destruction of property, increased food prices, and more. The costs of this pollution are massive, and communities all around the U.S. are already feeling the impacts – yet the President and his Administration continue to disregard this reality as well as basic scientific and economic facts.

Cherry-picking an impractical and imaginary pathway to emission reductions

The statistics the President used were picked from a specific scenario in the study that outlined an impractical and imaginary pathway to meet our 2025 targets designed to be needlessly expensive, as experts at the World Resources Institute and the Natural Resources Defense Council have noted. The study’s “core” scenario assumes sector by sector emission reduction targets (which do not exist as part of the Paris accord) that result in the most aggressive level of mitigation being required from the sectors where it is most expensive. This includes an almost 40 percent reduction in industrial sector emissions – a disproportionate level not envisioned in any current policy proposal – which results in heavily exaggerated costs.

An expert at the independent think tank Resources for the Future, Marc Hafstead, pointed out:

The NERA study grossly overstates the changes in output and jobs in heavy industry.

Yale economist Kenneth Gillingham said of these numbers:

It’s not something you can cite in a presidential speech with a straight face … It’s being used as a talking point taken out of context.

The NERA analysis also includes a scenario that illustrates what experts have known for decades – that a smarter and more cost-effective route to achieving deep emission reductions is a flexible, economy-wide program that prices carbon and allows the market to take advantage of the most cost-effective reductions across sectors. Even NERA’s analysis shows that this type of program would result in significantly lower costs than their “core” scenario. Not surprisingly, that analysis is buried in the depths of the report, and has been entirely ignored by the Chamber of Commerce and ACCF as well as President Trump.

Study ignores potential innovation and declining costs of low carbon energy

Finally, the NERA study assumes that businesses would not innovate to keep costs down in the face of new regulations – employing pessimistic assumptions that ignore the transformational changes already moving us towards the expansion of lower carbon energy. Those assumptions rely on overly-conservative projections for renewable energy costs, which have been rapidly declining. They also underestimate the potential for reductions from low-cost efficiency improvements, and assume only minimal technological improvements in the coming years.

In reality, clean energy is outpacing previous forecasts and clean energy jobs are booming. There are more jobs in solar energy than in oil and natural gas extraction in the U.S. right now, and more jobs in wind than in coal mining.

The truth is that the clean energy revolution is the economic engine of the future. President Trump’s announcement that he will withdraw the U.S. from the Paris accord cedes leadership and enormous investment opportunities to Europe, China, and the rest of the world. His faulty math will not change these facts.

Also posted in International, Politics, Trump's energy plan / Comments are closed