Energy Exchange

EDF, Wyoming Outdoor Council Team To Protect Wyoming Air From Oil And Gas Development

Wyoming Outdoor Council’s lead attorney on air quality

EDF and Wyoming Outdoor Council are teaming up to protect air and water quality from oil and gas development in the Cowboy State. One of the first efforts in this partnership surrounds strengthening air quality regulation for the oil and gas industry in Pinedale, WY where persistent ozone pollution threatens the health of local residents. EDF’s Natural Gas Media Director, Lauren Whittenberg, recently sat down with Bruce Pendery, Wyoming Outdoor Council’s lead attorney on air quality issues, and Jon Goldstein, EDF’s Senior Energy Policy Manager, to learn more about this partnership.

Lauren Whittenberg: Can you tell me about the pollution problems in Pinedale?

Bruce Pendery: Well, first and foremost, this pollution is a public health issue. Monitoring of air quality in the Upper Green River Basin in western Wyoming near Pinedale started to show dangerous levels of ground level ozone pollution in 2006. Ground level ozone (also known as smog) is created by a complicated interaction between two different forms of air pollution, oxides of nitrogen and volatile organic compounds. Oil and gas development in the Pinedale area is the main source of both. Since the problem was identified, the Wyoming Outdoor Council has been heavily engaged with regulators, local citizens and industry to seek a way to reduce this harmful pollution to protect the local citizens and gas field workers.

LW: What problems does ozone pollution cause?

Jon Goldstein: Ozone is a toxic air pollutant widely known to cause a host of respiratory problems. Exposure to ozone pollution, even in low concentrations, can cause serious health problems, including permanent damage to the lungs. To address some of these concerns, EPA introduced rules – for the first time – that established federal emission standards for natural gas well sites, as well as tightened existing standards for other aspects of gas processing and distribution. EPA’s clean air measures are important to reduce air pollution from the oil and gas sector. It’s also interesting to note that EPA – in part – based these federal standards on state level rules that have been in place in Wyoming for several years. However, a big opportunity exists to further strengthen federal and state regulations and reduce air pollution for communities dealing with poor air quality.

LW: What is the plan to address this harmful pollution?

BP: On January 10, the Wyoming Department of Environmental Quality (DEQ) announced its plan to address air pollution issues in the Pinedale area’s Upper Green River Basin. This plan is based on recommendations the department received from the Upper Green River Basin Air Quality Citizens Advisory Task Force,  a broad group of local citizens, elected officials, oil and gas industry and environmental representatives brought together by the department. I served on this task force and helped formulate the ten consensus recommendations we provided to the DEQ.

LW: What were the recommendations?

BP: These are very practical, common sense efforts to reduce emissions from oil and gas operations. Things like monitoring, investigating and plugging leaks from faulty oil and gas production equipment, reducing emissions from produced water tanks and ponds, and developing legal efforts to better regulate existing sources of pollution.

LW: You mentioned that these recommendations were “consensus.” What does that mean?

JG: That is what is so encouraging about this effort. Each of the ten recommendations has the buy in of every member of the task force – a very broad group of local citizens and elected officials as well as industry and environmental groups like Wyoming Outdoor Council. These practical recommendations followed nine months of deliberations by the task force and six lengthy meetings.

That such a broad group could reach consensus on ten methods to improve local air pollution is a testament to their dedication. This hard work will be well worth it when these ideas are made a regulatory reality, and air quality issues in the region begin to improve.

LW: What’s next?

JG: This action plan is a key first step; the DEQ has offered an outline that, if implemented quickly and completely, will help put us on the path toward cleaner, healthier air. But now is a crucial time in this process. It is now up to the DEQ to make these ideas a reality and implement them through regulatory processes as quickly as possible.

And we aren’t stopping with these ten items. We have advocated for additional efforts to improve air quality, including better measures to monitor maintenance activities such as liquids unloading, extending the state’s strong Presumptive Best Available Control Technology (P-BACT) requirements throughout the ozone nonattainment area, and ensuring that existing and grandfathered emissions sources are controlled.

A lot is at stake. Inaction or inadequate action will not improve air quality or protect the health of local residents.

LW: How will Wyoming Outdoor Council and EDF keep this momentum going?

BP: We will remain involved in this process to ensure that the DEQ follows through as quickly as possible. We plan to be very active in the formal regulatory development and adoption processes that will kick off in the coming months. And we hope that all Wyoming citizens will stay involved in this effort. Wyoming has a strong history of leadership in regulating air emissions from the oil and gas sector. Our plan is to defend this hard-earned reputation and protect people and our air quality in the process.

LW: What other efforts are on tap in Wyoming?

JG: Because of both the strong regulatory tradition that Bruce mentioned, and Wyoming’s status as one of the largest sources of domestic oil and gas resources, Wyoming is one of our target states for EDF’s natural gas work. We are working on a number of opportunities to raise the bar on air and water quality regulations and also improve drilling protections on federal lands. This includes adoption of strong new federal rules around the venting and flaring of natural gas. You will hear more about these efforts in coming months, but we are very happy to have a partner as well respected and experienced as Wyoming Outdoor Council  to help us make them a reality.

LW: Thank you both.

 

 

Posted in General / Comments are closed

More evidence emerges that California’s Low Carbon Fuel Standard is a winning strategy and oil industry cost estimates are full of holes

California drivers and policy makers should be breathing an extra sigh of relief this week with the release of a new study by the California Electric Transportation Coalition (CalETC). The study, an evaluation of electricity use within the state’s Low Carbon Fuel Standard (LCFS), clearly shows that electrification benefits are on the horizon and oil industry funded analyses have yet again over-dramatized the difficulty of meeting one of the state’s landmark environmental laws.

In the study, CalETC shows that using electric passenger vehicles (both battery electric and plug-in hybrid vehicles), and electric off-road equipment (forklifts and trains), has the potential to generate a significant amount of creditable greenhouse gas reductions in the LCFS.

According to CalETC, three electrification solutions can cut up to 4 million tons of greenhouse gases per year by the year 2020, a significant portion of the total reductions required under the law. What’s more, since electricity as a fuel source costs one to two dollars per equivalent gallon less than gas and diesel, once the vehicles are on the roads and rails, the LCFS can actually save drivers a significant amount of money at the pump.

Prior industry reports on the LCFS like the one funded by the Western States Petroleum Association have lamented that compliance with the LCFS isn’t possible without oil companies going out of business or charging consumers significantly more at the pump. However, a plain reading of oil company cost analyses shows they purposely avoid consideration of the benefits of widespread deployment of alternative electric vehicles (EVs) in their research.

Not the first, probably not the last

Of course, this isn’t the first time industry cost estimates of environmental regulations, and specifically the LCFS, have emerged as highly suspect. For example, in September 2012, the non-partisan business group Environmental Entrepreneurs (E2) published a report showing how well positioned the US biofuel industry is to meet demand under the California standard – a direct counterpoint to recent oil industry estimates that say biofuels simply aren’t available.

In that E2 report, researchers found that 1.6 to 2.6 billion gallons of advanced biofuel will likely be produced in 2015, with increasing volumes thereafter, meaning LCFS compliance can be achieved solely through blending low carbon biofuels in the short, medium, and potentially long term. This blending will allow for compliance over and above what the electrification opportunities provide.

Similarly, for natural gas vehicles, the industry modeling of compliance scenarios assumes natural gas technologies won’t be sufficiently ready for widespread consumer use to be counted as a legitimate LCFS compliance opportunity. However, consistently low natural gas prices along with recent investments and R&D from companies like Chesapeake Energy Corp., Clean Energy, General Electric, Whirlpool and 3M have all been aimed at increasing the availability of natural gas as a fuel for passenger vehicles and heavy duty trucks.

In yet another analysis of LCFS compliance, it was found that “significant inaccuracies and faulty assumptions” led to the results of oil industry funded studies.

A first of its kind strategy whose time has come

California’s first-of-its-kind LCFS strategy for cutting climate change pollution from transportation fuel is designed to work alongside the state’s landmark cap-and-trade regulation between now and the year 2020, facilitating the transition of California’s transportation sector towards one which is lower carbon and is powered from an array of resources.

As Elisabeth Brinton, head of the Sacramento Municipal Utility District’s retail business, so aptly puts it, the California LCFS is “a great idea whose time has come.”

For more information about entities that support the California LCFS, (read here).

Posted in General / Comments are closed

A Tale Of Two IPOs

This morning two energy initial public offerings (IPOs) made their debut.  One of them was green and one of them was brown.  Unfortunately, the mainstream media missed the boat by characterizing the brown company as successful and the green one as a miss. We don’t see it that way.

The brown company is PBF Energy, a Blackstone-backed rollup of three refiners that were divested by Valero and Sunoco.  The company, like many refiners, is having its day in the sun as refining margins are currently wide due to technical market issues relating to the relative prices of Brent and WTI crudes.  The bottom line, however, is that demand for gasoline and diesel is unlikely to grow as CAFE fuel economy standards continue to tighten.

The second company, SolarCity, has been posting over 100% annual growth in solar installations since 2009.  Additionally, the company has been a leader in residential energy efficiency and EV charging stations, and has even begun to roll out a residential energy storage solution.

Unfortunately, SolarCity’s business model requires some complex accounting that ultimately hurt their valuation.  The vast majority of their solar photovoltaic (PV) installations are executed as leases or similar structures to take advantage of various tax incentives.  This reduces the accountants’ formulation of revenue, and also makes the business unprofitable.  As an example, imagine a solar company can construct a solar system for $16k and sell it for $20k, with $3k of overhead.  They would result in $20k of revenue, $4k of gross profit and $1k of net income.  Do that enough times and you have a pretty good business.

As a lease, however, they only recognize revenue as it is received through annual lease payments, which might be around $1500.  Assuming the $3k of overhead remains, then the company would post a loss of $1500 in year one.  Economically, this might be the same or better business, but through the eyes of an accountant, this is a harder pill to swallow as the profits must be realized over the long term of the lease.

SolarCity is a new concept for the public market: it is essentially the first high-growth cleantech company that relies on an equipment leasing model. Despite projected revenue growth, the solar IPO struggled to generate demand due to this complex accounting and priced well below the expected range. On the other hand, PBF priced at the middle of its range, and sold more shares than originally expected. Longer term, however, my money would be on the company with the meteoric growth rate.  So far today, the market seems to agree.  SolarCity is up 48% from its pricing while PBF Energy has gained less than 1%.

Posted in General / Comments are closed

A Red Flag On Disclosure Of Hydraulic Fracturing Chemicals

It’s not often that a new regulatory idea becomes so popular that one or more states per month climb on the bandwagon. But that is precisely what has happened with the push to disclose which chemicals are pumped into the ground to stimulate oil and natural gas production during the process known as hydraulic fracturing, or “fracking.”

A year ago, only three states (Arkansas, Montana and Wyoming) required oil and gas producers to tell the public what chemicals they were using. Two other states (Colorado and Texas) were actively developing such rules. Today, just twelve months later, statutes or regulations mandating “frack” chemical disclosure are on the books in no fewer than 18 states, and proposals are pending or under consideration in several others.

FracFocus, an online registry that compiles information on hydraulic fracturing chemicals both for states where disclosure is voluntary and required, has been up and running for just 20 months, but already it houses approximately 800,000 records that include ingredients data. As of December 5, 2012, this data represented 33,606 wells. The amount of information on the site continues to grow rapidly.

It is impressive that so much information has been made available in such a short time. Still, people have begun to wonder whether the disclosure rules are accomplishing what was intended. The question is important because rules that aren’t working need to be changed. A good regulatory system is based on a process of continual improvement, not a naive idea that the rulebook can be written in a way that will never need changing.

Unfortunately, judging from early press reports, there are quite a few bugs in the system. To be fair, the reporting requirements are quite new and still being implemented — and analysis of the data has barely begun. But  problems are emerging. The issue receiving the most media attention is the sheer number of trade secret claims. Read More »

Also posted in Natural Gas, Texas / Read 5 Responses

California Cap-and-Trade Auction Success

The results of California’s first ever auction for greenhouse gas (GHG) emissions allowances are public, marking the start of a new era for stimulating innovative solutions to combat climate change. Coincidentally, earlier today new atmospheric data was released by NOAA showing that 2012 is on pace to be the warmest year, eclipsing the mark set only two years ago.

By establishing a hard cap on emissions and creating a carbon price through a trading mechanism, California’s comprehensive GHG program complements, and is fine-tuned based on experiences from the world’s other climate change cap-and-trade mitigation programs. For example, lessons learned from the world’s largest cap and trade program in the European Union have shown that emissions of GHGs can actually decrease while the economy grows. Similarly, as shown by the Analysis Group’s report of the cap-and-trade program in the Northeastern United States, in addition to creating a strong signal for innovation, money generated through an auction can be invested in ways to cut GHGs even further.

Based on today’s results, California’s program is performing according to the expectations of economic experts and policy makers. The market price ($10.09) for credits that can be used in 2013 was slightly above the floor price of $10 dollars. Also, there were more bids for 2013 credits than credits sold, with 97% of allowances going to covered entities. Put simply, regulated businesses are taking this market seriously and believe they can cut greenhouse gas emissions even more cheaply than anticipated. This is a very good thing for California.

At the same time as the California carbon auction sold 23 million allowances for use starting in 2013, the market also sold 5.5 million allowances for use in 2015 and beyond. This is a clear signal that investors see this as a lasting program, and provides an important signal that the 9 billion plus dollars of clean tech investment made in California since 2006 has strong backing.

A California carbon price opens the door for cleaner energy and clean air, as we finally have an “official” cost of pollution. We are marching more resolutely than ever into an economically and environmentally sustainable future.

 

Posted in General / Comments are closed

Hurricane Sandy: A Lesson In Risk Planning For The Power Industry

Living in New York City through a week of Sandy and her aftermath was a reminder of just how critical electricity is to our lives.

Electricity is the difference between feeling safe in well-lit buildings and streets, or vulnerable in the dark. Between food kept well-preserved in refrigerators and water pumping through pipes, or dinner spoiling and taps gone dry. Between communications and productivity, or isolation and economic losses — which are now forecasted, from Sandy alone, to reach $50 billion.

For some, electric power is literally life or death: heat on

(Credit: Master Sgt. Mark Olsen/U.S. Air Force)

a cold night, access to vital medical services.

The responsibility for providing these essential services rests on utilities. And the gravity of that responsibility – along with a reliance on long-lived and costly assets – has led to a culture of caution. One that has given the power industry pause in moving away from the tried and true methods it has used to generate and deliver power for the past 100 years.

But what the increasingly intense storms rolling across the country reveal is that – sometimes – what seems the cautious path is in fact the most risky.

With an estimated 9.5 million homes and businesses having lost power thanks to Sandy, the utilities faring best at restoring their customers to warmth and safety are those that have begun modernizing their grids with advanced information technologies, and using those “smart grids” to build resilience and reliance on community-based energy resources. I spoke with Bloomberg Businessweek earlier this week to discuss our outdated grid and the crucial need for modernization.

We’re already seeing proof these investments can reduce recovery time, keep crews and customers safer, and save lots of money. Thanks in part to federal stimulus grants, a number of utilities are embedding sensors, communications and controls across their networks. On the power lines that it has helped prevent cascading disasters like the one that knocked out power to 55 million people in 2003, when a single Ohio tree fell on a power line. Automated systems can detect a fault, cordon it off and reroute power flow around it.

Digital “smart” meters, capable of two-way communications, have also proved their worth: providing utilities real-time, granular visibility into their networks, without resorting to (often failing) phones or trucks dispatched on wild goose chases.  Programmed to send a “last gasp” signal when they lose power, those meters have enabled rapid diagnostics – pinpointing exactly which homes or blocks were out, where the break had occurred – and expedited repairs.

Baltimore Gas and Electric, for instance, has installed about 10 percent of its planned 1.3 million smart meters. Linked to a “smart command center” borrowed from sister utility ComEd of Illinois (with whom EDF has been working on developing a set of performance metrics for its grid investments), the meters are telling them when their power restoration efforts have been successful or when further troubleshooting is needed. Without smart meters, they’d have to phone customers to ask if the power is back on. In storm conditions, according to Jeannette Mills, BG&E’s VP of Customer Operations, two-thirds of those calls go unanswered, which means they have to dispatch crews block by block across the region. This time, they’ve been able to ping the meters, asking “are you on?” Mills reports “a much higher rate of success getting through to smart meters than we do reaching customers by phone” enabling far more efficient dispatch of crews.

Utilities with smart grids have also kept customers better informed. A Pennsylvania Power and Light customer described to Smart Grid News how the real time tracking enabled by smart meters allowed him not only “to check on repair status for my own home (with crew on site info and estimated time to repair) … but also remotely online check the status of our two rental houses without having to physically drive to each to check them out.”

One of the first utilities to demonstrate a smart grid’s resilience was Alabama Power, which was slammed in April 2011 by 30 tornadoes across 70 miles with winds up to 190 mph. The twisters left 400,000 without power and thousands of poles, wires and substations damaged or destroyed. But by using its 1.4m smart meters to locate the outages and prioritize repairs, the utility restored all of its customers within a week. It also drives 4 million fewer miles each year.

The security benefits of a smarter, more resilient grid have caught the attention of the U.S. military. It has begun installing smart grid technologies on bases so they can function as “microgrids”: decoupling from the commercial grid in the case of a natural or manmade disaster and maintaining vital homeland security operations. The bases will also become reliability resources themselves, capable of supplying power to the grid, or reducing demand, at times when the grid is stressed.

Most importantly, these smart grids will enable the military to meet its aggressive goals for shifting to low-carbon, domestic energy resources, particularly renewable energy on or near bases. Secretary of the Navy Ray Mabus has set a goal for the service to get half its power from renewable resources by 2015. A smart grid will be absolutely critical to enabling the integration of millions of smaller, regional resources, and for managing the on-again, off-again character of the wind and sun.

The Secretary’s leadership reflects his recognition of the greatest risks that come from sticking to our tried and true ways of making and delivering power:  the national security threats posed by climate change. These include the threats we’ve seen this last week, again, from rising seas and extreme weather, as well as the casualties incurred by troops having to protect vulnerable fuel supplies, and the acceleration of instability and conflict warned of in a 2010 DOD report. When it comes to power, the greatest risks will come from failing to be bold.

Posted in General / Read 7 Responses