Climate 411

Hansen was right: Marking an anniversary by misleading the public

Dr. James Hansen testifying before Congress in 1988

With the thirtieth anniversary of former NASA scientist Jim Hansen’s landmark testimony to Congress on the urgent need to address climate change, numerous articles marked the occasion by demonstrating that his 1988 predictions have proven to be accurate.

Inevitably, some writers seized the opportunity to revive long-debunked arguments in an attempt to cast doubt and confusion on the threat.

Perhaps the most misleading – and certainly the highest profile – was a June 21st op-ed in the Wall Street Journal written by Pat Michaels and Ryan Maue. Michaels is director of the Center for the Study of Science at the Cato Institute, a think tank financially linked to the fossil fuel industry. And Michaels has been found to have previously misled Congress by presenting a doctored graph of Hansen’s projections during public testimony before the House Small Business Committee.

Four decades of climate model projections have fared well

Their latest effort implies that U.S. climate policy is based on Hansen’s forecasts in 1988, and therefore we must “reconsider environmental policy” according to an evaluation of “how well his forecasts have done.”

In reality, climate policy is based on hundreds of years of collective research and an overwhelming amount of observational evidence gathered from all over the world.

Climate model development began as early as the 1950s, and projections from 1973 to 2013 (including Hansen’s 1988 paper) have been compared to observed temperatures by multiple institutions. All showed reasonably accurate surface temperature increases between 1970 and 2016, Hansen’s 1988 study included.

The largest uncertainties come not from lack of understanding of the climate system, but from unknown future human decisions. For example, if Hansen’s 1988 study had included the greenhouse gas emissions reductions that followed the Montreal Protocol Treaty – which took effect in 1989 and phased out ozone-depleting chemicals such as chlorofluorocarbons (CFCs) – the results from his “most likely” scenario would have matched projections by today’s more sophisticated models. Considering the lack of available data and computing power in 1988, this is incredibly impressive.

Predicting exactly what emissions path we’ll take is therefore a policy, and not science, question. Climate scientists work towards understanding how the climate will respond to a range of future emissions scenarios, and unless a particular emissions pathway comes to fruition, it is never expected that the climate model results will be exactly right even if the science is perfect.

However, even without accounting for the Montreal Protocol Treaty adjustment, Hansen predicted in his “most likely” scenario nearly 1 degree (C) of warming by 2016 with respect to a 1964-1983 average, and observations from the standard datasets by NASA and Cowtan and Way both show this amount of warming. On the other hand, Michaels and Maue’s piece misleads readers by inaccurately claiming that Hansen’s lowest projection was most accurate; a quick look at the data shows that this is not so.

The article goes a step further, inaccurately claiming that “Models devised by the United Nations Intergovernmental Panel on Climate Change have, on average, predicted about twice as much warming as has been observed since global satellite temperature monitoring began 40 years ago.” First, IPCC does not devise models themselves, but it collates, synthesizes, and standardizes model results from dozens of independent climate models worldwide. But the main problem with this claim is that it is based on comparisons of model results with satellite data that has since been found to have major calibration errors that underestimated temperature measurements. Correcting for the errors reveals that the models are very much in line with what we observe.

Another major flaw in the piece is that Hansen’s and the “IPCC’s” models “don’t consider more-precise measures of how aerosol emissions counter warming caused by greenhouse gases. Several newer climate models account for this trend and routinely project about half the warming predicted by U.N. models, placing their numbers much closer to observed temperatures. The most recent of these was published in April by Nic Lewis and Judith Curry in the Journal of Climate, a reliably mainstream journal.” Sophisticated climate models have long considered effects of aerosols, both directly and via cloud modifications. Lewis and Curry estimated a lower than average climate sensitivity not because of aerosols but because they selected a very low ocean heat uptake rate – a controversial choice among climate scientists; accounting for the latest ocean heat content data would have increased the climate sensitivity value to be on par with other model estimates. Further, their study used a version of a temperature dataset that didn’t include adjustments due to lack of coverage of the Arctic.

Temperature IS rising…

The authors of the opinion piece write that “Global surface temperature has not increased significantly since 2000, discounting the larger-than-usual El Niño of 2015-16.”

This is a tired canard that has been fully debunked elsewhere. This argument is based on flawed and cherry-picked data, and ignores the latest scientific understanding. First, when the flawed, underlying satellite data was corrected, it showed 140% faster warming since 1998 that was consistent with other datasets. Second, the data is cherry-picked to fit the authors’ argument; it is clearly unscientific to discount the El Niño of 2015-16, but not the common La Niñas that masked some of the warming, and not the El Niño of 1997-98 that makes the warming thereafter appear to “slow down.” Third, El Niño was found to play a very minor role in global temperature rise in 2015, which shattered previous records. While it played a relatively larger role in 2016, it is certainly not the cause of a century-long global temperature rise trend, and just amplifies warming when it occurs – in contrast to the La Niñas that mask warming when they occur.

Overall, five ground-based temperature datasets and two satellite datasets all from different scientific groups show rapid warming over the past 30 years that continues into the 21st century. The 2010s have been warmer than the 2000s, the 2000s were warmer than the 1990s, the 1990s were warmer than the 1980s, and the 1980s were warmer than the 1970s. And temperature changes are hardly the only indicator of a changing climate.

So the article’s central question – “Why should people world-wide pay drastic costs to cut emissions when the global temperature is acting as if those cuts have already been made?” is specious. Global temperature is not acting as if those cuts have been made. And basic physics known since the 1800s shows that the global temperature will continue on this path unless we cut emissions of greenhouse gases drastically.

In the U.S., we have also observed considerable warming. However, Michaels and Maue further tried to discredit Hansen by saying, without any evidence or source, “No such spike has been measured” in greater than average temperatures in the late ‘80s and ‘90s in the southeast U.S. and Midwest, as Hansen suggested in his 1988 paper. First, several states in these areas have seen higher than average temperature rise, including Florida, Michigan, Minnesota, and Wisconsin. Second, reading Hansen’s actual paper shows caveated language that there will be regional variations, and that there is a “tendency” for the southeast and central U.S. to be warmer than average. Hansen also fully acknowledges that major improvements are needed in our understanding of the climate system and our ability to predict change, especially the urgent need for more global measurements. For example, we didn’t know in 1988 important variability dynamics that have governed temperature change in these regions.

and the planet IS reacting

The excess warmth has touched every continent and every ocean.

We’ve observed considerable melting of land ice, something that Hansen highlighted in a testimony during a 2007 case on auto emissions. However, the opinion piece didn’t quite accurately depict his sentiments, paraphrasing his words as “most of Greenland’s ice would soon melt, raising sea levels 23 feet over the course of 100 years.”

Rather, Hansen was referring to ice melt in Greenland and Antarctica, stated that “it is nearly certain that West Antarctica and/or Greenland would disintegrate at some point if global warming approaches 3°C,” and caveated his estimation of sea level rise as “his opinion” (and therefore implying this as not a scientifically robust finding). The authors cite “a Nature study that found only modest ice loss after 6,000 years of much warmer temperatures than human activity could ever sustain.” But the same study acknowledges a major rise in sea level during that time, which if not from Greenland, was from Antarctica.

As for climate-related extreme events that have been on the rise over the past 30 years, the opinion article claims that hurricanes have not gotten stronger, but observational evidence shows they have. The article claims that tornadoes have not gotten stronger, but that was never a mainstream theory, and observations have shown that tornadoes are clumping together causing more severe outbreaks.

Michaels and Maue finally conclude that the list of what has been predicted and didn’t happen “is long and tedious.” I’d like to see that list, because the sampling they provided is filled with inaccuracies and easily refuted.

Jim Hansen’s 1988 testimony is a landmark moment. No matter how the opponents of climate action try to sow doubt and confusion, the judgement of history is clear: Hansen was right.

Also posted in Greenhouse Gas Emissions, News, Science, Setting the Facts Straight / Leave a comment

The path forward for net-zero emissions climate policy

By Nat Keohane and Susanne Brooks

This post originally appeared in The Hill

Climate change is a defining threat of our generation. But the way forward has never been clearer. Electric power generation is being transformed by the rapid deployment of wind, solar and utility-scale storage. Technological innovation is reshaping transportation and industry. New means of capturing and storing carbon are on the horizon.

Even so, the challenge is monumental. To have a reasonable chance of avoiding the worst effects of climate change, the world must achieve “net-zero emissions” — taking as much carbon out of the atmosphere as we put into it — in this century. Here in the United States, we are currently emitting carbon pollution at seven times the rate that we are soaking it up. We must take advantage of every cost-effective opportunity to cut climate pollution now, while investing in the innovations that will put us on course for net-zero emissions as soon as possible.

Economic and technological trends alone won’t do the trick. Waiting to act only deepens the challenge and increases the cost and pace of reductions needed. To unleash the full potential of breakthrough clean energy technologies, we need well-designed policies that accelerate the low-carbon transition rather than hinder it.Encouragingly, action is already underway: cities, states, and businesses are forging ahead to enact policies and undertake initiatives to reduce pollution, building on momentum from the plummeting costs of clean energy technologies. Those efforts are crucial. But the world won’t solve climate change without American leadership at all levels. To cut climate pollution at the scale and pace that science tells us is necessary requires national action.

A limit on pollution and a cost to polluters

Climate policies must lock in pollution reductions, grow the economy, and protect vulnerable populations. Policies should establish enforceable limits on climate pollution while holding polluters accountable for their share of the costs. To do all that, while providing communities everywhere with access to clean, efficient, affordable energy, we need to harness the power of markets to drive investment, create jobs, spur innovation, and deliver the transformative change needed to build the clean energy economy.

We know such policies work, because we’ve tried them before. Flexible policies that set firm, declining limits on pollution and let businesses find the best ways to respond have helped meet environmental goals faster and more cheaply than expected and while growing the economy — by penalizing pollution and rewarding new and better ways to cut emissions.

Performance-based policy and environmental integrity

The fundamental test of any climate policy is simple: Will it cut pollution at the pace and scale that the science demands?

The most straightforward way to cut carbon is to put a clear limit on pollution that guarantees the environmental outcome, while giving businesses flexibility to determine the best way to meet that limit. Ten U.S. states already have successful programs in place that take exactly this approach, known as cap and trade, and several others are moving in that direction.

Another approach, a carbon tax, also charges companies a cost for polluting. But making companies pay for their pollution doesn’t guarantee how much pollution they will cut. So for a tax to be effective, it must include an “environmental integrity mechanism” (EIM) that ties the tax to performance — and adjusts it, as necessary, to keep us on track to meet our environmental goals.

Regardless of the approach we take, the cornerstones of good policy design are the same: clear and measurable emission reduction goals, effective provisions to ensure they are met, and flexibility in how to meet them coupled with strong incentives to do it cheaply and efficiently. Given the importance of reaching net zero as soon as possible, good policy should also cast a wide net on sources and sinks of emissions — including by rewarding the innovators that pull carbon from the sky, whether by new technologies or natural sinks like forests and soils.

Environmental integrity also means preserving the ability of states and cities to take action — and, when necessary, to push further and faster than the federal government. It means protecting the Environmental Protection Agency’s statutory authority to protect the public from climate pollution and other dangerous sources of air pollution. The landmark protections established under the Clean Air Act over its more than 40-year history have saved hundreds of thousands of lives and protected the health of our children and the most vulnerable.

Here too, the key metric is environmental performance. The safeguards provided by EPA’s existing authority to limit climate pollution are particularly vital in the context of climate policies — such as a carbon tax without enforceable pollution limits — that lack provisions to ensure they will achieve their intended goals.

It’s time for America to lead again. Leadership means policies that cut pollution in line with science-based goals — backed up by provisions that guarantee the goals are met. Doing this in a way that is fair, and at lowest cost, will ensure the shared prosperity, growth, and security that are the promise of a safer climate.

Susanne Brooks is director of U.S. Climate Policy & Analysis.

Nat Keohane is senior vice president for Climate at Environmental Defense Fund.

Also posted in Climate Change Legislation, Energy, Greenhouse Gas Emissions, Policy / Leave a comment

Cherry blossoms: Predicting peak bloom in a warming world with weirder weather

USDA photo by Scott Bauer

Every March, Washington D.C. anxiously anticipates the arrival of the city’s world-famous cherry blossoms.

Millions of people flood the National Mall each year to observe the “peak bloom” – defined by the National Park Service as the day when 70 percent of the Yoshino cherry blossoms surrounding the Tidal Basin have opened.

Fluctuating weather patterns render predictions of peak bloom notoriously fickle. Experts consider it impossible to accurately estimate the cherry blossoms’ vibrant debut more than 10 days in advance.

This year has been no exception – with three changes to the 2018 peak bloom date prediction since March 1st.

While bloom forecasting is a historically temperamental exercise, climate change is now further complicating matters.

As global average surface temperatures continue to rise, D.C. has felt the heat. Weather station measurements from the city have recorded a 1.6 degree Celsius per century increase in regional temperature – double the global average warming rate. The warmer winters associated with these increasing temperatures may help explain why between 1921 and 2016 peak bloom dates have shifted earlier by about five days.

A warming regional climate may influence seasonal trends, but blooms are still heavily affected by short term changes in the weather. While 2018 peak bloom was originally projected to occur between March 17th  and 20th – early in the season due to the city’s exceptionally warm February – a major snowstorm and cold temperatures persisting through March delayed the arrival until April 5th.

It may initially seem that heavy snowstorms and colder temperatures are inconsistent with climate change. However, there is a growing body of evidence that shows how changes in atmospheric circulation patterns associated with rapid warming in the Arctic may actually be linked to these dramatic cold snaps in the mid-latitudes. Increased moisture in the atmosphere from a warming world also allows for heavier precipitation events, including snowfall.

These opposing consequences of climate change – hotter temperatures with intermittent cold snaps – make the bloom schedule of D.C.’s cherry blossoms even more complex. But one thing is clear: predictions will certainly not get any easier.

Also posted in Extreme Weather, News, Science / Comments are closed

The Winter Olympics on hostile terrain: How climate change is harming winter sports

The 2018 Winter Olympics have drawn to a close, and four years will pass before the world’s next opportunity to celebrate the Winter Games.

During that time, emerging athletes and innovations in training methods will inevitably change the face of the sports. But another more malevolent force of change is brewing – one that has begun to shift the landscape of the Games into hostile terrain.

As climate change continues to progress, adverse weather conditions threaten our beloved winter sports as we know them.

Familiar locations no longer suitable for outdoor sports

Researchers from the University of Waterloo recently determined that shifting weather conditions due to human-induced climate change will render 13 of the previous 19 hosts of the Winter Olympics too warm for outdoor sports by the end of the century.

Even recent host cities have faced new challenges in our changing climate. The 2014 Winter Olympics in Sochi, Russia, for example, experienced peak temperatures of 61 degrees Fahrenheit, inducing poor snow conditions that led to various delays and injuries throughout the weeks of competition.

Winter sport athletes have also begun to find their trusted off-season training locations unrecognizable. Glaciers that once provided ideal conditions for outdoor summer training have been slashed by trails of melt water and are rapidly disintegrating. U.S. athletes who previously looked to the Rocky Mountains to support their off-season practice must now travel across the globe to regions such as Switzerland, further exacerbating global warming as increased international travel pumps greenhouse gases into our atmosphere.

Accessibility diminishes for potential athletes

In the years of practice before an athlete may secure sponsorships or funding from national Olympic Committees, training and associated travel costs must be self-supported. The necessity of cross-continental travel thus not only makes tangible the effects of our changing climate, but confines potential talent pools from which Olympic athletes may emerge to socioeconomic groups able to financially support international travel.

The U.S. National Hockey League (NHL) has voiced similar concerns about athletes’ future training access. While the development of indoor rinks has allowed hockey to be played globally, the sport has traditionally relied on backyard rinks and ponds to provide players with their first introduction to skating. These more accessible venues are becoming progressively more limited as global temperatures continue to rise.

Informal backyard matches are not the only events threatened by climate change, as historic outdoor hockey events including the NHL Winter Classic, Heritage Classic, and Stadium Series may also be lost to warming conditions.

Widespread economic implications

We can shift these winter sports indoors or to higher latitudes in order to extend their lifetimes, but what happens to the regions left behind?

In the U.S. alone, snow-based recreation generates $67 billion per year and supports over 900,000 jobs. In a single year with poor snow conditions, more than $1 billion in revenue and 17,350 jobs can be lost.

Such threats are not looming in the distant future – changes are already taking shape.

As precipitation begins to fall as rain rather than snow throughout winter months, U.S. ski resorts are forced to spend more than 50 percent of their annual energy budgets on artificial snowmaking.

Canada’s average 4.5 degree Fahrenheit temperature rise between 1951 and 2005 has been matched with a 20 percent decrease in the country’s outdoor hockey season.

Future impacts are only expected to worsen, with the U.S. ski season projected to be cut in half by 2050.

Athletics are recognizing the impacts of climate change

Many competitors and athletic associations have already acknowledged the undeniable role of climate change in threatening the livelihood of these winter sports:

  • The National Ski Areas Association adopted their Climate Challenge program, aiming to help reduce greenhouse gas emissions and costs of energy use for participating ski areas.
  • Preceding the 2014 Winter Games, 75 Olympic medalists in skiing and snowboarding wrote a letter to then-President Barack Obama calling for a firmer stance on climate change mitigation and clean energy development.
  • The NHL used their 2014 sustainability reportto voice their “vested interest” in climate change, historically participating in the Paris Agreement conference discussions a year later.
  • A group of athletes and companies has come together to create a group called Protect Our Winters to educate and advocate for policies that mitigate the effects of climate change.

The threat of human-induced climate change recognized by these leaders applies to more than just winter events. Summer sports, such as golf and baseball, are also feeling the strain of our warming world.

In the spirit of the Olympic Games, we must unite as global citizens to join in our most important race – the race to defend the future of our planet.

Also posted in Energy, Extreme Weather, Greenhouse Gas Emissions, Science / Comments are closed

Natural disasters are no longer purely natural

You may have heard the alarming news that weather and climate disasters in the U.S. killed 362 people in 2017 and caused a record $306 billion in damages.

But also alarming is the fact that many news outlets are still referring to these events as “natural disasters.”

Southeast Texas after Hurricane Harvey – a not-purely-natural disaster. Photo: U.S. Department of Defense

With recent advances in science, researchers have found that human-caused climate change plays a major role in making certain events occur and/or making them worse. That means that many “natural disasters” are no longer purely “natural.”

Here is a look at some not-so-natural disasters:

  • Hurricane Harvey 2017: human-caused climate change made record rainfall over Houston around three times more likely and 15 percent more intense
  • European Extreme Heat 2017: human-caused climate change made intensity and frequency of such extreme heat at least 10 times as likely in Portugal and Spain
  • Australian Extreme Heat 2017: maximum summer temperatures like those seen during 2016-2017 are now at least 10 times more likely with human-caused climate change
  • Louisiana Downpours 2016: human-caused climate change made events like this 40 percent more likely and increased rainfall intensity by around 10 percent
  • European Rainstorms 2016: human-caused climate change made probability of three-day extreme rainfall this season at least 40 percent more likely in France
  • UK Storm Desmond 2015: human-caused climate change made extreme regional rainfall roughly 60 percent more likely
  • Argentinian Heat Wave 2013/2014: human-caused climate change made the event around five times more likely

By employing the term “natural disasters,” news outlets and others are inadvertently implying that all of these events are just misfortunate incidences – rather than consequences of our actions.

This seemingly innocuous phrase supports the idea that dangerous weather is out of our control.

But, we do have some control over their frequency and intensity, and that control is through our emissions of heat-trapping gases.

We need to act on climate, and we need to do it now. Pointing out that we worsen and may even cause these weather disasters may help convince people to do what needs to be done.

Also posted in Extreme Weather, News, Science, Setting the Facts Straight / Read 1 Response

A look back at 2017: The year in weather disasters – and the connection to climate change

Port Arthur, Texas after Hurricane Harvey. Photo: SC-HART

From hurricanes to heat waves, 2017 produced countless headlines concerning extreme weather and the devastation left in its wake.

We tend to think of extreme weather as an unpredictable, external source of destruction. When faced with catastrophes, we don’t always recognize the role we play in intensifying their impacts.

But as human-induced climate change continues to progress, extreme weather is becoming more frequent and dangerous. Without immediate greenhouse gas mitigation efforts, last year’s unprecedented disasters may soon become the norm.

Here’s a look back at the worst weather of 2017 and how these events may have been affected by climate change (and scroll down to see a timeline of the year’s worst weather).


  1. Massive flooding drowns California – Intense rains in January provided a much needed respite from California’s longstanding drought, but quickly tipped from satiating to inundating. Within the first 11 days of the year, California received 25 percent of the state’s average annual rainfall. Flooding and mudslides forced more than 200,000 people to evacuate their homes and caused an estimated $1.5 billion in property and infrastructure damages.

    The rapid shift from drought to flooding may be a marker of climate change. As temperatures warm, precipitation falling as rain rather than snow and expedited snow melt lead to the earlier filling of reservoirs. Such a shift increases the likelihood of both summer droughts and winter flooding, with the latter intensified by a warming atmosphere that holds more moisture and deposits greater precipitation in heavy rainfall events.

  2. Heat wave sizzles in Australia – High heat persisting overnight in the New South Wales and Southern Queensland regions of Australia induced a series of devastating heat waves throughout January and February. Following a record-setting month in which the city reached its highest ever overnight minimum temperature for December, Sydney experienced the hottest night in January since weather records began in the mid-1800s.

    Analysis has shown that these extreme summer temperatures are 10 times as likely due to the influence of climate change. With rising global temperatures, heat waves are expected to become more intense, frequent, and longer lasting. Australia was just one of many regions to experience these developing changes in 2017.

  3. Extreme heat melts the North Pole – Recent history of escalating temperatures in the Arctic could not dull the shock when temperatures near the North Pole reached more than 50 degrees Fahrenheit above regional averages this winter. The heat wave associated with this spike is not only dramatic in intensity, but frequency – heat this extreme usually occurs about once each decade, yet this event was the third recorded in just over a month.

    There exists an essential feedback between sea ice melt and Arctic warming – the more we warm, the more ice melts, lowering the region’s reflectivity of sunlight and increasing warming intensity. While these processes are usually gradual, weather variability can kick dramatic warming events into high gear. The winter heat waves experienced in the Arctic provide examples of such a combination, which may occur every few years should we reach a 2 degree Celsius global temperature rise.


  1. Drought brings risk of famine to Somalia – At a time when a staggering 6.2 million people – half of Somalia’s population – required urgent humanitarian aid, the World Health Organization released an official warning that Somalia was on the verge of famine. Such categorization would clock in as Somalia’s third famine in 25 years, the most recent of which led to the death of 260,000 people.

    After years of scarce rainfall, the nation continues to face widespread food insecurity, reduced access to clean water, and increased risk for drought-related illness. Analysis of both observational and modeling data suggests that only a small increase in the nation’s dry extremes can be attributed to climate change. However, as dry regions become progressively drier in a warming climate, similar national disasters may become increasingly common.


  1. Extreme heat blisters the Southwestern United States – In June, an intense heat wave blazed across the Southwestern U.S. and left record high temperatures in its trail. Daily records included 127 degrees Fahrenheit in Death Valley. All-time records were reached in Las Vegas, Nevada and Needles, California at 117 degrees Fahrenheit and 125 degrees Fahrenheit, respectively. High heat triggered public health concerns and led to power outages in the California Central Valley, the buckling of highways in West Sacramento, and the cancelation of 50 flights out of Phoenix Sky Harbor Airport for American Airlines alone.

    While high temperatures are typical of the low-humidity pre-monsoon season in the Southwest, the unprecedented magnitude of these numbers and the shift towards an earlier extreme heat season may be a signal of the changing climate.

Greenland's wildfires, as seen from space. Photo: NASA


  1. Once-icy Greenland engulfed in flames – In historically icy Greenland, wildfires have typically been of minimal concern. As a result, when the largest wildfire in the country’s history broke out at the end of July, there existed virtually no framework to assess the event’s health and infrastructure risk.

    As global temperatures rise and Greenland’s ice melts, the once barren landscape can fill with vegetation and expand the likelihood of forest fire outbreak. Climate change simultaneously lengthens and intensifies drought in the region, while increasing the likelihood of thunderstorms (a major catalyst of wildfires). Wildfires in turn intensify regional warming, as the fires’ soot deposits black carbon on the pristine snow cover, reducing the region’s reflectivity and accelerating ice sheet melt.

  2. “Lucifer” plagues Europe – Europe’s most sustained extreme heat event since the deadly 2003 heatwave (in which climate change was responsible for half of the 1050 recorded deaths) brought temperatures so reminiscent of the Inferno that locals named the event “Lucifer.” As temperatures throughout the region surpassed 104 degrees Fahrenheit, two deaths were recorded and a 15 percent increase in hospital emergency emissions was observed in Italy. The heatwave also caused pollution levels to soar and spurred wildfires throughout Portugal, just a few months after fires in Pedrógão Grande killed 60 and injured more than 250.

    Research concerning previous extreme heat in Europe has shown that climate change renders the maximum summer temperatures observed in regions such as Spain 500 times more likely than in the pre-industrial era. As global temperatures continue to rise, extreme heat will only become more familiar.

  3. Southeast Asia inundated by widespread floods – More than 41 million people were affected by massive floods and landslides that rippled through nations including Bangladesh, India, and Nepal. Losses experienced by the region included more than 1,300 lives and the displacement of 600,000. Two simultaneous pressures – the push for urbanization and neglect towards developing sustainable draining systems – renders the region highly vulnerable to these natural disasters.

    The link between the Southeast Asian monsoon season and climate change is complex, dependent upon a variety of entwined weather systems and intricate regional topography. More study is necessary to predict the influence of a changing climate on this monsoon system in order to prepare the region for impact and increase communities’ resilience.

Puerto Rico after Hurricane Irma. Photo: U.S. Customs and Border Protection


  1. Atlantic hurricane season leaves devastation in its wake – Deadly storms Harvey, Irma, Maria, and Ophelia dominated the news in August, killing more than 150 people and causing more than $300 billion in damages in just the United States.

    As the atmosphere holds seven percent more moisture with each one degree Celsius temperature rise, individual tropical storms can now deposit more rainfall. Recent studies have estimated that climate change rendered Harvey’s extreme rainfall three times more likely and 15 percent more intense. 27 trillion gallons of rain fell over Texas and Louisiana from Hurricane Harvey alone, setting the record for the highest tropical cyclone rainfall in the continental US. Sea level rise of 10 to 12 inches in cities such as Miami dramatically increased the destruction caused by the storm surges associated with Hurricane Irma, which were as high as 10 feet. Warming waters driving hurricane development and strength ushered in Hurricane Maria – Puerto Rico’s strongest storm in 85 years – and Hurricane Ophelia, which set records for the farthest east a major hurricane has traveled in the Atlantic and the worst storm in history to make landfall in Ireland.


  1. Western United States’ forests set ablaze – Wildfires devastated Northern California this October, with more than 245,000 acres burned and 14,000 homes destroyed. Insured losses in the region amounted to more than $3 billion, but danger does not end when the fires are extinguished. The remaining ash and debris (including hazardous waste, electronic waste, and heavy metal contamination) can be spread by wind and rain, posing even further health concerns to those nearby. The increased temperatures and decreased water availability associated with climate change increases the risk of wildfires. Due to recent temperature and dryness extremes in California, even engine heat from parked cars has been cited as the source of major fires.

    The duration of the fire season has also begun to lengthen, as spring and summer temperatures rise and snowmelt begins earlier. California wildfires ignited once again in December outside of Los Angeles, creating even more destruction than those in the north. Covering an area of more than 425 square miles and displacing more than 100,000 people, the Thomas fire ranks as the second largest fire in the state’s history. While dryness and high temperatures triggering the fire’s outbreak are associated with La Niña's current presence in the region, climate change serves to exacerbate both conditions and facilitate the dramatic losses experienced by California residents.

The direct influence of climate change on many of these events suggests that more devastating catastrophes lie ahead. But the future is not written in stone.

Should we recognize the intensification of these extreme weather events, the power to decrease greenhouse gas emissions worldwide and prevent increasingly hostile weather remains in our hands.

Also posted in Arctic & Antarctic, Extreme Weather, Science / Read 2 Responses