28 September 2013

Five Points on the IPCC Report (Wonky, Long)


Here are two reactions to the IPCC:
"The good news is our understanding of the climate system and our impact on it has improved immensely. The bad news is that the more we know, the more precarious the future looks. There's a clear message to Governments here, and the window for action is narrowing fast. If the last IPCC report was a wake up call, this one is a screaming siren." (source)
and
"Consider the case closed on global warming." (source)
Both of these quotes come from reactions to the February 2007 release of the IPCC's Fourth Assessment Report (and specifically the Summary for Policy Makers of its Working Group I) -- there are many more such reactions here.

The release of the IPCC's Fifth Assessment Report (again, technically the SPM of its WGI, available here in PDF) should give anyone following the climate issue a deep sense of deja vu, if not a full-on case of Groundhog Day syndrome. We have seen this all before.

At some risk of contributing to the deja vu, below I suggest 5 important points to take from the IPCC report released yesterday.

1. The core scientific understandings remain unchanged

The IPCC deserves much praise for bringing to the attention of the public and policy makers the fact that humans influence the climate system and that influence presents some risks. This message represents continuity with past reports. As George Monbiot explains:
There are no radical departures in this report from the previous assessment, published in 2007
Some have asserted that the IPCC's attention to a carbon budget (a trillion tons) represents something new. It is not. I discussed it in a 2006 book review (PDF) and a 2009 paper (PDF) and it certainly wasn't original to me. What is interesting is that the IPCC's WGI is taking some steps in the direction of the territory of IPCC WGIII, and a discussion of policy options.

Of course, in public debates some will emphasize the scarier end of the spectrum of uncertainty and others will emphasize the more benign end. The reality is the human-caused climate change is not about certainties, but risks and ignorance, a point well characterized by the late Steve Schneider more than a decade ago (source in PDF):
I readily confess a lingering frustration: uncertainties so infuse the issue of climate change that it is still impossible to rule out either mild or catastrophic outcomes, let alone provide confident probabilities for all the claims and counterclaims made about environmental problems.
The IPCC report already is being spun silly. Underneath the spin is an important core message.

2. The IPCC itself is still engaged in PR spin and messaging

The IPCC AR4 got into some trouble for its efforts to spin science beyond what it could bear. The 2035 glacier issue got the most attention, but for me it was the egregious treatment of work I have been involved in related to disasters that really caused me to question the integrity of the organization. Unfortunately, vestiges of such tactics appear to persist.

For instance, in the AR4 released in 2007, the IPCC was willing to highlight the significance of 6 years worth of global temperature data in supporting its conclusions:
Six additional years of observations since the TAR (Chapter 3) show that temperatures are continuing to warm near the surface of the planet. The annual global mean temperature for every year since the TAR has been among the 10 warmest years since the beginning of the instrumental record.
Now with global temperatures in an extended hiatus, the IPCC has reversed course and told us that such short-term periods are actually irrelevant to its arguments:
Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends. As one example, the rate of warming over the past 15 years (1998–2012; 0.05 [–0.05 to +0.15] °C per decade), which begins with a strong El Niño, is smaller than the rate calculated since 1951 (1951–2012; 0.12 [0.08 to 0.14] °C per decade)5. {2.4}
Back in 2006 my father and I warned against presenting too tidy a case based on an expectation of sustained surface temperature warming. (And he is on record in the scientific literature making such a case more than 10-years ago -- see this paper in PDF -- yet another subject which he has been proven right about after being first attacked, but more on that another time!).

The IPCC may be absolutely correct in its assertion that the so-called "hiatus" is of little relevance to its arguments -- no doubt we'll be hearing more about this in coming weeks and months. However, the IPCCs past willingness to spin data one way, and now another cannot build confidence in the IPCC's ability to play things straight.

Speaking for the IPCC, Princeton's Michael Oppenheimer explains that the IPCC full report, to be released next week, will comprehensively dealy with the "hiatus":
[T]he [IPCC] scientists looked at [the hiatus] very carefully. There's an extensive discussion of it in the detailed background documents will be made public on Monday.

There's been a slowdown. It hasn't stopped. The warming rate has been slower over the last 15 years than the long-term trend.

And that is believed to be because the climate is quite variable. If you look at the long-term record, there are bumps upwards, there are bumps downwards, and there are plateaus like this one. After every bump downward or every plateau, the climate change then accelerates again. Now, we can't be sure that is going to happen, but it's a good bet.

The best possible -- the best -- the leading explanation of this is that heat tends to hide in the ocean sometimes. But when heat hides in the ocean, it later comes out and reappears in the atmosphere, and then warming resumes faster than before. We don't know this for certain. We will find out over the next few years.
UPDATE 9/29: Pete Spotts reports a somewhat different account than Oppenheimer's on how the IPCC handled the "hiatus."

At the University of Nottingham, Reiner Grundmann has created word clouds of the 2007 IPCC SPM and its successor released yesterday. They show how fundamentally the IPCC has altered its emphasis.

2007 IPCC AR4 WGI SPM (source)


Word cloud made with WordItOut

2013 IPCC AR5 WGI SPM (source)


Word cloud made with WordItOut

Last week, I noted that the 2007 IPCC WGI had expressed 10 statements (in the entire report) with 95% or greater confidence. Only one of these appeared in the SPM. In the 2013 IPCC WGI SPM there are 18 such statements expressed at a 95% or greater likelihood. Whether this reflects a change in approach or a change in knowledge remains to be seem, but it is a significant difference in presentation.

Here is my bottom line: I trust the science being reported by the IPCC. I still do not trust the IPCC to faithfully report that science without trying to spin a message.

3. We will not be able to clearly distinguish the influence of that human influence from natural variability for decades:

The IPCC SPM explains:
Internal variability will continue to be a major influence on climate, particularly in the near-term and at the regional scale.
This means that there are exceedingly few variables in which human-caused climate change can be detected and attributed over the coming "short-term" (how long is that?). Apparently we can now add global surface temperatures to that list (at least over 15 years, it is not clear what the IPCC thinks is long enough, maybe we'll learn that answer in the full report).

In theory, such a conclusion should put to rest the incessant "wiggle watching" of various climate variables that goes on with respect to multi-year, yearly and even shorter time periods. Of course, in practice wiggle watching is a great pastime of the climate warriors on either side. The IPCC's (thus far) unwillingness to take head-on issues associated with the "hiatus" almost guarantees debates over the "wiggles" will continue.

The bottom line lesson to take from the IPCC is that such "wiggle watching" will be largely irrelevant to its core findings (discussed above under #1).

4. Actions to mitigate climate through reductions in carbon dioxide (and other greenhouse gases) will not have a detectable effect on climate until after mid-century.

The IPCC explains:
By the mid-21st century the magnitudes of the projected changes are substantially affected by the choice of emissions scenario.
This long time-scale of a detectable impact of mitigation can also be seen in the Table box SPM.2 where you can see that the uncertainty ranges for the temperature increase projected for 2065 for the highest and lowest emissions scenarios overlap. For sea level rise the ranges are just about identical. So under the assumption that the models can accurately predict future temperature change, even under very aggressive emissions reductions scenarios now thought unrealistic, we would not be able to conclusively see these effects in global average temperature by 2065. (Ironically, this also means that the only way we could actually verify century-long climate models against data would be to not engage in mitigation!)

This conclusion is not new (I highlighted it in 2006 testimony before the US Senate, here in PDF). The IPCC should put to rest silly claims that action on emissions, even very aggressive actions, can have a meaningful effect on short-term weather and climate. Here is a prominent example of such a claim from Al Gore eight days ago:
Three years ago, Congress failed to put a price on carbon and, in doing so, allowed global warming pollution to continue unabated. We have seen the disturbing consequences that the climate crisis has to offer—from a drought that covered 60% of our nation to Superstorm Sandy which wreaked havoc and cost the taxpayers billions, from wildfires spreading across large areas of the American West to severe flooding in cities all across our country—we have seen what happens when we fail to act. 
The asymmetry in costs and benefits of climate policy as climate policy is one of the fundamental motivations behind a Hartwellian approach to climate.

5. There is not a strong scientific basis for claiming a discernible effect of human-caused climate change on hurricanes, floods, tornadoes or drought.

This is a familiar conclusion to readers of this blog, so I won't belabor it (more to come soon on this). Here is what the IPCC SPM says about each looking out to mid-century:
  • Hurricanes (tropical cyclones):  "Low confidence" in both a "human contribution to observed changes" and "likelihood of future changes"
  • Floods: No comments in the SPM
  • Tornadoes: No comments in the SPM
  • Drought:  "Low confidence" in both a "human contribution to observed changes" and "likelihood of future changes"
The conclusions with respect to hurricanes and drought both represent a walking back from more aggressive conclusions reported in 2007, and should not be a surprise to readers here, as that is what the literature says. Kudos to the IPCC for getting this right.

25 September 2013

The Flood Next Time

UPDATE: CU/CIRES/NOAA/WWA have just released their "preliminary assessment of Front Range floods." You can find it here in PDF.

I have an op-ed published today in the Boulder Daily Camera, titled "The Flood Next Time." The intended audience is the Boulder community, and here is an excerpt:
One of the first steps that Boulder should take in the near term is a rigorous evaluation of how we did in the flood. What actions did we take in recent decades that worked? Where can we do better?

The city has seen an enormous amount of development since 1969, and the floods of 2013 tell us where the water in a flood actually goes. Infrastructure -- including especially in our mountain communities and our city storm sewer systems -- needs a hard look. Improving that infrastructure will require investments that won't be cheap, and which will need to be evaluated against competing, worthwhile priorities.
Comments welcomed.

Also, there is an all-star panel discussion of experts today at CIRES organized by the Western Water Assessment looking at the floods across the Front Range. Details here. It will be livestreamed and will be worth your time.

24 September 2013

A Flood Foretold

Back in 1975 Gilbert White and Eugene Haas co-authored a book titled "Assessment of Research on Natural Hazards" (MIT Press) which became a classic of the field. In it, they describe three future disasters, in Miami, San Francisco and Boulder. I have scanned in their 6-page description of the "hypothetical 1975 Boulder flood." It is here in PDF.

23 September 2013

What was the IPCC AR4 Most Certain About?

With attention focused on the release later this week of the Working Group I report of Fifth Assessment of the IPCC, I thought that it would be worthwhile to present the findings of Working Group I from the Fourth Assessment (AR4) which were expressed with the greatest certainty, that is with 95% confidence or greater.

In a 2011 paper published in Climatic Change (here in PDF), Rachel Jonassen and I looked at all of the 2,744 findings presented by the AR4 accompanied by associated likelihood terminology (shown above). Of those total findings, 573 came from Working Group I. Of those 573, a total of 17 non-unique findings were presented with a confidence level of greater than 95%. Note that the Summary for Policy Makers introduced a likelihood term not included in the guidance -- "unequivocal" as follows:
  • Warming of the climate system is unequivocal, as is now evident from observations of increases in global average air and ocean temperatures, widespread melting of snow and ice, and rising global average sea level (see Figure SPM.3). {3.2, 4.2, 5.5}
Including the "unequivocal" finding there appears to be about 10 unique findings that were expressed with a 95% or greater likelihood.

Here is the full list:

From the Summary for Policy Makers
  • It is virtually certain that over most land areas, there will be warmer and fewer cold days and nights, warmer and more frequent hot days and nights. [Tables 3.7, 3.8, 9.4; Sections 3.8, 5.5, 9.7, 11.2–11.9][WG1]
  • The observed widespread warming of the atmosphere and ocean, together with ice mass loss, support the conclusion that it is extremely unlikely that global climate change of the past 50 years can be explained without external forcing, and very likely that it is not due to known natural causes alone. [4.8, 5.2, 9.4, 9.5, 9.7][WG1]
From the Technical Summary
  • It is virtually certain that over most land areas, there will be warmer and fewer cold days and nights, warmer and more frequent hot days and nights. [TS/3.8, 5.5, 9.7, 11.2–11.9; Tables 3.7, 3.8, 9.4][WG1]
  • It is virtually certain that anthropogenic aerosols produce a net negative radiative forcing (cooling influence) with a greater magnitude in the NH than in the SH. [TS/2.9, 9.2][WG1]
  • From new estimates of the combined anthropogenic forcing due to greenhouse gases, aerosols and land surface changes, it is extremely likely that human activities have exerted a substantial net warming influence on climate since 1750. [TS/2.9][WG1]
  • It is extremely unlikely (less than 5%) that the global pattern of warming during the past half century can be explained without external forcing, and very unlikely that it is due to known natural external causes alone. The warming occurred in both the ocean and the atmosphere and took place at a time when natural external forcing factors would likely have produced cooling. [TS/2.9, 3.2, 5.2, 9.4, 9.5, 9.7][WG1]
From Chapter 2
  • The combined anthropogenic RF is estimated to be +1.6 [–1.0, +0.8]2 W m–2, indicating that, since 1750, it is extremely likely that humans have exerted a substantial warming influence on climate. This RF estimate is likely to be at least five times greater than that due to solar irradiance changes. For the period 1950 to 2005, it is exceptionally unlikely that the combined natural RF (solar irradiance plus volcanic aerosol) has had a warming influence comparable to that of the combined anthropogenic RF. It is extremely likely that the total anthropogenic RF is larger than +0.6 W m–2. it remains extremely likely that the combined anthropogenic RF is both positive and substantial (best estimate: +1.6 W m–2). [2.ES, 2.9.2, 9.2.1.1-9.2.1.2][WG1]
  • From the current knowledge of individual forcing mechanisms presented here it remains extremely likely that the combined anthropogenic RF is both positive and substantial (best estimate: +1.6 W m–2). [2.9.2][WG1]
  • Over particularly the 1950 to 2005 period, the combined natural forcing has been either negative or slightly positive (less than approximately 0.2 W m–2), reaffirming and extending the conclusions in the TAR. Therefore, it is exceptionally unlikely that natural RFs could have contributed a positive RF of comparable magnitude to the combined anthropogenic RF term over the period 1950 to 2005 [Figure 2.23][WG1]
From Chapter 3
  • The global land warming trend discussed is very unlikely to be influenced significantly by increasing urbanisation [3.2.2.2][WG1]
From Chapter 6
  • It is virtually certain that millennial-scale changes in atmospheric CO2 associated with individual antarctic warm events were less than 25 ppm during the last glacial period. This suggests that the associated changes in North Atlantic Deep Water formation and in the large-scale deposition of wind-borne iron in the Southern Ocean had limited impact on CO2. [6.ES][WG1]
From Chapter 9
  • The combined anthropogenic RF is estimated to be +1.6 [–1.0, +0.8]2 W m–2, indicating that, since 1750, it is extremely likely that humans have exerted a substantial warming influence on climate. This RF estimate is likely to be at least five times greater than that due to solar irradiance changes. For the period 1950 to 2005, it is exceptionally unlikely that the combined natural RF (solar irradiance plus volcanic aerosol) has had a warming influence comparable to that of the combined anthropogenic RF. It is extremely likely that the total anthropogenic RF is larger than +0.6 W m–2. it remains extremely likely that the combined anthropogenic RF is both positive and substantial (best estimate: +1.6 W m–2). [2.ES, 2.9.2, 9.2.1.1-9.2.1.2][WG1]
  • Extremely likely: Warming during the past half century cannot be explained without external radiative forcing. Anthropogenic change has been detected in surface temperature with very high significance levels (less than 1% error probability). This conclusion is strengthened by detection of anthropogenic change in the upper ocean with high significance level [9.4.1.2, 9.4.1.4, 9.5.1.1, 9.3.3.2, 9.7][WG1]
  • It is extremely unlikely (less than 5%) that recent global warming is due to internal variability alone such as might arise from El Niño (Section 9.4.1). The widespread nature of the warming (Figures 3.9 and 9.6) reduces the possibility that the warming could have resulted from internal variability. No known mode of internal variability leads to such widespread, near universal warming as has been observed in the past few decades. [9.7][WG1]
  • For the period 1950 to 2005, it is exceptionally unlikely that the combined natural RF (solar irradiance plus volcanic aerosol) has had a warming influence comparable to that of the combined anthropogenic RF. It is extremely likely that the total anthropogenic RF is larger than +0.6 W m–2. [2.ES, 2.9.2, 9.2.1.1-9.2.1.2][WG1]
  • Many observed changes in surface and free atmospheric temperature, ocean temperature and sea ice extent, and some large-scale changes in the atmospheric circulation over the 20th century are distinct from internal variability and consistent with the expected response to anthropogenic forcing. The simultaneous increase in energy content of all the major components of the climate system as well as the magnitude and pattern of warming within and across the different components supports the conclusion that the cause of the warming is extremely unlikely (less than 5%).
Details on our paper:

Jonassen, R. and R. Pielke, Jr., 2011. Improving conveyance of uncertainties in the findings of the IPCC, Climatic Change, 108:745-753, http://dx.doi.org/10.1007/s10584-011-0185-7.
Abstract: Authors of the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) received guidance on reporting understanding, certainty and/or confidence in findings using a common language, to better communicate with decision makers. However, a review of the IPCC conducted by the InterAcademy Council (2010) found that “the guidance was not consistently followed in AR4, leading to unnecessary errors . . . the guidance was often applied to statements that are so vague they cannot be falsified. In these cases the impression was often left, quite incorrectly, that a substantive finding was being presented.” Our comprehensive and quantitative analysis of findings and associated uncertainty in the AR4 supports the IAC findings and suggests opportunities for improvement in future assessments.

22 September 2013

Revisiting the "Consistent With" Canard

Over the past few days I've been engaged with a lively debate with a colleague over whether it is meaningful to proclaim that the extreme rainfall observed in Colorado several weeks ago are "consistent with" predictions of more intense rainfall associated with human-caused climate change.

Long-time readers will know that I believe the use of the phrase "consistent with" in this context is a canard and devoid of substance. Here is an example using exactly such a construction related to the Colorado floods.

The analogy I'd suggest is a 52-card deck stacked with an extra ace. After being dealt a blackjack (i.e., an ace and a face card) it would indeed be appropriate to proclaim that "such a hand is consistent with expectations for hands dealt from this stacked deck."  Of course, being deal a 5 and a 2 would also be "consistent with" the stacked deck. For that matter, being dealt a blackjack would also be "consistent with" hands coming from an unstacked deck.

Motivated by this discussion, I downloaded precipitation records from NOAA for Boulder (here in .dat), which covers a period from May 1, 1897 to August 31, 2013.

In the first half of that time period, covering 58 years or so, Boulder experienced 24 days with measured rainfall of 2 inches or more. In the second half of that period (also 58 years or so) Boulder experienced 20 days with rainfall of 2 inches or more.

How about really extreme, say 4 inches or more?  There is only one data point in the record at that level, July 31, 1919. Now, there is a second.

So what is this data "consistent with"? Pretty much anything, and that is the point.

Advice: just don't use "consistent with" in stretching to link particular extremes with human-caused climate change. It is pure spin.

Further, don't try associating a single event with human-caused climate change -- whether it is an extreme snowstorm in Washington, DC (leading an enthusiastic Senator to build an igloo and name it "Al Gore's New Home," below) or rainfall in Colorado. If you want to detect changes in climate you have to look at long-term records. In Boulder at least, the long-term records do not indicate much change at all in the incidence of extreme rainfall exceeding 2 inches in one day.

19 September 2013

How Fantasy Becomes Fact

UPDATE 9/21: Kudos to Bryan Walsh of Time who has updated and corrected his article.  That is the good news. The bad news is that a Google search of "boulder 1000-year flood" gives 3,190,000 results. 

UPDATE 9/20: Aslak Grinsted has a nice post up which runs rainfall and flood return periods for Boulder, showing similar information to that reported in this post.

The flooding in Colorado has wreaked tremendous devastation, by one estimate the costs will total about $2 billion. Some people remain unaccounted for and it will be many months if not longer before the Front Range recovers. If you'd like to contribute to the recovery, please see this page with resources.

As is often the case in the aftermath of extreme events and disasters, people look for some way to put them into a bigger perspective. With respect to floods, a common way of establishing this perspective is through the N-year flood, which is defined as a flood with 1/N probability of occurring in any given year. So the 100-year flood, used in floodplain regulations, is a flood with an expected 1% chance of occurring in any given year.

Earlier this week, I presented some of my objections to the utility and meaningfulness of the concept of the N-year flood. In this post I show how the concept of the N-year flood can be used to turn fantasy into fact.

In an article titled "The Science Behind Colorado' Thousand-Year Flood" Time magazine explains:
Parts of Boulder are experiencing a 1-in-1,000 year flood. That doesn’t literally mean that the kind of rainfall seen over the past week only occurs once in a millennium. Rather, it means that a flood of this magnitude only has a 0.1% chance of happening in a given year.
Time is a fixture of the mainstream media and what is written there is widely read and repeated.

A big problem with Time's article is that Boulder did not actually experience a "1,000-year flood." In fact, according to an analysis presented by fellow CU faculty member John Pitlick yesterday, using standard hydrological methods, Boulder experienced between a 25- and 50-year flood. (I am focusing here on Boulder, I have not seen similar analyses for other Colorado streamflows, though they are sure to come.) Pitlick further noted that the flood waters did not reach the 50-year flood marker on the Gilbert White memorial (seen at the top of this post.)
How is it that the "1000-year flood" has come to characterize the flood in Boulder? Let's take a quick look.

The Time article points us to an article on the floods at Climate Central, a non-profit group focused on reporting all things climate change. That article made the following claim:
The Boulder, Colo., area is reeling after being inundated by record rainfall, with more than half a year’s worth of rain falling over the past three days. During those three days, 24-hour rainfall totals of between 8 and 10 inches across much of the Boulder area were enough to qualify this storm as a 1 in 1,000 year event, meaning that it has a 0.1 percent chance of occurring in a given year.
So right away we see an error. Climate Central was discussing rainfall which Time mistakenly converted into floods. They are not the same thing. As John Pitlick explained yesterday, return periods for rainfall and flooding for the same event can vary by several orders of magnitude.

There is a further problem. The Climate Central article points us to NOAA's point precipitation frequency analyses, where the 1000-year storm claim originated.

What readers are not told is that in 2007 NOAA was considering discontinuing its presentation of 500-year and 1000-year precipitation return periods, because of their massive uncertainties. At the time NOAA solicited feedback from the expert community and received 122 responses, which you can see here in PDF (ultimately deciding to keep them, despite their problems).

One commenter to NOAA explained:
I pretty much feel 1000 year estimates are in the realm of fantasy.
Fantasy or not, anyone can look up online flood frequency analyses of questionable value, plug them into a story about climate change with no discussion of uncertainties. Those numbers can then be transformed by a larger and more widely read media outlet from referring to rainfall to referring to a flood. That information can then be shared, tweeted, repeated around widely as a "fact" by the mainstream media and presto ... the 1000-year flood is created.

17 September 2013

Global Temperature Trends and the IPCC

As the excitement builds about the release forthcoming IPCC report (snore), debate is underway on how to interpret previous IPCC predictions for the evolution of global surface temperature trends. The debate has been super-charged by a recent article in The Daily Mail by David Rose, leading the usual suspects to say the usual things. Such debates involve exegeses of generally inscrutable IPCC statements filtered through the imperfect process of media (social and mainstream) reporting, colored by agendas.

In this post I pass on the exegeses and have a look at the actual numbers to address several questions and raise a few of my own.

How have the IPCC's out of sample predictions for the evolution of global average surface temperature fared against observations?

I first addressed this question in a running series of blog posts at Prometheus, back in the day. That exercise resulted in a correspondence published in Nature Climate Change in 2008 (here in PDF). Here is a quick update of that analysis.

The graph at the top of this post updates Figure 1a from Pielke (2008) through 2012. I show only the NASA GISS observational dataset (with data from the KNMI Climate Explorer), as it is the "warmest" of the four datasets.

The data shows clearly that the observations are running cooler than the out-of-sample predictions of the IPCC from each of its past 4 reports.

How much cooler?

If we simply compare rates of projected increase (1990 to 2012) the answers with respect to each previous IPCC report are:

  • 47% = NASA GISS linear trend slope as percentage of IPCC 1990
  • 91% = NASA GISS linear trend slope as percentage of IPCC 1995
  • 80% = NASA GISS linear trend slope as percentage of IPCC 2001
  • 80% = NASA GISS linear trend slope as percentage of IPCC 2007

With different rates of increase, the absolute difference in observations vs. projections will be a function of the period being looked at. Note that a comparison with the other 3 surface datasets, not shown here but which appear in Pielke (2008), would lead to larger discrepancies.

What quantitative conclusions does this exercise lead to?

1. The observations of global average surface warming are about half that predicted in the first IPCC report from 1990. Over the past 25 years, projections of rates of future surface temperature increase have clearly come down dramatically.
2. Subsequent IPCC reports reduced their projections, but global average temperature observations are still running lower than that projected in 1995, 2001 and 2007.

Are the lower observed temperatures significantly different than the projections?

Fortunately, there is a just-published peer-reviewed paper in Nature Climate Change which takes up this question, and concludes:
Recent observed global warming is significantly less than that simulated by climate models.
This won't be a surprise to anyone who has followed the ongoing, high-quality discussions of the subject by bloggers, such as Lucia Liljegren.

Does the inconsistency between observations and models have much significance for climate policy?
Not really. The fact that some enthusiasts have over-egged the climate prediction pudding does not take away from the core understandings of climate science, namely that humans influence the climate system, via greenhouse gas emissions and other means, and such influences carry with them some risks.

Of course, the over-egging has set the stage for the discrepancy to be of political significance, as the credibility of the IPCC and its champions is what is really under siege by its critics. Had the IPCC more faithfully represented uncertainties and had its public representatives been less strident and less arrogant, then the fact that we cannot actually predict the short-term evolution of the climate system would have been expected rather than treated as a scientific failure of some sort by the IPCC.

What should scientists do?

Back when I paid more attention to such things I offered the following as a (tongue-in-cheek) suggestion to those wanting to desperately prove that models were in fact consistent with the observations, which is offered without explanation.
More seriously, rather than engaging in proxy wars over media reporting and the short-term PR spin associated with it -- which may in fact just make things worse -- it would be in the long-term interests of the climate science community to take a step back and consider the role of their spokespeople (official or otherwise) in aiding and abetting the skeptics, deniers and other nefarious evil-doers.

A difficult question for the climate science community is, how is it that this broad community of researchers -- full of bright and thoughtful people -- allowed intolerant activists who make false claims to certainty to become the public face of the field? 

It is a question with continuing relevance.

15 September 2013

Against the 100-Year Flood

Image above courtesy @The_CUI and @KaiCasey, click on it for link.

Boulder and the Colorado Front Range have experienced devastating floods over the past week. We came out OK, though many friends and neighbors did not. My son's elementary school will be closed for a long while. (Thanks for the many emails of support and concern from friends, colleagues and readers.) If you'd like to help, please see this page with resources.

Boulder has long been recognized as being at risk to major flooding. The local paper, the Daily Camera reported in 2008:
"Boulder is the No. 1 flood-risk community in Colorado," said Cristina Martinez, a city civil engineer. "That's the message we want to get out there."
In 1975 the classic "first assessment " of natural hazards by Gilbert White and colleagues identified Boulder as one of the nation's top major disasters waiting to happen. (White was a long-time University of Colorado geography professor who lived a remarkable life. He died in 2006 aged 94.) The memorial at the top of this post in the Boulder floodplain, demarcating floods of different levels, was raised in White's honor in 2011.

After many decades of relatively frequent flooding in the early parts of the 20th century, Boulder has been on a lucky streak which had, until this week, lasted over forty years:
Serious floods have affected downtown Boulder in 1894, 1896, 1906, 1909, 1916, 1921, 1938, and 1969 with the worst being those of May 31-June 2, 1894 and May 7, 1969. The flood of 1969 was the result of four days of almost continuous rainfall (11.27” measured in Morrison and 9.34” at the Boulder Hydroelectric Plant three miles up Boulder Canyon from town).
This lucky streak led to concerns, such as these expressed in 2008:
Eric Lessard, an engineering project manager with the city's utilities department, said it's hard not to get complacent, because it's been so long since the 1894 flood that inundated the city.

"That's one of the biggest problems we have -- we've been really, really fortunate in Boulder. We haven't had any major floods in many, many years. It starts to give people a false sense of confidence"
Despite the long lucky streak, in recent decades Boulder, and the Colorado Front Range, have devoted considerable resources to flood mitigation efforts. It will be interesting in the months and years to come to assess the effectiveness of those efforts. Many lessons will no doubt be learned about what might have been done better, but I will be surprised if the many years of planning, investment and structural mitigation did not dramatically reduce the possible impacts of the recent floods.

In the aftermath of this week's Boulder flood some observers are already trying to out-do each other by making bigger and bigger claims of the so-called N-year flood. As might be expected the biggest claims (a 1,000-year event has the record so far!) are made by those who seek to link causality of Colorado disaster to human-caused climate change in a simplistic way (those interested in this topic can have a look at the second fallacy covered in the paper below). There has been better reporting too, such as this from NBC.

Below I provide an excerpt from a 1999 paper of mine titled "Nine Fallacies of Floods" (a title suggested by Mickey Glantz) which takes issue with the common usage of the concept of the so-called "100-year flood." The first "fallacy" in that paper is that "flood frequencies are well understood."

Not only is the assumption that flood frequencies are well-understood a fallacy, but the entire notion of the N-year flood is predicated upon a view of stationarity in the statistics of climate that has come into question in the flood research community (which is related to, but also independent of research on human-caused climate change.) See this paper for a discussion of the ongoing debate.

Here is the full citation:
Pielke, Jr. R. A. 1999. Nine fallacies of floods. Climatic Change 42:413-438.
If anything, in the years since I wrote this paper, my views on the utility of the "100-year flood "concept have become stronger -- it is a great example of an oft-repeated scientific-sounding term that is in many important respects utterly wrong or misleading. In fact, it is not even wrong, perhaps wrongheaded is a better descriptor. However, being both wrong and wrongheaded does qualify the term as useful in the ongoing climate wars. So I guess it is here to stay.

Here is the excerpt:

Fallacy #1: Flood Frequencies are Well Understood

Flood experts use the terms ‘stage’ and ‘discharge’ to refer to the size of a flood (Belt, 1975). A flood stage is the depth of a river at some point and is a function of the amount of water, but also the capacity of a river channel and floodplain and other factors. Hence, upstream and downstream levees and different uses of floodplain land can alter a flood’s stage. A flood discharge refers to the volume of water passing a particular point over a period of time. For example, in 1993 St. Louis experienced ‘the highest stage we’ve ever had, but not the biggest volume’.
We’ve had bigger flows, but the stage was different because the water could flow from bluff to bluff. Now we have communities in the floodplain. Every time you do something on a floodplain, you change the flood relationship. Every time a farmer plants a field or a town puts in a levee, it affects upstream flooding. That’s why you can’t really compare flooding at different times in history (G. R. Dryhouse quoted in Corrigan, 1993).
According to the World Meteorological Organization’s International Glossary of Hydrology, ‘flood frequency’ is defined as ‘the number of times a flood above a given discharge or stage is likely to occur over a given number of years’ (WMO, 1993). In the United States, flood frequencies are central to the operations of the National Flood Insurance Program, which uses the term ‘base flood’ to note ‘that in any given year there is a one percent chance that a flood of that magnitude could be equalled or exceeded’ (FIFMTF, 1992, p. 9-7). The ‘base flood’ is more commonly known as ‘the 100-year flood’ and is ‘probably the most misunderstood floodplain management term’ (FIFMTF, 1992, p. 9-7).

A determination of the probability of inundation for various elevations within a community is based on analysis of peak flows at a point on a particular river or stream. However, ‘there is no procedure or set of procedures that can be adopted which, when rigidly applied to the available data, will accurately define the flood potential of any given watershed’ (USWRC, 1981, p. 1). For many reasons, including limitations on the data record and potential change in climate, ‘risk and uncertainty are inherent in any flood frequency analysis’ (USWRC, 1981, p. 2). Nevertheless, quantification of risk is a fundamental element of flood insurance as well as many aspects of flood-related decision making.

In order to quantify flood risk, in the early 1970s the National Flood Insurance Program adopted the 100 year-flood standard (FIFMTF, 1992, p. 8-2). The standard was adopted in order to standardize comparison of areas of risk between communities. Since that time the concept of the N-year flood has become a common fixture in policy, media, and public discussions of floods. Unfortunately, ‘the general public almost universally does not properly understand the meaning of the term’ (FIFMTF, 1992, p. 9-7). Misconceptions about the meaning of the term creates obstacles to proper understanding of the flood problem and, consequently, the development of effective responses.

The 100-year standard refers to a flood that has a one percent chance of being exceeded in any given year. It does not refer to a flood that occurs ‘once every 100 years’. In fact, for a home in a 100-year flood zone there is a greater than 26% chance that it will see at least one 100-year flood over a period of 30 years (and, similarly, more than a 74% chance over 100 years). The general formula for the cumulative probability of at least one flood of annual probability P is (1−P )^N >= C where N equals the number of years from now, and C is the cumulative probability over period N (P is assumed to be constant and events are independent from year to year). By choosing values for P and C one can compute the number of years that the cumulative probability (C) covers.

The concept and terminology of the ‘100-year floodplain’ was formally adopted by the federal government as a standard for all public agencies in 1977 under Executive Order 11988. In 1982 FEMA reviewed the policy and found that it was being used in the agencies and, lacking a better alternative, concluded that the policy should be retained (FIFMTF, 1992, p. 8-3). However, despite the FEMA review, use of the concept of the 100-year flood is encumbered by a number of logical and practical difficulties (cf. Lord, 1994).

First, there is general confusion among users of the term about what it means. Some use the term to refer to a flood that occurs every 100 years, as did the Midwestern mayor who stated that ‘after the 1965 flood, they told us this wouldn’t happen again for another 100 years’ (IFMRC, 1994, p. 59). Public confusion is widespread: A farmer suffering through Midwest flooding for the second time in three years complained that ‘Two years ago was supposed to be a 100-year flood, and they’re saying this is a 75-year flood, What kind of sense does that make? You’d think they’d get it right’ (Peterson, 1995).

Second, the ‘100-year flood’ is only one of many possible probabilistic measures of an area’s flood risk. For instance, in the part of the floodplain that is demarcated as the ‘100-year floodplain’ it is only the outer edge of that area that is estimated to have an annual probability of flooding of 0.01, yet confusion exists (Myers, 1994). Areas closer to the river have higher probabilities of flooding, e.g., there are areas of a floodplain with a 2% annual chance of flooding (50-year floodplain), 10% annual chance (10-year floodplain), 50% annual chance (2-year floodplain) etc., and similarly, areas farther from the river have lower probabilities of flooding. The ‘100-year floodplain’ is arbitrarily chosen for regulatory reasons and does not reflect anything fundamentally intrinsic to the floodplain.

Third, the ‘100-year floodplain’ is determined based on past flood records and is thus subject to considerable errors with respect to the probabilities of future floods. According to Burkham (1978) errors in determination of the ‘100-year flood’ may be off by as much as 50% of flood depth. Depending on the slope of the flood plain, this could translate into a significant error in terms of distance from the river channel. A FEMA press release notes that ‘in some cases there is a difference of only inches between the 10- and the 100-year flood levels’ (FEMA, 1996). Further, researchers are beginning to realize an ‘upper limit’ on what can be known about flood frequencies due to the lack of available trend data (Bobée and Rasmussen, 1995).

Fourth, the 100-year floodplain is not a natural feature, but rather is defined by scientists and engineers based on the historical record. Consequently, while the ‘100-year floodplain’ is dynamic and subject to redefinition based on new flood events that add to the historical record, the regulatory definition is much more difficult to change. For instance, following two years of major flooding on the Salt River in Phoenix, Arizona, the previously estimated 100-year flood was reduced to a 50-year flood (FIFMTF, 1992, p. 9-7). What happens to the structures in redefined areas? Any changes in climate patterns, especially precipitation, will also modify the expected probabilities of inundation. For example, some areas of the upper Midwest have documented a trend of increasing precipitation this century (Changnon and Kunkel, 1995; Bhowmik et al., 1994). Furthermore, human changes to the river environment, e.g., levees and land use changes, can also alter the hydraulics of floods. Finally, the extensive use of the term ‘100-year flood’ focuses attention on that aspect of flooding, sometimes to the neglect of the area beyond the 100-year flood plain (Myers, 1994).

What can be done? Given the pervasive use of the concept of the ‘100-year flood’ in flood insurance and regulatory decision-making it seems that adoption of an alternative concept is unlikely. Nevertheless, there are a number of steps that can be taken by those who use the concept when dealing with policy makers and the public. First, we need to be more precise with language. The FIFMTF (1992) recommends the phrase ‘one percent annual chance flood’ as a preferred alternative to ‘100-year flood’, ‘base flood’, or ‘one percent flood’. [NOTE: USGS has since made such a change in terminology.] Another alternative is ‘national base flood standard’ which removes reference to probability (Thomas, 1996, personal communication). Second, when communicating with the public and the media, flood experts could take care to convert annual exceedances into annual probabilities. And third, policy documents could rely less on the ‘100-year flood’ to illustrate examples and propose policies, and at the very least explicitly discuss floods of different magnitudes.

09 September 2013

Updated Major Hurricane Drought Figure

2,878 days. The US major hurricane drought continues. This length of time is almost exactly twice as long as the previous "drought" since 1915 (which was 1975-1979).

Of course, it could end this month or after, but it will end sometime. Whenever it does it has been remarkable.

03 September 2013

1974 Ehrlich and Holdren Senate Testimony

Motivated by my recent reading of The Bet, by Yale University's Paul Sabin, I tracked down the complete 1974 Senate testimony of Paul Ehrlich and John Holdren in hearings on the "Domestic Supply Information Act" held by the Committee on Commerce and Committee on Government Operations (Serial No. 93-107).

The testimony provides an eye-opening look into the depths of Malthusian froth of the time and also provides a great case study for thinking about the role of experts in policy making. I plan on using the testimony in future courses, so I am making it available here as a PDF.

Below are some excerpts and a starter-set of discussion questions.The whole testimony which is worth reading in full. First Ehrlich:
Ehrlich: I suspect you're aware, that the increased price of petroleum which is certainly related to the near depletion of petroleum resources-they're going to be gone by the end of the century . . .

I think that what is not realized, and it's going to be one of the hardest things to be accepted by the Americans in general, is that the onset of the age of scarcity essentially demolishes current models of economists. We are going to move to a no-growth [economy]. Now, whether we do it intelligently through the Government by planning as rapidly as possible, or whether we move there automatically-by the way, when I look at some of the figures these days, I think we're moving there much more rapidly than people realize--we're going to get there, obviously. And I think we'd do a lot better if we had some planning for the dislocations that will inevitably occur. . .

If bad weather continues in the Midwest this year, and if the monsoon should fail this year in India, as it might, then I think you're going to see the age of scarcity and many of the changes I'm talking about coming on next winter.'I mean that's when we're really going to start getting into it.  If we are "fortunate" for a few years, and have nothing but good weather, then it'll come on, you know, 5 or 10 years down the pike. But of course during that time populations will have increased. . . .

I think that the thing you can say with absolute assurance is, considering the magnitude of the changes, if we have 20 years-which I wouldn't put a nickel on-but if we have 20 years, we're already 10 years too late in starting to do something about it. We're not going to change the political and economic structure of the United States overnight. And for that reason, I think that any feeling of urgency that you can generate--one of the big problems is how do you generate a feeling of urgency . . .
Class discussion questions:
  • How did Ehrlich's warnings pan out?
  • Should policy makers have acted on his advice? Why? Why not? (and what would it have meant to "act"?)
  • What makes experts today more believable by policy makers than Ehrlich was then (both contemporaneously and with the advantage of hindsight)?
  • Should experts at the time have helped campaign to increase a sense of "urgency"?
Now some excerpts from Holdren's testimony:
Holdren: The. main point here is that, although there may be defects in any specific detailed model, the general conclusion is far more robust than any specific model. At the same time, one has to make a certain disclaimer, and that is that neither analysis nor computer models are adequate to the task of predicting exactly what disaster will follow from a continuation of present trends and exactly when such a disaster will take place.

Now, this problem puts those of us who tend to view with alarm in a somewhat curious position. We're calling upon society to make major changes, but we cannot prove exactly what will happen and exactly when, in the absence of those kinds of changes. This particular point is often used against us by people who are optimistic and believe that one way or another, technology will let us muddle through. I think a useful way to think about this particular dilemma is in terms of the burden of proof; that is, we should ask: Are we worse off if we believe the pessimists and they are wrong, or are we worse off if we believe the optimists and they are wrong?

I think the conclusion is clear. . .

In addition to the reliance on technological panaceas, per se. there is an enormious reliance on the part of optimists of various kinds of the price mechanism, on economic forces, to somehow bail us out of the kinds of difficulties that we're in for. This too has been one of the major criticisms of "Limits to Growth," tlat somehow they didn't adequately incorporate what the price mechanism would do for us in extricating us from this morass of problems.

In this context I think its very important to understand what economics is. Economics is the study of how to allocate resources that are fundamentally scarce in the most efficient way. It doesn't always even do that. But ideally, the idea behind economics is to allocate scarce resources. It does not make scarce resources less scarce. . .

This tendency is perhaps the most dangerous one we face, that somehow people want to wait until the evidence is absolutely overwhelming, that we're in for a catastrophe, before they take action. What worries me is that by the time the evidence is absolutely overwhelming, a good deal of the damage may in fact be irreversible. It's the same tendency toward oversimplification which leads people to think that one set of technological solutions will bail us out. As much as we need technology, we need a good many other things. And as you've already suggested this morning, one of them is social and institutional changes . . .
Class discussion questions:
  • How do you evaluate Holdren's view of technology?
  • How do you view Holdren's view of economics?
  • How much evidence should policy makers have before committing to a particular course of action?
  • In what ways are experts who call for social and institutional changes in society different than Holdren/Ehrlich in 1974? 

Over all, what advice should experts take from these cases for thinking about how their testimony in 2013 might be viewed from the perspective of 2053?