The floods that occurred earlier this month in Colorado remind us once again of the increasing talk about extreme weather events. This discussion has been going on for some time in the media, starting perhaps with the European heat wave of 2003, and has continued to this day—the Russian heat wave of 2010; the floods in Pakistan in 2012; and the enormous storm named Sandy that hit the Atlantic coast of the U.S. in 2012.
We hear in the media that the rain storms that hit Colorado this year are the worst since 1893. Are such extreme weather events becoming more common? If so, to what extent can they be attributed to climate change?
As Richard Smith, Director of the Statistical and Applied Mathematical Sciences Institute (SAMSI) pointed out in a recent talk, these are not simple questions to ask, but statistical analysis is making progress on ways to answer such questions.
While Smith points out that there is empirical evidence that extreme events are becoming more frequent, there is no universal agreement that it is due to climate change or anthropogenic contributions. And quantifying how frequent extreme events may become in the future, regardless of its causes, remains an area of research.
Smith offers some possible approaches to developing techniques to answering these questions, combing extreme value theory with hierarchical models. Details on these techniques may be found in his talk.
This talk was one of several in a minisymposium on Inference in Climate Studies. Audio recordings, synchronized with the slides, are available for listening/viewing.
Together these talks show some of the research in the mathematical and statistical sciences on climate studies. Some of these are also part of a larger field of research, known to mathematicians as uncertainty quantification, whose goal it is to better quantify errors in large-scale computational models.