|
A winning proposal for the Innovative Research Program, 2009:
How much should we trust decadal climate projections at regional scales?
Investigators: David Noone (CIRES/CU-ATOC), Gilbert Compo (CIRES/CDC-NOAA/PSD),
Matthew Newman (CIRES/CDC-NOAA/PSD)
Objectives
We propose to use two new CIRES-developed datasets to evaluate the ability to predict decadal climate
variability at regional scales by evaluating the temporal characteristics of organized patterns of variability.
We aim to establish to what degree dominant patterns of variability are subject to change over the 20th
century, and use this information to gauge the reliability with which they can be predicted in the 21st
century. We propose an analysis based on an algorithm developed at CIRES-NOAA (e.g., Newman
2007), and applied to both an ice core database (Schneider and Noone 2007) and a recently completed
atmospheric reanalysis (Compo et al. 2009). This work strengthens collaboration between CIRES
scientists at NOAA and CU by focusing complimentary research interests on a common scientific
outcome.
Background and importance
With the emergence of consensus on global mean temperature change, both the scientific community and
non-scientific stakeholders have turned their attention to decadal prediction of climate. While some
aspects of climate change are clearly predictable, such as warming associated with increasing greenhouse
gases, the ability to predict more detailed aspects remains unclear. In particular, regional changes in
atmospheric circulation that often control local precipitation patterns and year-to-year temperature
variations appear less predictable. Yet predicting changes in regional climate, and knowledge of the
confidence one can place on those predictions, is needed for effective decision making and developing
adoption strategies in spite of uncertainty and risk.
The question of predictability emerged in weather forecasting in the 1960s when Ed Lorenz pointed out
fundamental issues linked to error growth in non-linear systems. Beyond about 10 days, the weather is
unpredictable! A similar result emerged for seasonal prediction of ENSO, with the limits on the order of
months. The proposed work widens the scope further and asks to what degree is regional climate
associated with organized patterns of variability unpredictable? While the patterns of variability have both
forced and unforced contributions, we suspect that there are aspects of the unforced behavior that limit the
ability to predict forced response. To this end one might ask, what is the time-scale within which
regional-scale climate projections that depend on knowing how these patterns change can be treated as
reliable by decision makers?
To date measuring the ability to predict regional climate has been limited by the lack of sufficiently long
and comprehensive observations to characterize decadal scale variations in atmospheric circulation
anomalies. Attempts have been made to assess the quality of decadal projections by noting where
ensembles of climate model projections agree and taking model consensus as a measure of predictability.
This approach lacks the satisfying rigor of an observationally based estimate. As such there is a need to
use both better decadal-scale datasets and more appropriate statistical methods.
Research Plan
While atmospheric reanalysis datasets exist and capture atmospheric variability in recent decades, they
are of limited use for understanding decadal variability because of their limited length. The recently
completed historical reanalysis by Compo and others in CIRES-CDC/NOAA/PSD is poised to facilitate
analysis of climate variability on decadal scales. This dataset spans the period from 1891 to 2003, and
extends the NCAR/NCEP reanalysis which covers from 1948 onwards. There remain questions however
about the reliability of this data early in the 20th century – especially in oceanic regions and the Southern Hemisphere where there are few observations prior to the mid-20th century. As such, there is a need to
validate both the mean circulation patterns and the ability of this new dataset to accurately capture the
dynamics of the organized patterns of variability that we suspect are so important for regional climate.
The dataset assembled by Schneider and Noone (2007) is composed of annually dated ice core records
from around 1870 to 1990, captures aspects of the known circulation patterns and offers a pure
observational perspective. We suspect that these data not only capture signatures of the patterns of
variability, but also capture the same fundamental dynamics of the reanalysis of Compo et al. (2009).
Newman (2007) used an eigenfunction analysis of global sea surface temperature data to evaluate the
dynamical behavior captured by the observational data. He showed that there are both oscillatory and
decaying modes, and suggested that these are linked to some of the modes known to operate in the ocean
(specifically ENSO and the Pacific Decadal Oscillation). We propose a similar approach, to search for the
eigenmodes of the atmospheric circulation that are captured in both the reanalysis and ice core network.
The temporal statistics of the eigenmodes, and specifically the decay rates (implied by negative
eigenvalues), speak to the time-scale over which the patterns can be used as a basis for reliable prediction.
What makes this innovative
The use of eigenmode analysis of an empirically derived linearization of climate system dynamics is
based on a series of work pioneered at the CIRES-CDC/NOAA/PSD. This approach has begun to show
utility in seasonal prediction, but as yet, has not been applied to the task of decadal prediction of climate.
The research proposed offers a critical test of the ability of such an approach to be used for decadal
prediction. If successful, the findings will have immediate relevance for understanding predicted
regional climate change from 2010 to 2050, and will offer insight into the reliability of regional climate
projection. The ability of the ice core network to capture a significant part of the true variability in
atmospheric circulation allows the task of direct assimilation of paleo-proxy data to be considered. This
moves beyond more traditional uses of proxy climate data that focus on simple statistics. These are
known to be limited because of the lack of the dynamical and thermodynamic constraints. While there has
been discussion of the need to perform an assimilation of proxy data, there has been no significant
demonstration of the capability. This study provides this critical test of the utility of proxies by
establishing that the dynamics of the circulation are captured reliably by the proxy records.
Expected outcome and impact
The recently completed historical reanalysis by Compo and others in NOAA PSD is set to facilitate
analysis of climate variability on decadal scales. While some data in the Southern Ocean were included in
the assimilation and offers valuable constraint, the lack of comprehensive constraints raises questions as
to the reliability. The proposed study offers an independent check on not only variations in the mean state
but the degree to which the new dataset reliably captures the variations that are linked to organized
variability. We expect to determine those regions where the gridded data are the most reliable, and allow
confidence to be gained by the community in the dataset’s use.
References
1Gregory, S., and D. Noone, 2008: Variability in the teleconnection between the El Nino-Southern
Oscillation and West Antarctic climate. Journal of Geophysical Research, 113, D17110,
doi:10.1029/2007JD009107.
2Schneider, D. P. and D. C. Noone, 2007: Spatial covariance of water isotopes in ice cores during 20th
Century climate change. Journal of Geophysical Research, 112, D18105,
doi:10.1029/2007JD008652.
3Compo, G.P., et al. 2009: The Twentieth Century Reanalysis. Project. Bull. Amer. Met. Soc., in
preparation.
4Newman, M., 2007: Interannual to decadal predictability of tropical and North Pacific sea surface
temperatures. J. Climate, 20, 2333-2356.
|