Search Results
You are looking at 1 - 10 of 48 items for
- Author or Editor: Hans Von Storch x
- Refine by Access: All Content x
Abstract
The technique of “inflating” in downscaling, which makes the downscaled climate variable have the right variance, is based on the assumption that all local variability can be traced back to large-scale variability. For practical situations this assumption is not valid, and inflation is an inappropriate technique. Instead, additive, randomized approaches should be adopted.
Abstract
The technique of “inflating” in downscaling, which makes the downscaled climate variable have the right variance, is based on the assumption that all local variability can be traced back to large-scale variability. For practical situations this assumption is not valid, and inflation is an inappropriate technique. Instead, additive, randomized approaches should be adopted.
Abstract
The derivation of local scale information from integrations of coarse-resolution general circulation models (GCM) with the help of statistical models fitted to present observations is generally referred to as statistical downscaling. In this paper a relatively simple analog method is described and applied for downscaling purposes. According to this method the large-scale circulation simulated by a GCM is associated with the local variables observed simultaneously with the most similar large-scale circulation pattern in a pool of historical observations. The similarity of the large-scale circulation patterns is defined in terms of their coordinates in the space spanned by the leading observed empirical orthogonal functions.
The method can be checked by replicating the evolution of the local variables in an independent period. Its performance for monthly and daily winter rainfall in the Iberian Peninsula is compared to more complicated techniques, each belonging to one of the broad families of existing statistical downscaling techniques: a method based on canonical correlation analysis, as representative of linear methods; a method based on classification and regression trees, as representative of a weather generator based on classification methods; and a neural network, as an example of deterministic nonlinear methods.
It is found in these applications that the analog method performs in general as well as the more complicated methods, and it can be applied to both normally and nonnormally distributed local variables. Furthermore, it produces the right level of variability of the local variable and preserves the spatial covariance between local variables. On the other hand linear multivariate methods offer a clearer physical interpretation that supports more strongly its validity in an altered climate. Classification and neural networks are generally more complicated methods and do not directly offer a physical interpretation.
Abstract
The derivation of local scale information from integrations of coarse-resolution general circulation models (GCM) with the help of statistical models fitted to present observations is generally referred to as statistical downscaling. In this paper a relatively simple analog method is described and applied for downscaling purposes. According to this method the large-scale circulation simulated by a GCM is associated with the local variables observed simultaneously with the most similar large-scale circulation pattern in a pool of historical observations. The similarity of the large-scale circulation patterns is defined in terms of their coordinates in the space spanned by the leading observed empirical orthogonal functions.
The method can be checked by replicating the evolution of the local variables in an independent period. Its performance for monthly and daily winter rainfall in the Iberian Peninsula is compared to more complicated techniques, each belonging to one of the broad families of existing statistical downscaling techniques: a method based on canonical correlation analysis, as representative of linear methods; a method based on classification and regression trees, as representative of a weather generator based on classification methods; and a neural network, as an example of deterministic nonlinear methods.
It is found in these applications that the analog method performs in general as well as the more complicated methods, and it can be applied to both normally and nonnormally distributed local variables. Furthermore, it produces the right level of variability of the local variable and preserves the spatial covariance between local variables. On the other hand linear multivariate methods offer a clearer physical interpretation that supports more strongly its validity in an altered climate. Classification and neural networks are generally more complicated methods and do not directly offer a physical interpretation.
Abstract
This study explores the possibility of reconstructing the weather of Southeast Asia for the last decades using an atmospheric regional climate model, the Climate version of the Lokal-Modell (CLM). For this purpose global National Centers for Environmental Prediction–National Center for Atmospheric Research (NCEP–NCAR) reanalyses data were dynamically downscaled to 50 km and in a double-nesting approach to 18-km grid distance. To prevent the regional model from deviating significantly from the reanalyses with respect to large-scale circulation and large-scale weather phenomena, a spectral nudging technique was used.
The performance of this technique in dealing with Southeast Asian typhoons is now examined by considering an ensemble of one simulated typhoon case. This analysis is new insofar as it deals with simulations done in the climate mode (so that any skill of reproducing the typhoon is not related to details of initial conditions), is done in ensemble mode (the same development is described by several simulations), and is done with a spectral nudging constraint (so that the observed large-scale state is enforced in the model domain). This case indicates that tropical storms that are coarsely described by the reanalyses are correctly identified and tracked; considerably deeper core pressure and higher wind speeds are simulated compared to the driving reanalyses. When the regional atmospheric model is run without spectral nudging, significant intraensemble variability occurs; also additional, nonobserved typhoons form. Thus, the insufficiency of lateral boundary conditions alone for determining the details of the dynamic developments in the interior becomes very clear. The same lateral boundary conditions are consistent with different developments in the interior. Several sensitivity experiments were performed concerning varied grid distances, different initial starting dates of the simulations, and changed spectral nudging parameters.
Abstract
This study explores the possibility of reconstructing the weather of Southeast Asia for the last decades using an atmospheric regional climate model, the Climate version of the Lokal-Modell (CLM). For this purpose global National Centers for Environmental Prediction–National Center for Atmospheric Research (NCEP–NCAR) reanalyses data were dynamically downscaled to 50 km and in a double-nesting approach to 18-km grid distance. To prevent the regional model from deviating significantly from the reanalyses with respect to large-scale circulation and large-scale weather phenomena, a spectral nudging technique was used.
The performance of this technique in dealing with Southeast Asian typhoons is now examined by considering an ensemble of one simulated typhoon case. This analysis is new insofar as it deals with simulations done in the climate mode (so that any skill of reproducing the typhoon is not related to details of initial conditions), is done in ensemble mode (the same development is described by several simulations), and is done with a spectral nudging constraint (so that the observed large-scale state is enforced in the model domain). This case indicates that tropical storms that are coarsely described by the reanalyses are correctly identified and tracked; considerably deeper core pressure and higher wind speeds are simulated compared to the driving reanalyses. When the regional atmospheric model is run without spectral nudging, significant intraensemble variability occurs; also additional, nonobserved typhoons form. Thus, the insufficiency of lateral boundary conditions alone for determining the details of the dynamic developments in the interior becomes very clear. The same lateral boundary conditions are consistent with different developments in the interior. Several sensitivity experiments were performed concerning varied grid distances, different initial starting dates of the simulations, and changed spectral nudging parameters.
Abstract
The question of whether the large-scale low-frequency sea surface temperature (SST) variability in the North Pacific can be interpreted as a response to large-scale wind anomalies is studied by an ocean general circulation model coupled to an advective model for the air temperature. Forced with observed monthly mean winds, the model is successful in reproducing the main space and time characteristics of the large-scale low-frequency SST variability. In winter also the simulated and observed SSTs are highly correlated.
The dominant process in producing wintertime SST tendencies is the anomalous turbulent beat exchange with the atmosphere that is parameterized by the bulk aerodynamic formula and takes into account the simulated air temperature, the simulated SST, and the observed winds. The oceanic response to turbulent momentum fluxes is much smaller. The horizontal scale of the simulated air temperature is induced by advective transports with the observed winds and transferred to the ocean by anomalous turbulent latent and sensible heat fluxes. The ocean response is lagging the atmospheric forcing by about one month and persists over much longer time than the atmospheric anomalies, particularly in winter.
Part of the observed low-frequency SST variance can be explained by teleconnection. A wind field that is directly related to the tropical El Niño–Southern Oscillation (ENSO) phenomenon produces SST anomalies with an ENSO-related variance of more than 50% instead of 10% to 30% as observed.
Abstract
The question of whether the large-scale low-frequency sea surface temperature (SST) variability in the North Pacific can be interpreted as a response to large-scale wind anomalies is studied by an ocean general circulation model coupled to an advective model for the air temperature. Forced with observed monthly mean winds, the model is successful in reproducing the main space and time characteristics of the large-scale low-frequency SST variability. In winter also the simulated and observed SSTs are highly correlated.
The dominant process in producing wintertime SST tendencies is the anomalous turbulent beat exchange with the atmosphere that is parameterized by the bulk aerodynamic formula and takes into account the simulated air temperature, the simulated SST, and the observed winds. The oceanic response to turbulent momentum fluxes is much smaller. The horizontal scale of the simulated air temperature is induced by advective transports with the observed winds and transferred to the ocean by anomalous turbulent latent and sensible heat fluxes. The ocean response is lagging the atmospheric forcing by about one month and persists over much longer time than the atmospheric anomalies, particularly in winter.
Part of the observed low-frequency SST variance can be explained by teleconnection. A wind field that is directly related to the tropical El Niño–Southern Oscillation (ENSO) phenomenon produces SST anomalies with an ENSO-related variance of more than 50% instead of 10% to 30% as observed.
Abstract
One objective of general circulation models is to simulate, e.g., a “January” which is not distinguishable from observed Januaries. A strategy to verify an individual simulated state is proposed. Its main elements are: data compression by means of EOFS, performance of a multivariate parametric test, and a subsequent univariate analysis.
The suggested technique is applied to four January simulations perforated with the Hamburg University GCM. The meteorological parameters treated are the zonally averaged January mean of the geopotential itself and of the intensity of transient and stationary eddies of geopotential height at 300, 500 and 8 50 mb. The comparison is based on daily observations from 15 Januaries (1967–81).
It turns out that the midlatitudinal meridional gradient of geopotential height is significantly overestimated at all levels. The intensity of the transient eddies is significantly overestimated at 850 mb at practically all latitudes and at 300 and 500 mb at midlatitudes.
Abstract
One objective of general circulation models is to simulate, e.g., a “January” which is not distinguishable from observed Januaries. A strategy to verify an individual simulated state is proposed. Its main elements are: data compression by means of EOFS, performance of a multivariate parametric test, and a subsequent univariate analysis.
The suggested technique is applied to four January simulations perforated with the Hamburg University GCM. The meteorological parameters treated are the zonally averaged January mean of the geopotential itself and of the intensity of transient and stationary eddies of geopotential height at 300, 500 and 8 50 mb. The comparison is based on daily observations from 15 Januaries (1967–81).
It turns out that the midlatitudinal meridional gradient of geopotential height is significantly overestimated at all levels. The intensity of the transient eddies is significantly overestimated at 850 mb at practically all latitudes and at 300 and 500 mb at midlatitudes.
Abstract
A stochastic specification for monthly mean wintertime eddy heat transport conditional upon the monthly mean circulation is proposed. The approach is based on an analog technique. The nearest neighbor for the monthly mean streamfunction (at 850 and 300 hPa) is searched for in a library composed of monthly data of a 1268-yr control simulation with a coupled ocean–atmosphere model. To reduce the degrees of freedom a limited area (the North Atlantic sector) is used for the analog specification. The monthly means of northward transient eddy flux of temperature (at 750 hPa) are simulated as a function of these analogues.
The stochastic model is applied to 300 years of a paleosimulation (last interglacial maximum around 125 kyr BP). The level of variability of the eddy heat flux is reproduced by the analog estimator, as well as the link between monthly mean circulation and synoptic-scale variability. The changed boundary conditions (solar radiation and CO2 level) cause the Eemian variability to be significantly reduced compared to the control simulation. Although analogues are not a very good predictor of heat fluxes for individual months, they turn out to be excellent predictors of the distribution (or at least the variance) of heat fluxes in an anomalous climate.
Abstract
A stochastic specification for monthly mean wintertime eddy heat transport conditional upon the monthly mean circulation is proposed. The approach is based on an analog technique. The nearest neighbor for the monthly mean streamfunction (at 850 and 300 hPa) is searched for in a library composed of monthly data of a 1268-yr control simulation with a coupled ocean–atmosphere model. To reduce the degrees of freedom a limited area (the North Atlantic sector) is used for the analog specification. The monthly means of northward transient eddy flux of temperature (at 750 hPa) are simulated as a function of these analogues.
The stochastic model is applied to 300 years of a paleosimulation (last interglacial maximum around 125 kyr BP). The level of variability of the eddy heat flux is reproduced by the analog estimator, as well as the link between monthly mean circulation and synoptic-scale variability. The changed boundary conditions (solar radiation and CO2 level) cause the Eemian variability to be significantly reduced compared to the control simulation. Although analogues are not a very good predictor of heat fluxes for individual months, they turn out to be excellent predictors of the distribution (or at least the variance) of heat fluxes in an anomalous climate.
Abstract
The class of “regime dependent autoregressive” time series models (RAMs) is introduced. These nonlinear models describe variations of the moments of nonstationary time series by allowing parameter values to change with the state of an ancillary controlling time series and possibly an index series. The index series is used to indicate deterministic seasonal and regimal changes with time. Fitting and diagnostic procedures are described in the paper.
RAMs are fitted to a 102-year seasonal mean tropical Pacific sea surface temperature index time series. The models are controlled by a seasonal index series and one of two ancillary time series: seasonal mean Adelaide sea level pressure and Indian monsoon rainfall, which have previously been identified as possible precursors of the extremes of the Southern Oscillation (SO).
Analysis of the fitted models gives clear evidence for the seasonal variation of the statistical characteristics of the SO. There is strong evidence that the annual cycle of the SO index depends upon the state of the SO as represented by the ancillary time series. There is weaker evidence which suggests that its autocorrelation structure is also state dependent.
Abstract
The class of “regime dependent autoregressive” time series models (RAMs) is introduced. These nonlinear models describe variations of the moments of nonstationary time series by allowing parameter values to change with the state of an ancillary controlling time series and possibly an index series. The index series is used to indicate deterministic seasonal and regimal changes with time. Fitting and diagnostic procedures are described in the paper.
RAMs are fitted to a 102-year seasonal mean tropical Pacific sea surface temperature index time series. The models are controlled by a seasonal index series and one of two ancillary time series: seasonal mean Adelaide sea level pressure and Indian monsoon rainfall, which have previously been identified as possible precursors of the extremes of the Southern Oscillation (SO).
Analysis of the fitted models gives clear evidence for the seasonal variation of the statistical characteristics of the SO. There is strong evidence that the annual cycle of the SO index depends upon the state of the SO as represented by the ancillary time series. There is weaker evidence which suggests that its autocorrelation structure is also state dependent.
This paper addresses the views regarding the certainty and uncertainty of climate science knowledge held by contemporary climate scientists. More precisely, it addresses the extension of this knowledge into the social and political realms as per the definition of postnormal science. The data for the analysis is drawn from a response rate of approximately 40% from a survey questionnaire mailed to 1000 scientists in Germany, the United States, and Canada, and from a series of in-depth interviews with leading scientists in each country. The international nature of the sample allows for cross-cultural comparisons.
With respect to the relative scientific discourse, similar assessments of the current state of knowledge are held by the respondents of each country. Almost all scientists agreed that the skill of contemporary models is limited. Minor differences were notable. Scientists from the United States were less convinced of the skills of the models than their German counterparts and, as would be expected under such circumstances, North American scientists perceived the need for societal and political responses to be less urgent than their German counterparts. The international consensus was, however, apparent regarding the utility of the knowledge to date: climate science has provided enough knowledge so that the initiation of abatement measures is warranted. However, consensus also existed regarding the current inability to explicitly specify detrimental effects that might result from climate change. This incompatibility between the state of knowledge and the calls for action suggests that, to some degree at least, scientific advice is a product of both scientific knowledge and normative judgment, suggesting a socioscientific construction of the climate change issue.
This paper addresses the views regarding the certainty and uncertainty of climate science knowledge held by contemporary climate scientists. More precisely, it addresses the extension of this knowledge into the social and political realms as per the definition of postnormal science. The data for the analysis is drawn from a response rate of approximately 40% from a survey questionnaire mailed to 1000 scientists in Germany, the United States, and Canada, and from a series of in-depth interviews with leading scientists in each country. The international nature of the sample allows for cross-cultural comparisons.
With respect to the relative scientific discourse, similar assessments of the current state of knowledge are held by the respondents of each country. Almost all scientists agreed that the skill of contemporary models is limited. Minor differences were notable. Scientists from the United States were less convinced of the skills of the models than their German counterparts and, as would be expected under such circumstances, North American scientists perceived the need for societal and political responses to be less urgent than their German counterparts. The international consensus was, however, apparent regarding the utility of the knowledge to date: climate science has provided enough knowledge so that the initiation of abatement measures is warranted. However, consensus also existed regarding the current inability to explicitly specify detrimental effects that might result from climate change. This incompatibility between the state of knowledge and the calls for action suggests that, to some degree at least, scientific advice is a product of both scientific knowledge and normative judgment, suggesting a socioscientific construction of the climate change issue.
Abstract
Past variations of water levels at Cuxhaven, Germany (German bight), are examined, and a scenario for future changes due to expected global warming is derived.
The observational record of Cuxhaven water levels features a linear upward trend in the annual mean water level of about 30 cm 100 yr−1 overlaid by irregular variations due to synoptic disturbances. These irregular storm-related variations are shown to have remained mostly stationary since the beginning of observations until today.
A scenario for future conditions is derived by means of a two-step downscaling approach. First, a “time slice experiment” is used to obtain a regionally disaggregated scenario for the time mean circulation for the time of expected doubling of atmospheric CO2 concentrations. Then, an empirical downscaling model is derived, which relates intramonthly percentiles of storm-related water-level variations at Cuxhaven to variations in the monthly mean air pressure field over Europe and the northern North Atlantic.
Past variations of storm-related intramonthly percentiles are well reproduced by the downscaling model so that the statistical model may be credited with skill. The combined time slice–statistical model “predicts,” for the expect time of doubled atmospheric CO2 concentrations in the decade around 2035, an insignificant rise of the 50%, 80%, and 90% percentiles of storm-related water-level variations in Cuxhaven of less than 10 cm, which is well within the range of natural interdecadal variability. These numbers have to be added to the rise in mean sea level due to thermal expansion and other slow processes.
Abstract
Past variations of water levels at Cuxhaven, Germany (German bight), are examined, and a scenario for future changes due to expected global warming is derived.
The observational record of Cuxhaven water levels features a linear upward trend in the annual mean water level of about 30 cm 100 yr−1 overlaid by irregular variations due to synoptic disturbances. These irregular storm-related variations are shown to have remained mostly stationary since the beginning of observations until today.
A scenario for future conditions is derived by means of a two-step downscaling approach. First, a “time slice experiment” is used to obtain a regionally disaggregated scenario for the time mean circulation for the time of expected doubling of atmospheric CO2 concentrations. Then, an empirical downscaling model is derived, which relates intramonthly percentiles of storm-related water-level variations at Cuxhaven to variations in the monthly mean air pressure field over Europe and the northern North Atlantic.
Past variations of storm-related intramonthly percentiles are well reproduced by the downscaling model so that the statistical model may be credited with skill. The combined time slice–statistical model “predicts,” for the expect time of doubled atmospheric CO2 concentrations in the decade around 2035, an insignificant rise of the 50%, 80%, and 90% percentiles of storm-related water-level variations in Cuxhaven of less than 10 cm, which is well within the range of natural interdecadal variability. These numbers have to be added to the rise in mean sea level due to thermal expansion and other slow processes.
Abstract
Air pressure readings and their variations are commonly used to make inferences about storm activity. More precisely, it is assumed that the variation of annual and seasonal statistics of several pressure-based proxies describes changes in the past storm climate qualitatively, an assumption that has yet to be proven.
A systematic evaluation of the informational content of five pressure-based proxies for storm activity based on single-station observations of air pressure is presented. The number of deep lows, lower percentiles of pressure, the frequency of absolute pressure tendencies above certain thresholds, as well as mean values and high percentiles of absolute pressure tendencies is examined. Such an evaluation needs long and homogeneous records of wind speed, something that is not available from observations. Consequently, the proxies are examined by using datasets of ground-level wind speeds and air pressure from the NCEP-driven and spectrally nudged regional model, REMO. The proxies are gauged against the 95th and 99th percentile time series of ground-level wind speeds to quantify the relation between pressure-based proxies and storminess. These analyses rely on bootstrap and binomial hypothesis testing. The analyses of single-station-based proxies indicate that the proxies are generally linearly linked to storm activity, and that absolute pressure tendencies have the highest informational content. Further, it is investigated as to whether the proxies have the potential for describing storminess over larger areas, also with regard to surface conditions. It is found that absolute pressure tendencies have improved informational value when describing storm activity over larger areas, while low pressure readings do not show improved informational value.
Abstract
Air pressure readings and their variations are commonly used to make inferences about storm activity. More precisely, it is assumed that the variation of annual and seasonal statistics of several pressure-based proxies describes changes in the past storm climate qualitatively, an assumption that has yet to be proven.
A systematic evaluation of the informational content of five pressure-based proxies for storm activity based on single-station observations of air pressure is presented. The number of deep lows, lower percentiles of pressure, the frequency of absolute pressure tendencies above certain thresholds, as well as mean values and high percentiles of absolute pressure tendencies is examined. Such an evaluation needs long and homogeneous records of wind speed, something that is not available from observations. Consequently, the proxies are examined by using datasets of ground-level wind speeds and air pressure from the NCEP-driven and spectrally nudged regional model, REMO. The proxies are gauged against the 95th and 99th percentile time series of ground-level wind speeds to quantify the relation between pressure-based proxies and storminess. These analyses rely on bootstrap and binomial hypothesis testing. The analyses of single-station-based proxies indicate that the proxies are generally linearly linked to storm activity, and that absolute pressure tendencies have the highest informational content. Further, it is investigated as to whether the proxies have the potential for describing storminess over larger areas, also with regard to surface conditions. It is found that absolute pressure tendencies have improved informational value when describing storm activity over larger areas, while low pressure readings do not show improved informational value.