Search Results

You are looking at 1 - 10 of 48 items for

  • Author or Editor: Hans Von Storch x
  • Refine by Access: All Content x
Clear All Modify Search
Hans von Storch

Abstract

The technique of “inflating” in downscaling, which makes the downscaled climate variable have the right variance, is based on the assumption that all local variability can be traced back to large-scale variability. For practical situations this assumption is not valid, and inflation is an inappropriate technique. Instead, additive, randomized approaches should be adopted.

Full access
Hans von Storch
and
Erich Roeckner

Abstract

One objective of general circulation models is to simulate, e.g., a “January” which is not distinguishable from observed Januaries. A strategy to verify an individual simulated state is proposed. Its main elements are: data compression by means of EOFS, performance of a multivariate parametric test, and a subsequent univariate analysis.

The suggested technique is applied to four January simulations perforated with the Hamburg University GCM. The meteorological parameters treated are the zonally averaged January mean of the geopotential itself and of the intensity of transient and stationary eddies of geopotential height at 300, 500 and 8 50 mb. The comparison is based on daily observations from 15 Januaries (1967–81).

It turns out that the midlatitudinal meridional gradient of geopotential height is significantly overestimated at all levels. The intensity of the transient eddies is significantly overestimated at 850 mb at practically all latitudes and at 300 and 500 mb at midlatitudes.

Full access
Frauke Feser
and
Hans von Storch

Abstract

A two-dimensional discrete spatial filter was developed. It serves as a means to classify meteorological fields on a limited-area grid according to their spatial dimensions by filtering certain wavenumber ranges. Thereby it performs an isotropic spatial-scale separation of the atmospheric fields. A general algorithm was developed, which allows the construction of a filter that closely approximates a specific isotropic response function. The filter is simple in the construction and easy to apply while giving reasonable results. The method allows for considerable flexibility in choosing this specific response. This way, low-, band-, and high-pass filters are obtained. Examples show an effective scale separation of atmospheric fields on limited-area grids that can be used for process studies, model evaluation, or comparisons.

Full access
Frauke Feser
and
Hans von Storch

Abstract

This study explores the possibility of reconstructing the weather of Southeast Asia for the last decades using an atmospheric regional climate model, the Climate version of the Lokal-Modell (CLM). For this purpose global National Centers for Environmental Prediction–National Center for Atmospheric Research (NCEP–NCAR) reanalyses data were dynamically downscaled to 50 km and in a double-nesting approach to 18-km grid distance. To prevent the regional model from deviating significantly from the reanalyses with respect to large-scale circulation and large-scale weather phenomena, a spectral nudging technique was used.

The performance of this technique in dealing with Southeast Asian typhoons is now examined by considering an ensemble of one simulated typhoon case. This analysis is new insofar as it deals with simulations done in the climate mode (so that any skill of reproducing the typhoon is not related to details of initial conditions), is done in ensemble mode (the same development is described by several simulations), and is done with a spectral nudging constraint (so that the observed large-scale state is enforced in the model domain). This case indicates that tropical storms that are coarsely described by the reanalyses are correctly identified and tracked; considerably deeper core pressure and higher wind speeds are simulated compared to the driving reanalyses. When the regional atmospheric model is run without spectral nudging, significant intraensemble variability occurs; also additional, nonobserved typhoons form. Thus, the insufficiency of lateral boundary conditions alone for determining the details of the dynamic developments in the interior becomes very clear. The same lateral boundary conditions are consistent with different developments in the interior. Several sensitivity experiments were performed concerning varied grid distances, different initial starting dates of the simulations, and changed spectral nudging parameters.

Full access
Hans von Storch
and
Gerhard Hannoschöck

Abstract

Statistical properties of estimated nonisotropic principal vectors [empirical orthogonal functions (EOFs)] are reviewed and discussed. The standard eigenvalue estimator is nonnormally distributed and biased: the largest one becomes overestimated, the smallest ones underestimated. Generally, the variance of the eigenvalue estimate is large. The standard eigenvalue estimator may be used to define an unbiased estimator, which, however, exhibits an increased variance. If a fixed set of EOFs is used, the FOF coefficients are not stochastically independent. The variances of the low-indexed coefficients become considerably overestimated by the respective estimated eigenvalues, those of the high-indexed coefficients underestimated. If the ratio of degrees of freedom to sample size is one-half or even less, these disadvantages are still current as is demonstrated by an example.

Full access
Ute Luksch
and
Hans von Storch

Abstract

A stochastic specification for monthly mean wintertime eddy heat transport conditional upon the monthly mean circulation is proposed. The approach is based on an analog technique. The nearest neighbor for the monthly mean streamfunction (at 850 and 300 hPa) is searched for in a library composed of monthly data of a 1268-yr control simulation with a coupled ocean–atmosphere model. To reduce the degrees of freedom a limited area (the North Atlantic sector) is used for the analog specification. The monthly means of northward transient eddy flux of temperature (at 750 hPa) are simulated as a function of these analogues.

The stochastic model is applied to 300 years of a paleosimulation (last interglacial maximum around 125 kyr BP). The level of variability of the eddy heat flux is reproduced by the analog estimator, as well as the link between monthly mean circulation and synoptic-scale variability. The changed boundary conditions (solar radiation and CO2 level) cause the Eemian variability to be significantly reduced compared to the control simulation. Although analogues are not a very good predictor of heat fluxes for individual months, they turn out to be excellent predictors of the distribution (or at least the variance) of heat fluxes in an anomalous climate.

Full access
Francis W. Zwiers
and
Hans von Storch

Abstract

Recurrence analysis was introduced to infer the degree of separation between a “control” and an “anomaly” ensemble of, say, seasonal means simulated in general circulation model (GCM) experiments. The concept of recurrence analysis is described as a particular application of a statistical technique called multiple discriminant analysis (MDA). Using MDA, univariate recurrence is easily generalized to multicomponent problems. Algorithms that can be used to estimate the level of recurrence and tests that can be used to assess the confidence in a priori specified levels of recurrence are presented.

Several of the techniques are used to reanalyze a series of El Niño sensitivity experiments conducted with the Canadian Climate Centre GCM. The simulated El Niño response in DJF mean 500 mb height are all estimated to be more than 94% recurrent in the tropics and are estimated to be between 90% and 959b recurrent in the Northern Hemisphere between 20° and 60°N latitude.

Discrimination rules that can be used to classify individual realizations of climate as members of the control or “experimental” ensemble are obtained as a by-product of the multiple recurrence analysis. We show that it is possible to make reasonable inferences about the state of the eastern Pacific sea surface temperature by classifying observed DJF 500 mb height fields with discrimination rules derived from the GCM experiments.

Full access
Eduardo Zorita
and
Hans von Storch

Abstract

The derivation of local scale information from integrations of coarse-resolution general circulation models (GCM) with the help of statistical models fitted to present observations is generally referred to as statistical downscaling. In this paper a relatively simple analog method is described and applied for downscaling purposes. According to this method the large-scale circulation simulated by a GCM is associated with the local variables observed simultaneously with the most similar large-scale circulation pattern in a pool of historical observations. The similarity of the large-scale circulation patterns is defined in terms of their coordinates in the space spanned by the leading observed empirical orthogonal functions.

The method can be checked by replicating the evolution of the local variables in an independent period. Its performance for monthly and daily winter rainfall in the Iberian Peninsula is compared to more complicated techniques, each belonging to one of the broad families of existing statistical downscaling techniques: a method based on canonical correlation analysis, as representative of linear methods; a method based on classification and regression trees, as representative of a weather generator based on classification methods; and a neural network, as an example of deterministic nonlinear methods.

It is found in these applications that the analog method performs in general as well as the more complicated methods, and it can be applied to both normally and nonnormally distributed local variables. Furthermore, it produces the right level of variability of the local variable and preserves the spatial covariance between local variables. On the other hand linear multivariate methods offer a clearer physical interpretation that supports more strongly its validity in an altered climate. Classification and neural networks are generally more complicated methods and do not directly offer a physical interpretation.

Full access
Hans von Storch
and
Hinrich Reichardt

Abstract

Past variations of water levels at Cuxhaven, Germany (German bight), are examined, and a scenario for future changes due to expected global warming is derived.

The observational record of Cuxhaven water levels features a linear upward trend in the annual mean water level of about 30 cm 100 yr−1 overlaid by irregular variations due to synoptic disturbances. These irregular storm-related variations are shown to have remained mostly stationary since the beginning of observations until today.

A scenario for future conditions is derived by means of a two-step downscaling approach. First, a “time slice experiment” is used to obtain a regionally disaggregated scenario for the time mean circulation for the time of expected doubling of atmospheric CO2 concentrations. Then, an empirical downscaling model is derived, which relates intramonthly percentiles of storm-related water-level variations at Cuxhaven to variations in the monthly mean air pressure field over Europe and the northern North Atlantic.

Past variations of storm-related intramonthly percentiles are well reproduced by the downscaling model so that the statistical model may be credited with skill. The combined time slice–statistical model “predicts,” for the expect time of doubled atmospheric CO2 concentrations in the decade around 2035, an insignificant rise of the 50%, 80%, and 90% percentiles of storm-related water-level variations in Cuxhaven of less than 10 cm, which is well within the range of natural interdecadal variability. These numbers have to be added to the rise in mean sea level due to thermal expansion and other slow processes.

Full access
Francis W. Zwiers
and
Hans von Storch

Abstract

The comparison of means derived from samples of noisy data is a standard pan of climatology. When the data are not serially correlated the appropriate statistical tool for this task is usually the conventional Student's t-test. However, frequently data are serially correlated in climatological applications with the result that the t test in its standard form is not applicable. The usual solution to this problem is to scale the t statistic by a factor that depends upon the equivalent sample size ne .

It is shown, by means of simulations, that the revised t tea is often conservative (the actual significance level is smaller than the specified significance level) when the equivalent sample size is known. However, in most practical cases the equivalent sample size is not known. Then the test becomes liberal (the actual significance level is greater than the specified significance level). This systematic error becomes small when the true equivalent sample size is large (greater than approximately 30).

The difficulties inherent in difference of means tests when there is serial dependence are reexamined. Guidelines for the application of the “usual” t test are provided and two alternative tests are proposed that substantially improve upon the “usual” t test when samples are small.

Full access