Search Results

You are looking at 1 - 10 of 66 items for

  • Author or Editor: Isztar Zawadzki x
  • Refine by Access: All Content x
Clear All Modify Search
Dominik Jacques
and
Isztar Zawadzki

Abstract

In radar data assimilation, statistically optimal analyses are sought by minimizing a cost function in which the variance and covariance of background and observation errors are correctly represented. Radar observations are particular in that they are often available at spatial resolution comparable to that of background estimates. Because of computational constraints and lack of information, it is impossible to perfectly represent the correlation of errors. In this study, the authors characterize the impact of such misrepresentations in an idealized framework where the spatial correlations of background and observation errors are each described by a homogeneous and isotropic exponential decay. Analyses obtained with perfect representation of correlations are compared to others obtained by neglecting correlations altogether. These two sets of analyses are examined from a theoretical and an experimental perspective. The authors show that if the spatial correlations of background and observation errors are similar, then neglecting the correlation of errors has a small impact on the quality of analyses. They suggest that the sampling noise, related to the precision with which analysis errors may be estimated, could be used as a criterion for determining when the correlations of errors may be omitted. Neglecting correlations altogether also yields better analyses than representing correlations for only one term in the cost function or through the use of data thinning. These results suggest that the computational costs of data assimilation could be reduced by neglecting the correlations of errors in areas where dense radar observations are available.

Full access
Aitor Atencia
and
Isztar Zawadzki

Abstract

Lagrangian extrapolation of recent radar observations is a widely used deterministic nowcasting technique in operational and research centers. However, this technique does not account for errors due to changes in precipitation motion and to growth and decay, thus limiting forecasting skill. In this work these uncertainties have been introduced in the Lagrangian forecasts to generate different realistic future realizations (ensembles). The developed technique benefits from the well-known predictable large scales (low pass) and introduces stochastic noise in the small scales (high pass). The existence of observed predictable properties in the small scales is introduced in the generation of the stochastic noise. These properties provide realistic ensembles of different meteorological situations, narrowing the spread among members. Finally, some statistical spatial and temporal properties of the final set of ensembles have been verified to determine if the technique developed introduced enough uncertainty while keeping the properties of the original field.

Full access
Aitor Atencia
and
Isztar Zawadzki

Abstract

Nowcasting is the short-range forecast obtained from the latest observed state. Currently, heuristic techniques, such as Lagrangian extrapolation, are the most commonly used for rainfall forecasting. However, the Lagrangian extrapolation technique does not account for changes in the motion field or growth and decay of precipitation. These errors are difficult to analytically model and are normally introduced by stochastic processes. According to the chaos theory, similar states, also called analogs, evolve in a similar way plus an error related with the predictability of the situation. Consequently, finding these states in a historical dataset provides a way of forecasting that includes all the physical processes such as growth and decay, among others.

The difficulty of this approach lies in finding these analogs. In this study, recent radar observations are compared with a 15-yr radar dataset. Similar states within the dataset are selected according to their spatial rainfall patterns, temporal storm evolution, and synoptic patterns to generate ensembles. This ensemble of analog states is verified against observations for four different events. In addition, it is compared with the previously mentioned Lagrangian stochastic ensemble by means of different scores. This comparison shows the weaknesses and strengths of each technique. This could provide critical information for a future hybrid analog–stochastic nowcasting technique.

Full access
Aitor Atencia
and
Isztar Zawadzki

Abstract

Intrinsic predictability is defined as the uncertainty in a forecast due to small errors in the initial conditions. In fact, not only the amplitude but also the structure of these initial errors plays a key role in the evolution of the forecast. Several methodologies have been developed to create an ensemble of forecasts from a feasible set of initial conditions, such as bred vectors or singular vectors. However, these methodologies consider only the fastest growth direction globally, which is represented by the Lyapunov vector.

In this paper, the simple Lorenz 63 model is used to compare bred vectors, random perturbations, and normal modes against analogs. The concept of analogs is based on the ergodicity theory to select compatible states for a given initial condition. These analogs have a complex structure in the phase space of the Lorenz attractor that is compatible with the properties of the nonlinear chaotic system.

It is shown that the initial averaged growth rate of errors of the analogs is similar to the one obtained with bred vectors or normal modes (fastest growth), but they do not share other properties or statistics, such as the spread of these growth rates. An in-depth study of different properties of the analogs and the previous existing perturbation methodologies is carried out to shed light on the consequences of forecasting the choice of the perturbations.

Full access
Dominik Jacques
and
Isztar Zawadzki

Abstract

In data assimilation, analyses are generally obtained by combining a “background,” taken from a previously initiated model forecast, with observations from different instruments. For optimal analyses, the error covariance of all information sources must be properly represented. In the case of radar data assimilation, such representation is of particular importance since measurements are often available at spatial resolutions comparable to that of the model grid. Unfortunately, misrepresenting the covariance of radar errors is unavoidable as their true structure is unknown. This two-part study investigates the impacts of misrepresenting the covariance of errors when dense observations, such as radar data, are available. Experiments are performed in an idealized context. In Part I, analyses were obtained by using artificially simulated background and observation estimates. For the second part presented here, background estimates from a convection-resolving model were used. As before, analyses were generated with the same input data but with different misrepresentation of errors. The impacts of these misrepresentations can be quantified by comparing the two sets of analyses. It was found that the correlation of both the background and observation errors had to be represented to improve the quality of analyses. Of course, the concept of “errors” depends on how the “truth” is considered. When the truth was considered as an unknown constant, as opposed to an unknown random variable, background errors were found to be biased. Correcting these biases was found to significantly improve the quality of analyses.

Full access
Urs Germann
and
Isztar Zawadzki

Abstract

The lifetime of precipitation patterns in Eulerian and Lagrangian space derived from continental-scale radar images is used as a measure of predictability. A three-step procedure is proposed. First, the motion field of precipitation is determined by variational radar echo tracking. Second, radar reflectivity is advected by means of a modified semi-Lagrangian advection scheme assuming stationary motion. Third, the Eulerian and Lagrangian persistence forecasts are compared to observations to calculate the lifetime and other measures of predictability. The procedure is repeated with images that have been decomposed according to scales to describe the scale-dependence of predictability.

The analysis has a threefold application: (i) determine the scale-dependence of predictability, (ii) set a standard against which the skill for quantitative precipitation forecasting by numerical modeling can be evaluated, and (iii) extend nowcasting by optimal extrapolation of radar precipitation patterns. The methodology can be applied to other field variables such as brightness temperatures of weather satellites imagery.

Full access
Enrico Torlaschi
and
Isztar Zawadzki

Abstract

A format for an optimal post-detection integration is discussed. The measurement cells in the integration scheme have equal down-range and cross-range resolution to conserve more of the variability of the precipitation field. Every measurement cell combines partially dependent data both in range and time to achieve an adequate number of independent samples without losing resolution. Thus, the standard deviation of the average signal intensity level over a cell (σj) is reduced to a more desirable value. The radial and tangential extent of the cells change as a function of range, and are determined by σj. Such data processing is optimized for biases related to reflectivity gradients, space resolution and density of information.

Full access
Urs Germann
and
Isztar Zawadzki

Abstract

Eulerian and Lagrangian persistence of precipitation patterns derived from continental-scale radar composite images are used as a measure of predictability and for nowcasting [the McGill algorithm for precipitation nowcasting by Lagrangian extrapolation (MAPLE)]. A previous paper introduced the method and focused on the lifetime of patterns of rainfall rates and the scale dependence of predictability. This paper shows how the method of persistence of radar precipitation patterns can be extended to produce probabilistic forecasts. For many applications, probabilistic information is at least as important as the expected point value. Four techniques are presented and compared. One is entirely new and makes use of the intrinsic relationship between scale and predictability. The results with this technique suggest potential use for downscaling of numerical model output. For the 143 h of precipitation analyzed so far, roughly a factor of 2 was obtained between lead times of Eulerian and Lagrangian techniques. Three of the four techniques involve a scale parameter. The slope of the relationship between optimum scale and lead time is about 1 and 2 km min−1 for Lagrangian and Eulerian techniques, respectively. The skill scores obtained for the four techniques can be used as a measure of predictability in terms of probabilistic rainfall rates. The progress of other probabilistic forecasting methods, such as expert systems or numerical models, can be evaluated against the standard set by simple persistence.

Full access
Wanda Szyrmer
and
Isztar Zawadzki

Abstract

As a first step toward retrieval of snow microphysics from two vertically pointing radars operating at X band and W band, a theoretical model of snow microphysics is formulated in which the number of unknown parameters is reduced to snow particle density and to two bulk quantities controlling the particle size distribution. This reduction of parameters is achieved by normalizing not only the size distribution but also the snow particle mass in the mass–size relationship as well as by using a relationship between snow density and snow terminal fall velocity. However, no single snow microphysical model could describe the observed variability in the radar measurements. The uncertainty in the developed deterministic relations that map the microphysical parameters to the observables is shown to be mainly associated with the assumed dependence of particle velocity on its mass and on the particle size distribution (PSD) representation. Hence, various mass–velocity relationships together with different generic functional forms of the PSD reported in literature are described in this paper and then used in the retrieval. The derived relations provide a reasonable range of uncertainty associated with the microphysics when used for the actual retrieval of snow properties from observations in Part IV. The uncertainty in the backscattering computations of an individual particle, performed using Mie theory assuming spherical form with nonuniform density, is not taken into account in this study.

Full access
Wanda Szyrmer
and
Isztar Zawadzki

Abstract

Based on the theory developed in Part III, this paper introduces a new method to retrieve snow microphysics from ground-based collocated X- and W-band vertically pointing Doppler radars. To take into account the variety of microphysical relations observed in natural precipitating snow and to quantify the uncertainty in the retrieval results caused by this variety, the retrieval is formulated using the ensemble-based method. The ensemble is determined by the spread of uncertainties in the microphysical descriptions applied to map the same radar observables to the retrieved quantities.

The model descriptors use diverse assumptions concerning functional forms of particle size distribution and mass–velocity relations, all taken from previous observational studies. The mean of each ensemble is assumed to be the best estimate of the retrieval while its spread is defined by the standard deviation that characterizes its uncertainty. The main retrieved products are the characteristic size, the snow mass content, and the density parameter, as well as the vertical air motion. Four observables used in the retrieval are the difference in reflectivities and in Doppler velocities at two wavelengths, together with the equivalent reflectivity factor and Doppler velocity at X band. The solutions that are not consistent with all four observables after taking into account their estimated measurement errors are eliminated from the ensembles. The application of the retrieval algorithm to the real data yields a snow microphysical description that agrees with the snow characteristics seen in the vertical profile of the observed Doppler spectrum.

Full access