Search Results

You are looking at 1 - 10 of 13 items for

  • Author or Editor: Thomas Auligné x
  • All content x
Clear All Modify Search
Thomas Auligné

Abstract

In of this two-part paper, the multivariate minimum residual (MMR) scheme was introduced to retrieve profiles of cloud fraction from satellite infrared radiances and identify clear observations. In this paper it is now validated with real observations from the Atmospheric Infrared Sounder (AIRS) instrument. This new method is compared with the cloud detection scheme presented earlier by McNally and Watts and operational at the European Centre for Medium-Range Weather Forecasts (ECMWF). Cloud-top pressures derived from both algorithms are comparable, with some differences at the edges of the synoptic cloud systems. The population of channels considered as clear is less contaminated with residual cloud for the MMR scheme. Further procedures, based on the formulation of the variational quality control, can be applied during the variational analysis to reduce the weight of observations that have a high chance of being contaminated by cloud. Finally, the MMR scheme can be used as a preprocessing step to improve the assimilation of cloud-affected infrared radiances.

Full access
Thomas Auligné

Abstract

A new method is presented for cloud detection and the retrieval of three-dimensional cloud fraction from satellite infrared radiances. This method, called multivariate minimum residual (MMR), is inspired by the minimum residual technique by Eyre and Menzel and is especially suitable for exploiting the large number of channels from hyperspectral infrared sounders. Its accuracy is studied in a theoretical framework where the observations and the numerical model are supposed perfect. Of particular interest is the number of independent information that can be found on the cloud according to the number of channels used. The technical implementation of the method is also briefly discussed. The MMR scheme is validated with the Atmospheric Infrared Sounder (AIRS) instrument using simulated observations. This new method is compared with the cloud-detection scheme from McNally and Watts that is operational at the European Centre for Medium-Range Weather Forecasts (ECMWF) and considered to be the state of the art in cloud detection for hyperspectral infrared sounders.

Full access
Benjamin Ménétrier and Thomas Auligné

Abstract

The control variable transform (CVT) is a keystone of variational data assimilation. In publications using such a technique, the background term of the transformed cost function is defined as a canonical inner product of the transformed control variable with itself. However, it is shown in this paper that this practical definition of the cost function is not correct if the CVT uses a square root of the background error covariance matrix that is not square. Fortunately, it is then shown that there is a manifold of the control space for which this flaw has no impact, and that most minimizers used in practice precisely work in this manifold. It is also shown that both correct and practical transformed cost functions have the same minimum. This explains more rigorously why the CVT is working in practice. The case of a singular is finally detailed, showing that the practical cost function still reaches the best linear unbiased estimate (BLUE).

Full access
Yann Michel and Thomas Auligné

Abstract

The structure of the analysis increments in a variational data assimilation scheme is strongly driven by the formulation of the background error covariance matrix, especially in data-sparse areas such as the Antarctic region. The gridpoint background error modeling in this study makes use of regression-based balance operators between variables, empirical orthogonal function decomposition to define the vertical correlations, gridpoint variances, and high-order efficient recursive filters to impose horizontal correlations. A particularity is that the regression operators and the recursive filters have been made spatially inhomogeneous. The computation of the background error statistics is performed with the Weather Research and Forecast (WRF) model from a set of forecast differences. The mesoscale limited-area domains of interest cover Antarctica. Inhomogeneities of background errors are shown to be related to the particular orography and physics of the area. Differences seem particularly pronounced between ocean and land boundary layers.

Full access
Benjamin Ménétrier and Thomas Auligné

Abstract

Localization and hybridization are two methods used in ensemble data assimilation to improve the accuracy of sample covariances. It is shown in this paper that it is beneficial to consider them jointly in the framework of linear filtering of sample covariances. Following previous work on localization, an objective method is provided to optimize both localization and hybridization coefficients simultaneously. Theoretical and experimental evidence shows that if optimal weights are used, localized-hybridized sample covariances are always more accurate than their localized-only counterparts, whatever the static covariance matrix specified for the hybridization. Experimental results obtained using a 1000-member ensemble as a reference show that the method developed in this paper can efficiently provide localization and hybridization coefficients consistent with the variable, vertical level, and ensemble size. Spatially heterogeneous optimization is shown to improve the accuracy of the filtered covariances, and consideration of both vertical and horizontal covariances is proven to have an impact on the hybridization coefficients.

Full access
Hongli Wang, Thomas Auligné, and Hugh Morrison

Abstract

The study of evolution characteristics of initial perturbations is an important subject in four-dimensional variational data assimilation (4DVAR) and mesoscale predictability research. This paper evaluates the impact of microphysical scheme complexity on the propagation of the perturbations in initial conditions for warm-season convections over the central United States. The Weather Research and Forecasting Model (WRF), in conjunction with four schemes of the Morrison microphysics parameterization with varying complexity, was used to simulate convective cases using grids nested to 5-km horizontal grid spacing. Results indicate that, on average, the four schemes show similar perturbation evolution in amplitude and spatial pattern during the first 2 h. After that, the simplified schemes introduce significant error in amplitude and spatial pattern. The simplest (liquid only) and most complex schemes show almost the same growth rate of initial perturbations with different amplitudes during 6-h forecast, suggesting that the simplest scheme does not reduce the nonlinearity in the most complex scheme. The evolution of vertical velocity and total condensates is more nonlinear than horizontal wind, temperature, and humidity, which suggest that the observations of cloud variables and vertical velocity should have a shorter time window (less than 1 h) compared to horizontal wind, temperature, and humidity observations. The simplified liquid-only microphysics scheme can be used as an acceptable substitute for the more complex one with a short time window (less than 1 h).

Full access
Yann Michel, Thomas Auligné, and Thibaut Montmerle

Abstract

Convective-scale models used in NWP nowadays include detailed realistic parameterization for the representation of cloud and precipitation processes. Yet they still lack advanced data assimilation schemes able to efficiently use observations to initialize hydrometeor fields. This challenging task may benefit from a better understanding of the statistical structure of background errors in precipitating areas for both traditional and hydrometeor variables, which is the goal of this study. A special binning has been devised to compute separate background error covariance matrices for precipitating and nonprecipitating areas. This binning is based on bidimensional geographical masks defined by the vertical averaged rain content of the background error perturbations. The sample for computing the covariances is taken from an ensemble of short range forecasts run at 3-km resolution for the prediction of two specific cases of convective storms over the United States. The covariance matrices and associated diagnostics are built on the control variable transform formulation typical of variational data assimilation. The comparison especially highlights the strong coupling of specific humidity, cloud, and rain content with divergence. Shorter horizontal correlations have been obtained in precipitating areas. Vertical correlations mostly reflect the cloud vertical extension due to the convective processes. The statistics for hydrometeor variables show physically meaningful autocovariances and statistical couplings with other variables. Issues for data assimilation of radar reflectivity or more generally of observations linked to cloud and rain content with this kind of background error matrix formulation are thereon briefly discussed.

Full access
Thomas Nehrkorn, Bryan Woods, Thomas Auligné, and Ross N. Hoffman

Abstract

Alignment errors [i.e., cases where coherent structures (“features”) of clouds or precipitation in the background have position errors] can lead to large and non-Gaussian background errors. Assimilation of cloud-affected radiances using additive increments derived by variational and/or ensemble methods can be problematic in these situations. To address this problem, the Feature Calibration and Alignment technique (FCA) is used here for correcting position errors by displacing background fields. A set of two-dimensional displacement vectors is applied to forecast fields to improve the alignment of features in the forecast and observations. These displacement vectors are obtained by a nonlinear minimization of a cost function that measures the misfit to observations, along with a number of additional constraints (e.g., smoothness and nondivergence of the displacement vectors) to prevent unphysical solutions. The method was applied in an idealized case using Weather Research and Forecasting Model (WRF) forecast fields for Hurricane Katrina. Application of the displacement vectors to the three-dimensional WRF fields resulted in improved predicted hurricane positions in subsequent forecasts. When applied to a set of high-resolution forecasts of deep moist convection over the central United States, displacements are able to efficiently characterize part of the ensemble spread. To test its application as an analysis preprocessor, FCA was applied to a real-data case of cloud-affected radiances of one of the Atmospheric Infrared Sounder (AIRS) channels. The displaced background resulted in an improved fit to the AIRS observations in all cloud-sensitive channels.

Full access
Thomas Nehrkorn, Bryan K. Woods, Ross N. Hoffman, and Thomas Auligné

Abstract

The Feature Calibration and Alignment technique (FCA) has been developed to characterize errors that a human would ascribe to a change in the position or intensity of a coherent feature, such as a hurricane. Here the feature alignment part of FCA is implemented in the Weather Research and Forecasting Data Assimilation system (WRFDA) to correct position errors in background fields and tested in simulation for the case of Hurricane Katrina (2005). The displacement vectors determined by feature alignment can be used to explain part of the background error and make the residual background errors smaller and more Gaussian. Here a set of 2D displacement vectors to improve the alignment of features in the forecast and observations is determined by solving the usual variational data assimilation problem—simultaneously minimizing the misfit to observations and a constraint on the displacements. This latter constraint is currently implemented by hijacking the usual background term for the midlevel u- and υ-wind components. The full model fields are then aligned using a procedure that minimizes dynamical imbalances by displacing only conserved or quasi-conserved quantities. Simulation experiments show the effectiveness of these procedures in correcting gross position errors and improving short-term forecasts. Compared to earlier experiments, even this initial implementation of feature alignment produces improved short-term forecasts. Adding the calculation of displacements to WRFDA advances the key contribution of FCA toward mainstream implementation since all observations with a corresponding observation operator may be used and the existing methodology for estimating the background error covariances may be used to refine the displacement error covariances.

Full access
Thomas Auligné, Benjamin Ménétrier, Andrew C. Lorenc, and Mark Buehner

Abstract

Hybrid variational–ensemble data assimilation (hybrid DA) is widely used in research and operational systems, and it is considered the current state of the art for the initialization of numerical weather prediction models. However, hybrid DA requires a separate ensemble DA to estimate the uncertainty in the deterministic variational DA, which can be suboptimal both technically and scientifically. A new framework called the ensemble–variational integrated localized (EVIL) data assimilation addresses this inconvenience by updating the ensemble analyses using information from the variational deterministic system. The goal of EVIL is to encompass and generalize existing ensemble Kalman filter methods in a variational framework. Particular attention is devoted to the affordability and efficiency of the algorithm in preparation for operational applications.

Full access