Search Results

You are looking at 1 - 10 of 14 items for

  • Author or Editor: Milija Zupanski x
  • All content x
Clear All Modify Search
Milija Zupanski

Abstract

A preconditioning method suitable for use in four-dimensional variational (4DVAR) data assimilation is proposed. The method is a generalization of the preconditioning previously developed by the author, now designed to include direct observations, as well as different forms of the cost function. The original approach was based on an estimate of the ratio of the expected decrease of the cost function and of the gradient norm, derived from an approximate Taylor series expansion of the cost function. The generalized method employs only basic linear functional analysis, still preserving the efficiency of the original method.

The preconditioning is tested in a realistic 4DVAR assimilation environment: the data are direct observations operationally used at the National Centers for Environmental Prediction (formerly the National Meteorological Center), the forecast model is a full-physics regional eta model, and the adjoint model includes all physics, except radiation. The results of five 4DVAR data assimilation experiments, using a memoryless quasi-Newton minimization algorithm, show a significant benefit of the new preconditioning. On average, the minimization algorithm converges in about 20–25 iterations. In particular, after only 10 iterations, about 95% of the cost function decrease was achieved in all five cases. Especially encouraging is the fact that these results are obtained with physical processes present in the adjoint model.

Full access
Milija Zupanski

Abstract

The sensitivity of the solution of an optimization problem with respect to general parameter perturbation (e.g., sensitivity in calculus of variations) is addressed. First, a total variation of the optimal solution is obtained as a by-product of an iterative minimization. Then a general relation between the sensitivity and total variation is used to approximate the sensitivity in calculus of variation. The concept of total variation itself is very useful for tracing the sources of the cost function (forecast aspect) changes back to the initial conditions. For specific choices of the cost function, the total variation may be used to find the sources of a forecast error, the effect of a particular parameterization routine on the optimal solution, etc. The proposed method for calculation of the sensitivity in calculus of variations is approximate but computationally more efficient than existing methods. Its additional benefit is that the realistic calculations using the sophisticated forecast model with physics and real data are possible to accomplish.

As an example, the method is applied to find the source of a 24-h forecast error in initial conditions. For gradient calculations, an adjoint model with partial physics (horizontal diffusion, large-scale precipitation, and cumulus convection) is employed. The forecast model is the full-physics regional NMC's eta model. The results show a benefit of multiple iterations and applicability of the method in realistic meteorological situations.

Full access
Milija Zupanski

Abstract

A new ensemble-based data assimilation method, named the maximum likelihood ensemble filter (MLEF), is presented. The analysis solution maximizes the likelihood of the posterior probability distribution, obtained by minimization of a cost function that depends on a general nonlinear observation operator. The MLEF belongs to the class of deterministic ensemble filters, since no perturbed observations are employed. As in variational and ensemble data assimilation methods, the cost function is derived using a Gaussian probability density function framework. Like other ensemble data assimilation algorithms, the MLEF produces an estimate of the analysis uncertainty (e.g., analysis error covariance). In addition to the common use of ensembles in calculation of the forecast error covariance, the ensembles in MLEF are exploited to efficiently calculate the Hessian preconditioning and the gradient of the cost function. A sufficient number of iterative minimization steps is 2–3, because of superior Hessian preconditioning. The MLEF method is well suited for use with highly nonlinear observation operators, for a small additional computational cost of minimization. The consistent treatment of nonlinear observation operators through optimization is an advantage of the MLEF over other ensemble data assimilation algorithms. The cost of MLEF is comparable to the cost of existing ensemble Kalman filter algorithms. The method is directly applicable to most complex forecast models and observation operators. In this paper, the MLEF method is applied to data assimilation with the one-dimensional Korteweg–de Vries–Burgers equation. The tested observation operator is quadratic, in order to make the assimilation problem more challenging. The results illustrate the stability of the MLEF performance, as well as the benefit of the cost function minimization. The improvement is noted in terms of the rms error, as well as the analysis error covariance. The statistics of innovation vectors (observation minus forecast) also indicate a stable performance of the MLEF algorithm. Additional experiments suggest the amplified benefit of targeted observations in ensemble data assimilation.

Full access
Milija Zupanski

Abstract

Four-dimensional variational data assimilation is applied to a regional forecast model as part of the development of a new data assimilation system at the National Meteorological Center (NMC). The assimilation employs an operational version of the NMC's new regional forecast model defined in eta vertical coordinates, and data used are operationally produced optimal interpolation (OI) analyses (using the first guess from the NMC's global spectral model), available every 3 h. Humidity and parameterized processes are not included in the adjoint model integration. The calculation of gradients by the adjoint model is approximate since the forecast model is used in its full-physics operational form. All experiments are over a 12-h assimilation period with subsequent 48-h forecast. Three different types of assimilation experiments are performed:

  1. adjustment of initial conditions only (standard “adjoint” approach),
  2. adjustment of a correction to the model equations only (variational continuous assimilation), and
  3. simultaneous or sequential adjustment of both initial conditions and the correction term.

Results indicate significantly better results when the correction term is included in the assimilation. It is shown, for a single case, that the new technique [experiment (c)] is able to produce a forecast better than the current conventional OI assimilation. It is very important to note that these results are obtained with an approximate gradient, calculated from a simplified adjoint model. Thus, it may be possible to perform an operational four-dimensional variational data assimilation of realistic forecast models, even before more complex adjoint models are developed. Also, our results suggest that it may be possible to reduce the large computational cost of assimilation by using only a few iterations of the minimization algorithm. This fast convergence is encouraging from the prospective of operational use.

Full access
Dusanka Zupanski and Milija Zupanski

Abstract

A methodology for model error estimation is proposed and examined in this study. It provides estimates of the dynamical model state, the bias, and the empirical parameters by combining three approaches: 1) ensemble data assimilation, 2) state augmentation, and 3) parameter and model bias estimation. Uncertainties of these estimates are also determined, in terms of the analysis and forecast error covariances, employing the same methodology.

The model error estimation approach is evaluated in application to Korteweg–de Vries–Burgers (KdVB) numerical model within the framework of maximum likelihood ensemble filter (MLEF). Experimental results indicate improved filter performance due to model error estimation. The innovation statistics also indicate that the estimated uncertainties are reliable. On the other hand, neglecting model errors—either in the form of an incorrect model parameter, or a model bias—has detrimental effects on data assimilation, in some cases resulting in filter divergence.

Although the method is examined in a simplified model framework, the results are encouraging. It remains to be seen how the methodology performs in applications to more complex models.

Full access
Man Zhang, Milija Zupanski, Min-Jeong Kim, and John A. Knaff

Abstract

A regional hybrid variational–ensemble data assimilation system (HVEDAS), the maximum likelihood ensemble filter (MLEF), is applied to the 2011 version of the NOAA operational Hurricane Weather Research and Forecasting (HWRF) model to evaluate the impact of direct assimilation of cloud-affected Advanced Microwave Sounding Unit-A (AMSU-A) radiances in tropical cyclone (TC) core areas. The forward components of both the gridpoint statistical interpolation (GSI) analysis system and the Community Radiative Transfer Model (CRTM) are utilized to process and simulate satellite radiances. The central strategies to allow the use of cloud-affected radiances are (i) to augment the control variables to include clouds and (ii) to add the model cloud representations in the observation forward models to simulate the microwave radiances. The cloudy AMSU-A radiance assimilation in Hurricane Danielle's (2010) core area has produced encouraging results with respect to the operational cloud-cleared radiance preprocessing procedures used in this study. Through the use of the HVEDAS, ensemble covariance statistics for a pseudo-AMSU-A observation in Danielle's core area show physically meaningful error covariances and statistical couplings with hydrometeor variables (i.e., the total-column condensate in Ferrier microphysics). The cloudy radiance assimilation in the TC core region (i.e., ASR experiment) consistently reduced the root-mean-square errors of the background departures, and also generally improved the forecasts of Danielle's intensity as well as the quantitative cloud analysis and prediction. It is also indicated that an entropy-based information content quantification process provides a useful metric for evaluating the utility of satellite observations in hybrid data assimilation.

Full access
Milija Zupanski, Dusanka Zupanski, David F. Parrish, Eric Rogers, and Geoffrey DiMego

Abstract

Four-dimensional variational (4DVAR) data assimilation experiments for the East Coast winter storm of 25 January 2000 (i.e., “blizzard of 2000”) were performed. This storm has received wide attention in the United States, because it was one of the major failures of the operational forecast system. All operational models of the U.S. National Weather Service (NWS) failed to produce heavy precipitation over the Carolina–New Jersey corridor, especially during the early stage of the storm development. The considered analysis cycle of this study is that of 0000 to 1200 UTC 24 January. This period was chosen because the forecast from 1200 UTC 24 January had the most damaging guidance for the forecasters at the National Weather Service offices and elsewhere.

In the first set of experiments, the assimilation and forecast results between the 4DVAR and the operational three-dimensional variational (3DVAR) data assimilation method are compared. The most striking difference is in the accumulated precipitation amounts. The 4DVAR experiment produced almost perfect 24-h accumulated precipitation during the first 24 h of the forecast (after data assimilation), with accurate heavy precipitation over North and South Carolina. The operational 3DVAR-based forecast badly underforecast precipitation. The reason for the difference is traced back to the initial conditions. Apparently, the 4DVAR data assimilation was able to create strong surface convergence and an excess of precipitable water over Georgia. This initial convection was strengthened by a low-level jet in the next 6–12 h, finally resulting in a deep convection throughout the troposphere.

In the second set of experiments, the impact of model error adjustment and precipitation assimilation is examined by comparing the forecasts initiated from various 4DVAR experiments. The results strongly indicate the need for the model error adjustment in the 4DVAR algorithm, as well as the clear benefit of assimilation of the hourly accumulated precipitation.

Full access
Dusanka Zupanski, Milija Zupanski, Eric Rogers, David F. Parrish, and Geoffrey J. DiMego

Abstract

The National Centers for Environmental Prediction fine-resolution four-dimensional variational (4DVAR) data assimilation system is used to study the Great Plains tornado outbreak of 3 May 1999. It was found that the 4DVAR method was able to capture very well the important precursors for the tornadic activity, such as upper- and low-level jet streaks, wind shear, humidity field, surface CAPE, and so on. It was also demonstrated that, in this particular synoptic case, characterized by fast-changing mesoscale systems, the model error adjustment played a substantial role. The experimental results suggest that the common practice of neglecting the model error in data assimilation systems may not be justified in synoptic situations similar to this one.

Full access
Milija Zupanski, Dusanka Zupanski, Tomislava Vukicevic, Kenneth Eis, and Thomas Vonder Haar

Abstract

A new four-dimensional variational data assimilation (4DVAR) system is developed at the Cooperative Institute for Research in the Atmosphere (CIRA)/Colorado State University (CSU). The system is also called the Regional Atmospheric Modeling Data Assimilation System (RAMDAS). In its present form, the 4DVAR system is employing the CSU/Regional Atmospheric Modeling System (RAMS) nonhydrostatic primitive equation model. The Weather Research and Forecasting (WRF) observation operator is used to access the observations, adopted from the WRF three-dimensional variational data assimilation (3DVAR) algorithm. In addition to the initial conditions adjustment, the RAMDAS includes the adjustment of model error (bias) and lateral boundary conditions through an augmented control variable definition. Also, the control variable is defined in terms of the velocity potential and streamfunction instead of the horizontal winds. The RAMDAS is developed after the National Centers for Environmental Prediction (NCEP) Eta 4DVAR system, however with added improvements addressing its use in a research environment.

Preliminary results with RAMDAS are presented, focusing on the minimization performance and the impact of vertical correlations in error covariance modeling. A three-dimensional formulation of the background error correlation is introduced and evaluated. The Hessian preconditioning is revisited, and an alternate algebraic formulation is presented. The results indicate a robust minimization performance.

Full access
Dusanka Zupanski, Sara Q. Zhang, Milija Zupanski, Arthur Y. Hou, and Samson H. Cheung

Abstract

In the near future, the Global Precipitation Measurement (GPM) mission will provide precipitation observations with unprecedented accuracy and spatial/temporal coverage of the globe. For hydrological applications, the satellite observations need to be downscaled to the required finer-resolution precipitation fields. This paper explores a dynamic downscaling method using ensemble data assimilation techniques and cloud-resolving models. A prototype ensemble data assimilation system using the Weather Research and Forecasting Model (WRF) has been developed. A high-resolution regional WRF with multiple nesting grids is used to provide the first-guess and ensemble forecasts. An ensemble assimilation algorithm based on the maximum likelihood ensemble filter (MLEF) is used to perform the analysis. The forward observation operators from NOAA–NCEP’s gridpoint statistical interpolation (GSI) are incorporated for using NOAA–NCEP operational datastream, including conventional data and clear-sky satellite observations. Precipitation observation operators are developed with a combination of the cloud-resolving physics from NASA Goddard cumulus ensemble (GCE) model and the radiance transfer schemes from NASA Satellite Data Simulation Unit (SDSU). The prototype of the system is used as a test bed to optimally combine observations and model information to produce a dynamically downscaled precipitation analysis. A case study on Tropical Storm Erin (2007) is presented to investigate the ability of the prototype of the WRF Ensemble Data Assimilation System (WRF-EDAS) to ingest information from in situ and satellite observations including precipitation-affected radiance. The results show that the analyses and forecasts produced by the WRF-EDAS system are comparable to or better than those obtained with the WRF-GSI analysis scheme using the same set of observations. An experiment was also performed to examine how the analyses and short-term forecasts of microphysical variables and dynamical fields are influenced by the assimilation of precipitation-affected radiances. The results highlight critical issues to be addressed in the next stage of development such as model-predicted hydrometeor control variables and associated background error covariance, bias estimation, and correction in radiance space, as well as the observation error statistics. While further work is needed to optimize the performance of WRF-EDAS, this study establishes the viability of developing a cloud-scale ensemble data assimilation system that has the potential to provide a useful vehicle for downscaling satellite precipitation information to finer scales suitable for hydrological applications.

Full access