Search Results

You are looking at 1 - 10 of 98 items for

  • Author or Editor: Xuguang Wang x
  • Refine by Access: All Content x
Clear All Modify Search
Xuguang Wang

Abstract

A hybrid ensemble transform Kalman filter (ETKF)–three-dimensional variational data assimilation (3DVAR) system developed for the Weather Research and Forecasting Model (WRF) was studied for the forecasts of the tracks of two major hurricanes, Ike and Gustav, in 2008 over the Gulf of Mexico. The impacts of the flow-dependent ensemble covariance generated by the ETKF were revealed by comparing the forecasts, analyses, and analysis increments generated by the hybrid data assimilation method with those generated by the 3DVAR that used the static background covariance. The root-mean-square errors of the track forecasts by the hybrid data assimilation (DA) method were smaller than those by the 3DVAR for both Ike and Gustav. Experiments showed that such improvements were due to the use of the flow-dependent covariance provided by the ETKF ensemble in the hybrid DA system. Detailed diagnostics further revealed that the increments produced by the hybrid and the 3DVAR were different for both the analyses of the hurricane itself and its environment. In particular, it was found that the hybrid, using the flow-dependent covariance that gave the hurricane-specific error covariance estimates, was able to systematically adjust the position of the hurricane during the assimilation whereas the 3DVAR was not. The study served as a pilot study to explore and understand the potential of the hybrid method for hurricane data assimilation and forecasts. Caution needs to be taken to extrapolate the results to operational forecast settings.

Full access
Xuguang Wang

Abstract

Gridpoint statistical interpolation (GSI), a three-dimensional variational data assimilation method (3DVAR) has been widely used in operations and research in numerical weather prediction. The operational GSI uses a static background error covariance, which does not reflect the flow-dependent error statistics. Incorporating ensemble covariance in GSI provides a natural way to estimate the background error covariance in a flow-dependent manner. Different from other 3DVAR-based hybrid data assimilation systems that are preconditioned on the square root of the background error covariance, commonly used GSI minimization is preconditioned upon the full background error covariance matrix. A mathematical derivation is therefore provided to demonstrate how to incorporate the flow-dependent ensemble covariance in the GSI variational minimization.

Full access
Yongming Wang
and
Xuguang Wang

Abstract

A convective-scale static background-error covariance (BEC) matrix is further developed to include the capability of direct reflectivity assimilation and evaluated within the GSI-based three-dimensional variational (3DVar) and hybrid ensemble–variational (EnVar) methods. Specific developments are summarized as follows: 1) Control variables (CVs) are extended to include reflectivity, vertical velocity, and all hydrometeor types. Various horizontal momentum and moisture CV options are included. 2) Cross correlations between all CVs are established. 3) A storm intensity-dependent binning method is adopted to separately calculate static error matrices for clear-air and storms with varying intensities. The resultant static BEC matrices are simultaneously applied at proper locations guided by the observed reflectivity. 4) The EnVar is extended to adaptively incorporate static BECs based on the quality of ensemble covariances. Evaluation and examination of the new static BECs are first performed on the 8 May 2003 Oklahoma City supercell. Detailed diagnostics and 3DVar examinations suggest zonal/meridional winds and pseudo–relative humidity are selected as horizontal momentum and moisture CVs for direct reflectivity assimilation, respectively; inclusion of cross correlations favors spin up and maintains the analyzed storms; application of binning improves characteristics and persistence of the simulated storm. Relative to an experiment using the full ensemble BECs (Exp-PureEnVar), incorporating static BECs in hybrid EnVar reduces spinup time and better analyzes reflectivity distributions while the background ensemble is deficient in sampling errors. Compared to both pure 3DVar and Exp-PureEnVar, hybrid EnVar better predicts reflectivity distributions and better maintains a strong mesocyclone. Further examination through the 20 May 2013 Oklahoma supercells confirms these results and additionally demonstrates the effectiveness of adaptive hybridization.

Full access
Yongming Wang
and
Xuguang Wang

Abstract

A GSI-based EnVar data assimilation system is extended to directly assimilate radar reflectivity to initialize convective-scale forecasts. When hydrometeor mixing ratios are used as state variables (method mixing ratio), large differences of the cost function gradients with respect to the small hydrometeor mixing ratios and wind prevent efficient convergence. Using logarithmic mixing ratios as state variables (method logarithm) fixes this problem, but generates spuriously large hydrometeor increments partly due to the transform to and from the logarithmic space. The tangent linear of the reflectivity operators further contributes to spuriously small and large hydrometeor increments in method mixing ratio and method logarithm, respectively. A new method is proposed by directly adding the reflectivity as a state variable (method dBZ). Without the tangent linear and adjoint of the nonlinear operator, the new method therefore avoids the aforementioned problems.

The newly proposed method is examined on the analysis and prediction of the 8 May 2003 Oklahoma City tornadic supercell storm. Both the probabilistic forecast of strong low-level vorticity and maintenance of strong updraft and vorticity in method dBZ are more consistent with reality than in method logarithm and method mixing ratio. Detailed diagnostics suggest that a more realistic cold pool due to the better analyzed hydrometeors in method dBZ than in other methods leads to constructive interaction between the surface gust front and the updraft aloft associated with the midlevel mesocyclone. Similar low-level vorticity forecast and maintenance of the storm are produced by the WSM6 and Thompson microphysics schemes in method dBZ. The Thompson scheme matches the reflectivity distribution with the observations better for all lead times, but shows more southeastward track bias compared to the WSM6 scheme.

Full access
Yongming Wang
and
Xuguang Wang

Abstract

Explicit forecasts of a tornado-like vortex (TLV) require subkilometer grid spacing because of their small size. Most previous TLV prediction studies started from interpolated kilometer grid spacing initial conditions (ICs) rather than subkilometer grid spacing ICs. The tornadoes embedded in the 8 May 2003 Oklahoma City tornadic supercell are used to understand the impact of IC resolution on TLV predictions. Two ICs at 500-m and 2-km grid spacings are, respectively, produced through an efficient dual-resolution (DR) and a single-coarse-resolution (SCR) EnVar ingesting a 2-km ensemble. Both experiments launch 1-h forecasts at 500-m grid spacing. Diagnostics of data assimilation (DA) cycling reveal DR produces stronger and broader rear-flank cold pools, more intense downdrafts and updrafts with finer scales, and more hydrometeors at high altitudes through accumulated differences between two DA algorithms. Relative differences in DR, compared to SCR, include the integration from higher-resolution analyses, the update for higher-resolution backgrounds, and the propagation of ensemble perturbations along higher-resolution model trajectory. Predictions for storm morphology and cold pools are more realistic in DR than in SCR. The DR-TLV tracks match better with the observed tornado tracks than SCR-TLV in timing of intensity variation, and in duration. Additional experiments suggest 1) the analyzed kinematic variables strongly influence timing of intensity variation through affecting both low-level rear-flank outflow and midlevel updraft; 2) potential temperature analysis by DR extends the second track’s duration consistent with enhanced low-level stretching, delayed broadening large-scale downdraft, and (or) increased near-surface baroclinic vorticity supply; and 3) hydrometeor analyses have little impact on TLV predictions.

Free access
Aaron Johnson
and
Xuguang Wang

Abstract

A series of convection-allowing 36-h ensemble forecasts during the 2018 spring season are used to better understand the impacts of ensemble configuration and blending different sources of initial condition (IC) perturbation. Ten- and forty-member ensemble configurations are initialized with the multiscale IC perturbations generated as a product of convective-scale data assimilation (MULTI) and initialized with the MULTI IC perturbations blended with IC perturbations downscaled from coarser-resolution ensembles (BLEND). The forecast performance of both precipitation and nonprecipitation variables is consistently improved by the larger ensemble size. The benefit of the larger ensemble is largely, but not entirely, due to compensating for underdispersion in the fixed-physics ensemble configuration. A consistent improvement in precipitation forecast skill results from blending in the 10-member ensemble configuration, corresponding to a reduction in the ensemble calibration error (i.e., reliability component of Brier score). In the 40-member ensemble configuration, the advantage of blending is limited to the ∼18–22-h lead times at all precipitation thresholds and the ∼35–36-h lead times at the lowest threshold, both corresponding to an improved resolution component of the Brier score. The advantage of blending in the 40-member ensemble during the diurnal convection maximum of ∼18–22-h lead times is primarily due to cases with relatively weak synoptic-scale forcing, while advantages at later lead times beyond ∼30-h lead time are most prominent on cases with relatively strong synoptic-scale forcing. The impacts of blending and ensemble configuration on forecasts of nonprecipitation variables are generally consistent with the impacts on the precipitation forecasts.

Restricted access
Yue Yang
and
Xuguang Wang

Abstract

The Gridpoint Statistical Interpolation (GSI)-based four- and three-dimensional ensemble–variational (4DEnVar and 3DEnVar) methods are compared as a smoother and a filter, respectively, for rapidly changing storms using the convective-scale direct radar reflectivity data assimilation (DA) framework. Two sets of experiments with varying DA window lengths (WLs; 20, 40, 100, and 160 min) and radar observation intervals (RIs; 20 and 5 min) are conducted for the 5–6 May 2019 case. The RI determines the temporal resolution of ensemble perturbations for the smoother and the DA interval for the filter spanning the WL. For experiments with a 20-min RI, evaluations suggest that the filter and the smoother have comparable performance with a 20-min WL; however, extending the WL results in the outperformance of the filter over the smoother. Diagnostics reveal that the degradation of the smoother is attributed to the increased degree of nonlinearity and the issue of time-independent localization as the WL extends. Evaluations for experiments with different RIs under the same WL indicate that the outperformance of the filter over the smoother diminishes for most forecast hours at thresholds of 30 dBZ and above when shortening the RI. Diagnostics show that more frequent interruptions of the model introduce model imbalance for the filter, and the increased temporal resolution of ensemble perturbations enhances the degree of nonlinearity for the smoother. The impact of model imbalance on the filter overwhelms the enhanced nonlinearity on the smoother as the RI reduces.

Significance Statement

The background uncertainties of rapidly changing storms suffer from fast error growth and high degrees of nonlinearities during the data assimilation (DA) period. Two variants of the ensemble-based DA method can account for such temporal evolution. The smoother uses background ensemble from multiple observation times over an assimilation period to estimate the propagation of statistics. The filter frequently calculates the statistics at multiple observation times over the same period. Current comparisons of the smoother and the filter were mostly performed using simple models; however, unknowns remain for convection-allowing forecasts with additional complexities. This study compares the filter and the smoother for the convective-scale analysis and prediction using a real-data study and finds that the comparison varies with the assimilation period and the observation interval.

Restricted access
Xu Lu
and
Xuguang Wang

Abstract

Assimilating inner-core observations collected from recent field campaign programs such as Tropical Cyclone Intensity (TCI) and Intensity Forecasting Experiment (IFEX) together with the enhanced atmospheric motion vectors (AMVs) produce realistic three-dimensional (3D) analyses using the newly developed GSI-based, continuously cycled, dual-resolution hybrid ensemble–variational data assimilation (DA) system for the Hurricane Weather Research and Forecasting (HWRF) Model for Hurricane Patricia (2015). However, more persistent surface wind maximum spindown is found in the intensity forecast initialized from the realistic analyses produced by the DA system but not from the unrealistic initial conditions produced through vortex modification. Diagnostics in this study reveal that the spindown issue is likely attributed to the deficient HWRF Model physics that are unable to maintain the realistic 3D structures from the DA analysis. The horizontal diffusion is too strong to maintain the realistically observed vertical oscillation of radial wind near the eyewall region. The vertical diffusion profile cannot produce a sufficiently strong secondary circulation connecting the realistically elevated upper-level outflow produced in the DA analysis. Further investigations with different model physics parameterizations demonstrate that spindown can be alleviated by modifying model physics parameterizations. In particular, a modified turbulent mixing parameterization scheme together with a reduced horizontal diffusion is found to significantly alleviate the spindown issue and to improve the intensity forecast. Additional experiments show that the peak-simulated intensity and rapid intensification rate can be further improved by increasing the model resolution. But the model resolution is not as important as model physics in the spindown alleviation.

Full access
Yue Yang
and
Xuguang Wang

Abstract

The sensitivity of convection-allowing forecasts over the continental United States to radar reflectivity data assimilation (DA) frequency is explored within the Gridpoint Statistical Interpolation (GSI)-based ensemble–variational (EnVar) system. Experiments with reflectivity DA intervals of 60, 20, and 5 min (RAIN60, RAIN20, and RAIN5, respectively) are conducted using 10 diverse cases. Quantitative verification indicates that the degree of sensitivity depends on storm features during the radar DA period. Five developing storms show high sensitivity, whereas five mature or decaying storms do not. The 20-min interval is the most reliable given its best overall performance compared to the 5- and 60-min intervals. Diagnostics suggest that the differences in analyzed cold pools (ACPs) among RAIN60, RAIN20, and RAIN5 vary by storm features during the radar DA period. Such ACP differences result in different forecast skills. In the case where RAIN20 outperforms RAIN60 and the case where RAIN5 outperforms RAIN20, assimilation of reflectivity with a higher frequency commonly produces enhanced and widespread ACPs, promoting broader storms that match better with reality than a lower frequency. In the case where RAIN5 performs worse than RAIN20, the model imbalance of RAIN5 overwhelms information gain associated with frequent assimilation, producing overestimated and spuriously fast-moving ACPs. In the cases where little sensitivity to the reflectivity DA frequency is found, similar ACPs are produced.

Free access
Xuguang Wang
and
Ting Lei

Abstract

A four-dimensional (4D) ensemble–variational data assimilation (DA) system (4DEnsVar) was developed, building upon the infrastructure of the gridpoint statistical interpolation (GSI)-based hybrid DA system. 4DEnsVar used ensemble perturbations valid at multiple time periods throughout the DA window to estimate 4D error covariances during the variational minimization, avoiding the tangent linear and adjoint of the forecast model. The formulation of its implementation in GSI was described. The performance of the system was investigated by evaluating the global forecasts and hurricane track forecasts produced by the NCEP Global Forecast System (GFS) during the 5-week summer period assimilating operational conventional and satellite data. The newly developed system was used to address a few questions regarding 4DEnsVar. 4DEnsVar in general improved upon its 3D counterpart, 3DEnsVar. At short lead times, the improvement over the Northern Hemisphere extratropics was similar to that over the Southern Hemisphere extratropics. At longer lead times, 4DEnsVar showed more improvement in the Southern Hemisphere than in the Northern Hemisphere. The 4DEnsVar showed less impact over the tropics. The track forecasts of 16 tropical cyclones initialized by 4DEnsVar were more accurate than 3DEnsVar after 1-day forecast lead times. The analysis generated by 4DEnsVar was more balanced than 3DEnsVar. Case studies showed that increments from 4DEnsVar using more frequent ensemble perturbations approximated the increments from direct, nonlinear model propagation better than using less frequent ensemble perturbations. Consistently, the performance of 4DEnsVar including both the forecast accuracy and the balances of analyses was in general degraded when less frequent ensemble perturbations were used. The tangent linear normal mode constraint had positive impact for global forecast but negative impact for TC track forecasts.

Full access