Search Results
You are looking at 1 - 10 of 89 items for
- Author or Editor: Xuguang Wang x
- Refine by Access: All Content x
Abstract
Gridpoint statistical interpolation (GSI), a three-dimensional variational data assimilation method (3DVAR) has been widely used in operations and research in numerical weather prediction. The operational GSI uses a static background error covariance, which does not reflect the flow-dependent error statistics. Incorporating ensemble covariance in GSI provides a natural way to estimate the background error covariance in a flow-dependent manner. Different from other 3DVAR-based hybrid data assimilation systems that are preconditioned on the square root of the background error covariance, commonly used GSI minimization is preconditioned upon the full background error covariance matrix. A mathematical derivation is therefore provided to demonstrate how to incorporate the flow-dependent ensemble covariance in the GSI variational minimization.
Abstract
Gridpoint statistical interpolation (GSI), a three-dimensional variational data assimilation method (3DVAR) has been widely used in operations and research in numerical weather prediction. The operational GSI uses a static background error covariance, which does not reflect the flow-dependent error statistics. Incorporating ensemble covariance in GSI provides a natural way to estimate the background error covariance in a flow-dependent manner. Different from other 3DVAR-based hybrid data assimilation systems that are preconditioned on the square root of the background error covariance, commonly used GSI minimization is preconditioned upon the full background error covariance matrix. A mathematical derivation is therefore provided to demonstrate how to incorporate the flow-dependent ensemble covariance in the GSI variational minimization.
Abstract
A hybrid ensemble transform Kalman filter (ETKF)–three-dimensional variational data assimilation (3DVAR) system developed for the Weather Research and Forecasting Model (WRF) was studied for the forecasts of the tracks of two major hurricanes, Ike and Gustav, in 2008 over the Gulf of Mexico. The impacts of the flow-dependent ensemble covariance generated by the ETKF were revealed by comparing the forecasts, analyses, and analysis increments generated by the hybrid data assimilation method with those generated by the 3DVAR that used the static background covariance. The root-mean-square errors of the track forecasts by the hybrid data assimilation (DA) method were smaller than those by the 3DVAR for both Ike and Gustav. Experiments showed that such improvements were due to the use of the flow-dependent covariance provided by the ETKF ensemble in the hybrid DA system. Detailed diagnostics further revealed that the increments produced by the hybrid and the 3DVAR were different for both the analyses of the hurricane itself and its environment. In particular, it was found that the hybrid, using the flow-dependent covariance that gave the hurricane-specific error covariance estimates, was able to systematically adjust the position of the hurricane during the assimilation whereas the 3DVAR was not. The study served as a pilot study to explore and understand the potential of the hybrid method for hurricane data assimilation and forecasts. Caution needs to be taken to extrapolate the results to operational forecast settings.
Abstract
A hybrid ensemble transform Kalman filter (ETKF)–three-dimensional variational data assimilation (3DVAR) system developed for the Weather Research and Forecasting Model (WRF) was studied for the forecasts of the tracks of two major hurricanes, Ike and Gustav, in 2008 over the Gulf of Mexico. The impacts of the flow-dependent ensemble covariance generated by the ETKF were revealed by comparing the forecasts, analyses, and analysis increments generated by the hybrid data assimilation method with those generated by the 3DVAR that used the static background covariance. The root-mean-square errors of the track forecasts by the hybrid data assimilation (DA) method were smaller than those by the 3DVAR for both Ike and Gustav. Experiments showed that such improvements were due to the use of the flow-dependent covariance provided by the ETKF ensemble in the hybrid DA system. Detailed diagnostics further revealed that the increments produced by the hybrid and the 3DVAR were different for both the analyses of the hurricane itself and its environment. In particular, it was found that the hybrid, using the flow-dependent covariance that gave the hurricane-specific error covariance estimates, was able to systematically adjust the position of the hurricane during the assimilation whereas the 3DVAR was not. The study served as a pilot study to explore and understand the potential of the hybrid method for hurricane data assimilation and forecasts. Caution needs to be taken to extrapolate the results to operational forecast settings.
Abstract
Explicit forecasts of a tornado-like vortex (TLV) require subkilometer grid spacing because of their small size. Most previous TLV prediction studies started from interpolated kilometer grid spacing initial conditions (ICs) rather than subkilometer grid spacing ICs. The tornadoes embedded in the 8 May 2003 Oklahoma City tornadic supercell are used to understand the impact of IC resolution on TLV predictions. Two ICs at 500-m and 2-km grid spacings are, respectively, produced through an efficient dual-resolution (DR) and a single-coarse-resolution (SCR) EnVar ingesting a 2-km ensemble. Both experiments launch 1-h forecasts at 500-m grid spacing. Diagnostics of data assimilation (DA) cycling reveal DR produces stronger and broader rear-flank cold pools, more intense downdrafts and updrafts with finer scales, and more hydrometeors at high altitudes through accumulated differences between two DA algorithms. Relative differences in DR, compared to SCR, include the integration from higher-resolution analyses, the update for higher-resolution backgrounds, and the propagation of ensemble perturbations along higher-resolution model trajectory. Predictions for storm morphology and cold pools are more realistic in DR than in SCR. The DR-TLV tracks match better with the observed tornado tracks than SCR-TLV in timing of intensity variation, and in duration. Additional experiments suggest 1) the analyzed kinematic variables strongly influence timing of intensity variation through affecting both low-level rear-flank outflow and midlevel updraft; 2) potential temperature analysis by DR extends the second track’s duration consistent with enhanced low-level stretching, delayed broadening large-scale downdraft, and (or) increased near-surface baroclinic vorticity supply; and 3) hydrometeor analyses have little impact on TLV predictions.
Abstract
Explicit forecasts of a tornado-like vortex (TLV) require subkilometer grid spacing because of their small size. Most previous TLV prediction studies started from interpolated kilometer grid spacing initial conditions (ICs) rather than subkilometer grid spacing ICs. The tornadoes embedded in the 8 May 2003 Oklahoma City tornadic supercell are used to understand the impact of IC resolution on TLV predictions. Two ICs at 500-m and 2-km grid spacings are, respectively, produced through an efficient dual-resolution (DR) and a single-coarse-resolution (SCR) EnVar ingesting a 2-km ensemble. Both experiments launch 1-h forecasts at 500-m grid spacing. Diagnostics of data assimilation (DA) cycling reveal DR produces stronger and broader rear-flank cold pools, more intense downdrafts and updrafts with finer scales, and more hydrometeors at high altitudes through accumulated differences between two DA algorithms. Relative differences in DR, compared to SCR, include the integration from higher-resolution analyses, the update for higher-resolution backgrounds, and the propagation of ensemble perturbations along higher-resolution model trajectory. Predictions for storm morphology and cold pools are more realistic in DR than in SCR. The DR-TLV tracks match better with the observed tornado tracks than SCR-TLV in timing of intensity variation, and in duration. Additional experiments suggest 1) the analyzed kinematic variables strongly influence timing of intensity variation through affecting both low-level rear-flank outflow and midlevel updraft; 2) potential temperature analysis by DR extends the second track’s duration consistent with enhanced low-level stretching, delayed broadening large-scale downdraft, and (or) increased near-surface baroclinic vorticity supply; and 3) hydrometeor analyses have little impact on TLV predictions.
Abstract
A GSI-based EnVar data assimilation system is extended to directly assimilate radar reflectivity to initialize convective-scale forecasts. When hydrometeor mixing ratios are used as state variables (method mixing ratio), large differences of the cost function gradients with respect to the small hydrometeor mixing ratios and wind prevent efficient convergence. Using logarithmic mixing ratios as state variables (method logarithm) fixes this problem, but generates spuriously large hydrometeor increments partly due to the transform to and from the logarithmic space. The tangent linear of the reflectivity operators further contributes to spuriously small and large hydrometeor increments in method mixing ratio and method logarithm, respectively. A new method is proposed by directly adding the reflectivity as a state variable (method dBZ). Without the tangent linear and adjoint of the nonlinear operator, the new method therefore avoids the aforementioned problems.
The newly proposed method is examined on the analysis and prediction of the 8 May 2003 Oklahoma City tornadic supercell storm. Both the probabilistic forecast of strong low-level vorticity and maintenance of strong updraft and vorticity in method dBZ are more consistent with reality than in method logarithm and method mixing ratio. Detailed diagnostics suggest that a more realistic cold pool due to the better analyzed hydrometeors in method dBZ than in other methods leads to constructive interaction between the surface gust front and the updraft aloft associated with the midlevel mesocyclone. Similar low-level vorticity forecast and maintenance of the storm are produced by the WSM6 and Thompson microphysics schemes in method dBZ. The Thompson scheme matches the reflectivity distribution with the observations better for all lead times, but shows more southeastward track bias compared to the WSM6 scheme.
Abstract
A GSI-based EnVar data assimilation system is extended to directly assimilate radar reflectivity to initialize convective-scale forecasts. When hydrometeor mixing ratios are used as state variables (method mixing ratio), large differences of the cost function gradients with respect to the small hydrometeor mixing ratios and wind prevent efficient convergence. Using logarithmic mixing ratios as state variables (method logarithm) fixes this problem, but generates spuriously large hydrometeor increments partly due to the transform to and from the logarithmic space. The tangent linear of the reflectivity operators further contributes to spuriously small and large hydrometeor increments in method mixing ratio and method logarithm, respectively. A new method is proposed by directly adding the reflectivity as a state variable (method dBZ). Without the tangent linear and adjoint of the nonlinear operator, the new method therefore avoids the aforementioned problems.
The newly proposed method is examined on the analysis and prediction of the 8 May 2003 Oklahoma City tornadic supercell storm. Both the probabilistic forecast of strong low-level vorticity and maintenance of strong updraft and vorticity in method dBZ are more consistent with reality than in method logarithm and method mixing ratio. Detailed diagnostics suggest that a more realistic cold pool due to the better analyzed hydrometeors in method dBZ than in other methods leads to constructive interaction between the surface gust front and the updraft aloft associated with the midlevel mesocyclone. Similar low-level vorticity forecast and maintenance of the storm are produced by the WSM6 and Thompson microphysics schemes in method dBZ. The Thompson scheme matches the reflectivity distribution with the observations better for all lead times, but shows more southeastward track bias compared to the WSM6 scheme.
Abstract
A convective-scale static background-error covariance (BEC) matrix is further developed to include the capability of direct reflectivity assimilation and evaluated within the GSI-based three-dimensional variational (3DVar) and hybrid ensemble–variational (EnVar) methods. Specific developments are summarized as follows: 1) Control variables (CVs) are extended to include reflectivity, vertical velocity, and all hydrometeor types. Various horizontal momentum and moisture CV options are included. 2) Cross correlations between all CVs are established. 3) A storm intensity-dependent binning method is adopted to separately calculate static error matrices for clear-air and storms with varying intensities. The resultant static BEC matrices are simultaneously applied at proper locations guided by the observed reflectivity. 4) The EnVar is extended to adaptively incorporate static BECs based on the quality of ensemble covariances. Evaluation and examination of the new static BECs are first performed on the 8 May 2003 Oklahoma City supercell. Detailed diagnostics and 3DVar examinations suggest zonal/meridional winds and pseudo–relative humidity are selected as horizontal momentum and moisture CVs for direct reflectivity assimilation, respectively; inclusion of cross correlations favors spin up and maintains the analyzed storms; application of binning improves characteristics and persistence of the simulated storm. Relative to an experiment using the full ensemble BECs (Exp-PureEnVar), incorporating static BECs in hybrid EnVar reduces spinup time and better analyzes reflectivity distributions while the background ensemble is deficient in sampling errors. Compared to both pure 3DVar and Exp-PureEnVar, hybrid EnVar better predicts reflectivity distributions and better maintains a strong mesocyclone. Further examination through the 20 May 2013 Oklahoma supercells confirms these results and additionally demonstrates the effectiveness of adaptive hybridization.
Abstract
A convective-scale static background-error covariance (BEC) matrix is further developed to include the capability of direct reflectivity assimilation and evaluated within the GSI-based three-dimensional variational (3DVar) and hybrid ensemble–variational (EnVar) methods. Specific developments are summarized as follows: 1) Control variables (CVs) are extended to include reflectivity, vertical velocity, and all hydrometeor types. Various horizontal momentum and moisture CV options are included. 2) Cross correlations between all CVs are established. 3) A storm intensity-dependent binning method is adopted to separately calculate static error matrices for clear-air and storms with varying intensities. The resultant static BEC matrices are simultaneously applied at proper locations guided by the observed reflectivity. 4) The EnVar is extended to adaptively incorporate static BECs based on the quality of ensemble covariances. Evaluation and examination of the new static BECs are first performed on the 8 May 2003 Oklahoma City supercell. Detailed diagnostics and 3DVar examinations suggest zonal/meridional winds and pseudo–relative humidity are selected as horizontal momentum and moisture CVs for direct reflectivity assimilation, respectively; inclusion of cross correlations favors spin up and maintains the analyzed storms; application of binning improves characteristics and persistence of the simulated storm. Relative to an experiment using the full ensemble BECs (Exp-PureEnVar), incorporating static BECs in hybrid EnVar reduces spinup time and better analyzes reflectivity distributions while the background ensemble is deficient in sampling errors. Compared to both pure 3DVar and Exp-PureEnVar, hybrid EnVar better predicts reflectivity distributions and better maintains a strong mesocyclone. Further examination through the 20 May 2013 Oklahoma supercells confirms these results and additionally demonstrates the effectiveness of adaptive hybridization.
Abstract
Valid-time-shifting (VTS) ensembles, either in the form of full ensemble members (VTSM) or ensemble perturbations (VTSP), were investigated as inexpensive means to increase ensemble size in the NCEP Global Forecast System (GFS) hybrid four-dimensional ensemble–variational (4DEnVar) data assimilation system. VTSM is designed to sample timing and/or phase errors, while VTSP can eliminate spurious covariances through temporal smoothing. When applying a shifting time interval (τ = 1, 2, or 3 h), VTSM and VTSP triple the baseline background ensemble size from 80 (ENS80) to 240 (ENS240) in the EnVar variational update, where the overall cost is only increased by 23%–27%, depending on the selected τ. Experiments during a 10-week summer period show the best-performing VTSP with τ = 2 h improves global temperature and wind forecasts out to 5 days over ENS80. This could be attributed to the improved background ensemble distribution, ensemble correlation accuracy, and increased effective rank in the populated background ensemble. VTSM generally degrades global forecasts in the troposphere. Improved global forecasts above 100 hPa by VTSM may benefit from the increased spread that alleviates the underdispersiveness of the original background ensemble at such levels. Both VTSM and VTSP improve tropical cyclone track forecasts over ENS80. Although VTSM and VTSP are much less expensive than directly running a 240-member background ensemble, owing to the improved ensemble covariances, the best-performing VTSP with τ = 1 h performs comparably or only slightly worse than ENS240. The best-performing VTSM with τ = 3 h even shows more accurate track forecasts than ENS240, likely contributed to by its better sampling of timing and/or phase errors for cases with small ensemble track spread.
Abstract
Valid-time-shifting (VTS) ensembles, either in the form of full ensemble members (VTSM) or ensemble perturbations (VTSP), were investigated as inexpensive means to increase ensemble size in the NCEP Global Forecast System (GFS) hybrid four-dimensional ensemble–variational (4DEnVar) data assimilation system. VTSM is designed to sample timing and/or phase errors, while VTSP can eliminate spurious covariances through temporal smoothing. When applying a shifting time interval (τ = 1, 2, or 3 h), VTSM and VTSP triple the baseline background ensemble size from 80 (ENS80) to 240 (ENS240) in the EnVar variational update, where the overall cost is only increased by 23%–27%, depending on the selected τ. Experiments during a 10-week summer period show the best-performing VTSP with τ = 2 h improves global temperature and wind forecasts out to 5 days over ENS80. This could be attributed to the improved background ensemble distribution, ensemble correlation accuracy, and increased effective rank in the populated background ensemble. VTSM generally degrades global forecasts in the troposphere. Improved global forecasts above 100 hPa by VTSM may benefit from the increased spread that alleviates the underdispersiveness of the original background ensemble at such levels. Both VTSM and VTSP improve tropical cyclone track forecasts over ENS80. Although VTSM and VTSP are much less expensive than directly running a 240-member background ensemble, owing to the improved ensemble covariances, the best-performing VTSP with τ = 1 h performs comparably or only slightly worse than ENS240. The best-performing VTSM with τ = 3 h even shows more accurate track forecasts than ENS240, likely contributed to by its better sampling of timing and/or phase errors for cases with small ensemble track spread.
Abstract
This study investigates impacts on convection-permitting ensemble forecast performance of different methods of generating the ensemble IC perturbations in the context of simultaneous physics diversity among the ensemble members. A total of 10 convectively active cases are selected for a systematic comparison of different methods of perturbing IC perturbations in 10-member convection-permitting ensembles, both with and without physics diversity. These IC perturbation methods include simple downscaling of coarse perturbations from a global model (LARGE), perturbations generated with ensemble data assimilation directly on the multiscale domain (MULTI), and perturbations generated using each method with small scales filtered out as a control. MULTI was found to be significantly more skillful than LARGE at early lead times in all ensemble physics configurations, with the advantage of MULTI gradually decreasing with increasing forecast lead time. The advantage of MULTI, relative to LARGE, was reduced but not eliminated by the presence of physics diversity because of the extra ensemble spread that the physics diversity provided. The advantage of MULTI, relative to LARGE, was also reduced by filtering the IC perturbations to a commonly resolved spatial scale in both ensembles, which highlights the importance of flow-dependent small-scale (<~10 m) IC perturbations in the ensemble design. The importance of the physics diversity, relative to the IC perturbation method, depended on the spatial scale of interest, forecast lead time, and the meteorological characteristics of the forecast case. Such meteorological characteristics include the strength of synoptic-scale forcing, the role of cold pool interactions, and the occurrence of convective initiation or dissipation.
Abstract
This study investigates impacts on convection-permitting ensemble forecast performance of different methods of generating the ensemble IC perturbations in the context of simultaneous physics diversity among the ensemble members. A total of 10 convectively active cases are selected for a systematic comparison of different methods of perturbing IC perturbations in 10-member convection-permitting ensembles, both with and without physics diversity. These IC perturbation methods include simple downscaling of coarse perturbations from a global model (LARGE), perturbations generated with ensemble data assimilation directly on the multiscale domain (MULTI), and perturbations generated using each method with small scales filtered out as a control. MULTI was found to be significantly more skillful than LARGE at early lead times in all ensemble physics configurations, with the advantage of MULTI gradually decreasing with increasing forecast lead time. The advantage of MULTI, relative to LARGE, was reduced but not eliminated by the presence of physics diversity because of the extra ensemble spread that the physics diversity provided. The advantage of MULTI, relative to LARGE, was also reduced by filtering the IC perturbations to a commonly resolved spatial scale in both ensembles, which highlights the importance of flow-dependent small-scale (<~10 m) IC perturbations in the ensemble design. The importance of the physics diversity, relative to the IC perturbation method, depended on the spatial scale of interest, forecast lead time, and the meteorological characteristics of the forecast case. Such meteorological characteristics include the strength of synoptic-scale forcing, the role of cold pool interactions, and the occurrence of convective initiation or dissipation.
Abstract
Short-term spinup for strong storms is a known difficulty for the operational Hurricane Weather Research and Forecasting (HWRF) Model after assimilating high-resolution inner-core observations. Our previous study associated this short-term intensity prediction issue with the incompatibility between the HWRF Model and the data assimilation (DA) analysis. While improving physics and resolution of the model was found to be helpful, this study focuses on further improving the intensity predictions through the four-dimensional incremental analysis update (4DIAU). In the traditional 4DIAU, increments are predetermined by subtracting background forecasts from analyses. Such predetermined increments implicitly require linear evolution assumption during the update, which are hardly valid for rapidly evolving hurricanes. To confirm the hypothesis, a corresponding 4D analysis nudging (4DAN) method, which uses online increments is first compared with the 4DIAU in an oscillation model. Then, variants of 4DIAU are proposed to improve its application for nonlinear systems. Next, 4DIAU, 4DAN and their proposed improvements are implemented into the HWRF 4DEnVar DA system and are investigated with Hurricane Patricia (2015). Results from both the oscillation model and HWRF Model show that 1) the predetermined increments in 4DIAU can be detrimental when there are discrepancies between the updated and background forecasts during a nonlinear evolution; 2) 4DAN can improve the performance of incremental update upon 4DIAU, but its improvements are limited by the overfiltering; 3) relocating initial background before the incremental update can improve the corresponding traditional methods; and 4) the feature-relative 4DIAU method improves the incremental update the most and produces the best track and intensity predictions for Patricia among all experiments.
Abstract
Short-term spinup for strong storms is a known difficulty for the operational Hurricane Weather Research and Forecasting (HWRF) Model after assimilating high-resolution inner-core observations. Our previous study associated this short-term intensity prediction issue with the incompatibility between the HWRF Model and the data assimilation (DA) analysis. While improving physics and resolution of the model was found to be helpful, this study focuses on further improving the intensity predictions through the four-dimensional incremental analysis update (4DIAU). In the traditional 4DIAU, increments are predetermined by subtracting background forecasts from analyses. Such predetermined increments implicitly require linear evolution assumption during the update, which are hardly valid for rapidly evolving hurricanes. To confirm the hypothesis, a corresponding 4D analysis nudging (4DAN) method, which uses online increments is first compared with the 4DIAU in an oscillation model. Then, variants of 4DIAU are proposed to improve its application for nonlinear systems. Next, 4DIAU, 4DAN and their proposed improvements are implemented into the HWRF 4DEnVar DA system and are investigated with Hurricane Patricia (2015). Results from both the oscillation model and HWRF Model show that 1) the predetermined increments in 4DIAU can be detrimental when there are discrepancies between the updated and background forecasts during a nonlinear evolution; 2) 4DAN can improve the performance of incremental update upon 4DIAU, but its improvements are limited by the overfiltering; 3) relocating initial background before the incremental update can improve the corresponding traditional methods; and 4) the feature-relative 4DIAU method improves the incremental update the most and produces the best track and intensity predictions for Patricia among all experiments.
Abstract
A multiresolution ensemble (MR-ENS) method is developed to resolve a wider range of scales of the background error covariance (BEC) in the hybrid four-dimensional ensemble–variational (4DEnVar) while saving computational costs. MR-ENS is implemented in the NCEP Global Forecast System (GFS) gridpoint statistical interpolation (GSI) hybrid 4DEnVar. MR-ENS generates analysis increment by incorporating high-resolution static BEC and flow-dependent ensemble BECs from both high and low resolutions. MR-ENS is compared with three 4DEnVar update approaches: 1) the single-resolution (SR)-Low approach where the analysis increments are generated from the ensemble BEC and the static BEC at the same low resolution; 2) the dual-resolution (DR) approach where the analysis increment is generated using the high-resolution static BEC and low-resolution ensemble BEC; and 3) the SR-High approach, which is the same as 1) except that all covariances are at high-resolution. Experiments show that MR-ENS improves global and tropical cyclone track forecasts compared to SR-Low and DR. Inclusion of the high-resolution ensemble leads to increased background ensemble spread, better fitting of the background to observations, increased effective ranks, more accurate ensemble error correlation, and increased power of analysis increment at small scales. The majority of the improvement of MR-ENS relative to SR-Low is due to the partial use of high-resolution background ensemble. Compared to SR-High, MR-ENS decreases the overall cost by about 40% and shows comparable global and tropical cyclone track forecast performances. Diagnostics show that particularly in the tropics, MR-ENS improves the analysis increment over a wide range of scales and increases the effective rank of the ensemble BEC to the degree comparable to SR-High.
Abstract
A multiresolution ensemble (MR-ENS) method is developed to resolve a wider range of scales of the background error covariance (BEC) in the hybrid four-dimensional ensemble–variational (4DEnVar) while saving computational costs. MR-ENS is implemented in the NCEP Global Forecast System (GFS) gridpoint statistical interpolation (GSI) hybrid 4DEnVar. MR-ENS generates analysis increment by incorporating high-resolution static BEC and flow-dependent ensemble BECs from both high and low resolutions. MR-ENS is compared with three 4DEnVar update approaches: 1) the single-resolution (SR)-Low approach where the analysis increments are generated from the ensemble BEC and the static BEC at the same low resolution; 2) the dual-resolution (DR) approach where the analysis increment is generated using the high-resolution static BEC and low-resolution ensemble BEC; and 3) the SR-High approach, which is the same as 1) except that all covariances are at high-resolution. Experiments show that MR-ENS improves global and tropical cyclone track forecasts compared to SR-Low and DR. Inclusion of the high-resolution ensemble leads to increased background ensemble spread, better fitting of the background to observations, increased effective ranks, more accurate ensemble error correlation, and increased power of analysis increment at small scales. The majority of the improvement of MR-ENS relative to SR-Low is due to the partial use of high-resolution background ensemble. Compared to SR-High, MR-ENS decreases the overall cost by about 40% and shows comparable global and tropical cyclone track forecast performances. Diagnostics show that particularly in the tropics, MR-ENS improves the analysis increment over a wide range of scales and increases the effective rank of the ensemble BEC to the degree comparable to SR-High.
Abstract
Diverse observations, such as the High Definition Sounding System (HDSS) dropsonde observations from the Tropical Cyclone Intensity (TCI) program, the Tail Doppler Radar (TDR), Stepped Frequency Microwave Radiometer (SFMR), and flight-level observations from the Intensity Forecasting Experiment (IFEX) program, and the atmospheric motion vectors (AMVs) from the Cooperative Institute for Meteorological Satellite Studies (CIMSS) simultaneously depicted the three-dimensional (3D) structure of Hurricane Patricia (2015). Experiments are conducted to understand the relative impacts of each of these observation types on Patricia’s analysis and prediction using the Gridpoint Statistical Interpolation (GSI)-based ensemble-variational data assimilation system for the Hurricane Weather Research and Forecasting (HWRF) Model. In comparing the impacts of assimilating each dataset individually, results suggest that 1) the assimilation of 3D observations produces better TC structure analysis than the assimilation of two-dimensional (2D) observations; 2) the analysis from assimilating observations collected from platforms that only sample momentum fields produces a less improved forecast with either short-lived impacts or slower intensity spinup as compared to the forecast produced after assimilating observations collected from platforms that sample both momentum and thermal fields; and 3) the structure forecast tends to benefit more from the assimilation of inner-core observations than the corresponding intensity forecast, which implies better verification metrics are needed for future TC forecast evaluation.
Abstract
Diverse observations, such as the High Definition Sounding System (HDSS) dropsonde observations from the Tropical Cyclone Intensity (TCI) program, the Tail Doppler Radar (TDR), Stepped Frequency Microwave Radiometer (SFMR), and flight-level observations from the Intensity Forecasting Experiment (IFEX) program, and the atmospheric motion vectors (AMVs) from the Cooperative Institute for Meteorological Satellite Studies (CIMSS) simultaneously depicted the three-dimensional (3D) structure of Hurricane Patricia (2015). Experiments are conducted to understand the relative impacts of each of these observation types on Patricia’s analysis and prediction using the Gridpoint Statistical Interpolation (GSI)-based ensemble-variational data assimilation system for the Hurricane Weather Research and Forecasting (HWRF) Model. In comparing the impacts of assimilating each dataset individually, results suggest that 1) the assimilation of 3D observations produces better TC structure analysis than the assimilation of two-dimensional (2D) observations; 2) the analysis from assimilating observations collected from platforms that only sample momentum fields produces a less improved forecast with either short-lived impacts or slower intensity spinup as compared to the forecast produced after assimilating observations collected from platforms that sample both momentum and thermal fields; and 3) the structure forecast tends to benefit more from the assimilation of inner-core observations than the corresponding intensity forecast, which implies better verification metrics are needed for future TC forecast evaluation.