Browse

You are looking at 101 - 110 of 3,074 items for :

  • Weather and Forecasting x
  • Refine by Access: All Content x
Clear All
Joseph James
,
Chen Ling
,
Alyssa Bates
,
Gregory J. Stumpf
,
Kim Klockow-McClain
,
Pat Hyland
,
Jim LaDue
,
Kodi L. Berry
, and
Kevin Manross

Abstract

This project tested software capabilities and operational implications related to interoffice collaboration during NWS severe weather warning operations within a proposed paradigm, Forecasting A Continuum of Environmental Threats (FACETs). Current NWS policy of each forecast office issuing warnings for an exclusive area of responsibility may result in inconsistent messaging. In contrast, the FACETs paradigm, with object-based, moving probabilistic and deterministic hazard information, could provide seamless information across NWS County Warning Areas (CWAs). An experiment was conducted that allowed NWS forecasters to test new software that incorporates FACETs-based hazard information and potential concepts of operation to improve messaging consistency between adjacent WFOs. Experiment scenarios consisted of a variety of storm and office border interactions, fictional events requiring nowcasts, and directives that mimicked differing inter-WFO warning philosophies. Surveys and semi-structured interviews were conducted to gauge forecasters’ confidence and workload levels, and to discuss potential solutions for interoffice collaboration and software issues. We found that forecasters were able to adapt quickly to the new software and concepts and were comfortable with collaborating with their neighboring WFO in warning operations. Although forecasters felt the software’s collaboration tools enabled them to communicate in a timely manner, adding this collaboration increased their workload when compared to their workload during current warning operations.

Restricted access
Bu-Yo Kim
,
Miloslav Belorid
, and
Joo Wan Cha

Abstract

Accurate visibility prediction is imperative in the interests of human and environmental health. However, the existing numerical models for visibility prediction are characterized by low prediction accuracy and high computational cost. Thus, in this study, we predicted visibility using tree-based machine learning algorithms and numerical weather prediction data determined by the local data assimilation and prediction system (LDAPS) of the Korea Meteorological Administration. We then evaluated the accuracy of visibility prediction for Seoul, South Korea, through a comparative analysis using observed visibility from the automated synoptic observing system. The visibility predicted by machine learning algorithm was compared with the visibility predicted by LDAPS. The LDAPS data employed to construct the visibility prediction model were divided into learning, validation, and test sets. The optimal machine learning algorithm for visibility prediction was determined using the learning and validation sets. In this study, the extreme gradient boosting (XGB) algorithm showed the highest accuracy for visibility prediction. Comparative results using the test sets revealed lower prediction error and higher correlation coefficient for visibility predicted by the XGB algorithm (bias: −0.62 km, MAE: 2.04 km, RMSE: 2.94 km, and R: 0.88) than for that predicted by LDAPS (bias: −0.32 km, MAE: 4.66 km, RMSE: 6.48 km, and R: 0.40). Moreover, the mean equitable threat score (ETS) also indicated higher prediction accuracy for visibility predicted by the XGB algorithm (ETS: 0.5–0.6 for visibility ranges) than for that predicted by LDAPS (ETS: 0.1–0.2).

Open access
Lanxi Min
,
Qilong Min
, and
Yuyi Du

Abstract

Weather forecasting over complex terrain with diverse land cover is challenging. Utilizing the high-resolution observations from New York State Mesonet (NYSM), we are able to evaluate the surface processes of the Weather Research Forecast (WRF) Model in a detailed, scale-dependent manner. In the study, possible impacts of land–atmosphere interaction on surface meteorology and boundary layer cloud development are investigated with different model resolutions, land surface models (LSMs), and planetary boundary layer (PBL) physical parameterizations. The High-Resolution Rapid Refresh, version 3 (HRRR), forecasting model is used as a reference for the sensitivity evaluation. Results show that over complex terrain, the high-resolution simulations (1 km × 60 vertical levels) generally perform better compared to low-resolution (3 km × 50 levels) in both surface meteorology and cloud fields. LSMs play a more important role in surface meteorology compared to PBL schemes. The NoahMP land surface model exhibits daytime warmer and drier biases compared to the Rapid Update Cycle (RUC) due to better prediction of the Bowen ratio in RUC. The PBL schemes would affect the convective strength in the boundary layer. The Shin–Hong (SH) scale-aware scheme tends to produce the strongest convective strength in the PBL, while the ACM2 PBL scheme rarely resolved convection even at 1-km resolution. By considering the radiation effect of subgrid-scale (SGS) clouds, the Mellor–Yamada–Nakanishi–Niino eddy diffusivity mass flux (MYNN-EDMF) predicted the highest cloud coverage and lowest surface solar radiation bias. The configuration of SGS clouds in MYNN-EDMF would not only significantly reduce shortwave radiation bias, but also affect the convection behaviors through land surface–cloud–radiation interaction.

Restricted access
Wei Ye
,
Ying Li
, and
Da-Lin Zhang

Abstract

In this study, the development of an extreme precipitation event along the southeastern margin of the Tibetan Plateau (TP) by the approach of Tropical Cyclone (TC) Rashmi (2008) from the Bay of Bengal is examined using a global reanalysis and all available observations. Results show the importance of an anomalous southerly flow, resulting from the merging of Rashmi into a meridionally deep trough at the western periphery of a subtropical high, in steering the storm and transporting tropical warm–moist air, thereby supplying necessary moisture for precipitation production over the TP. A mesoscale data analysis reveals that (i) the Rashmi vortex maintained its TC identity during its northward movement in the warm sector with weak-gradient flows; (ii) the extreme precipitation event occurred under potentially stable conditions; (iii) topographical uplifting of the southerly warm–moist air, enhanced by the approaching vortex with some degree of slantwise instability, led to the development of heavy to extreme precipitation along the southeastern margin of the TP; and (iv) the most influential uplifting of the intense vortex flows carrying ample moisture over steep topography favored the generation of the record-breaking daily snowfall of 98 mm (in water depth), and daily precipitation of 87 mm with rain–snow–rain changeovers at two high-elevation stations, respectively. The extreme precipitation and phase changeovers could be uncovered by an unusual upper-air sounding that shows a profound saturated layer from the surface to upper troposphere with a moist adiabatic upper 100-hPa layer and a bottom 100-hPa melting layer. The results appear to have important implications to the forecast of TC-related heavy precipitation over high mountains.

Significance Statement

This study attempts to gain insight into the multiscale dynamical processes leading to the development of an extreme precipitation event over the southeastern margin of the Tibet Plateau as a Bay of Bengal tropical cyclone (TC) approached. Results show (i) the importance of an anomalous southerly flow with a wide zonal span in steering the relatively large-sized TC and transporting necessary moisture into the region; and (ii) the subsequent uplifting of the warm and moist TC vortex by steep topography, producing the extreme precipitation event under potentially stable conditions, especially the record-breaking daily snowfall of 98 mm (in water depth). The results have important implications to the forecast of TC-related heavy precipitation over the Tibet Plateau and other high mountainous regions.

Restricted access
Tseganeh Z. Gichamo
and
Clara S. Draper

Abstract

Within the National Weather Service’s Unified Forecast System (UFS), snow depth and snow cover observations are assimilated once daily using a rule-based method designed to correct for gross errors. While this approach improved the forecasts over its predecessors, it is now quite outdated and is likely to result in suboptimal analysis. We have then implemented and evaluated a snow data assimilation using the 2D optimal interpolation (OI) method, which accounts for model and observation errors and their spatial correlations as a function of distances between the observations and model grid cells. The performance of the OI was evaluated by assimilating daily snow depth observations from the Global Historical Climatology Network (GHCN) and the Interactive Multisensor Snow and Ice Mapping System (IMS) snow cover data into the UFS, from October 2019 to March 2020. Compared to the control analysis, which is very similar to the method currently in operational use, the OI improves the forecast snow depth and snow cover. For instance, the unbiased snow depth root-mean-squared error (ubRMSE) was reduced by 45 mm and the snow cover hit rate increased by 4%. This leads to modest improvements to globally averaged near-surface temperature (an average reduction of 0.23 K in temperature bias), with significant local improvements in some regions (much of Asia, the central United States). The reduction in near-surface temperature error was primarily caused by improved snow cover fraction from the data assimilation. Based on these results, the OI DA is currently being transitioned into operational use for the UFS.

Significance Statement

Weather and climate forecasting systems rely on accurate modeling of the evolution of atmospheric, oceanic, and land processes. In addition, model forecasts are substantially improved by continuous incorporation of observations to models, through a process called data assimilation. In this work, we upgraded the snow data assimilation used in the U.S. National Weather Service (NWS) global weather prediction system. Compared to the method currently in operational use, the new snow data assimilation improves both the forecasted snow quantity and near-surface air temperatures over snowy regions. Based on the positive results obtained in the experiments presented here, the new snow data assimilation method is being implemented in the NWS operational forecast system.

Restricted access
M. M. Nageswararao
,
Yuejian Zhu
,
Vijay Tallapragada
, and
Meng-Shih Chen

Abstract

The skillful prediction of monthly scale rainfall in small regions like Taiwan is one of the challenges of the meteorological scientific community. Taiwan is one of the subtropical islands in Asia. It experiences rainfall extremes regularly, leading to landslides and flash floods in/near the mountains and flooding over low-lying plains, particularly during the summer monsoon season [June–September (JJAS)]. In September 2020, NOAA/NCEP implemented Global Ensemble Forecast System, version 12 (GEFSv12), to support stakeholders for subseasonal forecasts and hydrological applications. In the present study, the performance evaluation of GEFSv12 for monthly rainfall and associated extreme rainfall (ER) events over Taiwan during JJAS against CMORPH has been done. There is a marginal improvement of GEFSv12 in depicting the East Asian summer monsoon index (EASMI) as compared to GEFS-SubX. The GEFSv12 rainfall raw products have been calibrated with a quantile–quantile (QQ) mapping technique for further prediction skill improvement. The results reveal that the spatial patterns of climatological features (mean, interannual variability, and coefficient of variation) of summer monsoon monthly rainfall over Taiwan from QQ-GEFSv12 are very similar to CMORPH than Raw-GEFSv12. Raw-GEFSv12 has an enormous wet bias and overforecast wet days, while QQ-GEFSv12 is close to reality. The prediction skill (correlation coefficient and index of agreement) of GEFSv12 in depicting the summer monsoon monthly rainfall over Taiwan is significantly high (>0.5) in most parts of Taiwan and particularly more during peak monsoon months, September, and August, followed by June and July. The calibration method significantly reduces the overestimation (underestimation) of wet (ER) events from the ensemble mean and probabilistic ensemble forecasts. The predictability of extreme rainfall events (>50 mm day−1) has also improved significantly.

Restricted access
Chung-Chieh Wang
,
Hung-Chi Kuo
,
Yu-Han Chen
,
Shin-Hau Chen
, and
Kazuhisa Tsuboki

Abstract

Typhoon Morakot struck Taiwan during 7–9 August 2009 and became the deadliest tropical cyclone (TC) in five decades by producing up to 2635 mm of rain in 48 h, breaking the world record. The extreme rainfall of Morakot resulted from the strong interaction among several favorable factors that occurred simultaneously. These factors from large scale to small scale include the following: 1) weak environmental steering flow linked to the evolution of the monsoon gyre and consequently slow TC motion; 2) a strong moisture surge due to low-level southwesterly flow; 3) asymmetric rainfall and latent heating near southern Taiwan to further reduce the TC’s forward motion as its center began moving away from Taiwan; 4) enhanced rainfall due to steep topography; 5) atypical structure with a weak inner core, enhancing its susceptibility to the latent heating effect; and 6) cell merger and back building inside the rainbands associated with the interaction between the low-level jet and convective updrafts. From a forecasting standpoint, the present-day convective-permitting or cloud-resolving regional models are capable of short-range predictions of the Morakot event starting from 6 August. At longer ranges beyond 3 days, larger uncertainty exists in the track forecast and an ensemble approach is necessary. Due to the large computational demand at the required high resolution, the time-lagged strategy is shown to be a feasible option to produce useful information on rainfall probabilities of the event.

Open access
Chen Zhao
,
Tim Li
, and
Mingyu Bi

Abstract

The Advanced version of the Weather Research and Forecasting (WRF-ARW) Model is used to investigate the influence of an easterly wave (EW) on the genesis of Typhoon Hagupit (2008) in the western North Pacific. Observational analysis indicates that the precursor disturbance of Typhoon Hagupit (2008) is an easterly wave (EW) in the western North Pacific, which can be detected at least 7 days prior to the typhoon genesis. In the control experiment, the genesis of the typhoon is well captured. A sensitivity experiment is conducted by filtering out the synoptic-scale (3–8-day) signals associated with the EW. The absence of the EW eliminates the typhoon genesis. Two mechanisms are proposed regarding the effect of the EW on the genesis of Hagupit. First, the background cyclonic vorticity of the EW could induce the small-scale cyclonic vorticities to merge and develop into a system-scale vortex. Second, the EW provides a favorable environment in situ for the rapid development of the typhoon disturbance through a positive moisture–convection feedback.

Restricted access
Mark DeMaria
,
James L. Franklin
,
Rachel Zelinsky
,
David A. Zelinsky
,
Matthew J. Onderlinde
,
John A. Knaff
,
Stephanie N. Stevenson
,
John Kaplan
,
Kate D. Musgrave
,
Galina Chirokova
, and
Charles R. Sampson

Abstract

The National Hurricane Center (NHC) uses a variety of guidance models for its operational tropical cyclone track, intensity, and wind structure forecasts, and as baselines for the evaluation of forecast skill. A set of the simpler models, collectively known as the NHC guidance suite, is maintained by NHC. The models comprising the guidance suite are briefly described and evaluated, with details provided for those that have not been documented previously. Decay-SHIFOR is a modified version of the Statistical Hurricane Intensity Forecast (SHIFOR) model that includes decay over land; this modification improves the SHIFOR forecasts through about 96 h. T-CLIPER, a climatology and persistence model that predicts track and intensity using a trajectory approach, has error characteristics similar to those of CLIPER and D-SHIFOR but can be run to any forecast length. The Trajectory and Beta model (TAB), another trajectory track model, applies a gridpoint spatial filter to smooth winds from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model. TAB model errors were 10%–15% lower than those of the Beta and Advection model (BAM), the model it replaced in 2017. Optimizing TAB’s vertical weights shows that the lower troposphere’s environmental flow provides a better match to observed tropical cyclone motion than does the upper troposphere’s, and that the optimal steering layer is shallower for higher-latitude and weaker tropical cyclones. The advantages and disadvantages of the D-SHIFOR, T-CLIPER, and TAB models relative to their earlier counterparts are discussed.

Significance Statement

This paper provides a comprehensive summary and evaluation of a set of simpler forecast models used as guidance for NHC’s operational tropical cyclone forecasts, and as baselines for the evaluation of forecast skill; these include newer techniques that extend forecasts to 7 days and beyond.

Restricted access
Clarice N. Satrio
,
Kristin M. Calhoun
,
P. Adrian Campbell
,
Rebecca Steeves
, and
Travis M. Smith

Abstract

While storm identification and tracking algorithms are used both operationally and in research, there exists no single standard technique to objectively determine performance of such algorithms. Thus, a comparative skill score is developed herein that consists of four parameters, three of which constitute the quantification of storm attributes—size consistency, linearity of tracks, and mean track duration—and the fourth that correlates performance to an optimal postevent reanalysis. The skill score is a cumulative sum of each of the parameters normalized from zero to one among the compared algorithms, such that a maximum skill score of four can be obtained. The skill score is intended to favor algorithms that are efficient at severe storm detection, i.e., high-scoring algorithms should detect storms that have higher current or future severe threat and minimize detection of weaker, short-lived storms with low severe potential. The skill score is shown to be capable of successfully ranking a large number of algorithms, both between varying settings within the same base algorithm and between distinct base algorithms. Through a comparison with manually created user datasets, high-scoring algorithms are verified to match well with hand analyses, demonstrating appropriate calibration of skill score parameters.

Significance Statement

With the growing number of options for storm identification and tracking techniques, it is necessary to devise an objective approach to quantify performance of different techniques. This study introduces a comparative skill score that assesses size consistency, linearity of tracks, mean track duration, and correlation to an optimal postevent reanalysis to rank diverse algorithms. This paper will show the capability of the skill score at highlighting algorithms that are efficient at detecting storms with higher severe potential, as well as those that closely resemble human-perceived storms through a comparison with manually created user datasets. The novel methodology will be useful in improving systems that rely on such algorithms, for both operational and research purposes focusing on severe storm detection.

Restricted access