Browse
Abstract
Conventional statistical postprocessing techniques offer limited ability to improve the skills of probabilistic guidance for heavy precipitation. This paper introduces two artificial neural network (ANN)-based, geographically aware, and computationally efficient postprocessing schemes, namely, the ANN-multiclass (ANN-Mclass) and the ANN–censored, shifted gamma distribution (ANN-CSGD). Both schemes are implemented to postprocess Global Ensemble Forecast System (GEFS) forecasts to produce probabilistic quantitative precipitation forecasts (PQPFs) over the contiguous United States (CONUS) using a short (60 days), rolling training window. The performances of these schemes are assessed through a set of hindcast experiments, wherein postprocessed 24-h PQPFs from the two ANN schemes were compared against those produced using the benchmark quantile mapping algorithm for lead times ranging from 1 to 8 days. Outcomes of the hindcast experiments show that ANN schemes overall outperform the benchmark as well as the raw forecast over the CONUS in predicting probability of precipitation over a range of thresholds. The relative performance varies among geographic regions, with the two ANN schemes broadly improving upon quantile mapping over the central, south, and southeast, and slightly underperforming along the Pacific coast where skills of raw forecasts are the highest. Between the two schemes, the hybrid ANN-CSGD outperforms at higher rainfall thresholds (i.e., >50 mm day−1), though the outperformance comes at a slight expense of sharpness and spatial specificity. Collectively, these results confirm the ability of the ANN algorithms to produce skillful PQPFs with a limited training window and point to the prowess of the hybrid scheme for calibrating PQPFs for rare-to-extreme rainfall events.
Abstract
Conventional statistical postprocessing techniques offer limited ability to improve the skills of probabilistic guidance for heavy precipitation. This paper introduces two artificial neural network (ANN)-based, geographically aware, and computationally efficient postprocessing schemes, namely, the ANN-multiclass (ANN-Mclass) and the ANN–censored, shifted gamma distribution (ANN-CSGD). Both schemes are implemented to postprocess Global Ensemble Forecast System (GEFS) forecasts to produce probabilistic quantitative precipitation forecasts (PQPFs) over the contiguous United States (CONUS) using a short (60 days), rolling training window. The performances of these schemes are assessed through a set of hindcast experiments, wherein postprocessed 24-h PQPFs from the two ANN schemes were compared against those produced using the benchmark quantile mapping algorithm for lead times ranging from 1 to 8 days. Outcomes of the hindcast experiments show that ANN schemes overall outperform the benchmark as well as the raw forecast over the CONUS in predicting probability of precipitation over a range of thresholds. The relative performance varies among geographic regions, with the two ANN schemes broadly improving upon quantile mapping over the central, south, and southeast, and slightly underperforming along the Pacific coast where skills of raw forecasts are the highest. Between the two schemes, the hybrid ANN-CSGD outperforms at higher rainfall thresholds (i.e., >50 mm day−1), though the outperformance comes at a slight expense of sharpness and spatial specificity. Collectively, these results confirm the ability of the ANN algorithms to produce skillful PQPFs with a limited training window and point to the prowess of the hybrid scheme for calibrating PQPFs for rare-to-extreme rainfall events.
Abstract
This paper analyzed observations from the Great Plains Irrigation Experiment (GRAINEX) to better understand land–atmosphere (L–A) interactions and PBL evolution. This study is focused on a day when the largest forcing on the boundary layer originated from the land surface/land use. To examine these impacts, we also applied the Weather Research and Forecasting (WRF) Model. Results from the observations show that compared to nonirrigated areas, air temperature, wind speed, and PBL height (PBLH) were lower while dewpoint temperature and latent heat flux were higher over irrigated areas. Findings suggest that entrainment layer drying and differences in energy partitioning over irrigated and nonirrigated areas played an important role in PBL evolution. In the final hours of the day, the PBL collapsed faster over nonirrigated areas compared to irrigated. The WRF Model simulations agree with these observations. They also show that the extent of irrigation [expressed as irrigation fraction (IF)] in an area impacts L–A response. Under ∼60% IF, the latent heat flux and mixing ratio reach their highest value while temperature and PBLH are at their lowest, and sensible heat flux is near its lowest value. Results are reversed for ∼2% IF. It is concluded that irrigation notably impacts L–A interactions and PBL evolution.
Abstract
This paper analyzed observations from the Great Plains Irrigation Experiment (GRAINEX) to better understand land–atmosphere (L–A) interactions and PBL evolution. This study is focused on a day when the largest forcing on the boundary layer originated from the land surface/land use. To examine these impacts, we also applied the Weather Research and Forecasting (WRF) Model. Results from the observations show that compared to nonirrigated areas, air temperature, wind speed, and PBL height (PBLH) were lower while dewpoint temperature and latent heat flux were higher over irrigated areas. Findings suggest that entrainment layer drying and differences in energy partitioning over irrigated and nonirrigated areas played an important role in PBL evolution. In the final hours of the day, the PBL collapsed faster over nonirrigated areas compared to irrigated. The WRF Model simulations agree with these observations. They also show that the extent of irrigation [expressed as irrigation fraction (IF)] in an area impacts L–A response. Under ∼60% IF, the latent heat flux and mixing ratio reach their highest value while temperature and PBLH are at their lowest, and sensible heat flux is near its lowest value. Results are reversed for ∼2% IF. It is concluded that irrigation notably impacts L–A interactions and PBL evolution.
Abstract
In the propagation from meteorological to hydrological drought, there are time-lag and step-abrupt effects, quantified in terms of propagation time and threshold, which play an important role in hydrological drought early warning. However, seasonal drought propagation time and threshold and their dynamics as well as the corresponding driving mechanism remain unknown in a changing environment. To this end, the standardized precipitation index (SPI) and standardized runoff index (SRI) were used respectively to characterize meteorological and hydrological droughts and to determine the optimal propagation time. Then, a seasonal drought propagation framework based on Bayesian network was proposed for calculating the drought propagation threshold with SPI. Finally, the seasonal dynamics and preliminary attribution of propagation characteristics were investigated based on the random forest model and correlation analysis. The results show that 1) relatively short propagation time (less than 9 months) and large propagation threshold (from −3.18 to −1.19) can be observed in the Toxkan River basins (subbasin II), especially for spring, showing low drought resistance; 2) drought propagation time shows an extended trend in most seasons, while the drought propagation threshold displays an increasing trend in autumn and winter in the Aksu River basin (subbasins I–II), and the opposite characteristics in the Hotan and Yarkant River basins (subbasins III–V); and 3) the impacts of precipitation, temperature, potential evapotranspiration, and soil moisture on drought propagation dynamics are inconsistent across subbasins and seasons, noting that reservoirs serve as a buffer to regulate the propagation from meteorological to hydrological droughts. The findings of this study can provide scientific guidelines for watershed hydrological drought early warning and risk management.
Significance Statement
The aim of this study is to better understand how the delayed and step-abrupt effects of propagation from meteorological drought to hydrological drought can be characterized through propagation time and threshold. These response indicators determine the resistance of a catchment to hydrological droughts and meteorological droughts. They can help water resources management agencies to mitigate hydrological droughts by taking measures such as water storage, increasing revenue, and reducing expenditure. The findings of this study can provide scientific guidelines for watershed hydrological drought early warning and risk management.
Abstract
In the propagation from meteorological to hydrological drought, there are time-lag and step-abrupt effects, quantified in terms of propagation time and threshold, which play an important role in hydrological drought early warning. However, seasonal drought propagation time and threshold and their dynamics as well as the corresponding driving mechanism remain unknown in a changing environment. To this end, the standardized precipitation index (SPI) and standardized runoff index (SRI) were used respectively to characterize meteorological and hydrological droughts and to determine the optimal propagation time. Then, a seasonal drought propagation framework based on Bayesian network was proposed for calculating the drought propagation threshold with SPI. Finally, the seasonal dynamics and preliminary attribution of propagation characteristics were investigated based on the random forest model and correlation analysis. The results show that 1) relatively short propagation time (less than 9 months) and large propagation threshold (from −3.18 to −1.19) can be observed in the Toxkan River basins (subbasin II), especially for spring, showing low drought resistance; 2) drought propagation time shows an extended trend in most seasons, while the drought propagation threshold displays an increasing trend in autumn and winter in the Aksu River basin (subbasins I–II), and the opposite characteristics in the Hotan and Yarkant River basins (subbasins III–V); and 3) the impacts of precipitation, temperature, potential evapotranspiration, and soil moisture on drought propagation dynamics are inconsistent across subbasins and seasons, noting that reservoirs serve as a buffer to regulate the propagation from meteorological to hydrological droughts. The findings of this study can provide scientific guidelines for watershed hydrological drought early warning and risk management.
Significance Statement
The aim of this study is to better understand how the delayed and step-abrupt effects of propagation from meteorological drought to hydrological drought can be characterized through propagation time and threshold. These response indicators determine the resistance of a catchment to hydrological droughts and meteorological droughts. They can help water resources management agencies to mitigate hydrological droughts by taking measures such as water storage, increasing revenue, and reducing expenditure. The findings of this study can provide scientific guidelines for watershed hydrological drought early warning and risk management.
Abstract
Atmospheric rivers (ARs) are defined as corridors of enhanced integrated water vapor transport (IVT) and produce large fractions of annual precipitation in regions with complex terrain along the western coastlines of midlatitude continents (e.g., 30%–50% along the U.S. West Coast in California). This study investigates this relationship among landfalling ARs, IVT, and watershed mean areal precipitation (MAP) for a 38-yr period over California. On average, the daily average IVT magnitude at different coastal locations explains ∼34% of the variance in annual watershed MAP across 140 Hydrologic Unit Code 8 (HUC-8) watersheds with large spatial variability across California. Further investigation of the IVT magnitude and direction at coastal locations illustrated that accounting for water vapor transport direction increases the explained variance in annual MAP to an average of 45%, with highest values (∼65%) occurring in watersheds over Northern and coastal California. Similar investigation of the lower-tropospheric water vapor flux vector at 850 and 925 hPa revealed further increases in the explained variance in annual MAP to an average of >50%. The results of this study 1) emphasize the importance of both IVT direction and water vapor flux altitude to watershed MAP, 2) align well with previous studies for select locations that highlight the importance of upslope (i.e., lower tropospheric) water vapor flux during landfalling ARs and precipitation, and 3) motivate the development of AR-related and watershed-centric forecast tools that incorporate IVT direction and water vapor flux altitude parameters in addition to IVT magnitude.
Abstract
Atmospheric rivers (ARs) are defined as corridors of enhanced integrated water vapor transport (IVT) and produce large fractions of annual precipitation in regions with complex terrain along the western coastlines of midlatitude continents (e.g., 30%–50% along the U.S. West Coast in California). This study investigates this relationship among landfalling ARs, IVT, and watershed mean areal precipitation (MAP) for a 38-yr period over California. On average, the daily average IVT magnitude at different coastal locations explains ∼34% of the variance in annual watershed MAP across 140 Hydrologic Unit Code 8 (HUC-8) watersheds with large spatial variability across California. Further investigation of the IVT magnitude and direction at coastal locations illustrated that accounting for water vapor transport direction increases the explained variance in annual MAP to an average of 45%, with highest values (∼65%) occurring in watersheds over Northern and coastal California. Similar investigation of the lower-tropospheric water vapor flux vector at 850 and 925 hPa revealed further increases in the explained variance in annual MAP to an average of >50%. The results of this study 1) emphasize the importance of both IVT direction and water vapor flux altitude to watershed MAP, 2) align well with previous studies for select locations that highlight the importance of upslope (i.e., lower tropospheric) water vapor flux during landfalling ARs and precipitation, and 3) motivate the development of AR-related and watershed-centric forecast tools that incorporate IVT direction and water vapor flux altitude parameters in addition to IVT magnitude.
Abstract
Accurate and reliable forecasts of quickflow, including interflow and overland flow, are essential for predicting rainfall–runoff events that can wash off recently applied agricultural nutrients. In this study, we examined whether a gridded version of the Sacramento Soil Moisture Accounting model with Heat Transfer (SAC-HT) could simulate and forecast quickflow in two agricultural watersheds in east-central Pennsylvania. Specifically, we used the Hydrology Laboratory–Research Distributed Hydrologic Model (HL-RDHM) software, which incorporates SAC-HT, to conduct a 15-yr (2003–17) simulation of quickflow in the 420-km2 Mahantango Creek watershed and in WE-38, a 7.3-km2 headwater interior basin. We directly calibrated HL-RDHM using hydrologic observations at the Mahantango Creek outlet, while all grid cells within Mahantango Creek, including WE-38, were calibrated indirectly using scalar multipliers derived from the basin outlet calibration. Using the calibrated model, we then assessed the quality of short-range (24–72 h) deterministic forecasts of daily quickflow in both watersheds over a 2-yr period (July 2017–October 2019). At the basin outlet, HL-RDHM quickflow simulations showed low biases (PBIAS = 10.5%) and strong agreement (KGE″ = 0.81) with observations. At the headwater scale, HL-RDHM overestimated quickflow (PBIAS = 69.0%) to a greater degree, but quickflow simulations remained satisfactory (KGE″ = 0.65). When applied to quickflow forecasting, HL-RDHM produced skillful forecasts (>90% of Peirce and Gerrity skill scores above 0.5) at all lead times and significantly outperformed persistence forecasts, although skill gains in Mahantango Creek were slightly lower. Accordingly, short-range quickflow forecasts by HL-RDHM show promise for informing operational decision-making in agriculture.
Significance Statement
Daily runoff forecasts can alert farmers to rainfall–runoff events that have the potential to wash off recently applied fertilizers and manures. To gauge whether daily runoff forecasts are accurate and reliable, we used runoff monitoring data from a large agricultural watershed and one of its headwater tributaries to evaluate the quality of short-term runoff forecasts (1–3 days ahead) that were generated by a National Weather Service watershed model. Results showed that the accuracy and reliability of daily runoff forecasts generally improved in both watersheds as lead times increased from 1 to 3 days. Study findings highlight the potential for National Weather Service models to provide useful short-term runoff forecasts that can inform operational decision-making in agriculture.
Abstract
Accurate and reliable forecasts of quickflow, including interflow and overland flow, are essential for predicting rainfall–runoff events that can wash off recently applied agricultural nutrients. In this study, we examined whether a gridded version of the Sacramento Soil Moisture Accounting model with Heat Transfer (SAC-HT) could simulate and forecast quickflow in two agricultural watersheds in east-central Pennsylvania. Specifically, we used the Hydrology Laboratory–Research Distributed Hydrologic Model (HL-RDHM) software, which incorporates SAC-HT, to conduct a 15-yr (2003–17) simulation of quickflow in the 420-km2 Mahantango Creek watershed and in WE-38, a 7.3-km2 headwater interior basin. We directly calibrated HL-RDHM using hydrologic observations at the Mahantango Creek outlet, while all grid cells within Mahantango Creek, including WE-38, were calibrated indirectly using scalar multipliers derived from the basin outlet calibration. Using the calibrated model, we then assessed the quality of short-range (24–72 h) deterministic forecasts of daily quickflow in both watersheds over a 2-yr period (July 2017–October 2019). At the basin outlet, HL-RDHM quickflow simulations showed low biases (PBIAS = 10.5%) and strong agreement (KGE″ = 0.81) with observations. At the headwater scale, HL-RDHM overestimated quickflow (PBIAS = 69.0%) to a greater degree, but quickflow simulations remained satisfactory (KGE″ = 0.65). When applied to quickflow forecasting, HL-RDHM produced skillful forecasts (>90% of Peirce and Gerrity skill scores above 0.5) at all lead times and significantly outperformed persistence forecasts, although skill gains in Mahantango Creek were slightly lower. Accordingly, short-range quickflow forecasts by HL-RDHM show promise for informing operational decision-making in agriculture.
Significance Statement
Daily runoff forecasts can alert farmers to rainfall–runoff events that have the potential to wash off recently applied fertilizers and manures. To gauge whether daily runoff forecasts are accurate and reliable, we used runoff monitoring data from a large agricultural watershed and one of its headwater tributaries to evaluate the quality of short-term runoff forecasts (1–3 days ahead) that were generated by a National Weather Service watershed model. Results showed that the accuracy and reliability of daily runoff forecasts generally improved in both watersheds as lead times increased from 1 to 3 days. Study findings highlight the potential for National Weather Service models to provide useful short-term runoff forecasts that can inform operational decision-making in agriculture.
Abstract
The accurate prediction of surface soil moisture (SM) is crucial for understanding hydrological processes. Deep learning (DL) models such as the long short-term memory model (LSTM) provide a powerful method and have been widely used in SM prediction. However, few studies have notably high success rates due to lacking prior knowledge in forms such as causality. Here we present a new causality-structure-based LSTM model (CLSTM), which could learn time interdependency and causality information for hydrometeorological applications. We applied and compared LSTM and CLSTM methods for forecasting SM across 64 FLUXNET sites globally. The results showed that CLSTM dramatically increased the predictive performance compared with LSTM. The Nash–Sutcliffe efficiency (NSE) suggested that more than 67% of sites witnessed an improvement of SM simulation larger than 10%. It is highlighted that CLSTM had a much better generalization ability that can adapt to extreme soil conditions, such as SM response to drought and precipitation events. By incorporating causal relations, CLSTM increased predictive ability across different lead times compared to LSTM. We also highlighted the critical role of physical information in the form of causality structure to improve drought prediction. At the same time, CLSTM has the potential to improve predictions of other hydrometeorological variables.
Abstract
The accurate prediction of surface soil moisture (SM) is crucial for understanding hydrological processes. Deep learning (DL) models such as the long short-term memory model (LSTM) provide a powerful method and have been widely used in SM prediction. However, few studies have notably high success rates due to lacking prior knowledge in forms such as causality. Here we present a new causality-structure-based LSTM model (CLSTM), which could learn time interdependency and causality information for hydrometeorological applications. We applied and compared LSTM and CLSTM methods for forecasting SM across 64 FLUXNET sites globally. The results showed that CLSTM dramatically increased the predictive performance compared with LSTM. The Nash–Sutcliffe efficiency (NSE) suggested that more than 67% of sites witnessed an improvement of SM simulation larger than 10%. It is highlighted that CLSTM had a much better generalization ability that can adapt to extreme soil conditions, such as SM response to drought and precipitation events. By incorporating causal relations, CLSTM increased predictive ability across different lead times compared to LSTM. We also highlighted the critical role of physical information in the form of causality structure to improve drought prediction. At the same time, CLSTM has the potential to improve predictions of other hydrometeorological variables.
Abstract
Global warming and anthropogenic activities have imposed noticeable impacts on rainfall pattern changes at both spatial and temporal scales in recent decades. Systematic diagnosis of rainfall pattern changes is urgently needed at spatiotemporal scales for a deeper understanding of how climate change produces variations in rainfall patterns. The objective of this study was to identify rainfall pattern changes systematically under climate change at a subcontinental scale along a rainfall gradient ranging from 1800 to 200 mm yr−1 by analyzing centennial rainfall data covering 230 sites from 1910 to 2017 in the Northern Territory of Australia. Rainfall pattern changes were characterized by considering aspects of trends and periodicity of annual rainfall, abrupt changes, rainfall distribution, and extreme rainfall events. Our results illustrated that rainfall patterns in northern Australia have changed significantly compared with the early period of the twentieth century. Specifically, 1) a significant increasing trend in annual precipitation associated with greater variation in recent decades was observed over the entire study area, 2) temporal variations represented a mean rainfall periodicity of 27 years over wet to dry regions, 3) an abrupt change of annual rainfall amount occurred consistently in both humid and arid regions during the 1966–75 period, and 4) partitioned long-term time series of rainfall demonstrated a wetter rainfall distribution trend across coastal to inland areas that was associated with more frequent extreme rainfall events in recent decades. The findings of this study could facilitate further studies on the mechanisms of climate change that influence rainfall pattern changes.
Significance Statement
Characterizing long-term rainfall pattern changes under different rainfall conditions is important to understand the impacts of climate change. We conducted diagnosis of centennial rainfall pattern changes across wet to dry regions in northern Australia and found that rainfall patterns have noticeably changed in recent decades. The entire region has a consistent increasing trend of annual rainfall with higher variation. Meanwhile, the main shifting period of rainfall pattern was during 1966–75. Although annual rainfall seems to become wetter with an increasing trend, more frequent extreme rainfall events should also be noticed for assessing the impacts of climate changes. The findings support further study to understand long-term rainfall pattern changes under climate change.
Abstract
Global warming and anthropogenic activities have imposed noticeable impacts on rainfall pattern changes at both spatial and temporal scales in recent decades. Systematic diagnosis of rainfall pattern changes is urgently needed at spatiotemporal scales for a deeper understanding of how climate change produces variations in rainfall patterns. The objective of this study was to identify rainfall pattern changes systematically under climate change at a subcontinental scale along a rainfall gradient ranging from 1800 to 200 mm yr−1 by analyzing centennial rainfall data covering 230 sites from 1910 to 2017 in the Northern Territory of Australia. Rainfall pattern changes were characterized by considering aspects of trends and periodicity of annual rainfall, abrupt changes, rainfall distribution, and extreme rainfall events. Our results illustrated that rainfall patterns in northern Australia have changed significantly compared with the early period of the twentieth century. Specifically, 1) a significant increasing trend in annual precipitation associated with greater variation in recent decades was observed over the entire study area, 2) temporal variations represented a mean rainfall periodicity of 27 years over wet to dry regions, 3) an abrupt change of annual rainfall amount occurred consistently in both humid and arid regions during the 1966–75 period, and 4) partitioned long-term time series of rainfall demonstrated a wetter rainfall distribution trend across coastal to inland areas that was associated with more frequent extreme rainfall events in recent decades. The findings of this study could facilitate further studies on the mechanisms of climate change that influence rainfall pattern changes.
Significance Statement
Characterizing long-term rainfall pattern changes under different rainfall conditions is important to understand the impacts of climate change. We conducted diagnosis of centennial rainfall pattern changes across wet to dry regions in northern Australia and found that rainfall patterns have noticeably changed in recent decades. The entire region has a consistent increasing trend of annual rainfall with higher variation. Meanwhile, the main shifting period of rainfall pattern was during 1966–75. Although annual rainfall seems to become wetter with an increasing trend, more frequent extreme rainfall events should also be noticed for assessing the impacts of climate changes. The findings support further study to understand long-term rainfall pattern changes under climate change.
Abstract
Seasonal forecasting of climatological variables is important for water and climatic-related decision-making. Dynamical models provide seasonal forecasts up to one year in advance, but direct outputs from these models need to be bias-corrected prior to application by end users. Here, five bias-correction methods are applied to precipitation hindcasts from ECMWF’s fifth generation seasonal forecast system (SEAS5). We apply each method in two distinct ways; first to the ensemble mean and second to individual ensemble members, before deriving an ensemble mean. The performance of bias-correction methods in both schemes is assessed relative to the simple average of raw ensemble members as a benchmark. Results show that in general, bias correction of individual ensemble members before deriving an ensemble mean (scheme 2) is most skillful for more frequent precipitation values while bias correction of the ensemble mean (scheme 1) performed better for extreme high and low precipitation values. Irrespective of application scheme, all bias-correction methods improved precipitation hindcasts compared to the benchmark method for lead times up to 6 months, with the best performance obtained at one month lead time in winter.
Abstract
Seasonal forecasting of climatological variables is important for water and climatic-related decision-making. Dynamical models provide seasonal forecasts up to one year in advance, but direct outputs from these models need to be bias-corrected prior to application by end users. Here, five bias-correction methods are applied to precipitation hindcasts from ECMWF’s fifth generation seasonal forecast system (SEAS5). We apply each method in two distinct ways; first to the ensemble mean and second to individual ensemble members, before deriving an ensemble mean. The performance of bias-correction methods in both schemes is assessed relative to the simple average of raw ensemble members as a benchmark. Results show that in general, bias correction of individual ensemble members before deriving an ensemble mean (scheme 2) is most skillful for more frequent precipitation values while bias correction of the ensemble mean (scheme 1) performed better for extreme high and low precipitation values. Irrespective of application scheme, all bias-correction methods improved precipitation hindcasts compared to the benchmark method for lead times up to 6 months, with the best performance obtained at one month lead time in winter.
Abstract
To understand and manage water systems under a changing climate and meet an increasing demand for water, a quantitative understanding of precipitation is most important in coastal regions. The capabilities of the Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (IMERG) V06B product for precipitation quantification are examined over three coastal regions of the United States: the West Coast, the Gulf of Mexico, and the East Coast, all of which are characterized by different topographies and precipitation climatologies. A novel uncertainty analysis of IMERG is proposed that considers environmental and physical parameters such as elevation and distance to the coastline. The IMERG performance is traced back to its components, i.e., passive microwave (PMW), infrared (IR), and morphing-based estimates. The analysis is performed using high-resolution, high-quality Ground Validation Multi-Radar/Multi-Sensor (GV-MRMS) rainfall estimates as ground reference at the native resolution of IMERG of 30 min and 0.1°. IMERG Final (IM-F) quantification performance heavily depends on the respective contribution of PMW, IR, and morph components. IM-F and its components overestimate the contribution of light rainfall (<1 mm h−1) and underestimate the contribution of high rainfall rates (>10 mm h−1) to the total rainfall volume. Strong regional dependencies are highlighted, especially over the West Coast, where the proximity of complex terrain to the coastline challenges precipitation estimates. Other major drivers are the distance from the coastline, elevation, and precipitation types, especially over the land and coast surface types, that highlight the impact of precipitation regimes.
Abstract
To understand and manage water systems under a changing climate and meet an increasing demand for water, a quantitative understanding of precipitation is most important in coastal regions. The capabilities of the Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (IMERG) V06B product for precipitation quantification are examined over three coastal regions of the United States: the West Coast, the Gulf of Mexico, and the East Coast, all of which are characterized by different topographies and precipitation climatologies. A novel uncertainty analysis of IMERG is proposed that considers environmental and physical parameters such as elevation and distance to the coastline. The IMERG performance is traced back to its components, i.e., passive microwave (PMW), infrared (IR), and morphing-based estimates. The analysis is performed using high-resolution, high-quality Ground Validation Multi-Radar/Multi-Sensor (GV-MRMS) rainfall estimates as ground reference at the native resolution of IMERG of 30 min and 0.1°. IMERG Final (IM-F) quantification performance heavily depends on the respective contribution of PMW, IR, and morph components. IM-F and its components overestimate the contribution of light rainfall (<1 mm h−1) and underestimate the contribution of high rainfall rates (>10 mm h−1) to the total rainfall volume. Strong regional dependencies are highlighted, especially over the West Coast, where the proximity of complex terrain to the coastline challenges precipitation estimates. Other major drivers are the distance from the coastline, elevation, and precipitation types, especially over the land and coast surface types, that highlight the impact of precipitation regimes.
Abstract
The WRF-lake, as a one-dimensional (1D) lake model popularly used for coupling with the Weather Research and Forecasting (WRF) system and modeling lake–atmosphere interactions, does not consider the heat exchange caused by inflow–outflow, which is an important characteristic of large reservoirs and can affect the energy budget and reservoir–atmosphere interactions. We evaluated the WRF-lake model by applying it at a large dimictic reservoir, Miyun Reservoir, in northern China. The results show that the WRF-lake model, though ignoring inflow–outflow, yields good surface water temperature simulation through reasonable parameterization. The Minlake model, as a better physically based model in reservoirs, was used to test the effect of inflow–outflow, including heat carried by inflow–outflow water exchange and water level change on the 1D model’s performance. The effect of heat carried by inflow–outflow is mainly in summer, negatively correlated with hydraulic residence time and positively correlated with temperature difference between inflow and outflow. For a reservoir with hydraulic residence time of 3 years and temperature difference between inflow and outflow about 10°C in summer, the heat carried by inflow–outflow is far less than the heat exchange through the surface (<2%) and therefore has little influence on total energy balance. The effect of water level change is mainly on latent heat and sensible heat in unit area, rather than outgoing longwave radiation. Though influencing the temperature in deep layers, the water level change does not have a significant impact on the surface temperature.
Significance Statement
The purpose of this study is to evaluate the applicability of WRF-lake, an important submodule of the Weather Research and Forecasting (WRF) system, in the large dimictic reservoir. This is important because WRF-lake does not consider the effect of inflow–outflow and water level change, which are important characteristics of large reservoirs and can affect the heat budget and reservoir–atmosphere interactions. The applicability of WRF-lake in large reservoirs with frequent inflow–outflow and water level change is widely concerned but has never been discussed in previous studies. Our research explored the applicability of WRF-lake in the large dimictic reservoir and discussed the effect of inflow–outflow and water level change quantitively.
Abstract
The WRF-lake, as a one-dimensional (1D) lake model popularly used for coupling with the Weather Research and Forecasting (WRF) system and modeling lake–atmosphere interactions, does not consider the heat exchange caused by inflow–outflow, which is an important characteristic of large reservoirs and can affect the energy budget and reservoir–atmosphere interactions. We evaluated the WRF-lake model by applying it at a large dimictic reservoir, Miyun Reservoir, in northern China. The results show that the WRF-lake model, though ignoring inflow–outflow, yields good surface water temperature simulation through reasonable parameterization. The Minlake model, as a better physically based model in reservoirs, was used to test the effect of inflow–outflow, including heat carried by inflow–outflow water exchange and water level change on the 1D model’s performance. The effect of heat carried by inflow–outflow is mainly in summer, negatively correlated with hydraulic residence time and positively correlated with temperature difference between inflow and outflow. For a reservoir with hydraulic residence time of 3 years and temperature difference between inflow and outflow about 10°C in summer, the heat carried by inflow–outflow is far less than the heat exchange through the surface (<2%) and therefore has little influence on total energy balance. The effect of water level change is mainly on latent heat and sensible heat in unit area, rather than outgoing longwave radiation. Though influencing the temperature in deep layers, the water level change does not have a significant impact on the surface temperature.
Significance Statement
The purpose of this study is to evaluate the applicability of WRF-lake, an important submodule of the Weather Research and Forecasting (WRF) system, in the large dimictic reservoir. This is important because WRF-lake does not consider the effect of inflow–outflow and water level change, which are important characteristics of large reservoirs and can affect the heat budget and reservoir–atmosphere interactions. The applicability of WRF-lake in large reservoirs with frequent inflow–outflow and water level change is widely concerned but has never been discussed in previous studies. Our research explored the applicability of WRF-lake in the large dimictic reservoir and discussed the effect of inflow–outflow and water level change quantitively.