Browse
Abstract
The occurrence of water deficit is intensified in lowland soils. Generating information with regard to its risk of occurrence is essential to avoid seed yield losses. The objective of this study was to determine the probability of water deficit in soybean cultivated in lowlands of the Vacacaí and Piratini River basins in the southern portion of Rio Grande do Sul in Brazil as a function of the sowing date. Soybean development was simulated considering three sets of cultivars of relative maturity groups (RMG) delimited by 5.9–6.8, 6.9–7.3, and 7.4–8.0, with a 10-day interval between the sowing dates making up the period between 21 September and 31 December. Daily meteorological data were used from 1971 to 2017 obtained from the Pelotas meteorological station and from 1968 to 2017 from the Santa Maria meteorological station. Water deficit (mm) in the subperiods and soybean development cycle was obtained from the calculation of evapotranspiration and daily sequential crop water balance. Data of water deficit were subjected to a probability distribution analysis, in which the exponential, gamma, lognormal, normal, and Weibull probability density function (pdf) adjustments were tested using chi-square and Kolmogorov–Smirnov adhesion tests, with a 10% significance level. The water deficit is lower in the Pelotas region than in Santa Maria. Sowings performed from 11 and 1 November present the lowest risk of occurrence of water deficit throughout the soybean cycle in Santa Maria and Pelotas, respectively. Risk of water deficit decreases for the beginning of flowering–beginning of seed (R1–R5) subperiod when soybean sowing occurs from the beginning of November.
Abstract
The occurrence of water deficit is intensified in lowland soils. Generating information with regard to its risk of occurrence is essential to avoid seed yield losses. The objective of this study was to determine the probability of water deficit in soybean cultivated in lowlands of the Vacacaí and Piratini River basins in the southern portion of Rio Grande do Sul in Brazil as a function of the sowing date. Soybean development was simulated considering three sets of cultivars of relative maturity groups (RMG) delimited by 5.9–6.8, 6.9–7.3, and 7.4–8.0, with a 10-day interval between the sowing dates making up the period between 21 September and 31 December. Daily meteorological data were used from 1971 to 2017 obtained from the Pelotas meteorological station and from 1968 to 2017 from the Santa Maria meteorological station. Water deficit (mm) in the subperiods and soybean development cycle was obtained from the calculation of evapotranspiration and daily sequential crop water balance. Data of water deficit were subjected to a probability distribution analysis, in which the exponential, gamma, lognormal, normal, and Weibull probability density function (pdf) adjustments were tested using chi-square and Kolmogorov–Smirnov adhesion tests, with a 10% significance level. The water deficit is lower in the Pelotas region than in Santa Maria. Sowings performed from 11 and 1 November present the lowest risk of occurrence of water deficit throughout the soybean cycle in Santa Maria and Pelotas, respectively. Risk of water deficit decreases for the beginning of flowering–beginning of seed (R1–R5) subperiod when soybean sowing occurs from the beginning of November.
Abstract
Sustainable management of biodiversity requires a thorough understanding of local climate and weather, particularly in areas where ecosystems have been degraded and where life is highly adapted to or dependent on narrow ecological niches. Furthermore, society, economy, and culture of urban agglomerations are directly affected by the quality and quantity of services provided by adjacent ecosystems, which makes knowledge of regional characteristics and impact of climate variability crucial. Here, we present precipitation data from six meteorological stations spread across several orographic zones of the eastern Andes in the surroundings of Bogotá, Colombia’s biggest urban agglomeration. The time series of rainfall data are analyzed statistically, examined regarding the occurrence of cyclicity in relation to ENSO, and correlated to the multivariate El Niño–Southern Oscillation index (MEI). Results offer no conclusive ENSO-related cycles but show that data of most of the stations are marked by annual or semestral cyclicity. There is no straightforward correlation between MEI and monthly precipitation values, and neither filtered nor lagged values showed any conclusive and significant correlation. Stations within the same orographic zones do not necessarily bring forth comparable statistical results. Temporal and spatial properties of precipitation appear to result from micro- and mesoscale topoclimates rather than from ENSO variability.
Abstract
Sustainable management of biodiversity requires a thorough understanding of local climate and weather, particularly in areas where ecosystems have been degraded and where life is highly adapted to or dependent on narrow ecological niches. Furthermore, society, economy, and culture of urban agglomerations are directly affected by the quality and quantity of services provided by adjacent ecosystems, which makes knowledge of regional characteristics and impact of climate variability crucial. Here, we present precipitation data from six meteorological stations spread across several orographic zones of the eastern Andes in the surroundings of Bogotá, Colombia’s biggest urban agglomeration. The time series of rainfall data are analyzed statistically, examined regarding the occurrence of cyclicity in relation to ENSO, and correlated to the multivariate El Niño–Southern Oscillation index (MEI). Results offer no conclusive ENSO-related cycles but show that data of most of the stations are marked by annual or semestral cyclicity. There is no straightforward correlation between MEI and monthly precipitation values, and neither filtered nor lagged values showed any conclusive and significant correlation. Stations within the same orographic zones do not necessarily bring forth comparable statistical results. Temporal and spatial properties of precipitation appear to result from micro- and mesoscale topoclimates rather than from ENSO variability.
Abstract
The Colorado River basin (CRB) supplies water to approximately 40 million people and is essential to hydropower generation, agriculture, and industry. In this study, a monthly water balance model is used to compute hydroclimatic water balance components (i.e., potential evapotranspiration, actual evapotranspiration, and runoff) for the period 1901–2014 across the entire CRB. The time series of monthly runoff is aggregated to compute water-year runoff and then used to identify drought periods in the basin. For the 1901–2014 period, eight basinwide drought periods were identified. The driest drought period spanned years 1901–04, whereas the longest drought period occurred during 1943–56. The eight droughts were primarily driven by winter precipitation deficits rather than warm temperature anomalies. In addition, an analysis of prehistoric drought for the CRB—computed using tree-ring-based reconstructions of the Palmer drought severity index—indicates that during some past centuries drought frequency was higher than during the twentieth century and that some centuries experienced droughts that were much longer than those during the twentieth century. More frequent or longer droughts than those that occurred during the twentieth century, combined with continued warming associated with climate change, may lead to substantial future water deficits in the CRB.
Abstract
The Colorado River basin (CRB) supplies water to approximately 40 million people and is essential to hydropower generation, agriculture, and industry. In this study, a monthly water balance model is used to compute hydroclimatic water balance components (i.e., potential evapotranspiration, actual evapotranspiration, and runoff) for the period 1901–2014 across the entire CRB. The time series of monthly runoff is aggregated to compute water-year runoff and then used to identify drought periods in the basin. For the 1901–2014 period, eight basinwide drought periods were identified. The driest drought period spanned years 1901–04, whereas the longest drought period occurred during 1943–56. The eight droughts were primarily driven by winter precipitation deficits rather than warm temperature anomalies. In addition, an analysis of prehistoric drought for the CRB—computed using tree-ring-based reconstructions of the Palmer drought severity index—indicates that during some past centuries drought frequency was higher than during the twentieth century and that some centuries experienced droughts that were much longer than those during the twentieth century. More frequent or longer droughts than those that occurred during the twentieth century, combined with continued warming associated with climate change, may lead to substantial future water deficits in the CRB.
Abstract
Irrigation has the potential to modify local weather and regional climate through a repartitioning of water among the surface, soil, and atmosphere with the potential to drastically change the terrestrial energy budget in agricultural areas. This study uses local observations, satellite remote sensing, and numerical modeling to 1) explore whether irrigation has historically impacted summer maximum temperatures in the Columbia Plateau, 2) characterize the current extent of irrigation impacts to soil moisture (SM) and land surface temperature (LST), and 3) better understand the downstream extent of irrigation’s influence on near-surface temperature, humidity, and boundary layer development. Analysis of historical daily maximum temperature (TMAX) observations showed that the three Global Historical Climate Network (GHCN) sites downwind of Columbia Basin Project (CBP) irrigation experienced statistically significant cooling of the mean summer TMAX by 0.8°–1.6°C in the post-CBP (1968–98) as compared to pre-CBP expansion (1908–38) period, opposite the background climate signal. Remote sensing observations of soil moisture and land surface temperatures in more recent years show wetter soil (~18%–25%) and cooler land surface temperatures over the irrigated areas. Simulations using NASA’s Land Information System (LIS) coupled to the Weather Research and Forecasting (WRF) Model support the historical analysis, confirming that under the most common summer wind flow regime, irrigation cooling can extend as far downwind as the locations of these stations. Taken together, these results suggest that irrigation expansion may have contributed to a reduction in summertime temperatures and heat extremes within and downwind of the CBP area. This supports a regional impact of irrigation across the study area.
Abstract
Irrigation has the potential to modify local weather and regional climate through a repartitioning of water among the surface, soil, and atmosphere with the potential to drastically change the terrestrial energy budget in agricultural areas. This study uses local observations, satellite remote sensing, and numerical modeling to 1) explore whether irrigation has historically impacted summer maximum temperatures in the Columbia Plateau, 2) characterize the current extent of irrigation impacts to soil moisture (SM) and land surface temperature (LST), and 3) better understand the downstream extent of irrigation’s influence on near-surface temperature, humidity, and boundary layer development. Analysis of historical daily maximum temperature (TMAX) observations showed that the three Global Historical Climate Network (GHCN) sites downwind of Columbia Basin Project (CBP) irrigation experienced statistically significant cooling of the mean summer TMAX by 0.8°–1.6°C in the post-CBP (1968–98) as compared to pre-CBP expansion (1908–38) period, opposite the background climate signal. Remote sensing observations of soil moisture and land surface temperatures in more recent years show wetter soil (~18%–25%) and cooler land surface temperatures over the irrigated areas. Simulations using NASA’s Land Information System (LIS) coupled to the Weather Research and Forecasting (WRF) Model support the historical analysis, confirming that under the most common summer wind flow regime, irrigation cooling can extend as far downwind as the locations of these stations. Taken together, these results suggest that irrigation expansion may have contributed to a reduction in summertime temperatures and heat extremes within and downwind of the CBP area. This supports a regional impact of irrigation across the study area.
Abstract
Woody plant cover, the area of the vertical projection of woody plants (trees, shrubs, and bushes), plays an important role in the structure and function of savanna ecosystems and is needed by the savanna modeling community. Recent problems facing savanna ecosystems such as woody plant encroachment and subsequent habitat fragmentation further underscore the relevance of regional-scale and even larger-scale woody plant cover mapping. The mixture of woody plants and herbaceous vegetation in savanna landscapes lends woody plant cover mapping to fractional representation. This study endeavors to develop a simple and reliable approach for fractional woody plant cover mapping in savanna ecosystems. It was tested in the savanna of central Texas, which features a wide woody plant density gradation. A multiple linear regression model was calibrated between orthophoto-based fractional woody plant cover and metrics derived from time series MODIS products of surface reflectance (MOD09A1) and fraction of photosynthetically active radiation (MOD15A2H). By applying this model, woody plant cover was extrapolated to Texas savanna at MODIS scale (500 m). Validation suggests a mean absolute error of 0.098 and an R-squared value of 0.60. This study demonstrates a potential approach for woody plant cover mapping in other savanna ecosystems of the world. It also highlights the utility of time series MODIS products in savanna woody plant cover estimation.
Abstract
Woody plant cover, the area of the vertical projection of woody plants (trees, shrubs, and bushes), plays an important role in the structure and function of savanna ecosystems and is needed by the savanna modeling community. Recent problems facing savanna ecosystems such as woody plant encroachment and subsequent habitat fragmentation further underscore the relevance of regional-scale and even larger-scale woody plant cover mapping. The mixture of woody plants and herbaceous vegetation in savanna landscapes lends woody plant cover mapping to fractional representation. This study endeavors to develop a simple and reliable approach for fractional woody plant cover mapping in savanna ecosystems. It was tested in the savanna of central Texas, which features a wide woody plant density gradation. A multiple linear regression model was calibrated between orthophoto-based fractional woody plant cover and metrics derived from time series MODIS products of surface reflectance (MOD09A1) and fraction of photosynthetically active radiation (MOD15A2H). By applying this model, woody plant cover was extrapolated to Texas savanna at MODIS scale (500 m). Validation suggests a mean absolute error of 0.098 and an R-squared value of 0.60. This study demonstrates a potential approach for woody plant cover mapping in other savanna ecosystems of the world. It also highlights the utility of time series MODIS products in savanna woody plant cover estimation.
Abstract
Clouds can modify terrestrial productivity by reducing total surface radiation and increasing diffuse radiation, which may be more evenly distributed through plant canopies and increase ecosystem carbon uptake (the “diffuse fertilization effect”). Previous work at ecosystem-level observational towers demonstrated that diffuse photosynthetically active radiation (PAR; 400–700 nm) increases with cloud optical thickness (COT) until a COT of approximately 10, defined here as the “low-COT regime.” To identify whether the low-COT regime also influences carbon uptake on broader spatial and longer temporal time scales, we use global, monthly data to investigate the influence of COT on carbon uptake in three land-cover types: shrublands, forests, and croplands. While there are limitations in global gross primary production (GPP) products, global COT data derived from Moderate Resolution Imaging Spectroradiometer (MODIS) reveal that during the growing season tropical and subtropical regions more frequently experience a monthly low-COT regime (>20% of the time) than other regions of the globe. Contrary to ecosystem-level studies, comparisons of monthly COT with monthly satellite-derived solar-induced chlorophyll fluorescence and modeled GPP indicate that, although carbon uptake generally increases with COT under the low-COT regime, the correlations between COT and carbon uptake are insignificant (p > 0.05) in shrublands, forests, and croplands at regional scales. When scaled globally, vegetated regions under the low-COT regime account for only 4.9% of global mean annual GPP, suggesting that clouds and their diffuse fertilization effect become less significant drivers of terrestrial carbon uptake at broader spatial and temporal scales.
Abstract
Clouds can modify terrestrial productivity by reducing total surface radiation and increasing diffuse radiation, which may be more evenly distributed through plant canopies and increase ecosystem carbon uptake (the “diffuse fertilization effect”). Previous work at ecosystem-level observational towers demonstrated that diffuse photosynthetically active radiation (PAR; 400–700 nm) increases with cloud optical thickness (COT) until a COT of approximately 10, defined here as the “low-COT regime.” To identify whether the low-COT regime also influences carbon uptake on broader spatial and longer temporal time scales, we use global, monthly data to investigate the influence of COT on carbon uptake in three land-cover types: shrublands, forests, and croplands. While there are limitations in global gross primary production (GPP) products, global COT data derived from Moderate Resolution Imaging Spectroradiometer (MODIS) reveal that during the growing season tropical and subtropical regions more frequently experience a monthly low-COT regime (>20% of the time) than other regions of the globe. Contrary to ecosystem-level studies, comparisons of monthly COT with monthly satellite-derived solar-induced chlorophyll fluorescence and modeled GPP indicate that, although carbon uptake generally increases with COT under the low-COT regime, the correlations between COT and carbon uptake are insignificant (p > 0.05) in shrublands, forests, and croplands at regional scales. When scaled globally, vegetated regions under the low-COT regime account for only 4.9% of global mean annual GPP, suggesting that clouds and their diffuse fertilization effect become less significant drivers of terrestrial carbon uptake at broader spatial and temporal scales.
Abstract
North Alabama is among the most tornado-prone regions in the United States and is composed of more spatially variable terrain and land cover than the frequently studied North American Great Plains region. Because of the high tornado frequency observed across north Alabama, there is a need to understand how land surface roughness heterogeneity influences tornadogenesis, particularly for weak-intensity tornadoes. This study investigates whether horizontal gradients in land surface roughness exist surrounding locations of tornadogenesis for weak (EF0–EF1) tornadoes. The existence of the horizontal gradients could lead to the generation of positive values of the vertical components of the 3D vorticity vector near the surface that may aid in the tornadogenesis process. In this study, surface roughness was estimated using parameterizations from the Noah land surface model with inputs from MODIS 500-m and Landsat 30-m data. Spatial variations in the parameterized roughness lengths were assessed using GIS-based grid and quadrant pattern analyses to quantify observed variation of land surface features surrounding tornadogenesis locations across spatial scales. This analysis determined that statistically significant horizontal gradients in surface roughness exist surrounding tornadogenesis locations.
Abstract
North Alabama is among the most tornado-prone regions in the United States and is composed of more spatially variable terrain and land cover than the frequently studied North American Great Plains region. Because of the high tornado frequency observed across north Alabama, there is a need to understand how land surface roughness heterogeneity influences tornadogenesis, particularly for weak-intensity tornadoes. This study investigates whether horizontal gradients in land surface roughness exist surrounding locations of tornadogenesis for weak (EF0–EF1) tornadoes. The existence of the horizontal gradients could lead to the generation of positive values of the vertical components of the 3D vorticity vector near the surface that may aid in the tornadogenesis process. In this study, surface roughness was estimated using parameterizations from the Noah land surface model with inputs from MODIS 500-m and Landsat 30-m data. Spatial variations in the parameterized roughness lengths were assessed using GIS-based grid and quadrant pattern analyses to quantify observed variation of land surface features surrounding tornadogenesis locations across spatial scales. This analysis determined that statistically significant horizontal gradients in surface roughness exist surrounding tornadogenesis locations.
Abstract
Landslide event inventories are a vital resource for landslide susceptibility and forecasting applications. However, landslide inventories can vary in accuracy, availability, and timeliness as a result of varying detection methods, reporting, and data availability. This study presents an approach to use publicly available satellite data and open-source software to automate a landslide detection process called the Sudden Landslide Identification Product (SLIP). SLIP utilizes optical data from the Landsat-8 Operational Land Imager sensor, elevation data from the Shuttle Radar Topography Mission, and precipitation data from the Global Precipitation Measurement mission to create a reproducible and spatially customizable landslide identification product. The SLIP software applies change-detection algorithms to identify areas of new bare-earth exposures that may be landslide events. The study also presents a precipitation monitoring tool that runs alongside SLIP called the Detecting Real-Time Increased Precipitation (DRIP) model that helps to identify the timing of potential landslide events detected by SLIP. Using SLIP and DRIP together, landslide detection is improved by reducing problems related to accuracy, availability, and timeliness that are prevalent in the state of the art for landslide detection. A case study and validation exercise in Nepal were performed for images acquired between 2014 and 2015. Preliminary validation results suggest 56% model accuracy, with errors of commission often resulting from newly cleared agricultural areas. These results suggest that SLIP is an important first attempt in an automated framework that can be used for medium-resolution regional landslide detection, although it requires refinement before being fully realized as an operational tool.
Abstract
Landslide event inventories are a vital resource for landslide susceptibility and forecasting applications. However, landslide inventories can vary in accuracy, availability, and timeliness as a result of varying detection methods, reporting, and data availability. This study presents an approach to use publicly available satellite data and open-source software to automate a landslide detection process called the Sudden Landslide Identification Product (SLIP). SLIP utilizes optical data from the Landsat-8 Operational Land Imager sensor, elevation data from the Shuttle Radar Topography Mission, and precipitation data from the Global Precipitation Measurement mission to create a reproducible and spatially customizable landslide identification product. The SLIP software applies change-detection algorithms to identify areas of new bare-earth exposures that may be landslide events. The study also presents a precipitation monitoring tool that runs alongside SLIP called the Detecting Real-Time Increased Precipitation (DRIP) model that helps to identify the timing of potential landslide events detected by SLIP. Using SLIP and DRIP together, landslide detection is improved by reducing problems related to accuracy, availability, and timeliness that are prevalent in the state of the art for landslide detection. A case study and validation exercise in Nepal were performed for images acquired between 2014 and 2015. Preliminary validation results suggest 56% model accuracy, with errors of commission often resulting from newly cleared agricultural areas. These results suggest that SLIP is an important first attempt in an automated framework that can be used for medium-resolution regional landslide detection, although it requires refinement before being fully realized as an operational tool.
Abstract
Trends and transitions in the growing-season normalized difference vegetation index (NDVI) from the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite sensor at 250-m resolution were analyzed for the period from 2000 to 2018 to understand recent patterns of vegetation change in ecosystems of the Arctic National Wildlife Refuge (ANWR) in Alaska. Statistical analysis of changes in the NDVI time series was conducted using the breaks for additive seasonal and trend method (BFAST). This structural change analysis indicated that NDVI breakpoints and negative 18-yr trends in vegetation greenness over the years since 2000 could be explained in large part by the impacts of severe wildfires. At least one NDVI breakpoint was detected in around 20% of the MODIS pixels within both the Porcupine River and Coleen River basins of the study area. The vast majority of vegetation cover in the ANWR Brooks Range and coastal plain ecoregions was detected with no (positive or negative) growing-season NDVI trends since the year 2000. Results suggested that most negative NDVI anomalies in the 18-yr MODIS record have been associated with early spring thawing and elevated levels of surface moisture in low-elevation drainages of the northern ANWR ecoregions.
Abstract
Trends and transitions in the growing-season normalized difference vegetation index (NDVI) from the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite sensor at 250-m resolution were analyzed for the period from 2000 to 2018 to understand recent patterns of vegetation change in ecosystems of the Arctic National Wildlife Refuge (ANWR) in Alaska. Statistical analysis of changes in the NDVI time series was conducted using the breaks for additive seasonal and trend method (BFAST). This structural change analysis indicated that NDVI breakpoints and negative 18-yr trends in vegetation greenness over the years since 2000 could be explained in large part by the impacts of severe wildfires. At least one NDVI breakpoint was detected in around 20% of the MODIS pixels within both the Porcupine River and Coleen River basins of the study area. The vast majority of vegetation cover in the ANWR Brooks Range and coastal plain ecoregions was detected with no (positive or negative) growing-season NDVI trends since the year 2000. Results suggested that most negative NDVI anomalies in the 18-yr MODIS record have been associated with early spring thawing and elevated levels of surface moisture in low-elevation drainages of the northern ANWR ecoregions.
Abstract
The Iowa Atmospheric Observatory was established to better understand the unique microclimate characteristics of a wind farm. The facility consists of a pair of 120-m towers identically instrumented to observe basic landscape–atmosphere interactions in a highly managed agricultural landscape. The towers, one within and one outside of a utility-scale low-density-array wind farm, are equipped to measure vertical profiles of temperature, wind, moisture, and pressure and can host specialized sensors for a wide range of environmental conditions. Tower measurements during the 2016 growing season demonstrate the ability to distinguish microclimate differences created by single or multiple turbines from natural conditions over homogeneous agricultural fields. Microclimate differences between the two towers are reported as contrasts in normalized wind speed, normalized turbulence intensity, potential temperature, and water vapor mixing ratio. Differences are analyzed according to conditions of no wind farm influence (i.e., no wake) versus wind farm influence (i.e., waked flow) with distance downwind from a single wind turbine or a large group of turbines. Differences are also determined for more specific atmospheric conditions according to thermal stratification. Results demonstrate agreement with most, but not all, currently available numerical flow-field simulations of large wind farm arrays and of individual turbines. In particular, the well-documented higher nighttime surface temperature in wind farms is examined in vertical profiles that confirm this effect to be a “suppression of cooling” rather than a warming process. A summary is provided of how the wind farm boundary layer differs from the natural boundary layer derived from concurrent measurements over the summer of 2016.
Abstract
The Iowa Atmospheric Observatory was established to better understand the unique microclimate characteristics of a wind farm. The facility consists of a pair of 120-m towers identically instrumented to observe basic landscape–atmosphere interactions in a highly managed agricultural landscape. The towers, one within and one outside of a utility-scale low-density-array wind farm, are equipped to measure vertical profiles of temperature, wind, moisture, and pressure and can host specialized sensors for a wide range of environmental conditions. Tower measurements during the 2016 growing season demonstrate the ability to distinguish microclimate differences created by single or multiple turbines from natural conditions over homogeneous agricultural fields. Microclimate differences between the two towers are reported as contrasts in normalized wind speed, normalized turbulence intensity, potential temperature, and water vapor mixing ratio. Differences are analyzed according to conditions of no wind farm influence (i.e., no wake) versus wind farm influence (i.e., waked flow) with distance downwind from a single wind turbine or a large group of turbines. Differences are also determined for more specific atmospheric conditions according to thermal stratification. Results demonstrate agreement with most, but not all, currently available numerical flow-field simulations of large wind farm arrays and of individual turbines. In particular, the well-documented higher nighttime surface temperature in wind farms is examined in vertical profiles that confirm this effect to be a “suppression of cooling” rather than a warming process. A summary is provided of how the wind farm boundary layer differs from the natural boundary layer derived from concurrent measurements over the summer of 2016.