Water management activities modify water fluxes at the land surface and affect water resources in space and time. Conventional understanding on the role of water management suggests that regulated river flow would be less sensitive to future climate conditions than natural flow in terms of the absolute changes in mean monthly flows. In this study the authors evaluate such an assumption by redefining sensitivity as the difference in the emergence of changes in cumulative distribution functions (CDFs) of future regulated and natural flows in response to climate change with respect to their respective historical regulated and natural flow conditions. The emergence of changes (shift in CDFs) in natural and regulated river flow regimes across the western United States from simulations driven by multiple climate models and scenarios were compared. Forty percent of Hydrologic Unit Codes 4 (HUC4s) over the western United States might perceive such a shift in seasonal regulated flow earlier than they would have seen in natural flow conditions, although the absolute change is smaller than that under natural conditions. About 10% of the regulated HUC4s see a delay and are therefore less sensitive to climate change. In the spring (MAM), the overall sensitivity tends to decrease as the level of river regulation increases, as expected. However, in the winter (DJF) and summer (JJA) seasons, the sensitivity tends to increase with increasing levels of regulation, with changes in smaller magnitudes than under natural conditions. The results could inform integrated assessment studies when designing adaptation strategies in the water–energy–food nexus.
The spatiotemporal patterns of water resources are expected to shift under climate change, especially in snow-dominated regions because of the near-surface warming caused by increases in greenhouse gas emissions (IPCC 2014). Barnett et al. (2005) investigated global snowmelt-controlled river basins and suggested that at least one-sixth of the global population that lives in these regions will experience a serious reduction in dry-season water availability within the next few decades. At the regional scale, Payne et al. (2004) and Hamlet et al. (2010) simulated twenty-first century hydrologic changes in the Columbia River basin and suggested that snow accumulations would be reduced due to rising temperatures; hence, summer flows would be systematically shifted to winter flows. Van Rheenen et al. (2004) assessed future water resources in the Sacramento and San Joaquin River basins and suggested that late spring snowpack reductions in the late twenty-first century would greatly reduce streamflow in winter, spring, and summer, especially in Southern California. Longer-term (e.g., millennium) significant shifts in water resources were also found to be profoundly nonstationary in some Rocky Mountain headwater basins (Fleming and Sauchyn 2013).
In addition to climate change, water management activities such as irrigation and flow regulation can shift the patterns of water resources. Most of these activities are associated with dams and reservoirs, which are constructed to support a wide range of uses, including flood control, irrigation, navigation, recreation, and hydropower. These reservoirs have greatly reshaped streamflow in many of the world’s rivers. Vörösmarty and Sahagian (2000), for instance, showed that water now takes several months longer to reach the mouths of some of the largest rivers than it did prior to dam construction. Water management activities also change the water storage distribution throughout the river basins. Zhou et al. (2016) analyzed reservoir storage in 32 global basins and found that in some heavily dammed basins (e.g., Yellow River basin), seasonal reservoir storage variations account for nearly 70% of natural storage variations (soil moisture plus snow water equivalent). Water management activities increase the complexity of water resources distributions in space and time and modulate the sensitivity of flow regimes to future climate changes. Sensitivities of managed water resources to climate change are typically evaluated with respect to the ability of reservoir operations to maintain performance targets, that is, flood control, and decreased seasonality and interannual variability for stable water supply, hydropower, recreation, navigation, etc. (Van Rheenen et al. 2004; Christensen et al. 2004; Vano et al. 2010; Georgakakos et al. 2014; AghaKouchak et al. 2015; Nazemi et al. 2017). For example, the third National Climate Change Assessment report (Georgakakos et al. 2014) evaluates how water management mitigates more frequent and intense extreme events (drought, floods). There is therefore a common understanding that reservoir operations mitigate climate change–induced natural flow changes and decrease the flow sensitivity. However, the report also emphasizes that water management in many U.S. regions will face challenges unseen before to an extent that cannot be addressed by current practices.
Water management stakeholders are investigating how to invest in future activities, which motivates research in the water–energy–food nexus (e.g., Howells et al. 2013; FAO 2014; Adam et al. 2015; Hejazi et al. 2015; Kraucunas et al. 2015). Climate change adaptation strategies are designed under the assumption that there will be continuous investments from the energy, water, and food stakeholders. In practice, the same stakeholders try to strategize if they need to invest and decide when they should start investing. The timing of the shift in projected regulated flow regime with respect to its own history has not been investigated and could inform integrated assessment research when representing regional adaptation strategies.
The tipping point when a shift in a time series rises above the noise of historical variability can be described as the emergence of significant change in the cumulative distribution functions (CDFs). The use of this concept has been mainly focused on climatic variables such as mean temperature and precipitation (e.g., Battisti and Naylor 2009; Diffenbaugh and Scherer 2011; Mahlstein et al. 2011; Hawkins and Sutton 2012) or extremes (e.g., Maraun 2013; Scherer and Diffenbaugh 2014). Recently, this concept has been adopted for analyzing shifts in hydrological patterns due to climate change (Kroll et al. 2015). For example, Leng et al. (2016) investigated the emergence of significant changes in the spatially distributed natural surface runoff over the conterminous United States (CONUS) across 31 global circulation models (GCMs) and four emission scenarios (Brekke et al. 2014). They found that significant changes in seasonal runoff are projected to occur after the 2040s across most of the country, but that the emergence of new hydrologic regimes—specifically, reduced summer runoff due to a combination of enhanced evaporative demands and increasing winter runoff caused by more liquid precipitation—will be earlier in much of the western United States. However, emergences of changes in both natural and regulated flows in the context of climate change have not been addressed.
In this study, our scientific goal is to assess the emergence of change in regulated flow and understand regional differences and accordingly understand its relationship, if any, with the level of water management. We tested two hypotheses to achieve this scientific goal. The first is that water management activities will decrease the sensitivity (delay the emergence of significant change) to climate change with respect to how changes would be perceived under natural flow conditions (e.g., Leng et al. 2016). The second hypothesis is that the level of sensitivity varies with the level of water management. Understanding the combinations of natural characteristics and human operations under which the sensitivity of the emergence of change might be exacerbated or alleviated could help assess the effectiveness of mitigation strategies that use an integrated modeling framework and could help quantify the tipping points.
We leveraged previous research on the emergence of change on runoff over the CONUS by using a large-scale river routing and water management model. We focused on the western United States, which is a water management intensive region associated with large reservoir capacities for flood control (e.g., in the Columbia River basin), water supply (e.g., in the Colorado River basin and California) given the relatively high irrigation demand, and high hydroelectricity generation potential (e.g., in the Columbia River basin). Because we focus on flow rather than spatially distributed runoff, our analyses were performed based on simulated streamflow at the Hydrologic Unit Code 4 (HUC4) level to investigate the overall impacts of water management. The locations of the modeled reservoirs and the outlet of each HUC4 region used for the analysis are marked in Fig. 1.
2. Data and method
a. Climate and runoff projection data
Two representative concentration pathway (RCP) scenarios (RCP4.5 and RCP8.5) were selected to force future river runoff modeling. They reflect anthropogenic radiative forcing reaching 4.5 and 8.5 W m−2 by the end of twenty-first century, respectively (Moss et al. 2010). Three models (CCSM4, GFDL CM3, and INM-CM4.0) from the Coupled Model Intercomparison Project phase 5 (CMIP5; Taylor et al. 2012) were selected to account for uncertainties in projected future precipitation and temperature across different models for each RCP. CCSM4 tends to be middle of the range and has been used in previous relevant analyses (Voisin et al. 2013b; Hejazi et al. 2015). The GFDL CM3 and INM-CM4.0 models were arbitrarily chosen because they generally exhibit warmer/wetter and cooler/drier conditions, respectively. Figure 2 shows that these three models span much of the range of the CMIP5 projections of future temperature and precipitation over our study domain (introduced in section 2d). Our analyses were based on all three GCMs and two RCP scenarios, but only RCP8.5 is presented in the plots. Additional plots with RCP4.5 can be found in the online supplemental material.
The spatially distributed runoff fields over the western United States for these three GCMs and two RCPs were obtained from the Bureau of Reclamation database (Brekke et al. 2014; available at ftp://gdo-dcp.ucllnl.org/pub/dcp/archive/cmip5/hydro/). To produce these runoff fields, the CMIP5 climate projections were first statistically downscaled to 0.125° × 0.125° resolution and bias corrected against the observed climate using a bias correction and spatial disaggregation (BCSD) approach (Wood et al. 2004). This approach ensures that the bias-corrected CMIP5 monthly climatology is the same as the observations over the historical period (1950–99). The BCSD CMIP5 climate forcings, including precipitation and temperature, were then used to drive the Variable Infiltration Capacity (VIC) model (Liang and Lettenmaier 1994) to simulate the projected runoff fields. The VIC model is a spatially distributed, physically based, macroscale hydrological model that has been widely used for hydrologic modeling at regional and global scales (e.g., Wood et al. 2004; Christensen and Lettenmaier 2006; Vano et al. 2015; Zhou et al. 2016). These projected runoff fields were evaluated over the CONUS basins and have been found to have a high level of consistency in terms of the seasonal mean and variability compared with the historical VIC simulations (Wood and Mizukami 2014).
b. Water management model
The Model for Scale Adaptive River Transport (MOSART; Li et al. 2013) is a river routing model and was used to simulate future “natural” river flow. The large-scale water management (WM) model (Voisin et al. 2013a), which is coupled with MOSART, simulates “regulated” river flow through the representation of spatially distributed water extractions based on local water demands and flow regulations at dams. The water demand that cannot be met locally at each grid cell is allocated across reservoirs based on elevation constraints, reservoir capacity, and water availability. For the flow regulation module, each reservoir’s operating rules are calibrated based on historical long-term mean monthly inflow and associated water demand, reservoir characteristics, and reservoir purpose (flood control, irrigation, combination of flood control and irrigation, other). The regulation includes minimum release for environmental flow (Voisin et al. 2013a). Reservoir operations have a monthly pattern that is shifted to represent interannual variability in water availability. Releases are further adjusted at a daily time scale (or shorter) to account for reservoir characteristics constraints (minimum storage, maximum capacity, minimum flow release). The WM model has been shown to represent seasonal variations in regulated river flows across the United States (Voisin et al. 2013a,b, 2017; Hejazi et al. 2015) and is similar to other large-scale water management applications in terms of handling water withdrawal and dam operations (van Vliet et al. 2012; Biemans et al. 2011; Döll et al. 2012; Hanasaki et al. 2006). Readers are referred to Voisin et al. (2013a) for details about the model structure. Note that the objective of the analysis is to explore regional trends in the emergence of change in regulated flow. We anticipate that the HUC4 basin-specific accuracy might change when using a more complex and operational water management model, which should be used for local decision-making. The simplified macroscale water management model used here represents the overall seasonal feature of river regulation and allows such an analysis being performed using a consistent model throughout the extended domain.
The MOSART–WM model was forced by the projected runoff fields at 0.125° × 0.125° spatial resolution described above at a daily time step and by gridded water demand from the integrated Global Change Assessment Model (GCAM; Hejazi et al. 2015). GCAM water demands have been calibrated and evaluated with respect to the demands reported by the U.S. Geological Survey (USGS; Kenny et al. 2009). GCAM is a partial equilibrium, integrated assessment model that considers six water demand sectors (irrigation, livestock, municipal, electricity generation, primary energy, and manufacturing) in addition to a number of socioeconomic variables such as population, labor productivity, and technology (Edmonds and Reilly 1985; Edmonds et al. 1997; Kim et al. 2006).
Because our focus in this paper is on comparing how water management changes water resources sensitivity to climate change, and not with socioeconomic change, which is another large source of uncertainty (Voisin et al. 2016), we chose to fix the water demands at year 2010 levels, which was the latest initialization year in the current GCAM data system (http://jgcri.github.io/gcam-doc/). This fixed demand approach is consistent with other watershed-scale climate change impact assessment studies of western U.S. water resources (e.g., Christensen et al. 2004; Van Rheenen et al. 2004; Payne et al. 2004; Vano et al. 2010), where water demands reflect prior appropriation water rights. Seasonal reservoir operations were calibrated to historical natural flow and 2010 water demands and remained unchanged over the simulation period in all experiments. No new water management activities (e.g., adaptations to water stress, dam removal, dam retrofit) were added to any of the simulations because doing so was beyond the scope of this study.
The MOSART–WM model was run at a daily time step from 1986 to 2095 after a 5-yr spinup period. The streamflow was extracted from the HUC4 outlets (Fig. 1) and aggregated to monthly time steps for the analysis.
c. Evaluation of the modeling framework
The performance of routed natural (MOSART) and regulated (MOSART–WM) flows was evaluated by comparing each with naturalized and observed flow at the outlets of five major HUC2 regions in the western United States where naturalized flows are available (Table 1). In regions that cover multiple river basins, gauges were selected based on the availability of naturalized flow (e.g., Sacramento River for California). The naturalized flow was obtained from Maurer and Wood (2002) and Livneh et al. (2013). The observed flow was downloaded from the USGS gauge data website (http://waterdata.usgs.gov/nwis). The monthly naturalized and observed flow time series were averaged over 20 years from 1977 to 1997 in the five regions to generate seasonal water supply curves corresponding to the historical period and were compared with MOSART and MOSART–WM simulation results during the same time period (Fig. 3, left). Generally speaking, the seasonality of routed flow from MOSART, which ultimately represents the projected runoff fields, reproduces the seasonality and annual water balance of naturalized flow reasonably well, and the MOSART–WM regulated flow simulations, which account for water regulation and extraction due to human activities, show a decrease in the annual flow and a reduction in the seasonal variability in the flow climatology, which is consistent with the difference between observed regulated and naturalized flows. To quantify the ability of the MOSART–WM model to capture the overall reshaping of seasonal flow patterns due to water management activities, the right panels of Fig. 3 show the percent change from natural to regulated monthly flows (or flow alteration due to human activities AQ) for both observations and simulations in the five regions where the naturalized flow was available, calculated as follows:
where is the regulated flow, is the natural flow, and is the mean natural flow. In general, the MOSART–WM model appears to capture the overall seasonality and magnitude of flow alterations due to water withdrawals and dam operations in the study region.
d. Spatial scale of the analysis
Water management redistributes resources in space and time and demonstrates high spatial heterogeneity in the redistribution (Voisin et al. 2017). Therefore, an analysis at the HUC2 scale could dilute the sensitivity of managed water resources (e.g., regulated flow) to climate change. For the analysis, we use 104 hydrologic units at HUC4 regions over the western United States based on the USGS Watershed Boundary Dataset (https://nhd.usgs.gov/wbd.html; Fig. 1).
We regionalize the interpretation of the HUC4 results by HUC2 river basins to indicate spatial and hydroclimate characteristics. We also interpret the results at the HUC4 scale based on the level of water management. A regulation factor (RF) was used to evaluate the level of water management on the streamflow:
where CVn and CVr are the coefficients of variation (i.e., ratio of the standard deviation σ to the mean μ: CV = σ/μ) of the monthly natural and regulated flow in the 1986–2015 period, respectively. Out of 107 HUC4 regions across our study domain, 42 had RF values less than or equal to 1, indicating that water management had limited/no impact on reducing the seasonal variations in natural flow, and therefore it was excluded in our analysis (gray area in Fig. 1). These HUC4 watersheds are located in the arid area (e.g., the Great Basin) or in a less populated area without major water management facilities (e.g., headwater subbasins in the Missouri and Arkansas basins).
e. Detecting emergence of change
To study the emergence of new hydrologic regimes, we evaluated the emergence of change in regulated flow using the two-sample Kolmogorov–Smirnov (K–S) test, similar to the analysis in Leng et al.’s (2016) for projected natural runoff over the CONUS.
The two-sample K–S test has been widely applied in climate research as a goodness-of-fit statistic tool that compares two empirical distribution functions (Von Storch and Zwiers 1999). The empirical cumulative density functions (eCDFs) of the historical and future time series are calculated as follows
where is the empirical eCDF of a time series for variable x, and n is the number of elements in the sample.
The distance between empirical eCDFs of two samples is calculated as follows:
where and are the sample sizes of the two time series, and sup is the supremum function. The null hypothesis of the K–S test is that the two samples are from the same distribution, which could be rejected at a significance level α if
where is the inverse of the Kolmogorov distribution at α. The smaller the value of Dm,n, the more similar will be the distributions between the two samples. In this study, a significant change was considered detected when the null hypothesis is rejected at α = 0.05.
The K–S test was applied to simulated natural and regulated flows at the HUC4 outlets in the study domain. The emergence of change in the annual/seasonal mean flow was analyzed at an annual time step with a 30-yr base period from 1986 to 2015 and a 30-yr sliding window centered by the testing years from 2016 to 2081 (Fig. 4). For each testing year, the K–S test was performed to determine where there is a significant amount of change. To ensure that the emergence of significant change represents a continuous, systematic shift in the distribution, here we define the change time as the year after which 90% of the K–S tests are rejected (significant change). For those time series that have no significant changes over the testing period, the flow regime is treated as not sensitive to climate change so that the time of emergence was set to 2081 (maximum) for analysis purposes. This is because 2081 is the center of the most distant 30-yr moving window (2066–95) to perform the test.
Figure 4 illustrates our application of the K–S test to natural and regulated flow time series. The year that the null hypothesis of the K–S test was rejected (i.e., the two CDFs are different) represents the emergence of change with respect to the historical period under natural or regulated conditions. Given that the emergence of change is a perceived change in flow regime with respect to a previous period, the difference in the emergences of change between regulated and natural flow conditions allows us to assess if regulated flow is more or less sensitive to climate change than its corresponding natural flow. Thereafter, we define sensitivity (to climate change) as the difference in the emergence of change between regulated and natural flows, where a positive (or negative) difference indicates a later (or earlier) emergence of change in regulated flow with respect to natural flow.
The hypotheses are that regulated flow will have less sensitivity to climate change (later emergence of change) than natural flow conditions and that the sensitivity is a function of the level of water management. From the evaluation of the modeling framework and the flattening effect of water management on flow seasonality (section 2d), the shifts of the CDFs (emergence of the change) in regulated flow might be earlier than under natural conditions, although the changes on the flow should be smaller in conventional understandings. In the results below, we first compare the impact of climate change on monthly natural and regulated flows. This first comparison allows us to quantify the magnitude of changes that are being evaluated by the K–S test for both natural and regulated flows. We then analyze the emergence of change in both natural and regulated flows by regions and seasons and then by level of water management.
a. Evaluation of the projected mean monthly streamflow changes
We evaluated the changes on mean monthly streamflow at the same five regions shown in Fig. 3. Here we extracted 30-yr averaged monthly flow from 2036 to 2065 (to represent the flow at 2050) at each HUC4 outlet under CCSM4, RCP8.5 conditions and compared it with the flow from 1986 to 2015 (historical period in the K–S test). The absolute changes under natural and regulated conditions relative to the historical period were compared for each month using box plots of the HUC4 flows (Fig. 5). The results clearly showed that future changes were more prominent under natural conditions than regulated conditions in almost every month across these regions, indicating that water management reduces the streamflow response to future climate changes in terms of the monthly mean (which is the conventional understanding), but not necessarily the distribution. The same analysis based on the other two GCMs and under RCP4.5 conditions showed similar patterns but with slightly different magnitudes of changes.
b. Emergence of change on natural and regulated flow
The first hypothesis that water management reduces the sensitivity in the emergence of change, that is, later emergence of change, needs to be evaluated because the lower variability in regulated flow climatology caused by water management activities might challenge it. Here we focus on the RCP8.5 scenario case because it is the worst-case scenario in terms of greenhouse gas emissions: it corresponds to the highest increase in air temperature and therefore decreased snowpack, which should affect both natural and regulated flows. The results based on RCP4.5 are reported in the text, and related figures are provided in the supplemental material. The K–S test evaluation was performed based on the monthly streamflow at each selected HUC4 outlet at the mean annual and seasonal (MAM, JJA, SON, and DJF) time scales for all three GCMs.
The comparisons between the emergences of change in natural and regulated flows were performed at the HUC4 scale and analyzed by HUC2 region in order to evaluate potential regional variabilities (Fig. 6). In the upper Colorado, lower Colorado, and California regions, the regulated flows are more sensitive than the natural flows in most seasons, as we hypothesized. Note that those results do not take into consideration changes in water demand, which was held at the 2010 level in this analysis (to be discussed in detail in section 4). However, in the Arkansas and Texas-Gulf regions, regulated flow shows earlier changes (more sensitive) than natural flow in most seasons, which is the opposite of our hypothesis. In other basins, our hypothesis only stands for certain seasons. For example, in the Missouri region, earlier emergence of change was detected in the regulated MAM flows in all three GCMs, while no obvious alteration was found in annual mean and other seasonal flows. In the Pacific Northwest region, flow regulations reduce the sensitivity in most of the seasons except for SON. These results indicated that water management activities play complex roles at various spatiotemporal scales in terms of managing the change and emergence of change in natural flow due to climate change. The results also implied that the sensitivity had a strong seasonality across the testing areas. The results based on RCP4.5 yield the same pattern (Fig. S1).
Our next analysis focused on seasonal variations of the sensitivity across the entire western United States. Here we calculated the differences in the emergence of change (averaged over three GCMs) between regulated and natural streamflows at each HUC4 outlet and ranked them to create a Weibull distribution (Fig. 7). The results also revealed that sensitivities in the emergence of change are different across annual and seasonal means. At an annual time scale, the sensitivity (delay years caused by regulation) is negative in 40% and positive (our hypothesis) in about 10% of the HUC4 regions. This implies that we observe an earlier shift of the distribution in regulated flow, but associated with a smaller change in regulated flow than that in the natural flow. At a seasonal time scale, these two fractions can change to 20% and 55% in MAM, 30% and 50% in JJA, 40% and 25% in SON, and 30% and 40% in DJF. This seasonal increase in the number of HUC4 regions with positive sensitivity indicates that the operations associated with reservoir storage characteristics manage the change in natural flow to a level of change beyond the emergence of change in natural flow. This might be because in most of the regions reservoirs can store the seasonal snowmelt freshet and thus alleviate the anomalies induced by seasonal climate changes.
Spatial distribution of the seasonal sensitivities (Fig. 8) features basin-scale patterns. The rationale behind this might be that the HUC4 regions in the same river basin have similar hydrometeorological features and share the total irrigation demand. Large contrast is observed between the Missouri River basin and Pacific Northwest region (mostly the Columbia River basin). In the Missouri region, most of the HUC4 regions experience negative sensitivity to climate change in JJA, SON, and DJF, but positive sensitivity in MAM, while the Columbia River basin shows negative sensitivity in SON and positive sensitivity in all other seasons. Some other basins also have spatial and seasonal patterns, but generally in the Arkansas, Rio Grande, lower Colorado, Great Basin, and Texas-Gulf regions there is little difference in the emergences of change between regulated and natural flows, indicating that the current operations result in a significant change in regulated flow similar to the emergence of change in natural flow. Conversely, the Pacific Northwest, upper Colorado, California, and Missouri regions show negative sensitivities in one or multiple seasons, implying that significant change in regulated flow will be triggered by the change in natural flow below the natural flow emergence of change. We applied the same analysis for the RCP4.5 scenario (see Fig. S2), and similar spatial patterns but a lesser magnitude of the impacts were detected over the western United States. This indicates that the RCP scenarios only change the magnitude but not the direction of impacts on the sensitivities. These results suggested that our first hypothesis (see section 3b) is rejected in most of the basins over the western United States, except for some areas in specific seasons. These findings then further lead to our second hypothesis, that the level of regulation can affect that sensitivity.
c. Regulated flow sensitivity to climate change related to the level of regulation
To test our second hypothesis, we use the RF value introduced in section 2d to represent the level of regulation applied on the streamflow at each HUC4 outlet. The sensitivities were averaged based on three GCMs and ranked for each season with respect to the ascending RF values (Fig. 9). A Mann–Kendall test was then applied for each season to test whether there is a significant trend in the sensitivity (p values noted in Fig. 9). The results revealed that with an increasing level of water management, a significant positive trend was detected in MAM sensitivities at α = 0.05, implying that regulated flow over the spring is not responding significantly to changes in natural flow until it reaches beyond the level of the emergence of change. In another word, the regulated flow is less sensitive to climate change than natural flow in the spring. Moreover, there seems to be a correlation between the level of water management and the level of changes in flow for which water management can compensate. However, the JJA and DJF sensitivities have significant negative trends at α = 0.05, featuring positive (negative) sensitivities for low (high) levels of water managements. We note that for the top 25% of the HUC4 subbasins with the highest level of water management, the DJF and JJA sensitivities are mostly negative and drive the annual sensitivities. Counterintuitively, for very large storage capacities, regulated flow tends to be more sensitive than natural flow. With larger storage capacities, the seasonality and interannual variability is most controlled and flattened out, and small variations can lead to this counterintuitive result. The representation of complex cascading effects and a more accurate representation of operating rules could affect this result in terms of the level of sensitivity. However, our results suggest that the complex reservoir systems could reach their limits in the flexibility of operations to adapt to anticipated changes. It remains beyond the scope of this analysis to provide quantitative measures for the flexibility of operations themselves. Nevertheless, the insight on linking the level of regulation to the timing of the emergence of change sheds light on the need to assess future changes in the water–energy nexus using an integrated approach and provides guidance on designing regional adaptation strategies to cope with such changes. The results are similar under the RCP4.5 scenario (Fig. S2) and show similar trends but lower sensitivities overall.
In this study, we quantified the projected emergence of changes in regulated flow regimes over the western United States in the twenty-first century under historical operations using 2010 water demands. We evaluated those emergences of change relative to those in natural flow conditions. The differences in the emergence of changes reflect the sensitivity of regulated flow to climate change. For a positive (or negative) sensitivity, that is, later (or earlier) emergence of change under regulated flow, this implies that the regulated flow is less (more) sensitive to climate change than natural flow (with respect to different changes). The approach has direct implications for understanding when water management practices might need to be adjusted and understanding the cascade of changes in natural flow when water resources are managed. It also has implications to inform integrated assessment models in their assumptions on regional water–energy–food adaptation strategies: regions might not invest at the same time, which would affect the evaluation of adaptation strategies. We also acknowledge that the statistical method (K–S test) we applied here focuses on the shifts in flow distribution and is not the only approach to detect flow alterations (Fleming and Sauchyn 2013; Kroll et al. 2015). Other approaches that focus on other aspects (e.g., peak flow, flow variability) may yield different results.
Overall, we found that 40% of the HUC4 subbasins over the western United States have a negative sensitivity at the annual time scale, while only 10% have a positive sensitivity. There are large seasonal variations with a tendency overall for the majority of HUC4 subbasins to have a positive sensitivity in MAM. The hypothesis that water management over the western United States has enough flexibility to alleviate changes in natural flow is only valid at the seasonal time scale for MAM and is rejected at the annual time scale. Furthermore, we note that HUC4 subbasins that have a higher degree of management tend to have negative summer and winter sensitivities, implying that water management is more sensitive to climate change.
To investigate the mechanism for how flow regulation changes the sensitivity to climate changes in natural flow, we extracted mean DJF flow from the simulation driven by CCSM4 under the RCP8.5 scenario from the Columbia River basin and Missouri River basin outlets to evaluate the general trend of regulated flow relative to the natural flow. In the Columbia River basin (Fig. 10a), because of climate change, natural flow increases in DJF over the twenty-first century, while the flow regulation tends to release water in DJF from reservoirs to accommodate the spring flood storage (Voisin et al. 2013a). With more natural streamflow in DJF and less spring flood (Elsner et al. 2010), reservoirs are filled faster but do not fill entirely before April or May. Therefore, the sensitivities are positive because the reservoir storage allows us to alleviate changes in natural flow.
In the Missouri River basin, the situation is different; sensitivities are negative in DJF but slightly positive in MAM (see Fig. 8). Figure 10b shows that the DJF flow is projected to slightly increase under water management at a faster rate than under natural conditions, which reflects flood control operations and is balanced by decreasing regulated flow in MAM. The coordination in operations between reservoirs is oversimplified in the WM model presently. Combined with a number of large reservoir projects along the main stem, water management seems to be more sensitive to a change in natural flow; significant change (early emergence of change) occurs before natural flow reaches a significant change of its own. Here we highlight a source of uncertainty in our results for basins undergoing higher levels of water management (high ratio of storage over annual flow) if coordination between reservoirs is not well represented in the modeling. The sensitivities might be overestimated over complex systems, although the complexity in reservoir operations also is a source of the lack of flexibility in operations. Understanding the uncertainty in sensitivities due to uncertainties in the representation of operations and level of flexibility in operations for adaptation is beyond the scope of this analysis because it would require basin-specific studies. This analysis highlights the need to look at the seasonal-scale HUC4 scale but with regional interpretation, level of water management, and interactions between water management infrastructures to understand the nonlinear response of managed water resources to changes in natural water availability under climate change.
We have presented results based on the RCP8.5 and RCP4.5 scenarios. Large uncertainties also exist in the emergence of changes in runoff depending on the GCM used (Leng et al. 2016). We selected two other GCMs to complement CCSM4, which bound the projected changes in precipitation and temperature in the CMIP5 archive. Using the emergence of change in natural flow driven by CCSM4 under the RCP8.5 scenario as a baseline, we present here the deviations in emergence of change in natural flow when using different RCPs (RCP4.5 and RCP8.5), different GCMs (GFDL CM3 and INM-CM4.0), and with/without flow regulations (natural and regulated) (Fig. 11). The results suggest that the deviations related to the choice of GCM models (>60 years in the Missouri River basin) are larger than the deviations related to emission scenarios (>40 years in the Pacific Northwest) over all regions in the western United States. Deviations related to the representation of water management in modeling are generally smaller than those related to GCMs and RCP scenarios but remain at the same order of magnitude, which motivates further investigation of the role of water management, that is, overall impact and also timing on perception of change, in climate adaptation studies.
Structural uncertainties and limitations in the water management module used in this study include the lack of representation of groundwater withdrawals (e.g., in the upper Snake River basin), interbasin transfers (e.g., in the Colorado River basin), and feedbacks into the hydrologic models for changes in evapotranspiration affecting crop water demand and local climate. There are also inherent uncertainties in the evolution of water demand under future conditions (Hejazi et al. 2015; Nazemi and Wheater 2015), which tend to drive long-term changes in supply deficit and regulated flow, while changes in runoff tend to drive higher-frequency changes such as reservoir releases for the purposes of flood control and environmental regulations (Voisin et al. 2016). In this study, we used a fixed 2010 level of water management in the WM model to simulate future regulated flow. The water demand is projected to increase and could potentially challenge water management more than climate change (Voisin et al. 2016; Georgakakos et al. 2014). We left out the projection of water demand for this analysis because such projections are highly uncertain. However, it is worth mentioning that they could further challenge the hypothesis that water management can delay the emergence of change, therefore strengthening our conclusion that shifts in regulated flow regimes could occur earlier than that in natural flow regimes. We also acknowledge that the reservoir operations represented in MOSART–WM are relatively simple and focus on management of seasonality and therefore differ from actual operational rules. For more specific estimates of the emergence of changes, one would run operational water resources management models with scenarios for projections of inflow and associated water demands under different sectors, operating policies, and constraints. We have captured the uncertainties in RCPs and GCMs in our analysis by isolating the uncertainties in the water demand and in the hydrology model parameterizations. The water management model is calibrated with respect to mean monthly simulated flow. To further discuss uncertainty in our experiment, we adjusted the calibration of the water management model to future (2050) flow conditions (mean flow from 2036 to 2065) and still kept the 2010 level of water demand. Under those adjusted operating rules, the emergence of change in regulated flow tended to be at a later date than under historical operating rules (Fig. S4). The overall results in the differences in the emergence of change between regulated and natural flow were, however, consistent.
Even though water management tends to mitigate the magnitude of the impact of climate change on natural flow, 40% of the HUC4 subbasins over the western U.S. regions could perceive an emergence of change earlier than under natural flow conditions. However, the change in regulated flow is smaller than the change in natural flow conditions. The sensitivity varies remarkably across the seasons. We also find that the sensitivity is related to the level of water management. Summer and winter regulated flows tend to show increasing sensitivity to climate change as the level of regulation increases, prompted by more flattened-out flow. The analysis constitutes the first large-scale investigation of emergence of change in regulated flow and reveals regional diversity as well as its relationship to the level of water management. The findings have implications for water–energy–food nexus research and motivate the need to better understand regional interconnections. They could also inform integrated assessment research for taking into account those regional differences when designing, incorporating, and evaluating water–energy–food nexus adaptation strategies.
The integrated hydrologic simulations were conducted under the Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory, a multiprogram national laboratory operated by Battelle for the U.S. Department of Energy under Contract DE-AC05-76RL01830. The regional analysis and writing of the paper were supported by the U.S. Department of Energy, Office of Science, as part of the Integrated Assessment Research Programs.
Supplemental information related to this paper is available at the Journals Online website: https://doi.org/10.1175/JHM-D-17-0095.s1.
Current affiliation: Environmental Change Institute, University of Oxford, Oxford, United Kingdom.