A drought monitoring system (DMS) can help to detect and characterize drought conditions and reduce adverse drought impacts. The authors evaluate how a DMS for Washington State, based on a land surface model (LSM), would perform. The LSM represents current soil moisture (SM), snow water equivalent (SWE), and runoff over the state. The DMS incorporates the standardized precipitation index (SPI), standardized runoff index (SRI), and soil moisture percentile (SMP) taken from the LSM. Four historical drought events (1976–77, 1987–89, 2000–01, and 2004–05) are constructed using DMS indicators of SPI/SRI-3, SPI/SRI-6, SPI/SRI-12, SPI/SRI-24, SPI/SRI-36, and SMP, with monthly updates, in each of the state’s 62 Water Resource Inventory Areas (WRIAs). The authors also compare drought triggers based on DMS indicators with the evolution of drought conditions and management decisions during the four droughts. The results show that the DMS would have detected the onset and recovery of drought conditions, in many cases, up to four months before state declarations.
Droughts can cause significant economic losses that reach all levels of society. Between 1980 and 2005, droughts and heat waves in the United States inflicted an estimated $174 billion (2009 U.S. dollars) in damages (Lott and Ross 2006). Since 1963, 46 federal drought declarations have been made across the United States (see http://www.fema.gov/news/disasters.fema). Despite its water-abundant reputation, the state of Washington has experienced two major statewide droughts in the last decade (2000–01 and 2004–05). Both droughts resulted in large economic losses: about $359 million and $542 million (2009 dollars), respectively (Fontaine and Steinemann 2009).
Future water availability in the state is projected to decline, owing in part to global warming and resultant declines in snowpack (Barnett et al. 2008; Mote et al. 2005, 2008; Elsner et al. 2010). This suggests an increased likelihood of future droughts and a need to shift drought management strategies from reactive to proactive (Wilhite 2000). Proactive drought management systems depend on timely and accurate information about the evolution of drought conditions and water supply outlooks (Hayes et al. 2004). Drought indicators are one element of proactive strategies that can detect and characterize drought conditions. Drought triggers, or specific values of drought indicators, can represent classes of drought severity and be linked to drought responses to reduce impacts (Steinemann 2003).
In addition to the characterization of drought based on hydrologic variables [e.g., precipitation, runoff, soil moisture (SM)], drought can also be characterized by its temporal extent and persistence. The impact of drought depends on not only the indicator but also on the potential uses of water over the time period and region of interest. For example, SM deficiencies with duration as short as one month can result in severe impacts on agricultural production if they occur during times of maximum crop water use. One of the costliest U.S. droughts to date was the 1988 drought, when the federal government spent $6.7 billion on drought relief programs and $4.3 billion (both in 2009 dollars) on farm credit programs (Riebsame et al. 1991). That drought lasted less than six months but was closely aligned with the spring and summer growing season in the most agriculturally productive part of the United States. Lack of streamflow to sustain low flows is another key drought concern, with environmental consequences that are less amenable to economic valuation. Furthermore, in the case of streamflow, the temporal lag between drought onset and its impact can be influenced by reservoir storage. Taken together, all of these factors highlight the importance of drought monitoring systems (DMSs) that can provide information about drought conditions at different time scales and for different water users.
Common indicators of drought include precipitation, streamflow, and SM. However, long-term SM (and to a lesser extent, streamflow for unregulated streams) observations across the United States are scarce. Therefore, SM and runoff datasets produced by the land surface models (LSMs) that make up the North American Land Data Assimilation System (NLDAS; Mitchell et al. 2004) have become a valuable source of information for drought monitoring and prediction (Mo 2008). The LSMs produce nowcasts (model representations of current hydrologic conditions) that simulate the time lag between precipitation (deficiency) and SM and runoff deficiencies. These latter two variables are directly related to the availability of water for agricultural and municipal users. A strength of LSM-based indicators is that they can be aggregated to any geographical area, such as counties, watersheds, or hydroclimate zones, whereas indicators such as the Palmer Drought Severity Index (PDSI; Palmer 1965) are usually calculated at the relatively coarse spatial resolution of the National Oceanic and Atmospheric Administration (NOAA) climate divisions. The use of LSMs is also desirable because real-time estimates can be related to long-term climatologies derived by running the models using retrospective forcings (typically precipitation and temperature) that go back many years, often approaching a century, depending on data availability.
Retrospective LSM simulations are especially valuable for reconstructing and characterizing the severity of multiyear drought events. One increasingly common method of relating current conditions with historical simulations is to express LSM-derived variables, such as SM, in terms of percentiles relative to a retrospective simulation. This approach has been used by Sheffield et al. (2004), Andreadis et al. (2005), Andreadis and Lettenmaier (2006), and Mo (2008). Retrospective LSM simulations have also been used to evaluate trends in drought-related variables. For instance, Andreadis and Lettenmaier (2006) found that trends in model-simulated runoff over the twentieth century compared well with similar studies by Lins and Slack (1999) that were based on observed streamflow. Sheffield et al. (2004) found LSM-simulated SM performed well as an indicator of vegetative growth. Shukla and Wood (2008) and Mo (2008) showed that model-derived runoff percentiles and a model-derived standardized runoff index (SRI) reflected the seasonal lag in the influence of precipitation and snowmelt on streamflow.
In this study, we evaluate how a DMS for Washington State, based on an LSM, would perform with respect to identification of four major droughts that occurred in Washington State over the last 30 years. Our objectives are to 1) describe DMS indicators and their application, 2) reconstruct four historical droughts using DMS products, and 3) compare DMS drought indicators with drought conditions and management decisions during each of the four drought events.
In the following subsections, we first describe the major hydroclimatological features of Washington State and the methodology adopted to calculate the drought indicators used in our DMS. We then illustrate the drought severity classification method used for this study.
a. Study domain
Our study domain is Washington State. Annual average precipitation over the state varies from less than 25.4 cm (10 inches) to more than 381 cm (150 inches), with high precipitation areas mostly on the western slopes of the Cascade Mountains, and the lowest precipitation in the east central interior of the state. The nature of water supply systems, and hence the characteristics of DMS, varies across the state with its hydroclimatology. For instance, in the arid and semiarid eastern part of the state, water is managed primarily for agricultural water supply and (in the case of the Columbia River system) hydropower production, whereas in the more humid western part of the state, municipal water supply and hydropower production dominate. In both parts of the state, in-stream flow requirements are a serious consideration, as related especially to protection and enhancement of native salmonids.
During the last century, the state has experienced 24 major drought events (King 1978; EWEC 1988; Hart et al. 2001; Anderson et al. 2005). In the state’s drought contingency plan, developed in 1992, water supply monitoring and forecasting responsibilities were assigned to the Water Supply Advisory Committee (WSAC). The WSAC advises the governor to convene the Executive Water Emergency Committee (EWEC) during drought conditions. The EWEC is responsible for assessing the overall impacts of ongoing droughts and coordinating the state’s response. As defined by the Washington Administrative Code, “drought conditions are water supply conditions where a geographical area or a significant part of a geographical area is receiving, or is projected to receive, less than seventy-five percent of normal water supply as the result of natural conditions and the deficiency causes, or is expected to cause, undue hardship to water users within that area” (see http://apps.leg.wa.gov/WAC/default.aspx?cite=173-166-030). The state’s climatological and hydrological features play a key role in drought declarations, as they relate directly to the drought declaration criterion of being likely to receive “less than seventy-five percent of normal water supply.”
Washington State is divided into 62 Water Resource Inventory Areas (WRIAs) (Fig. 1), which are similar to the U.S. Geological Survey fifth- and sixth-layer Hydrologic Unit Codes. Before starting this study, we met with state and regional water managers, and other stakeholders, to determine ways that an indicator system might be most useful to them, and an appropriate scale for decision making. Based on those discussions, we determined that the WRIAs would be a useful and appropriate unit for the indicator system.
b. Climatology of the state
Figure 2 shows the long-term monthly mean precipitation, snow water equivalent (SWE), SM, and runoff, spatially averaged over the state, where precipitation is gridded from observations following methods outlined in Elsner et al. (2010) and Maurer et al. (2002), and the other variables are output from the Variable Infiltration Capacity (VIC; Liang et al. 1994) hydrology model. As is evident from the plot, maximum precipitation is concentrated in fall and winter months, the wettest of which are November, December, January, and February (NDJF). Because annual precipitation is so heavily dependent on precipitation during these months, in years when there is a substantial accumulated precipitation deficit at the end of this 4-month period, it is unlikely the deficit can be offset later in the water year. Therefore, for any given water year, precipitation is crucial for drought planning purposes.
In winter [December–February (DJF)], most precipitation in headwater areas of the state’s major streams falls in the form of snow (noting that while November is an important contributor to wet season precipitation, substantial accumulations of snow in the mountainous regions of the state do not usually begin until December). As winter snowpack melts in spring and summer, it provides much of the water year’s runoff, especially for rivers with high-elevation headwaters draining the eastern slope of the Cascades. Snowmelt also contributes to SM in much of the eastern part of the state, and high-elevation areas in the western part of the state, and to runoff over most of the state during the relatively dry spring [March–May (MAM)] and summer [June–August (JJA)] months.
SM is usually high in the low-elevation areas of the state during winter (DJF) because of high precipitation and low evapotranspiration. Although in the highest elevation areas, where relatively little melt occurs during winter, low SM may persist throughout the winter as a result of end-of-summer dry conditions until it is replenished by snowmelt. Therefore, SM can be a useful and integrative drought indicator for much of the year and over many parts of the state, although it may not be as suitable for winter months in the high-elevation areas. Below-normal SM during spring (MAM) and summer (JJA) typically occurs because of lack of precipitation in winter (DJF) in high-elevation areas. For both high- and low-elevation areas, below-normal SM conditions during spring (MAM) and summer (JJA) usually prevail through the end of the water year.
Runoff generally shows a stronger seasonal cycle than does SM, and follows the cycle of precipitation and SWE with some temporal lag. For snowmelt dominant watersheds (most of the state, aside from some coastal and western interior streams), runoff is high during spring and early summer because of snowmelt and rainfall and declines rapidly through the summer dry season. For this reason, drought indicators based on late winter and spring conditions (observed or forecasted) can be especially useful for drought management.
c. Hydrology model
The physically based, semidistributed VIC model (Liang et al. 1994) was used to derive SM and runoff over the study domain. The VIC model balances both surface energy and water over each grid cell (in this case, ). The VIC model represents subgrid variability in soils, topography, and vegetation and this allows representation of the nonlinear dependence of the partitioning of precipitation into infiltration and direct runoff as determined by soil moisture in the upper layer and its spatial heterogeneity. The VIC model partitions the subsurface into three layers. The first layer has a fixed depth of 10 cm, and responds quickly to changes in surface conditions and precipitation. The second and third soil-layer depths are the same as in the Land Data Assimilation System (LDAS) retrospective simulations (Maurer et al. 2002). Moisture movement between the first and second, and second and third, soil layers is governed by gravity drainage, with diffusion from the second to the upper layer allowed in unsaturated conditions. Water from the second layer drains to the third layer is entirely gravity controlled. Base flow is a nonlinear function of the moisture content of the third soil layer (see Liang et al. 1994 for details).
The VIC model has been successfully used in numerous drought studies. Mishra et al. (2010) analyzed historical droughts over the Midwest using the VIC model. Sheffield et al. (2004), Andreadis et al. (2005), and Andreadis and Lettenmaier (2006) applied the model over the continental United States to reconstruct twentieth-century droughts. Sheffield and Wood (2008) reconstructed global droughts over the second half of the twentieth century, and Sheffield et al. (2009) evaluated potential changes in twenty-first-century drought using the model forced by downscaled global climate model scenarios.
d. Retrospective simulations
We performed a reconstruction of drought conditions over Washington State similar to the studies cited above, but at a higher ( latitude–longitude) spatial resolution. We ran the VIC model from 1915 to 2006 to produce gridded SM, SWE, and runoff, as well as precipitation (model forcing). The period selected was intended to include the major known droughts of the twentieth century, consistent with the availability of data to produce realistic model simulations. (The number of stations at which precipitation and temperature had been observed prior to 1915 falls off rapidly, and this is therefore the beginning of our period of analysis.) We performed model simulations at a daily time step in water balance mode, meaning that the model’s effective surface temperature is equal to surface air temperature, rather than iterating to close the surface energy balance (Liang et al. 1994). There are 5282 grid cells of size in the domain. We used a dataset developed by Elsner et al. (2010) that is based on daily precipitation, and maximum (Tmax), and minimum (Tmin) temperature data from Cooperative Observer stations, which were gridded using methods outlined in Maurer et al. (2002). Additional model forcings (downward solar and longwave radiation, and humidity) were estimated from the daily air temperature and temperature range following methods outlined in Maurer et al. (2002). Surface wind was taken from the lowest level of the National Centers for Environmental Prediction–National Center for Atmospheric Research (NCEP–NCAR) reanalysis (Kalnay et al. 1996); prior to 1949, average wind values from the reanalysis were used. A total of 196 precipitation and temperature stations within and near the boundaries of the domain were used in the development of the gridded dataset(s) by Elsner et al. (2010). Temperature data were lapsed using a pseudoadiabatic lapse rate based on the difference between the station and grid elevations. Both precipitation and temperature were then rescaled to match the long-term average of the Parameter-Elevation Regressions on Independent Slopes Model (PRISM) climatology (Daly et al. 1994, 1997) for the period 1971–2000.
e. Model-based drought indicators
Hydrologic model-derived SM and runoff values, in addition to standardized precipitation index (SPI) and SRI, are the drought indicators used in this study. In this section we describe the development of these indicators.
1) Standardized precipitation index
SPI (McKee et al. 1993) is a widely used drought indicator. It is calculated directly from precipitation data. It allows expression of droughts (and wet periods) in terms of precipitation deficits (Heim 2002).
We used monthly gridded precipitation data to compute SPI for each grid cell. To estimate an n-month SPI (where n was 1, 3, 6, 12, 24, and 36), precipitation was averaged over the n months and a Gamma distribution was fit to the time series. To promote ease of understanding and application to decision making, based on interactions with water managers and stakeholders, we used percentiles of these indicators, rather than a standard normal deviate (McKee et al. 1993; Heim 2002; Mo 2008). Therefore, our SPI values lie between 0 and 1.
2) Standardized runoff index
The SRI (Shukla and Wood 2008; Mo 2008) uses model-derived runoff data (overland plus baseflow in the VIC model) to derive an indicator, based on essentially the same methodology as the SPI (McKee et al. 1993). We first aggregated daily runoff data into monthly values and fit Gamma distributions to the derived runoff climatologies for each grid cell and month as described in Shukla and Wood (2008). Again, we used percentiles for this indicator, rather than a standard normal deviate.
3) Soil moisture percentile
SM can serve as an indicator of different types of drought. The availability of high-quality SM observations is highly limited. However, model-derived SM provides a reasonable alternative for large-scale studies (see, e.g., Maurer et al. 2002; Wood and Lettenmaier 2006). Several past studies (e.g., Sheffield et al. 2004; Andreadis et al. 2005; Andreadis and Lettenmaier 2006) have used model-derived SM in ways that are similar to our approach here.
We used total column SM (sum of the three model layers) averaged by month. The 91 years of monthly values formed the climatology for each grid cell and month. We converted the SM into percentiles using the Weibull probability distribution. The method adopted is essentially the same as was used by Andreadis et al. (2005) and Wood and Lettenmaier (2006).
f. Drought severity classes
We used SPI, SRI, and soil moisture percentile (SMP) as drought indicators. As indicated above, our percentile values of SPI, SMP, and SRI lie between 0 and 1. We categorized the individual drought indicators into six drought severity classes based on Steinemann (2003), acknowledging that other thresholds for drought classes could be used as well. Table 1 describes the values of individual drought indicators, based on percentiles. We used a percentile approach because it offers statistical consistency and comparability among indicators over time and space, which is not necessarily offered by other approaches, such as percent of normal.
3. Analysis of indicators and droughts
Washington State has experienced numerous drought events over the last century. Figure 3 shows the number of WRIAs with drought severities of class 3 or higher (more severe), as defined in Table 1, in terms of SMP, SPI-12, SPI-24, SPI-36, SRI-12, SRI-24, and SRI-36 from 1925 to 2006.
In Fig. 3, the drought years of the 1930s, 1940s, 1976–77, 1987–89, 2000–01 and 2004–05 stand out as the major drought events. We used DMS indicators to reconstruct four of these drought events: 1976–77, 1987–89, 2000–01, and 2004–05. We also surveyed the literature on drought conditions and responses relevant to Washington applicable to these events, including the drought report (Hart et al. 2001; Anderson et al. 2005), and initial drought action program prepared by EWEC in 1978 and 1988 and local newspapers (e.g., Seattle Times and Yakima Herald).
In the following sections, we compare the reconstructed droughts using DMS indicators for each WRIA. For each individual drought event, we first examine how each drought evolved in terms of DMS indicators. To do so, we calculate the number of WRIAs with a drought severity of class 3 or higher, which corresponds roughly to the threshold used by the Washington State Department of Ecology to declare drought in a given area. We then examine the progression, persistence, and recession of drought, analyzing the severity class of each drought indicator for each month throughout the duration of the four drought events. Then, for a more focused case study, we use the highly drought-vulnerable Yakima River basin, whose irrigated crops, and potential drought losses, represent the highest agricultural economic value in the state.
To provide an assessment of drought onset and recovery that is relevant to state drought decision making, we perform additional analyses using DMS indicators, following Steinemann and Cavalcanti (2006). We define drought onset as the last month of any 3-month period for which an indicator has a continuous drought severity of class 3 or higher. We define drought recovery as the last month of the next 4-month period for which that indicator has a continuous drought severity of class 2 or lower (less severe). The onset of the next drought is then defined as the last month of the next 3-month period for which the indicator again has a continuous drought severity of class 3 or higher. Results are provided for the Yakima River basin in Tables 2b, 3b, 4b, and 5b, where drought onset is indicated by bold italic font, and drought recovery is indicated by bold regular font. The drought onset criterion of three consecutive months with a drought severity of class 3 or higher, and the drought recovery criterion of four consecutive months with a drought severity of class 2 or lower, strives to provide early warning, but guard against premature declarations of drought onset (“false alarms”) or drought recovery (“false assurances”), respectively, as described in Steinemann and Cavalcanti (2006).
To assess statewide occurrences of drought, we extend this analysis by defining the onset of statewide drought as when 50% or more of the WRIAs (i.e., 31 or more of the 62 WRIAs) have a drought severity of class 3 or higher for three consecutive months, and by defining the recovery from statewide drought when fewer than 50% of the WRIAs have a drought severity of class 2 or lower for four consecutive months. Results are provided in Tables 2a, 3a, 4a, and 5a, where drought onset and recovery are again indicated by bold italic and bold regular fonts, respectively. Tables 2 –5 also indicate the months of the state’s official drought declaration and recovery for each of the four droughts with bold italic and bold regular fonts in the left-hand columns. Finally, to provide a more specific evaluation of these indicators and to offer comparisons with real-time drought declarations, we examine the evolution of the four drought events and the results from the DMS in the following sections.
a. 1976–77 drought
Lack of precipitation during the fall [September–November (SON)] of 1976, combined with record low snowpack in the winter of 1977, resulted in a severe drought during water year (WY) 1977. Figure 4 shows the major water balance components (precipitation, temperature, SM, SWE, and runoff), spatially aggregated across the state, as they evolved during WYs 1977 and 1978. Although WY 1976 ended with normal conditions, it was followed by a dry fall in the beginning of WY 1977. Considerably below-normal precipitation began to affect SM by late November. Record low snowpack during the winter further exacerbated these conditions, leading to extremely low SM and a significant drop in runoff in the spring. Furthermore, the spring of 1977 was relatively warm, resulting in an early melt of the abnormally low snowpack. On 1 April 1977, SWE ranged from 30% to 71% of normal (King 1978). Because of the combined effects of the low winter snowpack and warm spring, little snow remained by the end of June, resulting in extremely low summer streamflows in most of the state’s watersheds. The official drought recovery was attained in December 1977.
Table 2a shows the number of WRIAs with a drought severity of class 3 or higher as calculated by the SMP and SPI/SRI-3, SPI/SRI-6, SPI/SRI-12, SPI/SRI-24, and SPI/SRI-36 in 1976. In January 1977, SMP, SPI-3, SRI-3, and SPI-6 met this criterion. At the time of the official drought declaration in March 1977, all indicators except SPI/SRI-24 and SPI/SRI-36 were, by our definition, already in statewide drought.
Figure 5 depicts the spatial distribution of drought severity during three months of the drought, highlighting its statewide impact in terms of SPI/SRI-6, SPI/SRI-12, and SMP. In March 1977, virtually the entire state had a drought severity of class 5 or 6 in terms of SPI-6, SPI-12, and SRI-6.
We defined the recovery from statewide drought as the last month of a 4-month period for which an indicator had a drought severity of class 3 or higher in less than 31 of the WRIAs. As shown in Table 2a, SPI-3 met this criterion in November 1977, while SPI-6 met it in December 1977. As noted above, the official drought recovery declaration came in December 1977, but by then, according to DMS, these two indicators were the only ones that showed any real sign of recovery. This number increased to five by March 1978. The decision to declare drought recovery, however, also depends on the status of the snowpack (Anderson et al. 2005). As shown in Fig. 6, SWE percentiles during the winter of 1978 were mostly normal or above normal.
Table 2b shows the monthly progression and recession of drought severity in the Yakima River basin during 1976 and 1977. By January 1977, the onset of basinwide drought was indicated by SMP, SPI-3, SRI-3, and SPI-6. By March 1977, when drought was officially declared, SRI-6, SRI-12, SPI-12, and SPI-24 also indicated basinwide drought. By the time the drought was officially declared over in December 1977, the indicators that showed sign of recovery were SPI-3, (October) SPI-6 (November), SMP, SRI-3, and SPI-36 (December). By March 1978, SRI-6 and SPI-12 had also recovered.
b. 1987–89 drought
The 1987–89 drought resulted primarily from below-normal rainfall and hence snow accumulation during the fall of 1987 and winter of 1988 (EWEC 1988). An extended dry spell that lasted from summer 1987 into winter 1988 was attributable in part to El Niño. Seattle, which normally receives about 20 cm (8 inches) of rain from June to October, had only 4.5 cm (1.8 inches) of rain during that period in 1987. Consequently, a serious water supply situation resulted for the City of Seattle and its customers in late summer and fall of 1987 (Lettenmaier et al. 1990). Anomalously low precipitation, combined with warm temperatures in the following winter of 1988, led to below-normal snow accumulation and early snowmelt in spring 1988, which extended the drought into 1988. On 7 March 1988, the United States Bureau of Reclamation (USBR) announced, “a water supply shortage is likely this summer” (EWEC 1988). Normal precipitation and snow accumulation during fall 1988 and winter 1989 brought an end to the drought.
As shown in Fig. 7a, precipitation and snowfall were below normal statewide during WY 1988. Throughout the fall of 1987 and much of 1988, SM and cumulative runoff were also below normal. During WY 1989 (Fig. 7b) the precipitation and snowfall returned to normal.
WY 1988 began as a dry year with all indicators showing signs of ongoing drought by November (Table 3a). Although it is not known exactly when statewide drought was declared, we do know that by the late fall of 1987, water supply and hydropower production for the City of Seattle were significantly affected (Lettenmaier et al. 1990). By July 1988, SPI-3 and SPI-6 were in recovery; by then, the number of WRIAs with a drought severity of class 3 or higher for these indicators had been less than 31 for four months, an improvement that could be attributed to a spring storm. Nevertheless, none of the other indicators showed signs of recovery, and the statewide drought persisted through the fall of 1988. Although we were unable to obtain information as to how long the official statewide drought declaration remained in effect, indications from the DMS are that SMP; SPI-12; and SRI-3, SRI-6, and SRI-12 showed recovery in January, February, and June 1989, respectively.
Analysis of the spatial extent of the drought (not shown here) depicted that at the time of its onset, the drought was more pronounced in western Washington, especially in the vicinity of Seattle, than in the eastern part of the state. However, as the water year progressed, the drought continued to spread across eastern Washington.
Table 3b indicates the drought severity classes from January 1987 to June 1989 for the Yakima basin. By December 1987 all the DMS indicators had met the criteria for the onset of drought in the basin. There were no signs of recovery until the following year, when drought recovery was finally attained by January 1989, in terms of SPI-3, SRI-6, SRI-12, SRI-3, and SRI-6.
c. 2000–01 drought
WY 2001 began as a normal year. Although the fall of 2000 was drier and cooler than normal, wetter-than-normal weather for the Pacific Northwest was predicted for the winter months. The prediction, however, was not realized, and the dry spell persisted throughout the winter of 2001. Between November 2000 and March 2001, most of the state’s precipitation and snowpack totals were approximately 60% of normal. Considerably below-normal precipitation and snowpack led to a severe statewide drought. A statewide drought emergency was declared on 14 March 2001 (Hart et al. 2001), and the drought declaration remained in effect until December 2001.
Figure 8 shows spatially aggregated water balance components as they evolved during WYs 2001 and 2002. Although statewide precipitation and SWE were normal throughout WY 2000, precipitation dropped below normal at about the beginning of WY 2001. The significantly below-normal SWE worsened conditions and resulted in far below-normal SM and runoff.
Table 4a shows the number of WRIAs with a drought severity of class 3 or higher from January 2000 to June 2002. None of the indicators showed signs of drought until January 2001, when SMP, SPI-3, SPI-6, SPI-12, SRI-3, and SRI-6 met the criteria for the onset of drought. This highlights an important feature of the 2000–01 drought: it developed very quickly, much more so than the other droughts we analyzed. Statewide drought was declared on 14 March 2001 (Hart et al. 2001), by which time SRI-12 and SPI-24 had also met the criteria for the onset of statewide drought.
The statewide drought persisted through December 2001 in terms of most of the indicators except SPI-3, which started showing signs of recovery as early as August 2001. By November 2001, SPI-6 also indicated recovery from statewide drought. Despite the continuation of drought among all other indicators, the drought declaration was lifted by the governor on 31 December 2001. Precipitation since the beginning of WY 2001 was well above normal, contributing to an optimistic outlook for the winter snowpack (Hart et al. 2001). The simulated SWE conditions (not shown here) confirm that the state had well-above-normal snowpack during the winter of 2002. The combination of a good winter snowpack, normal precipitation during the fall of 2001, and indications of recovery by SPI-3 and SPI-6 would have supported the drought management decision.
Table 4b shows how drought severity varied in the Yakima River basin from January 2000 to June 2002 in terms of DMS indicators. SMP, SPI-3, SPI-6, SPI-12, SRI-3, and SRI-6 indicated the onset of drought by January 2001. By the time statewide drought was officially declared, all of DMS indicators except SPI-36, SRI-24, and SRI-36 met the criteria for the onset of drought. For a major part of the rest of the year, many of the indicators continued to have a drought severity of class 3 or more. The first indicator to show signs of recovery was SPI-3 in August 2001. None of the other indicators except SPI-3 and SPI-6 recovered before the statewide drought was officially declared over in December, suggesting that the drought may have indeed persisted in the Yakima River basin.
d. 2004–05 drought
WY 2005 began with normal to below-normal precipitation in October for all but the north Puget Sound region. The dry spell started in November 2004, and a relatively warm winter in 2005 resulted in low snowpack accumulation and early melt. One major feature of the 2005 drought was that, like the 2000–01 drought, it developed quickly. This rapid change in conditions was attributed to a record low snowpack in February and March 2005, due in part to a mid-January storm (a so-called Pineapple Express) that removed much of the accumulated snowpack (Anderson et al. 2005). Statewide drought was declared in March and recovery from drought was officially declared on 31 December 2005 (Anderson et al. 2005).
Spatially aggregated water balance components (Fig. 9) show that three important factors led to the 2004–05 drought: (i) below-normal precipitation in the fall of 2004, (ii) record low snowpack, and (iii) above-normal temperatures. Higher temperatures that were attributed to El Niño conditions resulted in little snow accumulation during the winter and brought about earlier snowmelt. SM was below normal throughout the year, resulting in adverse impacts on state agriculture.
Table 5a shows the number of WRIAs with a drought severity of class 3 or higher from January 2004 to June 2006 for the various DMS indicators. WY 2004 started out dry, and by June 2004, SMP, SPI-3, SPI-6, SPI-12, SPI-24, SRI-6, SRI-12, SRI-24, and SRI-36 had all met the criteria for statewide drought. Although six of these indicators (SMP, SPI-3, SPI-6, SPI-12, SPI-24, and SRI-3) showed signs of recovery by December 2004, low fall precipitation and winter snowpack caused SMP, SPI-3, SPI-12, and SPI-24 to again fall into drought by the time of the state’s declaration on 10 March 2005 (Anderson et al. 2005). As such, DMS indicators were in line with the official drought declaration. Due largely to spring storms in western Washington, SPI-3 and SPI-6 met the criteria for statewide drought recovery in August and October 2005, respectively. However, none of the other indicators showed signs of recovery by the time the drought was declared over in December 2005. As with the 2000–01 drought, this decision may have been influenced by an above-normal winter snowpack outlook. Simulated SWE conditions (not shown here) confirm that the snowpack during the winter of 2006 was well above normal.
Table 5b shows the progression of drought in the Yakima River basin in terms of the severity classes of DMS indicators. By March 2005, all the DMS indicators met criteria for the onset of drought. Although both SPI-3 and SPI-6 indicated drought recovery by December 2005 when the statewide drought recovery was officially declared, none of the other indicators showed signs of recovery until then.
4. Summary and conclusions
Washington State has experienced several major droughts, and substantial associated economic losses, over the last three decades. Because the state’s water resources are strongly dependent on winter snow accumulation, they are susceptible to droughts resulting from dry winter conditions, warm winter conditions, or both. In addition, as the climate warms, the sensitivity of the state to drought is likely to increase because of reductions in mean snow accumulation. This vulnerability emphasizes the need for proactive drought management and the potential value of a drought monitoring system. We have described how such a system, which is based on the SPI, SRI, and SMP as indicators of drought, would have performed during four major droughts over the study period.
In this paper, a daily dataset covering the period 1915–2006 was aggregated to monthly data over the major Water Resource Inventory Areas (WRIAs) of the state. Simulated SM data were used to estimate monthly SMP, and monthly precipitation and model-generated runoff data were used to estimate SPI-3, SPI-6, SPI-12, SPI-24, SPI-36, SRI-3, SRI-6, SRI-12, SRI-24, and SRI-36. We used these indicators to reconstruct 4 major drought events during the last four decades (1976–77, 1987–89, 2000–01, and 2004–05), based on 6 drought severity classes, for each of the 62 WRIAs in the state. We also performed an analysis of the progression, persistence, and recession of drought according to these indicators. We additionally used gridded precipitation, SM, and runoff data to depict the spatial pattern of all four drought events.
Our main findings are as follows: (i) For drought onset, DMS indicators—primarily SPI-3, SPI-6, SPI-12, SRI-3, SRI-6, and monthly SMP—showed the onset of statewide drought at least 0 to 2 months before the state’s official declarations of the 1976–77, 2000–01, and 2004–05 droughts. (ii) For drought recovery, DMS indicators, primarily SPI-3, and SPI-6, showed recovery from statewide drought at least 0 to 4 months before the state’s official declarations in the 1976–77, 2000–01, and 2004–05 droughts. (iii) For the Yakima basin, DMS indicators—primarily SPI-3, SPI-6, SPI-12, SRI-3, SRI-6, and monthly SMP—showed drought onset at least 0 to 2 months before the state’s official declarations of the 1976–77, 2000–01, and 2004–05 droughts. (iv) For the Yakima basin, DMS indicators, primarily SPI-3 and SPI-6, showed drought recovery at least 0 to 4 months before the state’s official declarations in the 1976–77, 2000–01, and 2004–05 droughts.
These results suggest that a DMS can provide a method for early detection of the onset, duration, severity, and recovery from drought, and an approach that would allow for finer-scale resolution of drought declaration. The DMS approach also provides a scientific basis for indicators and triggers that can assist in drought management decisions for Washington State and other regions. There are a number of obvious extensions to the approach we have outlined that will make it useful in real-time drought assessments and decision making. For instance, a DMS at national scale (www.hydro.washington.edu/forecast/monitor) based on the approach described by Wood and Lettenmaier (2006) is currently used as one of the NOAA Climate Prediction Center’s inputs to the U.S. Drought Monitor. A similar system for Washington State (www.hydro.washington.edu/forecast/sarp) is currently made available to state and regional water managers.
We thank Julie Vano, Eric Rosenberg, Ben Livneh, and Vimal Mishra of the University of Washington Land Surface Hydrology Group for their comments on earlier versions of this paper, and Andrew W. Wood for his work on the Washington State Hydrologic Monitoring and Prediction System (www.hydro.washington.edu/forecast/sarp). This research was supported by the National Oceanic and Atmospheric Administration (NOAA) under Grant NA06OAR4310075 and NOAA Cooperative Agreement NA08OAR4320899 and by the U.S. Geological Survey (USGS) under Grant 06HQGR0190 to the University of Washington. This is Contribution 1780 from the Joint Institute for the Study of the Atmosphere and Ocean (JISAO), under NOAA Cooperative Agreement NA17RJ1232.
Corresponding author address: Shraddhanand Shukla, Wilson Ceramic Laboratory, University of Washington, Seattle, WA 98195-2700. Email: email@example.com