Statistically downscaled climate projections from nine global climate models (GCMs) are used to force a snow accumulation and ablation model (SNOW-17) across the central-eastern North American Landscape Conservation Cooperatives (LCCs) to develop high-resolution projections of snowfall, snow depth, and winter severity index (WSI) by the middle and late twenty-first century. Here, projections of a cumulative WSI (CWSI) known to influence autumn–winter waterfowl migration are used to demonstrate the utility of SNOW-17 results. The application of statistically downscaled climate data and a snow model leads to a better representation of lake processes in the Great Lakes basin, topographic effects in the Appalachian Mountains, and spatial patterns of climatological snowfall, compared to the original GCMs. Annual mean snowfall is simulated to decline across the region, particularly in early winter (December–January), leading to a delay in the mean onset of the snow season. Because of a warming-induced acceleration of snowmelt, the percentage loss in snow depth exceeds that of snowfall. Across the Plains and Prairie Potholes LCC and the Upper Midwest and Great Lakes LCC, daily snowfall events are projected to become less common but more intense. The greatest reductions in the number of days per year with a present snowpack are expected close to the historical position of the −5°C isotherm in December–March, around 44°N. The CWSI is projected to decline substantially during December–January, leading to increased likelihood of delays in timing and intensity of autumn–winter waterfowl migrations.
Snow is an important element of the hydrologic cycle, replenishing soil moisture and contributing runoff to river basins through spring melt (Lettenmaier and Gan 1990; Groisman et al. 2001). Snow cover enhances longwave emissivity and induces an insulating effect on soil temperatures (Zhang et al. 2008). Continued increasing temperatures may lead to a diminished snowpack, which would shift snowmelt from spring to winter (Manabe et al. 1981; Mahanama et al. 2012), reduce soil moisture in spring–summer during low-flow season (Mastin et al. 2011), favor drought development (Mishra et al. 2010; Mahanama et al. 2012), and increase fire risk (Westerling et al. 2006). Furthermore, snow influences many facets of society. Spring snowpack is an essential water-supply reservoir for irrigation, hydroelectric power generation, and municipal demands (Norton and Bolsenga 1993; Kunkel et al. 2000; Christensen et al. 2007; Barnett et al. 2005; Vicuna et al. 2008). Snowfall supports multimillion-dollar winter recreational industries and tourism (Scott et al. 2008). Heavy snowstorms can be a natural hazard to life and property, leading to automobile accidents (Eisenberg and Warner 2005), heart attacks (Rogot and Padgett 1976), disrupted air and surface traffic (Robinson 1989; Rasmussen et al. 1993), and large economic costs (Rooney 1967). The survival and distribution of many animals are strongly impacted by the presence and depth of snow (Severinghaus 1947; Edwards 1956; Verme 1968; Krohn et al. 2006; Hoving et al. 2005; Carroll 2007; Notaro et al. 2011).
North American snow cover has declined since the mid-twentieth century, especially in spring (Brown 2000; Lemke et al. 2007). Snow cover duration across the Northern Hemisphere decreased during 1966–2007 across a zone where the seasonal mean air temperatures were within ±5°C (Brown and Mote 2009). In contrast, a diminishing lake ice cover and more favorable circulation patterns have enhanced lake-effect snowfall across the Great Lakes basin (Norton and Bolsenga 1993; Leathers and Ellis 1996; Burnett et al. 2003). If no change is expected in precipitation, then future warming should favor more rain and less snow. However, given projections of greater cold season precipitation, particularly at mid-to-high latitudes, an increase in snowfall might be expected where temperature continues to remain below freezing (Kapnick and Delworth 2013). The response in snow cover to enhanced greenhouse warming is therefore complicated by projected increases in mid- to high-latitude precipitation, with the response varying by latitude and elevation (Groisman et al. 1993; Räisänen 2008; Brown and Mote 2009). Climate models generally display a transition zone, around the present-day −10°C isotherm, with expected snowfall increases to the north (high latitudes) and decreases to the south (midlatitudes) (Krasting et al. 2013). Projections of snow depth are more complicated than of snowfall, as the former is determined by both snowfall and snowmelt-driven runoff and sublimation (Kapnick and Delworth 2013).
Several modeling studies have generated coarse projections of twenty-first-century snowfall. Long-term planning for water management strategies and reservoir design is driven by the potential for greater winter flows and reduced spring snowpack and spring–summer runoff (Adeloye et al. 1999; Draper and Kundell 2007; Mastin et al. 2011). Simulations by the National Center for Atmospheric Research (NCAR) Community Climate System Model suggest that snow cover will decline across all of North America, except northernmost Canada, during the twenty-first century, particularly during springtime, despite projected increases in cold season precipitation (Peacock 2012). Using the Geophysical Fluid Dynamics Laboratory Climate Model version 2.1 (GFDL CM2.1), with approximate 50-km spatial resolution, Kapnick and Delworth (2013) simulated positive trends in twenty-first century snowfall across the mid-to-high latitudes and negative trends across the mid-to-low latitudes, including reductions across all of North America, except Hudson Bay and northern Quebec. According to an ensemble of coupled atmosphere–ocean global climate models (GCMs) from phase 5 of the the Coupled Model Intercomparison Project (CMIP5), annual snowfall is anticipated to decline during the twenty-first century across much of the Northern Hemisphere, except with high-latitude increases (Brutel-Vuilmet et al. 2013; Krasting et al. 2013). The CMIP3 and CMIP5 GCMs share similar deficiencies in their capacity to simulate mean patterns and trends in Northern Hemispheric snow cover, including an underestimation of the recent negative trend in spring snow cover extent (Roesch 2006; Brutel-Vuilmet et al. 2013).
The purpose of this study is to generate high-resolution projections of various winter snow metrics for the mid and late twenty-first century across the central-eastern North American Landscape Conservation Cooperatives (LCCs) using an operational snow model forced by statistically downscaled CMIP3 climate projections. The LCCs are an applied conservation science partnership among states, federal agencies, universities, nongovernmental organizations, and tribes that provides the expertise needed to support landscape-scale conservation planning. These winter projections will benefit state, provincial, and federal natural resource agencies and nongovernmental agencies in generating survival, distribution, population, and carrying capacity projections for wildlife influenced by winter severity. To date, a lack of quality twenty-first-century snow projections has hindered the capacity to estimate the potential effects of climate change on species that are sensitive to snowpack and temperature.
2. Data and methods
a. Statistical downscaling
Through the Wisconsin Initiative on Climate Change Impacts (WICCI), daily statistically downscaled climate projections were developed to guide climate change impact and adaptation studies (WICCI 2011). Notaro et al. (2011) used this downscaled data to force an operational snow model, SNOW-17, and generate high-resolution snow projections for Wisconsin. The current study uses an improved version of the downscaling covering 10 of the LCCs across the United States and southern Canada, to the east of the Rocky Mountains (Tables 1 and 2), at 0.1° × 0.1° resolution. Downscaled variables include daily maximum temperature, minimum temperature, and precipitation for the late twentieth (1961–2000), mid-twenty-first (2046–65), and late twenty-first centuries (2081–2100), based on three CMIP3 emission scenarios (A2, A1B, and B1) and 13 GCMs. The present study focuses on the nine GCMs (see Table 3, including expansions of the nine model names) for which downscaled projections are available for the late twentieth century (20C3M) and each emission scenario, with results shown for the contrasting high-end (A2) and low-end (B1) scenarios; one ensemble member is considered per model. By the year 2100, atmospheric carbon dioxide levels are projected to reach 840 and 550 ppm, under the respective scenarios.
The station data used to train the statistical models consist of daily maximum and minimum temperature and precipitation from the National Weather Service (NWS)’s Cooperative Observer Program (1950–2009) and Environment Canada’s Canadian Daily Climate Data (1950–2007), for roughly 4000 stations in the United States and Canada. Objective methods are applied to correct the hour of observation and remove stations with large biases in wet day frequency (Daly et al. 2007). Only stations with at least 30 yr of data, and fewer than five missing days per month, are included. The large-scale atmospheric state is obtained from the National Centers for Environmental Prediction (NCEP)–NCAR reanalysis (Kalnay et al. 1996). The downscaling methodology is probabilistic, as a given large-scale state does not determine a single, precise value for a location’s temperature or precipitation, but rather the parameters’ values in a parametric probability density function (PDF). These parameters vary daily with the large-scale fields. A normal distribution is used for maximum and minimum temperature, a generalized gamma distribution (Stacy 1962) is used for precipitation amount on wet days, and the probability of measurable precipitation is modeled with logistic regression. A realistic temperature skewness is simulated, because the final temperature distribution is determined by both the large-scale predictors with normally distributed noise added to the mean. Unlike classic linear regression, the variance is allowed to change, based on large-scale fields.
The reanalysis predictors for daily maximum (minimum) temperature include the 2-m maximum (minimum) temperature itself, 1000-m wind vector, and column-integrated relative humidity. Here, column-integrated relative humidity measures the lower to midtropospheric water holding capacity, computed as the ratio of the sum of specific humidity between sigma levels 0.9 and 0.5 and the sum of the saturated specific humidity between those levels. For the probability of measurable precipitation, the predictors are precipitation rate to the 0.25 power, column-integrated relative humidity, average wind vector between sigma levels 0.9 and 0.5, and a pseudo-K index. The inclusion of wind as a predictor allows for the representation of lake-effect precipitation. The K index is defined as
where T is temperature and TD is dewpoint, both in degrees Celsius. Given that this equation contains an imbalance of three temperature values and two dewpoint values, a projected increase in temperature would almost certainly imply increased instability, whether realistic or not. If relative humidity is constant, which is a reasonable first-order approximation for global warming (Held and Soden 2000), then both the dewpoint temperature and actual temperature increase by the same amount. Therefore, the pseudo-K index is computed by eliminating the T850hPa term. To make the index suitable for high elevations, sigma levels are used. The pseudo-K index is defined as
The predictors for precipitation amount include precipitation itself, the pseudo-K index, available water, and available water flux. The latter two predictors are motivated by the scaling relationships between moisture and precipitation under climate change (O’Gorman and Schneider 2009). Available water is defined as the amount of water that is condensed out when an air parcel lifts vertically from its current pressure to half its pressure. The available water and available water fluxes (computed using the wind field) are averaged vertically between sigma levels 0.9 and 0.5 before serving as predictors.
The above statistical models determine the spatially and temporally varying PDFs of maximum and minimum temperature and precipitation at each station and day in the observed record. To apply the downscaling to climate models, the cumulative distribution function remapping algorithm of Wood et al. (2004) is used to debias the GCM predictors to match the reanalysis. The statistical models are then applied to the debiased model predictors, creating downscaled PDFs for each station and day in the climate models.
To create a gridded downscaling dataset, the PDF parameters are interpolated to a predefined grid. In the study by Notaro et al. (2011), the PDFs were interpolated to a grid based on distance to surrounding stations. The mean temperature at a grid point g was given by
where the summation is across stations, d(g, sj) is the distance between the grid point g and station sj, and D is a parameter that is fit by leaving each station out in turn and using (3) to predict the station’s temperature from the remaining stations. The D that minimizes the error over all stations was used. A different value of D was found for each variable and month. In the new downscaling, (3) is modified so that the weighting also depends on the difference in the mean climate between the stations sj and the grid point g:
where ΔT(g, sj) is the mean temperature difference between the grid point and the station. The mean climate is obtained from the Parameter-Elevation Regressions on Independent Slopes Model (PRISM; Daly 2006; Daly et al. 2008) and the Canadian Climate Normals dataset (McKenney et al. 2001). These datasets are interpolated to the downscaling grid and to the stations using bilinear interpolation. Using PRISM and the Canadian Climate Normals dataset adds information because the fitting of the conditional PDF estimates requires stations with extensive data, while the PRISM and Canadian Climate Normals data are intended only to estimate the mean. Therefore, many more stations are used to constrain the mean climate in the PRISM and Canadian Climate Normals datasets. PRISM also estimates the role of topography and proximity to coastlines in its interpolation scheme. In (4), the mean climate effect is intentionally of lower order than the distance term, so that the distance term dominates if one is far removed from the grid point. Stations with a similar climate are given extra weight only if they are relatively close to the grid point. The constants D1 and D2 are found by cross validating over stations.
In regions of complex topography and data-sparse regions of Canada, the downscaling is relatively unconstrained by data, as the applied approach is restricted to stations with long records. Therefore, PRISM and the Canadian Climate Normals dataset are also used to adjust the mean maximum and minimum temperature and precipitation. For temperature, an additive correction is used, which is the difference between the PRISM and/or Canadian Climate Normals dataset and gridded downscaled data. For precipitation, a multiplicative correction is used. This correction is not meant to correct the downscaled mean of the station data. For example, the conditional mean for temperature is determined by ordinary least squares regression, so the mean is exact to numerical precision. Instead, this correction is used to provide additional information in between stations in regions of complex topography and in data-sparse regions, as confirmed by the fact that the corrections are small, except in mountains and parts of Canada.
Spatially and temporally varying PDFs of maximum and minimum temperature and precipitation are now known at each grid point and day. To generate normal data, random numbers are drawn from the PDFs, generating a possible realization of the small-scale state that is consistent with the large-scale fields. The random numbers are correlated in space (and time, in the case of temperature) so that the spatial and temporal correlations of the downscaled variables are similar to observations. One of the randomly generated realizations is used to drive the snow model.
Wintertime variance and covariance of temperature and precipitation from observations and downscaled data are used to gauge the fidelity of the downscaling. The downscaling, by design, reproduces the mean climate, so this analysis does not focus on the mean. All variables are averaged across the nine climate models before plotting the results. For the standard deviation in daily maximum temperature (see Figs. S1a,b in the supplementary material), the downscaling reproduces the strong variability from the Great Plains to the Ohio Valley and weak variability near the Great Lakes and coasts. The largest errors are found in Florida, where the downscaled maximum temperature variability is too large. For the standard deviation in daily minimum temperature (Figs. S1c,d), the downscaling reproduces the strong variability in the north and relatively weak variability near the Great Lakes, Atlantic coast, and southwestern Great Plains.
Given that projected snowfall extremes are examined in this study, precipitation extremes in the observations and downscaling data are compared (Fig. S2 in the supplementary material). The downscaled extremes are calculated directly from the PDFs and thus exhibit less sampling variability (spatial noise) than the observations. The downscaling reproduces the spatial patterns and magnitudes of the observed 99.9th and 99.99th percentiles. The largest differences are found across the southeastern United States for the 99.99th percentile, although given the limited sample size, much of the differences likely results from sampling variability.
The correlation between daily wintertime temperature and precipitation is assessed (Fig. S3 in the supplementary material). Unlike the precipitation and temperature variability, no attempt was made to fit the correlation between temperature and precipitation. The correlations in the downscaling are those that naturally appear as a result of covariability in the large-scale predictors used to individually predict temperature and precipitation. For maximum temperature and precipitation, the downscaling reproduces the general pattern of positive correlations in the Northeast and negative correlations in the western Great Plains. The downscaling also shows decreased correlations to the lee of the Appalachians, although the downscaled correlations are too positive. The downscaled correlations are typically too large across the Northeast. For minimum temperature and precipitation, the downscaling reproduces the general large-scale patterns, although the downscaled correlations are too large across the northwestern Great Plains and around Tennessee.
b. Cumulative winter severity index
Schummer et al. (2010) developed a cumulative winter severity index (CWSI) to explain changes in the relative abundance of dabbling ducks during autumn–winter at midlatitude migration areas of eastern North America. CWSI is defined as
where TEMP is the mean daily temperature (degrees Celsius multiplied by −1), TEMPDAYS is the number of consecutive days with a mean temperature less than 0°C, SNOW is the snow depth (converted to inches dividing by 2.54 cm in.−1), and SNOWDAYS is the number of consecutive days with a snow depth of at least 2.54 cm. CWSI is cumulatively summed each day from 1 September to 31 March. For temperate regions, CWSI remains below zero throughout the warm season, but as the air temperature consistently drops below freezing and a snowpack develops from late autumn to early winter, the index accrues and achieves positive values. CWSI reflects the current and cumulative effects of temperature on ducks’ energy expenditure and snow and ice cover on food availability to meet the ducks’ energy needs. Based on the CWSI threshold of 7.2 for mallards, researchers can identify the likely date when duck abundance will begin to decrease at northern latitudes with southward migration (Schummer et al. 2010).
c. Snow model
SNOW-17 is an empirically based, operational snow accumulation and ablation model (Anderson 1973, 2002, 2006; Franz et al. 2008a,b; Raleigh and Lundquist 2012), a component of the NWS River Forecast System. SNOW-17 treats the key physical processes that regulate snow dynamics in a conceptual manner, requiring only commonly available data observations of temperature and precipitation as inputs (Fig. S4 in the supplementary material). In treating a column of snowpack, the model addresses the following principal processes: form of precipitation, snow accumulation, energy exchange at the snow–air interface, heat exchange at the soil–snow interface, heat storage and deficit within the snowpack, and liquid water retention and transmission of water through the snow cover (Anderson 2006). Snow cover is modeled as a single bulk layer with a specified water holding capacity. The density of new snow is treated as a function of air temperature, while the density of existing snow responds to compaction, destructive metamorphism, and the component of melt metamorphism due to the presence of liquid water (Anderson 2006). The simple temperature-based SNOW-17 model performs at least as well as more physically based energy balance models (Anderson 1973, 1976; Ohmura 2001; Zappa et al. 2003; Franz et al. 2008a). There is a precedent to applying SNOW-17 in climate change studies (Notaro et al. 2011). Prior studies have used either SNOW-17 or the Sacramento Soil Moisture Accounting Model (Burnash et al. 1973) coupled to SNOW-17, forced by climate projection data, to investigate potential changes in California water resources and reservoir operations (Miller et al. 2003; Brekke et al. 2009; Maurer et al. 2010; Hanson et al. 2012).
Hourly air temperature data at 12 weather stations across the United States for 1980–99 are used to compute their mean diurnal cycle for December–February (DJF) and identify the times of maximum and minimum temperature. The station data are extracted from the National Climatic Data Center (NCDC) global and U.S. integrated surface hourly dataset (available at http://www.ncdc.noaa.gov/cdo-web) for Billings, Montana; Havre, Montana; Pueblo, Colorado; and Cheyenne, Wyoming, in the western study region (103°–116°W); for LaCrosse, Wisconsin; Dubuque, Iowa; Madison, Wisconsin; and Oklahoma City, Oklahoma, in the central region (87°–103°W); and for Bangor, Maine; Atlanta, Georgia; Greenville, South Carolina; and Teterboro, New Jersey, in the eastern region (67°–87°W). Based on analysis of these stations, the daily minimum and maximum temperatures in DJF typically occur at 1400 UTC (range of 1400–1500 UTC) and 2200 UTC (2000–2200 UTC) in the western region, 1300 UTC (1100–1400 UTC) and 2100 UTC (2000–2100 UTC) in the central region, and 1200 UTC (1000–1200 UTC) and 2000 UTC (1900–2100 UTC) in the eastern region, respectively. This information aids the interpolation of downscaled daily maximum and minimum temperatures to hourly values, for input to SNOW-17. Based on hourly present weather (rain or snow occurrence) and air temperature for 1980–99 at 135 stations across the study region, from the NCDC global and U.S. integrated surface hourly dataset, the mean snow–rain temperature threshold is computed at different locations and assigned to the SNOW-17 parameter, PXTEMP. Climatologies of snow depth and surface soil temperature for DJF from the North American Regional Reanalysis (NARR; Mesinger et al. 2006) are used to estimate SNOW-17’s antecedent temperature index parameter (TIPM) and the parameter for daily snowmelt at the snow–soil interface (DAYGM), respectively. The SI parameter that quantifies the required amount of liquid snow for 100% cover is determined by applying values used by the Noah land surface model to an International Geosphere–Biosphere Programme land cover map. Detailed descriptions of SNOW-17 parameters are provided in the supplemental material. Annual snowfall and snow depth climatologies for 1981–2000 are computed for a different set of 196 stations across the study region from NCDC local climatological data publications (available at http://www.ncdc.noaa.gov/IPS/lcd/lcd.html) and Environment Canada’s Canadian Climate Normals (available at http://climate.weather.gc.ca/climate_normals) and used to tune and evaluate SNOW-17. Elevation is retrieved from the 2′ Gridded Global Relief Data (ETOP02; National Geophysical Data Center 2013).
e. SNOW-17 simulations and their evaluation
After calibrating SNOW-17 and its parameters across the study region, the model is run for the late twentieth century (1981–2000), mid-twenty-first century (2046–65), and late twenty-first century (2081–2100), according to the A2 and B1 emission scenarios. For each time period, statistically downscaled climate data from nine CMIP3 GCMs are used to force the snow model across the 10 LCC regions of central-eastern North America (Fig. 1). SNOW-17 generally requires hourly climatic inputs. Downscaled daily maximum and minimum temperature is converted to hourly temperature data using a cubic spline under tension, with the minimum and maximum temperature assigned to 1200–1400 and 2000–2200 UTC, respectively (depending on longitude). This interpolation approach is successful in maintaining different projected changes in maximum versus minimum temperature, recognizing that most GCMs simulate a greater increase in minimum temperature. Sensitivity experiments, in which the timing of maximum or minimum temperature is shifted ahead or back by one or two hours, demonstrate that simulated snowfall patterns are robust and not overly sensitive to small adjustments in the timing of the temperature diurnal cycle. Daily precipitation amounts are converted into hourly values by randomly distributing precipitation on wet days into continuous 6-h periods (0000–0600, 0600–1200, 1200–1800, or 1800–2400 UTC), which is the typical duration of a cold-season precipitation event based on studies for the Midwest and northeastern United States (Cox and Armington 1914; Changnon 1969; Knapp et al. 2000; Laird et al. 2009).
SNOW-17 requires regionally specific calibration to produce quality snow simulations. The model applies six major parameters (SCF, MFMAX, MFMIN, UADJ, SI, and ADC) and six minor parameters (MBASE, NMF, DAYGM, PLWHC, PXTEMP, and TIPM). The methodology for determining appropriate parameter values is outlined in the supplemental materials. The model is tuned to best capture climatological patterns of annual snowfall and DJF snow depth, compared to observations at 196 stations (Fig. 1).
Using the resulting parameter values, SNOW-17 is forced by statistically downscaled climate data from nine CMIP3 GCMs for 1981–2000. The simulated climatological patterns of annual snowfall and DJF snow depth (averaged among nine GCMs) are evaluated against the set of 196 stations (Fig. 2). The observed and simulated mean annual snowfall closely match, with a correlation of 0.99 (ranging from 0.95 in April to 0.99 in December–January), root-mean-square difference of 5.9 cm (ranging from 2.8 cm in November to 3.4 cm in December), and a mean absolute difference of 3.8 cm. Because of deficiencies in assigned parameter values, the model produces excessive snowfall in December–January (+8.3% in December) and too little in November and February–April (−14.8% in November). Regarding DJF climatological snow depth, the model and observations exhibit slightly less agreement, with a correlation of 0.90 (range from 0.86 in March to 0.95 in November), root-mean-square difference of 4.4 cm (range from 0.4 cm in November to 6.7 cm in February), and a mean absolute difference of 1.3 cm. The simulated snowpack is too deep in spring, because of insufficient snowmelt. Percent biases range from −7% in November to +37% in March. Overall, SNOW-17 performs well in reproducing the modern-day snow climatology across the vast study region when forced with statistically downscaled climate data, with a distinct improvement over the original GCMs (Fig. S7 in the supplementary material).
a. Downscaled climate projections
At every location in the study region, and according to both emission scenarios, warming is a robust projection for December–March (DJFM) by the mid and late twenty-first century by all nine downscaled GCMs. According to the B1 and A2 scenarios, the area-average increase in surface air temperature by the mid-twenty-first century is +2.0°C (range from +1.3° to +2.6°C) and +2.9°C (range from +2.0° to +4.3°C) and by the late twenty-first century it is +3.1°C (range from +2.1° to +4.2°C) and +5.3°C (range from +4.1° to +6.2°C), respectively (Fig. 3). The scenarios diverge by the late twenty-first century in terms of the magnitude of projected warming. The magnitude of warming increases with latitude, with a local minimum across the Great Lakes basin. The greatest mean projected warming by the late twenty-first century, from the A2 scenario, is found across the Upper Midwest and Great Lakes LCC (+6.0°C) and Plains and Prairie Potholes LCC (+5.7°C) (Tables 1 and 2; Fig. 3). In this study, at each grid cell, the “low-end projection” and “high-end projection” in temperature refer to the least and most warming, respectively, among the nine downscaled GCMs. The A2-based low-end and high-end projections by the late twenty-first century differ dramatically by roughly 3.5°–4°C across Michigan and southern Canada (Figs. 3a,c,d), indicating consistent projections of warming but much uncertainty regarding its magnitude (particularly in minimum temperatures). This uncertainty may be attributed to 1) greater natural variability with increased latitude, 2) variations in the intensity of snow–ice feedbacks among GCMs, and 3) differing representations of the Great Lakes among the original GCMs (e.g., GCMs excluding the lakes may warm more regionally).
There is even greater uncertainty in DJFM projections of precipitation than temperature among GCMs. The area-average projected change in DJFM precipitation by the mid-twenty-first century is +0.09 mm day−1 (range from −0.01 to +0.21 mm day−1), or +5.3%, according to the B1 scenario and +0.13 mm day−1 (range from +0.03 to +0.25 mm day−1), or +7.4%, according to the A2 scenario (Fig. 4); projections for the late twenty-first century are +0.13 mm day−1 (range from −0.01 to +0.31 mm day−1), or +7.4%, for the B1 scenario and +0.20 mm day−1 (range from +0.00 to +0.32 mm day−1), or +11.5%, for the A2 scenario. The model-mean DJFM projections for the late twenty-first century, based on the A2 scenario, range from −0.19 mm day−1 (−10.7%) in the Gulf Coast Prairie LCC to +0.61 mm day−1 (+21.7%) in the North Atlantic LCC, with the largest increases over the northeastern United States (Fig. 4b, Tables 1 and 2). Precipitation projections largely disagree among the GCMs across the Gulf Coastal Plains and Ozarks LCC, ranging from drying of about −1 mm day−1 to moistening of about +1.0 mm day−1 (Figs. 4a,c,d). While a potential decline in precipitation across the southern LCCs would reinforce the effect of warming to reduce snowfall, a robust projection of greater precipitation across the northernmost LCCs should counter the impact of warming on future snowfall.
b. Benefits of downscaling for snow simulations
Given the coarse horizontal resolution of the CMIP3 GCMs (mean grid cell size of 7.4 × 104 km2) and insufficient representation of the Great Lakes (either completely absent or represented by a few grid cells), lake-effect processes are largely neglected; unfortunately, this limitation continues to exist in most CMIP5 GCMs. However, by including wind as a precipitation predictor in the statistical downscaling, lake-effect precipitation events are captured in the downscaling product, and therefore SNOW-17 simulations forced by this downscaled data include lake-effect snowstorms. To illustrate this point, two December days are selected from the models GISS-ER and ECHAM5 (see Table 3 for model information), with cold air outbreaks and associated west-northwesterly low-level winds over the lakes at a time of the year when ice cover is usually limited and therefore lake-effect snow would be expected (Figs. 5a,b). However, in both cases, there is no signature of lake-effect precipitation in the GCM (Figs. 5a,b). SNOW-17, when forced by the statistical downscaling data, produces distinct lake-effect snowfall maxima downwind of the lakes on both days (Figs. 5c,d). The downscaling relies on the strength of GCMs, which is the representation of large-scale circulation patterns, and adds accuracy to the projections by relating the GCM’s large-scale circulation patterns to the expected local climatic response, as established through observations.
Compared to the raw GCMs, the use of statistically downscaled climate data in SNOW-17 results in a more reasonable spatial pattern of annual mean climatological snowfall across the study region. The mean observed snowfall peaks across the northeastern United States, southern Ontario and Quebec, the Great Lakes basin, the Appalachian Mountains, and on the lee side of the Rocky Mountains (Figs. 2 and 6c). The mean snowfall climatology among the raw CMIP3 models crudely captures this spatial pattern but contains significant biases and minimal signature of the Great Lakes or Appalachian Mountains (Fig. 6a). However, when forced by downscaled climate data, SNOW-17 produces minimal biases in annual snowfall and accurately represents the lake-effect snow maxima and elevated snowfall over the Appalachians (Fig. 6b). For the comparisons in Fig. 6 (and Fig. S7), given that the GCMs only saved the liquid equivalent of snowfall, a 10: 1 ratio is applied to crudely convert the liquid-equivalent snowfall to simulated snowfall depth. Krasting et al. (2013) likewise applied a 10: 1 ratio to the CMIP3 snowfall mass flux output but cautioned that this ratio has significant observed regional variations. The lake-effect zones of the Great Lakes basin are typically characterized by snow-to-liquid-equivalent ratios of 14: 1 or 15: 1 (Baxter et al. 2005; Kutikoff 2013).
c. Snow projections
The model-mean projections, for both emission scenarios and both time periods, indicate a warming-induced decline in annual snowfall across the entire study region (Figs. 7b,e,h,k). The area-average mean projections for annual snowfall, by the mid-twenty-first century, are −17.9 cm (range from −8.4 to −28.6 cm), or −16.6%, according to the B1 scenario and −26.0 cm (range from −19.0 to −37.8 cm), or −24.2%, according to the A2 scenario (Fig. 7); these projections for the late twenty-first century are −28.4 cm (range from −14.6 to −35.9 cm), or −26.4%, for the B1 scenario and −45.0 cm (range from −32.5 to −55.7 cm), or −41.8%, for the A2 scenario. The largest declines, by the late twenty-first century, are projected for the North Atlantic LCC (−58.8 cm, or −30.8%, for B1 versus −92.0 cm, or −48.1%, for A2) and the Upper Midwest and Great Lakes LCC (−53.9 cm, or −24.1%, for B1 versus −89.9 cm, or −40.2%, for A2) (Tables 1 and 2), as precipitation falls more in the form of rain than snow. By the late twenty-first century, the two emission scenarios result in similar patterns of projected snowfall decline, although the high-end projection (defined as the largest decline) for A2 produces about 25% greater snowfall declines across the domain than does B1 (Figs. 7f,l). According to the low-end projection (smallest decline, or a potential increase) for annual snowfall, relatively modest increases are possible across southern Canada and the Plains and Prairies Potholes LCC, and the southern states, as increases in cold season precipitation offset the warming trend (Figs. 7a,d,g,j).
The mean frequency of snowfall days, in excess of 1 cm, is projected to decline everywhere by the mid and late twenty-first century, according to model-mean results from both emission scenarios (Figs. 8a–d). During the late twentieth century, SNOW-17 simulates a mean frequency of snowfall days that ranges from 0.01 day in the Peninsular Florida LCC to 39.08 days in the Upper Midwest and Great Lakes LCC. The largest model-mean declines in snowfall days are projected for the Upper Midwest and Great Lakes LCC (for B1: mean = −8.9 days, or −22.9%, range from −2.6 to −13.8 days; and for A2: mean = −15.6 days, or −39.9%, range from −12.2 to −19.6 days) and the North Atlantic LCC (for B1: mean = −8.2 days, or −28.2%, range from −4.4 to −11.8 days; and for A2: mean = −13.7 days, or −46.7%, range from −8.6 to −18.0 days) (Tables 1 and 2, Figs. 8a–d). According to the B1 scenario, a modest increase in snowfall day frequency by the mid-twenty-first century is identified across northern Wisconsin, Minnesota, and Montana for the low-end projection (defined as smallest decline, or greatest increase, in snowfall days among nine GCMs).
Despite large projected declines in annual-mean snowfall and snowfall frequency across the study region, SNOW-17 produces only slight changes in mean snowfall intensity, or total daily snowfall per event (Figs. 8e–h). Most areas are simulated to experience lighter snowfall events, with declines in mean snowfall intensity simulated across 64% of the study region by the mid-twenty-first century with the B1 scenario, and across 79% of the region by the late twenty-first century with the A2 scenario (Figs. 8e–h). However, an increase in snowfall intensity is possible for southern Canada and the Gulf states (Figs. 8e–h). Southern Canada may experience lower annual snowfall totals, fewer snowfall days, and higher snowfall totals per event (Figs. 7 and 8), which suggests fewer, but more intense, snowfall events. Model-mean projections for snowfall totals per event, by the late twenty-first century, are modest and differ notably across models. Projected intensity changes for the Great Plains LCC are −0.26 cm (range from −1.00 to +0.54 cm), or −4.1%, for the B1 scenario and −0.52 cm (range from −1.29 to +0.24 cm), or −8.4%, for the A2 scenario (Tables 1 and 2). For the Gulf Coastal Plains and Ozarks LCC, these projections include +0.10 cm (range from −0.58 to +1.24 cm), or +2.2%, for the B1 scenario and −0.62 cm (range from −1.37 to +0.03 cm), or −14.3%, for the A2 scenario; in this region, snowfall events are rare, making these conclusions uncertain. A distinct climate change signal in snow event intensity is generally difficult to detect, with prior studies rarely tackling this area of uncertainty.
Projected percentage changes in the frequency of snowfall events of different daily magnitudes are computed both across the study region and individual LCC regions (Fig. 9). For both scenarios and time periods, the projected percentage reduction in the frequency of daily snowfall events is maximized around 15–23 cm, ranging from −21.5% by the mid-twenty-first century under the B1 scenario to −47.6% by the late twenty-first century under the A2 scenario (Fig. 9). Consistent with GCM projections for heavier precipitation events later this century, SNOW-17 simulates more frequent intense snowfall events, particularly by the mid-twenty-first century, under the low-end emission scenario (B1), and across the north-northwest study region. Specifically, daily snowfall events exceeding (under) 70 cm are projected to become more (less) frequent across the study region by the mid-twenty-first century, for the B1 scenario (Fig. 9). Under more extreme warming projections (late twenty-first century or A2), the potential for heavier snowfall events diminishes substantially (Fig. 9). The projected increase in frequency of heavy daily snow events is primarily confined to the Plains and Prairie Potholes LCC and Upper Midwest and Great Lakes LCC (Fig. 9), for events exceeding 56 cm for the mid-twenty-first century and B1 scenario or 76 cm for the late twenty-first century and A2 scenario.
With larger projected snowfall declines in autumn than spring, SNOW-17 generally simulates a delayed onset of the snow season, more prominently than an earlier termination (Fig. 10). During the cold season (November–April), the study region is projected to warm the most during November–December, with the largest snowfall reductions during December–January (see late twenty-first century and A2 scenario in Fig. 10). The primary exception is found across the northern Great Plains, particularly South Dakota, Nebraska, Colorado, and Wyoming, with the greatest projected decline in snowfall during springtime (March–April); these regions are characterized by high rain–snow temperature thresholds exceeding 1.5°C (Fig. S5a in the supplementary material).
For all 10 LCC regions, the projected percentage reduction in mean November–April snow depth exceeds that of snowfall, as projected warming also accelerates snowmelt. For example, the North Atlantic LCC is expected to experience a 48.1% reduction in annual snowfall and a 72.1% reduction in November–April snow depth by the late twenty-first century according to the A2 scenario (Tables 1 and 2, Fig. 11). The largest projected reductions in snow depth are found within the Upper Midwest and Great Lakes LCC, with a decline by the late twenty-first century of −7.63 cm (−36.9%) for the B1 scenario versus −11.94 cm (−57.8%) for the A2 scenario. Across the study region, November–April mean snow depth is projected to decline by −1.79 cm (range from −1.12 to −2.67 cm) according to the B1 scenario and −2.56 cm (range from −1.95 to −3.30 cm) according to the A2 scenario, by the mid-twenty-first century; late twenty-first-century projections include declines of −2.72 cm (range from −1.50 to −4.08 cm) for the B1 scenario and −4.11 cm (range from −3.17 to −4.78 cm) for the A2 scenario. Here, the snow line, or southernmost mean extent of snow cover, is defined by a criterion of at least 0.5 cm of mean snow depth during November–April. The snow line across the study region is anticipated to shift northward by 72 km (128 km) by the mid-twenty-first century and 126 km (229 km) according to the B1 (A2) scenario. The projected northward migration rate of the snow line is less than half of that of the mean isotherms during DJFM (for −5° to 5°C), implying a more gradual response to climate change; the mean isotherms are projected to shift northward at a much more rapid rate than that of the snow line. The low-end projection among the nine GCMs suggests the potential for a slight increase in annual mean snowfall and snow depth across the southern states, with a large percentage increase (Figs. 11a,d,g,j).
Projected changes in the annual number of days with existing snowpack (≥1 cm) are assessed (Fig. 12). The regions likely to experience the largest reductions in days with snowpack are the Upper Midwest and Great Lakes LCC, North Atlantic LCC, and Plains and Prairie Potholes LCC, with respective declines of −27.0 days (−19.7%, range from −16.2 to −42.8 days), −28.5 days (−30.3%, range from −18.3 to −36.7 days), and −26.3 days (−22.2%, range from −17.1 to −47.2 days) for the B1 scenario and −47.6 days (−34.8%, range from −35.7 to −62.0 days), −47.3 days (−50.3%, range from −31.8 to −60.0 days), and −42.7 days (−35.9%, range from −29.5 to −55.9 days) for the A2 scenario (Tables 1 and 2). An optimal zone of maximum reduction in days with snowpack is identified near 44°N, closely following the −5°C isotherm of the late twentieth-century mean climatology (Fig. 12). As this isotherm shifts northward, projected trends in declining snowfall accelerate. Ratios between the projected snowfall decline by the late twenty-first century and the projected decline by the mid-twenty-first century are on the order of 1.4–1.6 to the south of this isotherm and 1.6–2.0 to the north (Fig. 13). The northern LCCs can expect an accelerated negative trend in annual snowfall during this century.
d. Case study application: Projected waterfowl distributions
To illustrate the utility of the SNOW-17 based projections, potential impacts of changing winter conditions on mallard ducks are assessed as a case study. CWSI is computed for the late twentieth, mid-twenty-first, and late twenty-first centuries, according to the A2 and B1 scenarios (Tables 1 and 2, Fig. 14). Analysis focuses on the CWSI = 7.2 isopleth, which is the threshold at which mallards typically migrate southward and locally decrease in abundance (Schummer et al. 2010), thereby providing an index of relative duck distribution. During Januaries in the late twentieth century, this isopleth was typically located across central Kansas, southern Illinois, and southern Ohio (Fig. 14j). In response to warming and declining snowpack, this isopleth is projected to shift northward, reaching northern Nebraska, northern Illinois, and central Michigan by the late twenty-first century under the A2 scenario. The arrival of this isopleth from Canada into the United States could generally be delayed by roughly one month in late autumn, by the late twenty-first century. The largest reductions in mean DJF CWSI are expected for the Upper Midwest and Great Lakes LCC, from 94.9 during the late twentieth century to either 60.5 or 40.3 for the B1 and A2 scenarios (Tables 1 and 2). Within the study region, the probability of WSI > 7.2 (likely southward migration) is projected to decline especially during December across the northern LCCs (e.g., Plains and Prairie Potholes) and during January across the central LCCs (e.g., Eastern Tallgrass Prairie and Big Rivers) (Fig. 14). For example, during Decembers in the late twentieth century, there was a 62.5% probability of WSI > 7.2 across the North Atlantic LCC. By the late twenty-first century, this probability is projected to decrease to 40.4% for the B1 scenario or 19.8% for the A2 scenario (Fig. 14f), suggesting a delayed southward migration during early winter.
4. Discussion and conclusions
High-resolution projections of daily snowfall, snow depth, and winter severity for the mid-twenty-first and late twenty-first century are generated for the central-eastern North American LCCs by forcing SNOW-17 with statistically downscaled climate data from nine CMIP3 GCMs and two emission scenarios. These cold season projections will aid natural resource agencies in assessing impacts of changes in snow and winter severity to wildlife. Compared to raw GCM snow output, this approach leads to a better representation of weather extremes, lake-effect processes, and topographic influences, along with smaller biases in snowfall and snow depth.
For both time periods and both emission scenarios, the entire region is expected to experience reduced annual snowfall and a shift toward less snow and more rain, based on model-mean projections. The largest snowfall losses are simulated for the North Atlantic LCC and the Upper Midwest and Great Lakes LCC, with respective declines by the late twenty-first century of −48.1% and −40.2% under the A2 scenario. For much of the region, snowfall is projected to decline particularly during December–January, implying a delayed onset of the snow season, except over the Northern Plains, where the greatest losses are expected in March–April when snowstorms are more common (Changnon et al. 2006). These results are consistent with studies by Kapnick and Delworth (2013) and Krasting et al. (2013), which project a general decline in Northern Hemispheric snowfall, including across North America, outside of the high latitudes. Considering the range of outcomes from nine GCMs for the B1 scenario, there is a potential for greater annual snowfall across southern Canada, north of the historical mean position of the −10°C isotherm in DJFM, by both the mid and late twenty-first century, in response to increased precipitation. Similarly, Krasting et al. (2013) concluded that snowfall would increase to the north of this isotherm and decrease to the south. SNOW-17 simulates a trend toward fewer snowfall days, but with modest changes in mean intensity. A larger percentage decline in mean snow depth is simulated than in annual snowfall, because of enhanced snowmelt rates with warming. The mean snow line is simulated to shift northward, ranging from 72 km by the mid-twenty-first century according to the B1 scenario to 229 km by the late twenty-first century according to the A2 scenario. The projected decline in the number of days with existing snowpack is expected to optimally peak around 44°N, close to the mean −5°C DJFM isotherm. Dramatic decreases are projected in mean CWSI and the probability of CWSI > 7.2, with the arrival of this critical CWSI isopleth from Canada into the United States becoming delayed by a month by the late twenty-first century, indicative of a delayed southward migration of mallards; this shift will impact the $3.4 billion migratory bird hunting industry (U.S. Fish and Wildlife Service 2011; Southwick Associates 2012).
Few studies have addressed the impact of future climate change on the frequency of heavy snow events (Gutowski et al. 2008; López-Moreno et al. 2011), with much uncertainty in the future response in mean snow event intensity. Changnon (2007) found that 88% of the economic losses attributed to United States’ snowstorms during 1949–2003 occurred in the nation’s eastern half. Snowstorms in the United States have become less frequent, but larger and more intense (Changnon 2007). As temperatures continue to rise, the atmospheric moisture-holding capacity will increase, favoring more intense precipitation events, potentially even snowstorms (CMAP 2013). The current study provides support for intensified snowstorms during this century. Heavy daily snow events are projected to increase in frequency, particularly across the Plains and Prairie Potholes LCC and the Upper Midwest and Great Lakes LCC and most notably for low-warming scenarios.
Several limitations are identified in the study. SNOW-17 only requires inputs of ground-level temperature and precipitation, although ground-level temperature is not a perfect indicator of precipitation type (Anderson 2006). SNOW-17 requires an extensive set of spatially explicit parameters, which can be challenging to develop as most prior applications focused on single watersheds. The applied parameter values result in insufficient snowmelt and excessive snow depth in spring. The LCC statistical downscaling is based on output from a small set of nine CMIP3 GCMs, rather than the latest CMIP5 archive. Only one realization for each CMIP3 model is considered; thus, the full range of possible climate outcomes may be insufficiently represented. This is particularly a concern for projections out to the mid-twenty-first century, given the importance of internal natural variability on the decadal-to-multidecadal time scale (Deser et al. 2012a,b, 2014); future studies should consider large ensemble sets to represent uncertainty. The snowfall projections need to be considered with caution for the Great Lakes basin, given that the statistical downscaling does not consider future changes in lake ice cover, which regulates lake-effect snow. Observations have shown a positive trend in the Great Lakes lake-effect snowfall (Norton and Bolsenga 1993; Leathers and Ellis 1996; Burnett et al. 2003), in response to greater lake evaporation, although an observed trend reversal toward less lake-effect snowfall might have recently begun (Bard and Kristovich 2012). To address future changes in lake-effect snow, a combination of dynamical downscaling, using regional climate models coupled to interactive lakes (Notaro et al. 2013), and improved statistical downscaling techniques are encouraged.
The research was funded by the Upper Midwest and Great Lakes LCC, Gulf Coastal Plains and Ozarks LCC, Wisconsin Focus on Energy, and the U.S. Environmental Protection Agency’s Great Lakes Restoration Initiative through a contract with the Michigan Department of Natural Resources. CMIP3 data were obtained through the Program for Climate Model Diagnosis and Intercomparison (PCMDI). The authors appreciate the helpful comments from three anonymous reviewers.
Nelson Institute Center for Climatic Research Contribution 1183.
Current affiliation: Department of Biological Sciences, State University of New York at Oswego, Oswego, New York.