The air-freezing index (AFI) is a common metric for determining the freezing severity of the winter season and estimating frost depth for midlatitude regions, which is useful for determining the depth of shallow foundation construction. AFI values represent the seasonal magnitude and duration of below-freezing air temperature. Departures of the daily mean temperature above or below 0°C (32°F) are accumulated over each August–July cold season; the seasonal AFI value is defined as the difference between the highest and lowest extrema points. Return periods are computed using generalized extreme value distribution analysis. This research replaces the methodology used by the National Oceanic and Atmospheric Administration to calculate AFI return periods for the 1951–80 time period, applying the new methodology to the 1981–2010 climate normals period. Seasonal AFI values and return period values were calculated for 5600 stations across the coterminous United States (CONUS), and the results were validated using U.S. Climate Reference Network temperature data. Return period values are typically 14%–18% lower across CONUS during 1981–2010 versus a recomputation of 1951–80 return periods with the new methodology. For the 100-yr (2 yr) return periods, about 59% (83%) of stations show a decrease of more than 10% in the more recent period, whereas 21% (2%) show an increase of more than 10%, indicating a net reduction in winter severity that is consistent with observed climate change.
Recent climate studies have documented an increase in global and regional surface temperatures, with the greatest shift in warming occurring over the last three decades (Solomon et al. 2007). In the United States the increase in warming has corresponded with an increase in growing season length, defined as the number of days between the last spring and first fall frost (Kunkel et al. 2004). Easterling et al. (2000) showed that the number of subfreezing days between 1910 and 1998 has decreased by 4 days yr−1 in the United States, while other studies have found that the greatest decrease has occurred in the western United States (Easterling 2002; Kunkel et al. 2004). Air temperature has a well-known correlation with soil temperature and soil frost depth (Brown 1964); under ideal conditions a decrease in air temperature should be associated with a corresponding increase in soil frost depth.
As long-term networks that monitor deep soil temperatures are sparse and recent (Schaefer et al. 2007; Bell et al. 2013), proxy measurements of soil temperature using air temperature are necessary until adequate soil temperature data can be accumulated from existing networks. Early field studies showed that the severity of air freezing has a direct correlation to soil frost depth with areas that have bare ground cover (Brown 1964). Steurer (1989) used the 100-yr return of the air-freezing index (AFI), a measure of magnitude and duration of air temperature below freezing, as a determinant of the maximum soil frost depth. Later work found that this method works best for midlatitude regions that do not experience severe, prolonged winters (Steurer and Crandell 1995).
Research has shown that up to one-third of the U.S. gross domestic product is reliant on accurate weather and climate information (Dutton 2002). Maximum soil frost depth, which can be estimated with AFI, and soil temperature are important factors in construction costs and building foundations. The severity of soil frost is responsible for frost heave, a naturally occurring process that causes soils to produce an outwardly exerting force on a belowground structure (Jones et al. 1982). An accurate estimate of maximum soil frost depth allows for reduced construction costs and proper preparations for future climate conditions. Trenberth et al. (2002) report that the American Home Builders Association saved the American public an estimated $300 million yr−1 by generating new building and foundation standards that were based on the AFI research completed by Steurer and Crandell (1995).
Steurer and Crandell (1995) computed AFI values that were constructed using a 30-yr serially complete dataset from 1951 to 1980. However, CONUS-wide average temperatures from the Climate-at-a-Glance tool (Lawrimore et al. 2007; Vose et al. 2014) show that 19 (2) of the 30 warmest years from 1895 to 2010 have occurred in the 1981–2010 (1951–80) period. Even when statistical uncertainty is factored in (Guttorp and Kim 2013), it is clear that 1981–2010 was warmer than 1951–80, requiring an upgraded analysis of seasonal AFI return periods.
2. Data and methodology
NOAA’s National Climatic Data Center (NCDC) is responsible for archiving U.S. and global climate records and for providing climate datasets and products. This study utilizes the same serially complete dataset of daily maximum and minimum temperatures that were utilized to calculate NOAA’s 1981–2010 frost-freeze and growing degree-day normals (Arguez 2012). This serially complete dataset was produced using observations from the Global Historical Climatology Network–Daily database (Menne et al. 2012) in a manner consistent with the computation of NOAA’s 1981–2010 temperature normals (Arguez et al. 2012; Arguez and Applequist 2013). A total of 5600 stations (see Fig. 1) across the United States are utilized, with each daily time series covering the 1951–2010 time period. All of these stations are part of the National Weather Service’s Cooperative Observer Program (COOP).
AFI values were calculated from the daily maximum and minimum temperatures for each station in the study. Departures of the daily mean temperature above or below 0°C (the index is derived using Fahrenheit air temperatures) were accumulated and can be plotted on a seasonal time curve (Steurer and Crandell 1995). These daily departures are commonly referred to as freezing degree-days (FDDs). The cumulative seasonal FDD totals were calculated at each station by
where is the cumulative total of degree-days during the season, is the average of the daily maximum and daily minimum temperature for a day i, and N is the number of days in a season (1 August–31 July).
The difference between the highest and lowest exterma points on this seasonal curve is defined as the seasonal AFI value (see Fig. 2). For example, the most extreme AFI value for the Asheville Regional Airport over the 1951–2010 period occurred during the 1976/77 season with an AFI value of 292 FDDs. This value comes from the difference between the highest (3034 FDDs) and lowest (2742 FDDs) points for that station’s season. The 1 August–31 July definition of the cold season, which is supported by inspecting the annual progression of Tave observations for all CONUS stations, follows NCDC precedent for calculating frost-freeze normals.
Using the more preferred generalized extreme value (GEV) probability distribution (Coles 2001), return periods were calculated for each station using its respective seasonal AFI values separately for 1981–2010 and 1951–80. Return period estimates are only computed if at least 15 of the 29 seasonal AFI values are nonzero; this precludes the computation of return periods for ~10% of stations (indicated by red circles in Fig. 1). The results from the GEV distribution were used to generate maximum AFI estimates for the 1.1-, 1.25-, 2-, 2.5-, 3.3-, 5-, 10-, 20-, 25-, 50-, and 100-yr return periods. A simple χ2 goodness-of-fit test was utilized to identify inferior fits of the GEV model; in these limited cases, return periods were estimated using an empirical, nonparametric approximation. The return periods were interpolated using inverse distance weighting. Grid cells of the interpolation were calculated using return period values of the 12 closest stations. An interpolated grid is displayed at 30-km spatial resolution. The maximum soil frost depth penetration was estimated for the 1981–2010 period using the 100-yr return AFI values and Brown’s (1964) relationship between air temperature and maximum soil frost depth penetration:
where dfrost = the depth of frost for a bare ground surface (m) and AFI100 = 100-yr return AFI (°C).
Maximum soil frost depth estimates were calculated for the 1981–2010 period to provide an example of how AFI can be used to estimate the maximum depth of frost penetration for bare ground surfaces (i.e., no vegetation or snow cover).
Time series of CONUS and regional AFI over the full 1951–2010 time span were calculated via simple arithmetic averaging across the 5600 COOP stations. Regions are defined by the nine U.S. climate regions developed by Karl and Koss (1984). Linear regression analysis was performed on the yearly average CONUS AFI values, as well as for individual climate regions.
To provide confirmation of the results and reassurance of the data used in this study, we compare our Tave dataset of 5600 COOP stations with that of the U.S. Climate Reference Network (USCRN). USCRN stands as the premier surface observing network in the country and is specifically designed to observe climate (Diamond et al. 2013). Although USCRN stations are limited to a relatively recent period of record (the first stations were installed in 2000), Menne et al. (2010) reported that the USCRN stations have been successful in detecting the national climate signal.
The USCRN project offers 114 high-quality stations located across the CONUS. Each station is fully equipped with three 5-min temperature and three precipitation measurements sensors, as well as hourly solar radiation, 1.5-m wind speed, relative humidity, ground surface temperature, and soil moisture and soil temperature sensors (if feasible). USCRN air temperature observations require triple redundancy, strong data continuity, and rigorous quality control practices. We compare the seasonal AFI values calculated from the 5600 COOP stations used in this study with USCRN air temperature observations.
Seasonal AFI values were calculated for all 114 USCRN stations for the 2005–10 seasons. These data were compared with corresponding AFI values from the serially complete dataset. Comparisons between these two networks were made by matching all 114 USCRN stations with the nearest COOP station used in this study. This provided 114 paired stations to compare seasonal AFI values across the CONUS. The average distance of compared stations in these two networks was 26 km apart, with the closest pair being 40 m apart. Seasonal AFI values from 2005 to 2010 are aggregated by region to account for the short temporal overlap period of five cold seasons, and the coefficient of determination was calculated for these regional samples. Accounting for the effective reduction of degrees of freedom due to serial autocorrelation does not materially affect the correlation results or their interpretation.
Seasonal AFI values across CONUS have historically ranged between 0 and 5000 FDDs. Zero values are typical for much of Florida, the Gulf Coast, and parts of Arizona, California, and coastal Oregon. The stations in these areas (denoted as red circles in Fig. 1) rarely, if ever, experience days on which the mean temperature does not reach or exceed 32°F. Thus, their cumulative FDD curves tend to increase monotonically. Not surprisingly, the highest seasonal AFI values computed for 1981–2010 are located in the northern part of CONUS, stretching from the Rocky Mountains to New England. In this swath of the country, seasonal AFI values routinely exceed 1000 FDDs [Fig. 3 (top)]. The 100-yr return periods [Fig. 3 (bottom)] exceed 2000 FDDs in parts of the Intermountain West and the northern extents of the Great Plains, the Midwest, and New England, with the largest values (in excess of 3500 FDDs) in northern Minnesota and North Dakota. The greatest 100-yr AFI return value for the 1981–2010 period was for the Hallock station in northwestern Minnesota. The 100-yr return period values in the southeastern coastal plain, southern Texas, the Southwest, and along the Pacific Coast are 250 FDDs or less, highlighting how unusual it is for these areas to observe mean temperatures below 32°F.
Comparing the 1981–2010 return periods with recalculated 1951–80 values using the same new methodology, a decrease in winter severity across much of CONUS becomes apparent. Return period values are typically 14%–18% lower during the 1981–2010 period (Table 1), with a median difference (across CONUS stations for which return periods were calculated) of −66 FDDs for the 2-yr return periods and −175 for the 100-yr return periods. Over 58% of stations have a 100-yr return period value that was at least 10% higher in the 1951–80 period versus 1981–2010, whereas about 20% of stations show the opposite. However, care must be taken when comparing 100-yr return periods from consecutive 30-yr periods, as the difference is largely a function of the coldest year in each period. Differences in shorter return periods, such as of 2 yr, are more in tune with observed climate change. Over 82% of stations have a 2-yr return period value that decreased by more than 10% from 1951–80 to 1981–2010, while the converse occurred for less than 2% of stations.
As stated earlier, return periods are only computed if at least 15 of the 29 seasonal AFI values are nonzero. This precluded the computation of return periods of 615 stations for 1981–2010, and 512 stations for 1951–1980. In 104 cases, we were able to compute return periods for 1951–80 but not for 1981–2010, and the opposite was true for only one station.
A majority of the CONUS experienced a decrease in AFI across all return periods, punctuated by the changes in the 2-yr return periods as depicted in Fig. 4 (top), which shows a decrease across the vast majority of CONUS save for small increases in northern Nevada, southwestern Oregon, and elsewhere. The most consequential decreases coincide with the areas of CONUS where winters tend to be the most severe [Fig. 3 (top)], namely the continental regions near the Canadian border. The differences in the 100-yr return periods [Fig. 4 (bottom)] include increases in much of the South, Southeast, and Northwest regions, whereas the rest of the country experienced decreases, including reductions of 500–2500 FDDs in much of the Great Plains and the Midwest. The Big Sandy station in north-central Montana experienced the greatest decrease in AFI (maximum soil frost depth, from Brown’s formula) between the two periods, from 4521 FDDs (330 cm) to 2003 FDDs (191 cm), yielding a decrease in AFI (maximum soil frost depth) of 2518 FDDs (139 cm). The Hill City, Idaho, station experienced the greatest increase in AFI (frost depth) between the two periods, from 2299 FDDs (210 cm) to 4242 FDDs (316 cm), yielding an increase in AFI (frost depth) of 1943 FDDs (106 cm).
The 1951–2010 CONUS-averaged seasonal AFI values are shown in Fig. 5. The average value for 1981–2010 (492 FDDs) is significantly different from the 1951–2010 average (597 FDDs) at the 95% confidence level (p value = 0.001 98). Linear regression analysis showed that there was a significant decreasing trend (p = 0.010; slope= −2.4737 FDDs yr−1). An analysis of the individual climate regions found a decreasing trend in all nine climate regions (see Table 2), with trends found to be significantly different from zero at 95% confidence for the Northeast, northern Rockies and plains, Southwest, and upper Midwest.
The AFI values calculated for USCRN and the COOP stations match up reasonably well (see Table 3). All regions within the CONUS experienced a significant positive correlation between AFI values for COOP and USCRN stations; all coefficients of determination r2 exceeded 0.84 except for the Southwest region, which registered ~0.65.
Our results indicate that soil frost depth has significantly decreased across the CONUS since the 1951–80 AFI values were reported by Steurer (1989). Similar to the previous estimates of AFI from the 1951–80 period, the recalculated AFI values will hopefully prove beneficial in reducing U.S. construction costs (Dutton 2002) and assist in predicting possible environmental implications (Kreyling and Henry 2011). The AFI results in this study are consistent with results found in other examinations of Northern Hemisphere winter temperature trends over the past century (Easterling 2002; Kunkel et al. 2004). Other research into Northern Hemisphere climate impacts has found a reduction in mean snow cover (Dye and Tucker 2003) and a decrease in estimated frozen ground (Lemke et al. 2007). Thus, our results add to the indication that winter climate has changed over the last half-century.
Although more complex approaches can be used in determining maximum soil frost depth with multiple meteorological measurements (DeGaetano et al. 2001), relatively good accuracy can be obtained with indices focusing on only air temperature (Gel’fan 1989; Steurer and Crandell 1995), which facilitates the computation for a larger number of stations. Early field experiments first characterized the relationship between severity of air freezing and soil freezing (Brown 1964). More in-depth studies have determined that changes in air temperature and snow cover are the two factors that are the most responsible for determining soil temperatures (Zhang et al. 2003) but recent studies have concluded that air temperature exceeds snowpack in determining soil frost depth over larger spatial areas (Zhang et al. 2005). Thus, the lack of incorporating snow depth into our results will likely have some impact on the estimation of the maximum penetration of frost. Using Brown’s formula [Eq. (2)], Fig. 6 displays an example of how 100-yr return AFI values may be used to estimate the maximum depth of frost penetration for bare ground.
Results from the 2-yr return value comparison, between 1951–80 and 1981–2010, show that nearly all stations in all regions experienced a decrease in seasonal AFI. These results correspond with other studies that have seen an overall decrease in the number of cold days and nights across the United States (Seneviratne et al. 2012). As AFI is a measure of winter severity, the cooling trend in annual air temperature for the Southeast could possibly be explained by more severe winters, as indicated in our results, although this region generally experiences minimal frost penetration, which causes even slight changes to be highly visible. It should also be noted that decreases in frost depth far exceed increases. Some caution should be used with spatially interpolated results. The most recognized flaw to this method is that interpolation assumes the spatial area is homogeneous across the surface when in all likelihood this in not the case. Further research should address this issue and apply methods to account for heterogeneous factors (e.g., location’s topography, proximity to water, coastal effects) (Daly et al. 1997; Vose et al. 2014).
Besides the previously mentioned use of AFI for accurately determining construction costs, soil frost depth has a number of ecological implications. Plant root growth and photosynthetic response can be altered by frost depth (Noshiro and Sakai 1979; Rigby and Porporato 2008). Soil microbes also have a strong relationship with soil temperature and frost depth. For example, greater microbial activity can occur with warmer winters and result in changes in biogeochemical cycling (Clein and Schimel 1995). Soil microbes are also sensitive to freezing intensity and duration (Elliott and Henry 2009). However, there is still much research to be conducted on understanding changes in soil frost depth and severity of winter on plants and ecosystems, including the establishment and persistence of invasive species (Kreyling and Henry 2011).
Our results suggest that the AFI in the United States has changed significantly since 1951. Return period values are typically 14%–18% lower across CONUS during 1981–2010 versus a recomputation of 1951–80 return periods with the new methodology. For 2-yr return periods, over 80% of stations show a decrease of more than 10% in the more recent period. These results provide a recent, accurate estimate of AFI and map products that will benefit homebuilders and the construction industry with estimating costs. More accurate estimation of changes in air freezing and maximum soil frost depth will also benefit agricultural producers and ecologists in better understanding the responses of ecosystems to climate change.
The USCRN provides highly accurate air temperature measurements with triplicate configuration for better precision (Diamond et al. 2013). Each station is designed to fulfill data requirements necessary for climate science. Air temperature measurements cannot be directly compared between the two networks, as aspirated fans are used at the USCRN stations. Regressions were applied to compare the AFI values calculated from the serially completed dataset used in this study with the USCRN AFI values to determine the year-to-year variation over the 5-yr period. The results herein provide verification that our results are consistent with USCRN’s high-precision measurements in stable and open environments. Menne et al. (2010) found similar results in comparing the bias-corrected U.S. Historical Climatology Network air temperature values to USCRN air temperatures. Although the shared period between the two networks is relatively short, the use of USCRN data to validate the COOP data validates the seasonal AFI values produced in this study.
We greatly appreciate Peter Steurer for his assistance and expertise in the subject. We also thank Scott Stephens for providing data. Special thanks to M. Kruk, J. Crouch, and the three anonymous reviewers for providing feedback on this paper. This work was supported by NOAA through the Cooperative Institute for Climate and Satellites–North Carolina under Cooperative Agreement NA09NES4400006.