1. Introduction
Two separate extreme surface air temperature observations in 2006 and 2007 were reported as new all-time records for their respective states. Subsequent evaluation of the observations revealed both to be erroneous, however. The incorrect reporting of these observations as record-setting extremes would have been avoided by the use of a formal, objective method to evaluate such observations. A review of the events also revealed that some of the tables of statewide climate extremes maintained by the National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center (NCDC) had not been updated in several years and thus did not present the most current information concerning the climate extremes of each state.
These issues were addressed by the National Data Stewardship Team (NDST), an ad hoc group including governmental and academic representatives that was formed in 2004 to address national-level issues regarding the collection and disposition of climate data by NOAA and its climate-service partners. The NDST recommended that NCDC’s tables of statewide climate extremes undergo a thorough review and be updated as necessary. The group also recommended establishing a Statewide Climate Extremes Committee (SCEC) to formally evaluate observations (past or present) that may tie or set a state’s all-time record for a particular climate element and to ensure that any future changes to NCDC’s climate-extremes tables are documented (National Weather Service 2011). Shein et al. (2012) more fully describe the SCEC and announce the release of the revised climate-extremes tables, and the present paper details the method used to revise the tables.
2. Method
NCDC’s existing tables of statewide extremes for all-time maximum temperature, all-time minimum temperature, all-time greatest 24-h precipitation (liquid and melted snow), all-time greatest 24-h snowfall, and all-time greatest snow depth were chosen for review. Other extremes tables are available, but these five extremes tables represent extreme values of the five most commonly observed meteorological variables in the United States, and those values are single observations rather than products of multiple observations (e.g., monthly or annual total) that would require the evaluation of each contributory observation.
The five extremes tables were previously updated between 1998 (24-h precipitation) and 2006 (24-h snowfall and snow depth) and therefore did not include contributions from more recent meteorological observations or from any historical observations that may have been recovered by data-rescue efforts such as NOAA’s Climate Database Modernization Program (Dupigny-Giroux et al. 2007; Ross and Truesdell 2010). In addition, little information could be located that documented the methods used to verify the values included in the existing tables. Thus, the process for revising these tables was twofold. First, the statewide record values in each of the five tables had to be reviewed for validity. Second, the observational records at NCDC were examined to identify any observations that might exceed the valid values listed in the tables of statewide extremes.
In the context of the NCDC tables of statewide climate extremes, the criteria for any observation to be considered an official statewide record are that it must be defensible and it must have a documentable provenance. Thus the initial process of evaluation and review was limited to observations that were included in the existing climate-extremes tables, had been reviewed by the SCEC, or had been quality controlled and archived by NCDC. If the investigation of an observation revealed additional observations that merited examination, these too were evaluated regardless of their source. National Weather Service (NWS) personnel and State Climate offices reviewed the revised tables and in some cases suggested additional observations for investigation.
Although a number of values qualified for record status solely on the basis of their magnitude, an observation had to meet certain basic criteria to be considered official in the context of the NCDC climate-extremes tables. The observation had to be taken by a trained observer or at an automated meteorological station. It had to be obtained from an instrument constructed for the purpose of measuring the meteorological variable being reported (i.e., purpose built). The instrument had to be located and operated to observational standards defined by the federal government (e.g., OFCM 2005) or recommended by the peer-reviewed literature at the time of observation. This includes avoiding locations intended to deliberately sample an extreme microclimate more related to geomorphology (e.g., a cirque, frost hollow, or volcanic crater) than to larger-scale atmospheric conditions. Last, the observation must be associated with sufficient metadata to reasonably establish the quality of the observation (e.g., situational information and/or a record of good observations). Also, the data and metadata must be openly and indefinitely accessible for review (e.g., held in an unrestricted archive).
Some existing statewide record climate extremes are plausible and are widely accepted by the climatological community as legitimate, even though they may not have been observed according to the aforementioned criteria, are estimated, or have insufficient metadata to establish accuracy and bias. For example, the record 75.8-in. (1 in. ≈ 2.54 cm) 24-h snowfall at Silver Lake, Colorado, over 14–15 April 1921 is a prorated estimate derived from the 27.5-h 87-in. observed storm total described in Paulhus (1953). That value is acknowledged by the National Climate Extremes Committee (NCEC; http://www.ncdc.noaa.gov/extremes/ncec/) as the official record 24-h snowfall for the United States, although some concerns about the estimate have been raised (National Weather Service 1997). In such cases, that value was retained in the tables as an unofficial record extreme, supplementing the listed official extreme observation.
3. Reviewing the existing tables of statewide extremes
Each state’s all-time maximum temperature, all-time minimum temperature, and greatest 24-h precipitation were obtained from the NCDC tables of statewide climate extremes published on NCDC’s website. Snowfall and snow depth were extracted from the U.S. Snow Climatology database also housed at NCDC (NCDC 2006). Original observation forms, metadata, and official climatological summary reports that related to the existing record values were retrieved from NCDC’s archives. Historical documentation was available for most of the existing records. The existing record values were also referenced against daily weather maps and values from nearby stations. Nearly all existing records were determined to be valid, although several precipitation, snowfall, and snow-depth record values were considered to be unofficial because they had been estimated. In addition, between 2006 and 2009, the SCEC verified six observations as new statewide record climate extremes (see online at http://www.ncdc.noaa.gov/extremes/scec/scec-reports.html). These new records replaced their respective state’s entries in the records tables. Overall, the review resulted in the invalidation of seven of the values from the existing tables, with another six identified as unofficial.
One example of an existing record that lacked evidentiary support was a 118°F temperature observation from Bennett, Colorado, on 11 July 1888. The Colorado Climatological Data report for July 1888 (Fig. 1) showed a maximum monthly temperature of 105°F at Glenwood Springs. Observations in the area on 11 July ranged from the mid-90s to the low 100s in degrees Fahrenheit. Although records identify a U.S. Signal Service voluntary observer operating at Bennett in 1888, Finneran (1965) notes that Bennett’s observer forms only exist for February through April of 1888 and do not resume until February of 1889. The earliest observation form from Bennett in NCDC’s archives is from January of 1893. In July of 1900, the 105°F maximum temperature for July of 1888 was changed to 115°F. In July of 1948, it was further altered to 118°F. Both changes indicate possible transcription errors, although it remains unclear how Bennett became associated with the value. Discussion with the Colorado State Climatologist revealed that the value has long been distrusted by his office (N. Doesken 2010, personal communication), and thus the value was invalidated.
Summary of climate elements for stations throughout Colorado in July 1888 from the “Colorado Weather” Bulletin of the Colorado Meteorological Association (precursor to the monthly Climatological Data publication). The climate summary notes the maximum temperature for the month as 105°F at Glenwood Springs, and not the 118°F purportedly observed at Bennett.
Citation: Journal of Applied Meteorology and Climatology 51, 11; 10.1175/JAMC-D-11-0226.1
Other existing statewide climate extremes either were from sources that were not part of a recognized observation network or were estimated values rather than actual observations. Observations from nonrecognized sources commonly lacked sufficient supporting metadata to adequately establish provenance and accuracy. Because record-setting meteorological conditions often are associated with a notable extreme-weather event, a majority of these unofficial observations could be evaluated by reference to postevent reports produced by the Weather Bureau or NWS (e.g., U.S. Weather Bureau 1944). In a few instances an observation from a nonrecognized source was deemed a valid official record value because it came from a calibrated instrument (e.g., a graduated rain gauge) and reasonable accuracy could be established. In most cases of records from nonrecognized sources, however, the value was either a postevent estimate (e.g., volumetric depth of water in an exposed container such as a drink bottle) or was from an unknown source, and information regarding the instrument (i.e., location, disposition, accuracy, etc.) could not be determined from archived documentation. These values were compared with surrounding observations, the daily weather map, and other sources of information to determine whether the value was reasonably reliable. If the value was plausible and the state’s State Climatologist felt it was acceptable, the value was carried forward as an unofficial record value.
Two such values were the 38.70- and 34.50-in. 24-h precipitation records for Florida and Pennsylvania, respectively. The 38.70 in. of rain that fell at Yankeetown, Florida, during the passage of Hurricane Easy on 5 September 1950 is widely considered to be legitimate but was mathematically estimated from filled 12-oz (1 oz ≈ 29.6 mL) soda bottles that had been found on the sidewalk after the storm (Harns 1952). In a similar way, a report of 34.50 in. of rain having fallen near Smethport, Pennsylvania, on 17 July 1942 is commonly acknowledged as a legitimate record (Jennings 1950) that was duly noted in a postevent storm summary compiled by the U.S. Weather Bureau (1944). The value is a reasonable volumetric depth estimate derived from 37 quarts (1 quart ≈ 0.95 L) of liquid found in a 10-gal (1 gal ≈ 3.79 L) milk can having a mouth diameter of between 8.625 and 9.25 in. (the range of common mouth diameters of 1940s 10-gal milk cans). Doubts exist about the accuracy of both the Yankeetown and Smethport estimates because neither vessel was subsequently tested for accuracy in the collection of rainfall. In addition, the location and disposition of the vessels before and during the events are unknown, as is whether they contained any liquid prior to the rain event. Thus, while these and similar values might be acknowledged as legitimate, they fail to meet the criteria for recognition as official record values in the context of the NOAA tables of statewide climate extremes.
4. Building a comparative dataset
NCDC’s digital data archive was searched for any observations that tied or exceeded a state’s record value for that meteorological element. The top (bottom for minimum temperature) 100 observations from each of NCDC’s operational, quality-controlled daily datasets (DSI-3200, 3206, and 3210) were extracted for review. Hourly precipitation data were also scanned using a 24-h moving window to capture total 24-h amounts that may have spanned two calendar days. The top 100 of these totals also were extracted for each state. The choice of 100 values is arbitrary but was selected to permit a comprehensive yet manageable visual overview of the outer limit of each element’s data distribution. The lists also provided enough values to identify temporal agreement (i.e., many extreme temperatures on the same dates at different stations), systematic errors at particular stations (i.e., a single station dominating the list), and any discontinuities that might divide plausible extremes from obvious keying errors (e.g., an increase from temperatures in the low 100s to temperatures in the 120s with nothing between). In addition, many historical observations remain undigitized. Thus, archived paper and microfiche documents pertaining to several notable historical extreme-weather events (e.g., the great New England blizzard of March of 1888) were manually examined for potential additions to the lists. Although this manual search of the archive was extensive, only a few potential values were identified.
In January of 2011 (subsequent to this analysis), the Global Historical Climatology Network (GHCN) replaced the DSI-3200/06/10 series as NCDC’s operational daily dataset (Lawrimore et al. 2010). All of the daily data used in this analysis are now contained in the GHCN-Daily dataset. This analysis was not redone using the GHCN-Daily dataset, however—in part because it contains observations from nonfederal sources that may or may not all adhere to federal installation and observation standards and because all potential records that may come to light from here forward are to be reviewed by the SCEC. The GHCN-Daily was, however, reviewed in 2011 to ensure that no new statewide record extremes had been added since November of 2009 when the data were last extracted.
Because many observations appear in more than one dataset, the data lists for each state and element were merged to eliminate any duplicate observations. Once merged, the lists were further shortened by eliminating any observations that did not tie or exceed the respective state’s valid extreme value for a given element. If the existing extreme was considered to be invalid or unofficial, a subjective baseline of 90% of that existing record value was applied to the list of observations. Each list was then examined to eliminate any clearly erroneous values. Although NCDC employs a rigorous automated quality-control (QC) process (e.g., Reek et al. 1992; Durre et al. 2010; Shein 2008; NCDC 2009) capable of correctly validating approximately 98% of all in situ observations it checks, extreme values can be problematic for an automated QC system because the magnitude of the observation means comparative data often are unavailable (You and Hubbard 2006). In addition, the transcription of data from historical paper observation forms to digital media results in errors when an observation is illegible or is copied incorrectly. Such errors may cause an extreme observation to be transcribed as a lesser value (e.g., 18 in. of snow transcribed as 1.8 in.) or a nonextreme value to be copied as an extreme (e.g., 20°F entered as 120°F). The limitations of automated QC processes with respect to extreme values, and the presence of transcription errors in the digital data prescribed a largely manual approach to this analysis.
A few lists were dominated by observations that were at least an order of magnitude greater than the existing record values. Original observation forms revealed that this was due to observations being recorded to an inappropriately high level of precision by the observer (e.g., recording a snowfall observation to the hundredth rather than to the tenth of an inch). Data entry operators were instructed to transcribe the numbers on the observer forms exactly as they saw them, ignoring any decimal points (the data are stored as five digit integers, with the file metadata indicating how many of the rightmost digits represent the fractional component of the observation). An example of the problem that might arise is demonstrated in Fig. 2, in which an observer incorrectly reported 2.25 in. of snow depth instead of 2 in. The transcriber entered 00225 instead of 00002, and, because the observation is to be read as whole inches, 00225 is interpreted as 225 in. Fractions were also recorded and transcribed literally, resulting in, for example, a “5½”-in. snowfall observation being transcribed as 00512, or 51.2 in. (e.g., Fig. 3). The issue of literal transcription was restricted to digitization efforts that took place in the early 1990s as part of the Centennial Cooperative Weather Station Program (Doty 1989, p. 3). Where a magnitude error affected more than 10% of a particular data list, the list compilation process was repeated with an arbitrary upper limit of 2 times the existing record value (e.g., only observations of 40 in. or less were considered given a 20-in. existing snow-depth record).
Cooperative Weather Observation Form 1009 from Murfreesboro, Tennessee, from December 1937. Both snowfall and snow depth were incorrectly reported to the hundredth of an inch but were transcribed as inches to tenths and whole inches, respectively. Thus, the 2.25-in. snowfall and snow-depth observations on 8 Dec appeared as 22.5 in. of snowfall and a 225-in. snow depth in the digital data.
Citation: Journal of Applied Meteorology and Climatology 51, 11; 10.1175/JAMC-D-11-0226.1
Cooperative Weather Observation Form 1009 from Wood River Junction, Rhode Island, from January 1942, showing snowfall and snow-depth observations listed in inches and fractions. Early transcription efforts entered only the numerical digits, and thus the 3½-in. snow depth was transcribed as 312 in. and the 5½-in. snowfall became 51.2 in.
Citation: Journal of Applied Meteorology and Climatology 51, 11; 10.1175/JAMC-D-11-0226.1
Observations that appeared to be meteorologically improbable or impossible could be eliminated as well. Examples included a −64°F minimum temperature in Alabama in September (Fig. 4), and a sudden 134-in. jump in snow depth with no new snow (Fig. 5). In a number of cases, an observer noted a malfunctioning thermometer and yet continued to record erroneous temperature observations (e.g., 125°F in December) rather than record the observations as missing. Snow depth was occasionally recorded by adding the daily snowfall to the existing snow depth rather than taking a daily snow-depth observation. Also, when an observation was missed or the station was inaccessible, precipitation was often reported as a multiday total, but the values were transcribed as a single observation for the date on which the observation was recorded.
Weather Bureau Cooperative Weather Observation form 1009 for Union Springs, Alabama, showing what appears to be a negative 64°F minimum temperature observation on 30 Sep 1946. The observer summary, however, notes a monthly minimum of +58°F, indicating the “−” was not intended, although the value was transcribed as a negative number.
Citation: Journal of Applied Meteorology and Climatology 51, 11; 10.1175/JAMC-D-11-0226.1
NWS Cooperative Observation form B-91 for Brianhead, Utah. On 9 Mar 2009, snow depth increases by 134 in. The observations from the preceding two days are missing, but the depth drops by 118 in. the next day, despite an additional 11.0 in. of new snow.
Citation: Journal of Applied Meteorology and Climatology 51, 11; 10.1175/JAMC-D-11-0226.1
5. Reviewing NCDC archive data
After removing erroneous values and observations that did not exceed their respective baseline value, no list of extreme observations contained more than 18 values requiring investigation. In several instances, the valid existing record remained the most extreme observation in NCDC’s digital data. The investigation of any remaining observations began from each list’s most extreme observation and progressed toward the existing record until a given observation could not be eliminated as erroneous. If that value exceeded the stated record extreme value for its respective state and element, it replaced that existing record in the revised extremes tables.
Most observations subject to investigation were quickly eliminated when the stated value did not match the value on the original observation form. As before, such discrepancies were attributable to either a transcription error or to the observer recording an observation to an inappropriate resolution. For example, numerous instances were found in which the data-entry operator inadvertently pressed an adjacent key on the computer number pad, transforming, for example a 20°F observation into 120°F. When the correct value of an erroneously digitized observation could be determined, or when the accuracy of an observation did not correspond to the validity assigned to it by NCDC’s automated QC system, the observation was assigned to the “Datzilla” data-error-reporting and error-correction program at NCDC (Shein 2008) for review and correction.
a. Documentary evidence
Values that matched their respective entry on the observation form and appeared to be recorded correctly were cross-referenced against the Climatological Data (CD) publication for the observation month as well as daily weather maps for the date. The CD publication was normally produced shortly after the end of each month and gave Weather Bureau or NWS personnel an opportunity to perform preliminary QC on the observations. Until the 1960s, these reports opened with a narrative of the weather for the month. The narratives usually included an account of the averages and extremes for the month (especially if a new extreme was set), which normally was sufficient to corroborate or refute a value. Very often, the narratives were more descriptive in summarizing the character of the weather for the month and gave special mention to any severe weather or notable observations (e.g., Fig. 6). Such descriptions, coupled with the daily weather maps (e.g., Fig. 7), placed a value in the context of prevailing weather conditions and established reasonably high confidence in the accuracy of an extreme observation.
Narrative of extreme-weather conditions in West Virginia that appeared in the Weather Bureau’s June 1949 Climatological Data of West Virginia publication. This narrative helped to corroborate an observation of 12.02 in. of precipitation reported at Brushy Run, West Virginia, on 18 Jun 1949.
Citation: Journal of Applied Meteorology and Climatology 51, 11; 10.1175/JAMC-D-11-0226.1
This Monthly Weather Review map of maximum snow depth in February 1895, coupled with the map of February’s low pressure tracks (not shown), helped to corroborate a 24-in. snow depth at Rayne, Louisiana, on 15 Feb 1895.
Citation: Journal of Applied Meteorology and Climatology 51, 11; 10.1175/JAMC-D-11-0226.1
Observers also often included descriptive information about an event or observation in the remarks section of the observer form. Record-level extreme values are noteworthy, and remarks often describe the weather conditions associated with an extreme-value observation. For example, a 13.60-in. rainfall observation on 13 September 1982 at Milan, Tennessee, was accompanied by the remark “Roads washed out, R.R. also. Homes & businesses flooded. Crops destroyed. 2 deaths.” Another observer noted, “Coldest in memory of oldest inhabitants. Largest snow in memory” (Cooperative Observer Network observer in Lafayette, Georgia, in January of 1940). Such remarks clearly indicate that a significant meteorological event occurred. Conversely, unless an observer routinely failed to provide remarks, extreme-value observations lacking remarks were viewed with suspicion.
Station metadata were reviewed for any information about the reliability of the observing instrument, the situation of the station (i.e., located to minimize bias from nonclimatological factors), or any notes by a reviewing official concerning the skill or accuracy of the observer. In addition, event reports and assessments by the Weather Bureau, NWS, or other reliable sources document notable extreme meteorological events in great detail (e.g., Smith 1956; Price et al. 1975). These reports often include comprehensive information about the observations, including station metadata. When such reports were available to evaluate an observation, they nearly always provided sufficient information to assess the quality and accuracy of the value being investigated.
Last, supporting information was sought from the Internet (e.g., online newspaper archives or image galleries from local historical societies). Photographs, media articles, or other documentation of a significant weather event that impacted a community are often preserved by communities as part of their history. For example, Bradshaw (2009) quotes an eyewitness to a Gulf Coast blizzard: “The St. Valentine’s Day snow of 1895 lasted for three days and two nights [in Lake Charles, LA]. I [F.V. Gallaugher] was 15 and the snow was up to my knees.” This descriptive measure corresponds well to the 22-in. snow depth reported for Lake Charles from that storm. Photographs from the event in question also corroborate the value (e.g., Fig. 8).
Photograph of snow blanketing the ground at the corner of Pujo and Bilbo Streets in Lake Charles, Louisiana, after the Valentine’s Day blizzard of 1895. Such documentation is invaluable in supporting or refuting an extreme meteorological observation. (Photograph credit: M. Reid, provided through the courtesy of the Calcasieu Parish Public Library.)
Citation: Journal of Applied Meteorology and Climatology 51, 11; 10.1175/JAMC-D-11-0226.1
b. Neighborhood analysis
On occasion, a value that was supported by all available documentation was inconsistent with nearby observations in space and/or time. In such cases, the value was evaluated in the context of the station’s observational data and corresponding observations from surrounding stations, if any. A circle with a 16-km (10 mi.) radius was centered on the station to identify concurrently operating neighboring stations. Although some remote observers had no neighboring stations within the buffer region, most had comparative observations from between one and six contemporaneous neighbors.
Visual comparison of a station’s observation series with those of neighboring stations revealed that many of the discrepancies occurring between nearby observations were due either to different observation times or to so-called date shifting. Because daily observations cover 24-h periods rather than calendar dates, observers have long been instructed to assign the daily readings from their instruments to the calendar date on which the instrument was read. In some cases, the reading may take place on the calendar day after the value actually occurred. Most voluntary observers make their observations at either 0700 or 1800 local time, but other times may be selected by the observer. Thus, neighboring stations may differ in their observation times, and the same temperature or precipitation event could be ascribed to two different dates by neighboring observers. For example, a 110°F maximum temperature that occurred at 1600 local time on 11 July would be recorded on 11 July by an evening observer and on 12 July by a morning observer. Both observations are correct, but they must be reconciled prior to comparison.
Date shifting occurs when observers (usually a morning observer) routinely but erroneously assign their observations of maximum temperature and/or precipitation to the calendar day preceding the observation date, under the assumption that the values most likely occurred during the prior day. If a station’s observation time could not be identified, and date shifting could not be visually confirmed, the time series of the station in question was lagged by ±1 day relative to its neighbors, and the lag (−1, 0, or 1 day) that produced the highest correlation coefficient was used for the comparison. When applied, this shift was sufficient to align the observations and to facilitate the validation of the observation in question.
6. External review
The results of the investigation process for both existing record extremes and potential replacements were provided to the NWS Climate Services Division and to the American Association of State Climatologists, who disseminated them to local NWS climate focal points and state climatologists, respectively. These external reviewers were given an opportunity to evaluate and challenge the results. Most of the feedback confirmed the values or requested clarification of the review process. A few comments questioned the distinction between “official” and “unofficial” records, but all were in agreement that such a distinction was necessary for defensibility. Twelve values were disputed in this postinvestigation review process, and each of the cases was reevaluated in the context of additional information. In one case (all-time greatest 24-h snowfall for Washington State), an extreme observation that was archived outside of NCDC was identified, and after evaluation it replaced the revised record extreme value. In the remaining cases, information about the reliability of the station was provided by the state climatologists and NWS focal points. That information was sufficient to invalidate the revised record extreme value in question. All 12 challenges were ultimately resolved to the satisfaction of NCDC and the reviewers and are reflected in the revised extremes tables published by NCDC on its website (http://www.ncdc.noaa.gov/extremes/scec).
Two of the revised statewide extremes conflicted with the national climate extreme values reported by the NCEC. The NCEC identifies an observation of 43.00 in. of rain reported in conjunction with Tropical Storm Claudette near Alvin, Texas, in 1979 as the U.S. record for 24-h precipitation. A meteorologist on duty at the local NWS office during the storm received a report of 42.00 in. from a reliable observer with a 10-in. rain gauge west of Alvin (Fig. 9), however. It is possible that the 43.00 was estimated from this amount, as the observer reported that his gauge had overflowed at one point during the event. For this reason, the state tables list 42.00 in. as the record for Texas instead of the 43.00 in. recognized as the national record. The second discrepancy involves a 24-h snowfall observation of 78.0 in. from Mile 47 Camp, Alaska, in February of 1963. The observation exceeds the 75.8-in. national record set at Silver Lake, Colorado, in 1921. Discussion with NCEC members indicates that they were not aware of this value at the time that they last evaluated the national 24-h snowfall record and that the Mile 47 Camp observation probably had not been digitized at that point in time. These cases are under investigation by representatives of the national and state climate-extremes committees to ensure that the most accurate information is presented on both tables.
A handwritten note from an employee of the NWS Houston Forecast Office dated 20 Aug 1979 describing the precipitation observations of a resident of Alvin during the passage of Tropical Storm Claudette. Although the note lends support for an estimated 24-h total of up to 45 in., a reasonably precise estimate could not be determined. The note does corroborate that at least 42.00 in. was measured in a rain gauge during the 24-h period between 0700 LT 25 Jul and 0700 LT 26 Jul 1979; that value was carried forward into official storm-summary documents, and therefore that value was deemed sufficiently defensible for an official record amount.
Citation: Journal of Applied Meteorology and Climatology 51, 11; 10.1175/JAMC-D-11-0226.1
7. Results
In total, 104 of the 250 record extreme values in the five NCDC tables of statewide climate extremes were revised. This number included seven values observed since 2006 and confirmed by the SCEC. In addition, new tracking was established for all-time maximum temperature, all-time minimum temperature, and all-time greatest 24-h precipitation records in Puerto Rico and the U.S. Virgin Islands. Statewide all-time snow depth and all-time greatest 24-h snowfall received 36 and 37 revisions, respectively. Many of these changes were due to difficulties inherent to observing and quality controlling snow data (National Weather Service 1997). Seventeen statewide records of all-time greatest 24-h precipitation were updated, and the lists of all-time maximum and all-time minimum temperature records each contained seven revisions.
The quantity of revised values and the substantial effort required to facilitate those revisions have highlighted the limitations of periodically updating the NCDC tables of statewide climate extremes and have demonstrated the importance of a process that allows any statewide record extreme value to be individually challenged, reviewed, and updated as needed. Such a process has been established with the creation of the SCEC. Through the SCEC, any value in the revised tables may be challenged by new observations, new information about the existing record, or newly discovered historical observations. If a record value is found to be invalid or is superseded by a greater observation, the extremes database is immediately updated to reflect the new record. This ensures that the extremes tables remain current.
In addition, the transition from static extremes tables to a dynamic database of extreme values allows additional statewide climate records to be tracked with minimal effort. The SCEC has the latitude to establish records for climate elements that are not traditionally tracked by NCDC. For example, in 2009 the SCEC met to establish a record for the all-time largest hail in Vermont. As a result of this meeting, a database of initial largest hail observations in each state was established. As resources permit, other climate elements (e.g., wind speed or barometric pressure) may also be added.
Puerto Rico and the U.S. Virgin Islands have been added to the tables of statewide climate extremes, but the U.S.-affiliated Pacific Islands (i.e., Guam, American Samoa, Republic of Palau, Federated States of Micronesia, Republic of the Marshall Islands, and the U.S. Minor Outlying Islands) still await inclusion. Observation stations from these islands have traditionally been combined under the “Pacific Islands” header in the Cooperative Observer Network system. The next step in this effort is to extract each island’s extreme observations and add their individual climate-extremes records to the tables. As resources permit, additional tables of statewide climate extremes (e.g., annual total snowfall or highest temperature by month) will also be revised.
This work and the continued work of the SCEC are critical components in national and global climate-extremes evaluation. Much effort has been devoted to the evaluation of extremes at these larger scales (e.g., Cerveny et al. 2007) because a comprehensive knowledge of extreme environmental conditions is imperative to addressing potential climate impacts on our socioeconomic and environmental systems (Peterson et al. 2008). In this context, verifiable statewide climate extremes may bring a new record to the attention of the NCEC or the World Meteorological Organization (WMO) Commission for Climatology and serve to inform a review of national or global record extreme values. For example, on 23 July 2010, a record-setting hailstone fell near Vivian, South Dakota. The record status was confirmed for the state by the SCEC, and that information was forwarded to the NCEC, who subsequently confirmed the hailstone as a national record for greatest-diameter (203.2 mm) and heaviest (0.879 kg) hailstone. The WMO Rapporteur on Climate Extremes also requested the SCEC report to investigate whether the stone may have set an international record (R. S. Cerveny 2010, personal communication).
Acknowledgments
The authors acknowledge the hard work of Michael Changery, Grant Goodge, and Tom Ross of NCDC and Robert Leffler and Andrew Horvitz of NWS in compiling the previous NCDC records tables. We also acknowledge the conscientious efforts of NCDC’s Datzilla administrator, Bryant Korzeniewski, to correct any erroneous data or metadata found by this investigation. The NWS Climate Services Division and the American Association of State Climatologists deserve credit for facilitating the validation of the revised tables, and the invaluable dedication of thousands of volunteer weather observers that made this work possible must also be acknowledged. We also thank the anonymous reviewers whose thoughtful comments helped to improve this manuscript.
REFERENCES
Bradshaw, J., 2009: A history of snow in south Louisiana. Teche Today, December. [Available online at http://www.techetoday.com/view/full_story/8437112/article-A-history-of-snow-in-south-Louisiana.]
Cerveny, R. S., J. Lawrimore, R. Edwards, and C. Landsea, 2007: Extreme weather records: Compilation, adjudication and publication. Bull. Amer. Meteor. Soc., 88, 853–860.
Doty, S., 1989: Centennial Cooperative Weather Station Program. State Climatologist, Vol. 13, No. 2, 8 pp. [Available online at http://www.stateclimate.org/publications/state-climatologist/NOAA-NCY-SCBOOKS-SC77097/00000037.pdf.]
Dupigny-Giroux, L., T. F. Ross, J. D. Elms, R. Truesdell, and S. R. Doty, 2007: NOAA’s Climate Database Modernization Program: Rescuing, archiving, and digitizing history. Bull. Amer. Meteor. Soc., 88, 1015–1017.
Durre, I., M. J. Menne, B. E. Gleason, T. G. Houston, and R. S. Vose, 2010: Comprehensive automated quality assurance of daily surface observations. J. Appl. Meteor. Climatol., 49, 1615–1633.
Finneran, H. T., 1965: Preliminary inventory of operational and miscellaneous meteorological records of the Weather Bureau: (Record Group 27). General Services Administration, National Archives and Records Service, The National Archives, 160 pp.
Harns, J. E., 1952: Storm Study SA 5-8, 3-8 September 1950. U.S. Army Corps of Engineers, Jacksonville District, 201 pp.
Jennings, A. H., 1950: World’s greatest observed point rainfalls. Mon. Wea. Rev., 78, 4–5.
Lawrimore, J., B. Gleason, C. N. Williams Jr., M. J. Menne, and W. E. Angel, 2010: U.S. and global in situ datasets for the analysis of climate variability and change. Preprints, 22nd Conf. on Climate Variability and Change, Atlanta, GA, Amer. Meteor. Soc., J14.1. [Available online at http://ams.confex.com/ams/pdfpapers/159705.pdf.]
National Weather Service, 1997: Evaluation of the reported January 11-12, 1997, Montague, New York, 77-inch, 24-hour lake-effect snowfall. NOAA/NWS/Office of Meteorology Special Rep., 60 pp. [Available online at ftp://ftp.ncdc.noaa.gov/pub/data/cmb/special-reports/ncec/mantague-ny-snowfall-24hour.pdf.]
National Weather Service, 2011: National Weather Service Instruction 10-1004: Climate records. NOAA/NWS NWSI 10-1004, 43 pp. [Available online at http://www.nws.noaa.gov/directives/sym/pd01010004curr.pdf.]
NCDC, 2006: US snow climatology background. [Available online at http://www.ncdc.noaa.gov/ussc/USSCAppController?action=scproject.]
NCDC, 2009: Data documentation for data set 3200 (DSI-3200): Surface land daily cooperative summary of the day. NCDC Publ., 19 pp. [Available online at http://www1.ncdc.noaa.gov/pub/data/documentlibrary/tddoc/td3200.pdf.]
OFCM, 2005: Federal meteorological handbook No. 1—Surface weather observations and reports. Office of the Federal Coordinator for Meteorological Services and Supporting Research, 104 pp. [Available online at http://www.ofcm.gov/fmh-1/pdf/FMH1.pdf.]
Paulhus, J. L., 1953: Record snowfall of April 14-15, 1921, at Silver Lake, Colorado. Mon. Wea. Rev., 81, 38–40.
Peterson, T. C., and Coauthors, 2008: Why weather and climate extremes matter. Weather and Climate Extremes in a Changing Climate, U.S. Climate Change Research Program Synthesis and Assessment Product 3.3, 11–34.
Price, S., P. T. Matsuo, and K. T. S. How, 1975: Study element report: Climatology. Hawaii Water Resources Regional Study, 126 pp. [Available online at http://www.hawaiistateassessment.info/library/Hawaii_Water_Resources_Regional_Study_Element_Reports_1975/HWRRS_Study_Element_Report_Climatology_April-1975.pdf.]
Reek, T., S. R. Doty, and T. W. Owen, 1992: A deterministic approach to the validation of historical daily temperature and precipitation data from the Cooperative Network. Bull. Amer. Meteor. Soc., 73, 753–765.
Ross, T. F., and R. T. Truesdell, 2010: NOAA’s Climate Database Modernization Program (CDMP)—A decade of data rescue and modernization activities. Preprints, Eighth AMS Presidential History Symp., Atlanta, GA, Amer. Meteor. Soc., J7.2. [Available online at https://ams.confex.com/ams/pdfpapers/159403.pdf.]
Shein, K. A., 2008: Interactive quality assurance practices. Preprints, 24th Conf. on Interactive Information Processing Systems (IIPS), New Orleans, LA, Amer. Meteor. Soc., 6A.9. [Available online at https://ams.confex.com/ams/pdfpapers/131217.pdf.]
Shein, K. A., D. P. Todey, F. A. Akyuz, J. R. Angel, T. M. Kearns, and J. L. Zdrojewski, 2012: Revisiting the statewide climate extremes for the United States. Bull. Amer. Meteor. Soc., in press.
Smith, E., 1956: Report of rainstorm damage, 2 (November 1955 and January 1956). Kilauea Sugar Co., Ltd., 36 pp. [Available from Kaua’i Historical Society, P.O. Box 1778, Lihu’e, HI 96766.]
U.S. Weather Bureau, 1944: Storm of July 17–18, 1942 New York–Pennsylvania: Supplement to Daily and Hourly Precipitation. Compiled by USWB Hydrologic Unit, Albany, NY, 40 pp.
You, J., and K. G. Hubbard, 2006: Quality control of weather data during extreme events. J. Atmos. Oceanic Technol., 23, 184–197.