1. Introduction
Impact-based decision support (IDSS) and impact-based warnings (IBWs) are efforts to consider how a forecast will be experienced by end users and to communicate that impact. As Campbell et al. (2018) assert, there is a need to move “from ‘what the weather will be’ to ‘what the weather will do’.” To that end, IBWs attempt to meet a variety of user needs and effectively tell the story of the probable threats severe weather may bring to a location. Much of the research on IDSS centers on IBWs. While it has been asserted that including specific impacts of an event in a warning may lead to more appropriate responses by members of the public who receive the IBW (Casteel 2016, 2018; Weyrich et al. 2018; WMO 2015), others have found either no effect or mixed results (see, e.g., Potter et al. 2018; Ripberger et al. 2015). However, these studies address public responses to IBWs. Studies that focus on perspectives of emergency managers and other professionals have reported on the benefits of IBWs, including, among others, “added awareness of antecedent conditions and cascading hazards” (Potter et al. 2021), a focus on the information (impacts, not amounts) that resonates with the decisions emergency managers make (Kox et al. 2018), providing more insight into what forecasters are thinking (Galluppi et al. 2013), and particular utility for events that occur less frequently (Schmidt et al. 2022). Of course, there are also challenges with IBWs as there are with all forms of risk communication, including how to meet the different needs of target audiences, which impact thresholds are appropriate, and how much information to include (Morss et al. 2016, 2018; Potter et al. 2018, 2021; Ripberger et al. 2015). Additionally, IBWs may be effective in informing user perceptions of severe storm events, but this does not necessarily lead to users taking action in response (Potter et al. 2018).
A relatively new product in the impact-based forecast suite is the winter storm severity index (WSSI), a graphical product designed to meet the need for high-level notice, providing general situational awareness of the severity and range of potential impacts from an impending winter weather event. Although WSSI is not a warning in the technical sense, it is similar in that it presents anticipated impacts from a forecast storm. Studies on IBWs provide important background for understanding the use and effectiveness of the WSSI, but none of the studies above are focused on winter weather. Furthermore, IBWs are typically primarily text-based messages, whereas the WSSI is a graphical representation. Graphics—in this case, maps—have been found to be an effective means of presenting hazards because, among other aspects, they indicate who needs to take protective action without requiring familiarity with the language in text products, thus leading to greater personalization of the risk (Bean et al. 2015; Dallo et al. 2020). However, in addition to the challenges noted above with IBWs, all of which also apply to the WSSI, a graphical product has additional challenges related to design considerations in presenting impacts as well as effective text in legends. For example, two studies found that forecasts that are solely graphical can lead to inaccurate interpretations (Broad et al. 2007; Savelli and Joslyn 2013), and graphical products may not convey information that is understandable by recipients so that they are motivated to act (Hogan Carr et al. 2016a). Thus, design factors, including the use of color, have been shown to help people “make sense of the information being conveyed” (Hogan Carr et al. 2016b). Additionally, accompanying text information, particularly in map legends, is critical to provide explanation and detail. These elements are important to consider in determining effective ways to communicate winter storm impacts.
There is a paucity of research that addresses users’ needs for and use of winter weather forecasts (Sherman-Morris 2013); much of what exists addresses decisions about school closings (see, e.g., Call 2010; Call and Coleman 2014; Montz et al. 2015), use of forecasts to manage transportation routes (see Ye et al. 2009; Strong et al. 2010), and the impacts of uncertainty in warnings (see Drobot 2007; Drobot et al. 2008; LeClerc and Joslyn 2015), including second-order uncertainty or uncertainty about the uncertainty in a forecast (Kox et al. 2015). Nichols and Hoekstra (2011) illustrate the factors that influence school district decision-making in the face of forecasts for severe weather (Fig. 1), and Montz et al. (2015) adopted this model for their work on school closure decisions related to winter storms. Weather information, though central to the weather-related decision-making process, is only part of the process. Different decisions may well result from similar forecasts, given the other factors at play, including varying professional responsibilities and local context. However, the more trust users have in the forecasts, the more it will facilitate their decisions and, as Potter et al. (2021) have found, IBWs can provide important situational awareness.
Model of the factors that influence decision-making (Nichols and Hoekstra 2011).
Citation: Weather, Climate, and Society 15, 3; 10.1175/WCAS-D-23-0023.1
Winter storms, including snow, ice, blowing snow, and freezing temperatures, present one of the largest weather hazards in the United States, causing tremendous financial damage, disruption to services, and often, loss of life and property. Along with public audiences, professional users of National Weather Service (NWS) products, including emergency managers, transportation departments, utilities, hospitals, schools, and aviation partners, need timely and accurate information about when and where a winter storm may hit. But forecasting winter storms is a complex process involving a range of critical impacts, which can occur at different levels and spatial distributions within a single storm event. Drawing from a mixed-methods social science research study conducted by the Nurture Nature Center in coordination with the Weather Prediction Center (WPC) addressing the utility of the WSSI for forecasting winter weather impacts, this paper focuses on how professional stakeholders understand, interpret, and use this graphical impact-based product for communicating about impending winter weather.
The WSSI
The WSSI has emerged in response to forecaster and professional user needs for easily consumable forecast information that identifies the multiple impacts and relative severity of an impending storm. Initially conceived at the Burlington, Vermont, Weather Forecast Office (WFO), the WSSI was taken under development by the WPC, and the science behind the tool has continued to evolve. The WSSI has been launched for WFOs across the contiguous United States (CONUS). The WSSI uses geographic information systems (GIS) by screening the official NWS gridded forecasts from the National Digital Forecast Database (NDFD) for winter weather elements and combining those data with nonmeteorological or static information datasets (e.g., climatology, land use, and urban areas) that have an effect on the nature and potential severity of impacts and results in a graphical depiction of impacts from winter weather. Specifically, the WSSI is calculated using forecast 6-h snow, ice, and precipitation accumulation, wind speed, temperature, snowfall rate, snow to liquid ratio, snow depth and snow water equivalent, urban area and land use designations, and National Centers for Environmental Information (NCEI) annual snowfall climatology. Specific details of the WSSI computation can be found at the WPC WSSI website (https://www.wpc.ncep.noaa.gov/wwd/wssi/wssi.php) in the product description document (WPC 2022).
The overall expected severity of winter weather is classified into five categories: minor, moderate, major, extreme, and winter weather area (where winter weather conditions are expected but not anticipated to impact daily life). The WSSI output consists of graphical image files and GIS files for an overall impact map and maps of six components: snow amount, ice accumulation, snow load, blowing snow, ground blizzards, and flash freeze. The overall impact is a composite of the maximum impact from any of the six components. The algorithms for the components predict the severity of the winter weather on a scale of 0 to 5 for the specific subcomponents, with the 0 to 5 scale equating to the potential severity category levels (0 for no winter weather, 1 for winter weather area, and 2 for minor, 3 for moderate, 4 for major, and 5 for extreme impacts). The final WSSI value is the maximum from among all the components for each grid point (at 2.5-km NDFD resolution). The WSSI does not account for conditions that occur prior to its creation, as it uses forecast information and thus will not be representative of the full event in an ongoing weather situation. WSSI articulates impacts for audiences with a 72-h forecast window that is available as 24-h forecasts for each of days 1, 2, and 3 and as a 72-h forecast for days 1–3, and then scales the resulting forecast severity into five impact levels (Fig. 2). This scaling is designed to help users quickly and easily identify the forecast level of winter storm impacts.
WSSI examples showing (left) overall severity and (right) severity of a component element, snow amount, for a day-1 forecast. On the website, the WSSI is interactive, providing the ability to zoom down to the street level. Note that this is the current representation of the WSSI (as of December 2022), which has been modified on the basis of our iterative research and communication with the WPC. When testing in the study focus groups, an earlier version of the WSSI was shown (see Fig. 3, below, for details on what was included in the focus-group rounds).
Citation: Weather, Climate, and Society 15, 3; 10.1175/WCAS-D-23-0023.1
Each component of the WSSI presents a different hazard and, in many cases, creates impacts specific to different users, partners, and regions. For instance, transportation-related users need to understand where to anticipate ice accumulation and blowing snow to safely prepare for travel conditions. Snow load will be critical for infrastructure impacts due to the weight of snow and for emergency managers, who may need to prepare for extended power outages or other service disruptions, necessitating planning for emergency shelters. Ground blizzards, which are rare in most regions of the country, are high impact when they happen, as they combine preexisting snow with very strong winds to create hazardous conditions, which present significant impacts for transportation and other sectors. Flash freeze creates urgent transportation considerations that are distinct from that of blowing snow and require a different planned response in situations where rapidly changing temperatures around freezing have impacts during or after precipitation. Ice accumulation can cause widespread tree damage, utility line disruption, and transportation hazards, while snow accumulation can create transportation challenges as well.
The WSSI has two key audiences. First, it is intended to assist NWS operational forecasters in maintaining situational awareness of the possible significance of weather-related impacts and facilitate collaboration around such impacts across WFOs and national forecast centers. Second, the WSSI is designed to enhance communication to external professional partners, including utilities, transportation, emergency management, water resources, and schools, of the expected severity (potential societal impacts) of winter weather and its spatial distribution. While there was some feedback collected early in the development of the WSSI, there was no user testing conducted to ensure that the WSSI captured the right mix of storm components or that it was appropriately categorizing threats. Furthermore, no testing had been done to determine how various users of the WSSI would interact with the product or to determine how information should be presented. Iterative testing through this mixed-methods social science research study was completed to address these gaps.
2. Method
To collect feedback and provide revision recommendations for the WSSI, methods included two rounds of virtual focus groups in six diverse regions of the CONUS, pre- and post-focus-group-session surveys, a third round of testing via an online survey to all previous participants, and inclusion in the WPC’s Hydrometeorological Testbed Winter Weather Experiment. Two-hour focus groups were held with professional stakeholders in six WFO areas: Grand Rapids, Michigan; San Joaquin/Hanford, California; Jackson, Mississippi; Boston, Massachusetts; Omaha, Nebraska; and Boulder, Colorado. Round 1 was held virtually in January 2021, and round 2 was held virtually in October 2021. Also included were a virtual focus group for forecasters throughout the CONUS (one in round 1 and one in round 2), a virtual focus group with only industry representatives including transportation and utilities (round 1 only), and a virtual focus group with personnel from WPC and NWS Headquarters (round 1 only). Participants represented professionals in utilities, emergency management, transportation, municipalities, education, public health, water resources, media and news, police, operations, and more. Participants were different in round 1 and round 2, but all participants were invited back to complete round 3. The focus groups for industry, forecasters, and WPC, along with the round-3 survey results, are not included in the analysis here, due to their specialized perspectives.
The focus-group discussions and surveys used products provided by each area’s WFO as part of a scenario about a severe winter storm event relevant to each geographical area. Scenarios started from 2 weeks to several days ahead of the target event date, to incorporate the different regional contexts. In round 1, the WSSI was the then-current WSSI developed by WPC. In round 2, the same scenarios were used but with revised, mocked-up WSSI graphics designed after incorporating survey and focus-group feedback from round 1 (see the Boston examples in Fig. 3 with explicit changes and explanations noted). The changes made were assimilated from all focus groups and were not specific to each location. In round 3, the online survey did not use a scenario-based approach but rather showed a further revised WSSI graphic and included specific questions about components and legend details in order to refine product design and wording recommendations. The surveys included both open-ended questions and multiple choice questions. Results from only rounds 1 and 2 are reported in this paper. Survey instruments are provided in the online supplemental material.
Example of the WSSI product used in Boston in (left) round 1 and (right) round 2. Explicit details of changes made between rounds are noted.
Citation: Weather, Climate, and Society 15, 3; 10.1175/WCAS-D-23-0023.1
For each focus group, participants were recruited through partnering WFOs, which provided contacts with whom the research team connected. During each session, participants completed a presession survey about winter storm experience, challenges, and demographic information. Specifically, the survey asked questions such as “what are the most significant community or social impacts of winter weather events in your area?”; “how do you use and access NWS winter weather information?”; and “if you learn a significant winter event is approaching, what do you typically do with that information?” Then participants were led through a winter storm scenario via a presentation showcasing the WSSI as it is commonly used within the local WFO (i.e., as part of briefing packages or weekly partner emails) to test the current format and delivery. The facilitator asked questions about the types of decisions the users make and how the users have or might engage with WSSI in their decision-making processes. After the focus-group discussion, participants completed a postsession survey asking for detailed feedback on the design of the products, as well as the ways they would share the information provided in the WSSI. Specific questions included “what is the biggest barrier you face in responding to and/or preparing for winter storm events?” and “what else would be important for us to know about how you gather information about winter weather risks and your intended actions?” Following the focus groups, survey responses were aggregated and analyzed and focus-group recordings were transcribed and content coded using NVivo software.
Following the analysis of the round-2 survey results and focus groups, further refinement of the product design and recommendations was tested in the WPC’s Hydrometeorological Testbed [as part of the 12th Annual Winter Weather Experiment (WWE)] with forecasters through discussion about the utility of the product for forecaster use and a brief survey. The WWE provides collaborative research-to-operations support by bringing together forecasters, researchers, and academics to evaluate and discuss winter weather forecast challenges. A description of the testbed and results is available in Harnos et al. (2022). Then a virtual survey was administered (round 3) to ask previous focus-group participants about specific elements and options related to the legend and components (Fig. 4). Throughout the process, the research team provided interim findings and recommendations and debriefed the WPC team, which led to the implementation of some product revisions, including changes to the legend descriptions, number of legend categories, title, and several other graphic design changes.
Example of the product tested in the round-3 online survey.
Citation: Weather, Climate, and Society 15, 3; 10.1175/WCAS-D-23-0023.1
3. Results
a. Surveys
Focus-group participants were asked to complete a pre- and postsession survey in each round. Participation rates are noted in Table 1. Across all the sites, the winter weather impact of most concern related to travel disruptions, with 62% of participants mentioning travel in round 1 and 39% mentioning travel in round 2. Power outages were the second-most-mentioned impact (31% in round 1 and 13% in round 2), followed by school (6% in round 1; 9% in round 2) and business (12% in round 1; 7% in round 2) disruptions. Barriers to responding to winter storms included unpredictable weather, the timing of storms, lack of confidence in forecasts, accuracy in forecasts or uncertainty, and limited resources.
Number of participants in the surveys for each round by location.
Focus-group participants varied in their familiarity with the WSSI (Fig. 5). In round 1 23% of participants were not familiar with the WSSI and 12% used it regularly. In round 2, 22% of participants were not familiar with the WSSI and 22% used it regularly.
Focus-group participants’ level of familiarity with the WSSI reported in presession surveys for (left) round 1 and (right) round 2. Each location is reported along with a summary of all locations (rightmost column in each graph).
Citation: Weather, Climate, and Society 15, 3; 10.1175/WCAS-D-23-0023.1
The perceived usefulness of the WSSI product varied by region (Fig. 6), with some areas, such as Jackson, seeing less use, while others such as Boulder and Omaha finding high utility in the WSSI. Boston, Hanford, and Omaha all had higher perceived usefulness of the WSSI in round 2 than in round 1. These high levels of usefulness across sites, and the increase in usefulness from round 1 to round 2, support the effectiveness of design modifications in improving the utility and understandability of the product. Jackson had lower levels of utility for the product, in part due to skepticism of forecasting winter weather in the area and not necessarily to a lack of understanding, as reflected through comments related to needing more time with the product and a desire to “ground truth” the forecast. Specifically, participants suggested they wanted to spend more time with the product and ground truth it in order to assess how useful it might be; in these comments, participants expressed a willingness to continue working with the product even if initial confidence was tentative.
Usefulness of the WSSI reported in postsession surveys across all locations in (left) round 1 and (right) round 2. Extremely or very useful is shown in red, and slightly or not useful is shown in orange.
Citation: Weather, Climate, and Society 15, 3; 10.1175/WCAS-D-23-0023.1
b. Focus groups
As described earlier, focus-group participants were led through scenarios in which the WSSI was included in briefings or other communications WFOs would typically disseminate with an impending event. Because our focus here is on understanding the utility and interpretation of impact-based winter forecast products, we refer the reader to the report of our full findings for details on design changes and additional recommendations (Nurture Nature Center 2023). The findings related to impact-based products fall under several main concepts: the utility of an impact-based product; confusion interpreting an impact-based product; describing winter weather impacts; timing matters with impacts; consideration of other impacts; changes to the product; and impact variability by location. Quotations included for each main finding were selected to illustrate the range and frequency of responses received from various participants across all sites.
1) Utility of an impact-based product
The WSSI, along with other products, helps professionals understand the situation and disseminate that information as needed. Participants noted that the WSSI is helpful in planning and determining which resources and staffing might be needed where, as well as making decisions relating to closures. In round 1, participants thought the WSSI provided good situational awareness. Several mentioned that it provides a heads-up for planning a response. For instance, one participant said, “knowing that this is the weather impacts specifically, and that we can then make it a layer in our own EOC [emergency operations center] and add site-specific or area-specific information on top of that is really helpful for that full picture.” In round 2, the revised WSSI was also seen as very useful, at least as a starting point, for the professionals in the focus groups, as reflected in comments such as “It’s a really just quick way to aggregate a lot of information and a lot of atmospheric properties into one map and kind of convey risk.” It was further noted that the information helps the professionals make decisions about the scale and locations of the resources that may be needed.
2) Confusion interpreting an impact-based product
Despite the utility of the product expressed by many participants, some components of the maps, such as word choices and a lack of quantitative information, led to confusion among the focus groups. In round 1, the title “WSSI Overall Component,” was not readily understood, with one participant asking, “What’s a WSSI?” and others questioning the use of the word component. There was a suggestion that the word component be replaced by impact, arguing that “since everything is impact based everywhere else, I’m not sure what component means.” It was also noted that it is not clear that the overall map is a combination of the six components: “So, if I saw this, without any other context I would go OK, is that the heaviest snow, is that heaviest ice, is this blowing snow? What is this telling me?”
Titles of some components, such as “Ice Accumulation” and “Snow Amount,” caused a number of participants to assume that quantities would be linked to the legend categories “because you see the word amount and you’re immediately searching for totals.” Others reflected their need for quantities: “I think when you have a map depicting snow amount but you don’t have estimated inches, that’s going to confuse people as well” and “From my perspective, having snow amount is very important because it does dictate a lot of our impacts to communities.” These and similar comments made across focus groups illustrate a potential misunderstanding among some professionals of the impact-based purpose of the product.
In round 2, there remained an expressed need for quantities: “Why not tell me the amount of snow that we’re going to get, rather than say minor or . . . moderate? Tell me we’re gonna get two to four inches or we’re gonna get one to three inches, not just a color-coded graph.” The lack of actual numbers of accumulation was seen by some as reducing the utility of the product: “I do think that by trying to add or display things like ice accumulation or snow amount in here without those actual numbers, I think it waters down the effectiveness of the product”; and “I still think that for me, being a quantitative person, you know the more detail we can get the better, versus the sort of qualitative threshold breaks.” These sentiments may reflect the need for greater clarity and/or education about the WSSI as an impact-based product or highlight the need for both quantitative and impact-based information to be included in winter weather briefings.
3) Describing winter weather impacts
How the levels of impact are differentiated in the legend categories was another topic of discussion in round 1. Without explanations or definitions of the categories beyond the level provided (e.g., limited, minor, moderate), there was often difficulty sorting out the extent of potential impacts. One participant said, “I think they just generally want to know more about what these different colors mean because the descriptions that you have there in the scale are kind of generalized.” It was also noted that “we’re trying to condense all these sort of effects and categorize them into one or one small thing, but there’s a lot happening there,” calling for greater detail in the legend on each map.
Participants had a difficult time understanding the difference among the categories, an example being “that just might be my lack of use of this product, but like . . . I don’t know what minor, moderate, major, extreme, what the breakdown of that is.” Another participant asked, “What’s the difference between limited and minor? It seems like it’s the same to me.” Others questioned why the limited category is even needed. As one professional said, “For me, it’s just too general. It’s just, it’s not specific enough.” There were discussions about the relative nature of the terms used, recognizing that what is minor to one person may be major to another. Without definitions of how the categories are determined, one participant asked, “What is it that makes it minor impacts or moderate impacts? For me to try to understand it and then effectively communicate it, I want more information so I can know what I’m looking at and that’ll help me better explain it to partners.” In general, there was an expressed need for specific information on what leads to an impact being categorized as, for example, major or extreme.
In round 2, there was concern that what is minor, moderate, and major is relative such that a “moderate impact may be more severe depending on where it is” and “it can’t be complete unless it’s layered upon local impact.” The minor category generated a great deal of discussion, with suggestions that it could generate a false sense of security. Specifically, it was noted that while the impacts might not be a direct threat to life and property, it is not clear what the impacts are besides inconvenience. Furthermore, those that might occur could be more than minor, because, for instance, “we know that poorly timed minor conditions can cause a real pile up.” And location makes a difference: “If southern Mississippi receives a fourth inch of snow, I guarantee it’s gonna be a direct threat to life and property.” One participant worried that if she communicated minor to her “higher ups,” they would only consider basic preparedness and not worry about the potential for moderate or major impacts.
The extreme category also generated discussion. One participant said, “I think this would be a terrifying map to look at . . . I got an extreme impact right next door to us, so I would definitely start, well, probably already making calls”; and a professional in the Grand Rapids region said, “The extreme should only be multiday . . . up to a week impact. It should never be used except for the most extreme events.” While the extreme category definitely draws attention, comments indicate that the professionals want more specific definitions of what the impacts for each category are, similar to what came out of the round-1 focus groups. As one put it, “I would expect, OK, it’s going to be really really bad in certain areas or circumstances, but why? It doesn’t say why, what’s causing it.” A number of participants asked for examples of the kinds of impacts one might expect in each category, with some requesting specific transportation impacts or impacts to power, and others suggested providing examples of potential property damage.
Another need expressed related to historic information. As one asked, “what about archival data somewhere giving the link to the last three, one to three storms with similar forecasts in the actual impacts from previous storms?” Several stated that such information allows them to put the impacts that are forecast into perspective because it would provide something to which they can relate the current event.
4) timing matters with impacts
Beyond knowing which impacts might occur, it matters when the impacts are going to occur because, as the professionals noted in round 1, impacts will vary at different times, so the nature and timing of their decisions will need to reflect that. An obvious example is rush hour versus later in the day or overnight, pointing out the importance of being able to track the impacts in smaller time increments than the 3-, 2-, and 1-day maps, or whether it is on a weekday or weekend. One participant said, “Because decision makers need to see if it’s going to be over the weekend or if it’s going to be at night. Those kinds of things are important to us.”
It’s all about time of year and perception and what’s going on, but there there’s been a lot of times that when less snow or almost like some freezing drizzle, not an ice storm but some freezing drizzle, will cause much bigger events crash wise than the bigger events themselves.
Timing of impacts, such as what time of day the weather occurred or if it was on a weekend or weekday, remained a concern in round 2 because of how that affects various responders’ operations. There was an appreciation for the new 24-h rolling display in 6-h increments that was shown as a prototype. “Certain decision-making points happen at certain times, so for us I mean 24 hours and at least 12 hours prior. I mean, by the time we’re hours before, we’re already moving into what we need to do with stuff as far as staffing and planning . . . So I think it would be a useful tool.”
I think there’s still issues that we’ve talked about in terms of explaining some of the timing or onset, or closeout that would be either have to be conveyed in some way, because I know that would be a question I would still want to ask or know would come up and want to sort of proactively answer when sharing this kind of information out to folks.
So again 6 hours helps, but having sort of a larger narrative of like what happens even within that six hours, right? Are we waiting for, are the major impacts at the end of that six hours, throughout the six hours, right smack in the middle? Do they happen twice? Do they go up and down and up?
I would like to know if there’s going to be an ice accumulation prior to the snow, because that can really change things” and “I think you know a specified more time frame of when you could see it [flash freeze] is probably more something that I would like to see kind of elaborated more, you know, on that.
5) Consideration of other impacts
In all of the focus groups, there were comments about other impacts of winter weather that are important to their decision-making. Two mentioned consistently were temperatures and wind. With respect to the former, one professional pointed out that a couple of degrees of temperature change up or down can have a huge impact on decisions that need to be made regarding such important activities as road treatment and snow removal. It was recognized that wind also plays into this, a requested component across focus groups: “It would be a useful tool and save us a couple of steps if wind speeds, temperatures, and wind chills were also included.” Besides the impacts of wind chill, the impact of wind on visibility was also mentioned as being of great importance to their decisions and actions, and wind direction makes a difference in some regions.
6) Changes to the product
It doesn’t complicate the map for those who just want to visually see the impact, but if you do have folks who may be preparing a more detailed report or those who are really interested in wanting to know more, that option is there. So it’s nice to have it available for those who need to access it, but it doesn’t take away from the overall objective of the document or the map.
Yeah, we definitely need any wind chill data, timing, severity . . . I think it also helps us interpret . . . if it might affect . . . the heavy wet snow or accumulation that’s already there. So yeah, it’s an integral part of how we try and interpret or forecast storms and our response.
The importance of wind chill to decision-making was emphasized by one school official who noted that it is critical to their discussions about delaying or closing and another professional who said he would use the information to keep their people who are in the field safe. Yet, not all agreed on the helpfulness of the wind-chill impact map, with one professional questioning what sort of damage wind chill has on property and others saying that they need to see numbers rather than categories of impact. Some thought it would be helpful, “but in combination with at least an approximation of what the temperatures would be.”
7) Impact variability by location
Some of these terms we’re not going to be very familiar with down here like snow load. It makes sense as to what it was after you explained it, but it’s not something that’s like intuitive. And so there’s the very quick and dirty definitions that are there at the top, but even then, it’s still pretty baseless. It, there’s not a whole lot there and so like I need to know what a flash freeze is. What does that look like? What does that feel like? You know how is that going to impact me? I don’t need a paragraph, but I need to know like you know water on surfaces could, could freeze within an hour, that bridges will become suddenly icy, you know something like that. Just I feel a little lost because I’m not used to these snow terms.
Some of the discussion in the focus groups centered around storm characteristics and impacts specific to the location. In the Grand Rapids focus group, discussion was about the lake effect and the difficulty in addressing it in the WSSI. As an example, in 2016, a 53-car pileup on an interstate highway caused three deaths, and one professional pointed out that “it happened in a minor impact forecast.” There is concern about how unpredictable lake-effect events can be with respect to which areas they will affect, particularly given problems when motorists “drive from sunshine to a band of lake effect that is two miles wide.” Thus, participants noted that wind speed and direction, as well as the scale of lake effect events, pose issues for determining impact severity. In Grand Rapids, a participant noted he was more concerned about snow than visibility, due to their more rural location: “For us we’re more rural and so blowing, drifting snow and visibilities are less of a concern for us because we don’t have those visibility factors like on the freeways. So for us, it’s that heavier snow that can cause more impacts.”
Another regional difference came up in the Boulder and Hanford focus groups, relating to the role of elevation and population. There was some confusion as to the categories used in mountainous areas where complex terrain may make winter weather more severe but lower population density and greater familiarity with winter weather may reduce overall impacts. On one hand, a participant asked with respect to the snow amount map, “it is appearing that the moderate impacts are kind of on the western faces of those mountains, tapering off as it gets higher into elevation . . . but is that meaning that the impacts and that level is adjusted based on elevation?” The lower population at that location led to the lowering of the impact category, a factor that was not readily understood. Yet, in another scenario, the WSSI showed major and extreme impacts at high elevations, which was contested by the participants because there are few people and communities to be affected. Similarly, professionals wanted to have a better idea of the elevation at which the snow and ice are occurring: “This [the ice accumulation WSSI map] plus the elevation data would be helpful because it would give us a better understanding of, you know, the roads and communities where people are used to ice being impacted, or is it getting down further to the areas where it is less common for people to deal with it”; and “I know that you know the topography obviously has an impact here and I’m sure that’s why those things are there, but it doesn’t present, none of that is, you know, this is there’s nothing about elevation, there’s nothing about terrain.” These examples illustrate the needs and impacts as they vary geographically across the country, as well as the need to help users understand what is calculated in severity levels.
4. Discussion
While impact-based forecast products are seen to have utility in an overall package of information for decision-making, variability in interpretations of impacts creates complications in communicating winter weather. What is one area’s moderate impact may not be for another, a result also emphasized by Meléndez-Landaverde and Sempere-Torres (2023), who noted different vulnerabilities may require different thresholds for impact categorizations, such as relevant impacts for different types of critical infrastructure (Kiel et al. 2016). While the WSSI does consider climate in its calculation, community readiness for storms emerged repeatedly as a factor in how users would accept and translate the WPC’s severity levels. Areas that are used to snow and have equipment to handle transportation and power impacts have higher thresholds for impact than those typically not affected by winter weather (e.g., Boston vs Jackson).
There are many elements of winter storms that are critical for understanding impacts and informing decision-making, including timing and frequency. For those with frequent winter storms, impacts are more readily known for certain storm components, but there is a need for information about less-frequent impacts. Furthermore, the timing of the impacts is key for understanding the category of impact level; the first snow or ice event of the season has a greater impact than the third or the fourth. So, too, do antecedent conditions such as soil moisture or previous precipitation, a factor pointed out by Potter et al. (2021).
A number of focus-group participants in the study wanted to know the actual numbers (how many inches of snow or inches of ice accumulation) behind the impact categories in order for them to determine for themselves the true level of impact for their area and level of preparedness, again reflecting variations in vulnerability to the impacts. This differs from the findings of Kox et al. (2018), who found that decisions of emergency managers are based on the impacts and not on the amounts themselves. It could be argued that more time with the WSSI product is needed for professional users to understand how the forecast impacts play out in reality, perhaps obviating the need for actual numbers from which they can calibrate their own understanding of the forecast impacts.
Given these variable needs for impact-related information, the findings from focus groups across all sites in this study highlight how essential it is to have careful consideration of categories and clarity about thresholds for those categories, especially for a national product such as the WSSI. Expressed needs for impacts related to travel and power, along with a rolling 24-h forecast, and changes to categories and descriptions were able to be acted upon during the course of the research, improving the utility of the product in real time.
Additional elements providing more detailed information about impacts within each category of the WSSI impact-based product and having the option to provide important takeaways in a forecaster’s note improved the perceived utility of the product to inform users’ decision-making. However, there remains a strong need for more education about what the WSSI is and what it is not. There was a desire for quantities for components like snow and ice amount, yet providing that specific type of information is not the intent of the impact-based WSSI. Such information could be included in emergency briefings accompanying the WSSI, and it will be important for forecasters to understand what additional information the WSSI triggers users to seek out. Thus, while the WSSI is seen as an important product to bring attention to potential impacts of an impending storm, it is just one part of an overall package of information professional users are looking for to inform their winter weather decision-making.
5. Limitations
There were several limitations to this research that need to be considered in evaluating the results. The team reached out to partners identified by the local WFOs, and not all participated in the focus groups, leaving self-selected groups in each location. While the participants represented a range of professions relevant to the aims of the project, including both the public and private sectors, all relevant professions were not represented in each focus group. Similarly, the study sites were chosen to cover a range of winter weather experiences, but not all winter weather climatologies across the CONUS were included. Additional study sites could lead to further important findings. The research team found the virtual focus groups to be very productive and that the virtual format made it easier for some professionals to participate. At the same time, in-person sessions might have led to more in-depth discussion, avoiding the distractions that participants may have had while participating in a remote format.
6. Conclusions
As illustrated in the survey and focus-group results from this mixed-methods social science research study, there is perceived utility in a winter weather impact-based product for situational awareness. WSSI is seen as a simple way to communicate risk to stakeholders as part of a package of products for use in decision-making, helping users understand potential impact and resource needs. The impact-based nature of the product is still relatively new to many of the professionals who will incorporate the product into their decision-making. As discussed above, although the WSSI does consider climate in its calculation of categories, users nonetheless requested amounts for some components, in some cases intending to correlate those amounts with their internal processes for determining threshold decisions—in essence, to determine their own categorization of the severity of impacts. An impact-based product by design works differently, and users appear to require more experience and clarification about the goals of the product to use it appropriately. Geographical differences, and particularly a region’s experience with winter weather, preparedness, and resources, all affect user interpretations of impact. Furthermore, impacts can differ even in the same area depending on the time of year.
Users who were less familiar with the product identified a need to ground truth the WSSI information. Training users over time, as well as careful explanation of categories, will help professionals as they incorporate severity levels into their operational systems. Understanding the factors that influence perspectives on impact levels, as well as the variable needs for winter weather information across regions, improves forecasters’ abilities to effectively communicate and provide critical information that helps end users to prepare for severe winter weather.
Acknowledgments.
This paper was prepared by the Nurture Nature Center, Inc., under Award NA20OAR4590355 from the Joint Technology Transfer Initiative Program of the National Oceanic and Atmospheric Administration (NOAA), U.S. Department of Commerce. The statements, findings, conclusions, and recommendations are those of the author(s) and do not necessarily reflect the views of NOAA or the U.S. Department of Commerce. The authors thank the seven National Weather Service offices who were partners for this study—Weather Prediction Center: Dr. Joshua Kastman, Dr. Kirstin Harnos, James Nelson, and Dr. Dana M. Tobin (CIRES); Grand Rapids, Michigan, Weather Forecast Office: Daniel Cobb, James Maczko, Walt Felver, and Brandon Hoving; Boulder, Colorado, Weather Forecast Office: Jennifer Stark, Paul Schlatter, and Gregory Hanson; Jackson, Mississippi, Weather Forecast Office: Chad Entremont, Eric Carpenter, and Thomas Winesett; Boston, Massachusetts, Weather Forecast Office: Andy Nash, Rodney Chai, and Andrew Loconto; San Joaquin/Hanford, California, Weather Forecast Office: Kristian Mattarochia and William South; Omaha, Nebraska, Weather Forecast Office: Suzanne Fortin, Brian Smith, Brian Barjenbruch, and Cathy Zapotocny.
Data availability statement.
All data created or used during this study are openly available at Harvard Dataverse (https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/H0NGXR).
REFERENCES
Bean, H., J. Sutton, B. F. Liu, S. Madden, M. M. Wood, and D. S. Mileti, 2015: The study of mobile public warning messages: A research review and agenda. Rev. Comm., 15, 60–80, https://doi.org/10.1080/15358593.2015.1014402.
Broad, K., A. Leiserowitz, J. Weinkle, and M. Steketee, 2007: Misinterpretations of the “cone of uncertainty” in Florida during the 2004 hurricane season. Bull. Amer. Meteor. Soc., 88, 651–668, https://doi.org/10.1175/BAMS-88-5-651.
Call, D. A., 2010: A survey of county emergency managers’ response to ice storms. J. Homeland Secur. Emerg. Manage., 7, 32, https://doi.org/10.2202/1547-7355.1634.
Call, D. A., and J. S. Coleman, 2014: The decision process behind inclement-weather school closings: A case-study in Maryland, USA. Meteor. Appl., 21, 474–480, https://doi.org/10.1002/met.1359.
Campbell, R., D. Beardsley, and S. Tokar, 2018: Impact-based forecasting and warning: Weather Ready Nations. WMO Bull., 67, 10–13.
Casteel, M., 2016: Communicating increased risk: An empirical investigation of the National Weather Service’s impact-based warnings. Wea. Climate Soc., 8, 219–232, https://doi.org/10.1175/WCAS-D-15-0044.1.
Casteel, M., 2018: An empirical assessment of impact based tornado warnings on shelter in place decisions. Int. J. Disaster Risk Reduct., 30, 25–33, https://doi.org/10.1016/j.ijdrr.2018.01.036.
Dallo, I., M. Stauffacher, and M. Marti, 2020: What defines the success of maps and additional information on a multi-hazard platform? Int. J. Disaster Risk Reduct., 49, 101761, https://doi.org/10.1016/j.ijdrr.2020.101761.
Drobot, S. D., 2007: Evaluation of winter storm warnings: A case study of the Colorado Front Range December 20–21, 2006, winter storm. Natural Hazards Center Quick Response Rep., Vol. 192, 8 pp.
Drobot, S. D., C. Schmidt, and J. Demuth, 2008: The January 5–6, 2008, California winter storm: Assessing information sources, actions, and damages. Natural Hazards Center Quick Response Rep., Vol. 207, 27 pp.
Galluppi, K., J. Losego, and B. Montz, 2013: Evaluation of the effectiveness of the Central Region impact-based warning demonstration conducted by weather for emergency management decision support. NOAA, 28 pp., https://repository.library.noaa.gov/view/noaa/28893.
Harnos, K., J. Correia Jr., B. Albright, S. Trojniak, and J. Nelson, 2022: 12th Annual Winter Weather Experiment: Findings and results. Weather Prediction Center Hydrometeorological Testbed, 54 pp., https://www.wpc.ncep.noaa.gov/hmt/Final_Report_12th_Annual_WWE.pdf.
Hogan Carr, R., B. E. Montz, K. Semmens, K. Maxfield, S. Hoekstra, and E. Goldman, 2016a: Motivating action under uncertain conditions: Enhancing emergency briefings during coastal storms. Wea. Climate Soc., 8, 421–434, https://doi.org/10.1175/WCAS-D-16-0028.1.
Hogan Carr, R., B. E. Montz, K. Maxfield, S. Hoekstra, K. Semmens, and E. Goldman, 2016b: Effectively communicating risk and uncertainty to the public: Assessing the National Weather Service’s flood forecast and warning tools. Bull. Amer. Meteor. Soc., 97, 1649–1665, https://doi.org/10.1175/BAMS-D-14-00248.1.
Kiel, J., P. Petiet, A. Nieuwenhuis, T. Peters, and K. van Ruiten, 2016: A decision support system for the resilience of critical transport infrastructure to extreme weather events. Transp. Res. Proc., 14, 68–77, https://doi.org/10.1016/j.trpro.2016.05.042.
Kox, T., L. Gerhold, and U. Ulbrich, 2015: Perception and use of uncertainty in severe weather warnings by emergency services in Germany. Atmos. Res., 158–159, 292–301, https://doi.org/10.1016/j.atmosres.2014.02.024.
Kox, T., C. Lüder, and L. Gerhold, 2018: Anticipation and response: Emergency services in severe weather situations in Germany. Int. J. Disaster Risk Sci., 9, 116–128, https://doi.org/10.1007/s13753-018-0163-z.
LeClerc, J., and S. Joslyn, 2015: The cry wolf effect and weather‐related decision making. Risk Anal., 35, 385–395, https://doi.org/10.1111/risa.12336.
Meléndez-Landaverde, E. R., and D. Sempere-Torres, 2023: Design and evaluation of a community and impact-based site-specific early warning system (SS-EWS): The SS-EWS framework. J. Flood Risk Manage., e12860, https://doi.org/10.1111/jfr3.12860, in press.
Montz, B. E., K. J. Galluppi, J. L. Losego, and C. F. Smith, 2015: Winter weather decision-making: North Carolina school closures, 2010–2011. Meteor. Appl., 22, 323–333, https://doi.org/10.1002/met.1457.
Morss, R. E., J. L. Demuth, J. K. Lazo, K. Dickinson, H. Lazrus, and B. H. Morrow, 2016: Understanding public hurricane evacuation decisions and responses to forecast and warning messages. Wea. Forecasting, 31, 395–417, https://doi.org/10.1175/WAF-D-15-0066.1.
Morss, R. E., C. L. Cuite, J. L. Demuth, W. K. Hallman, and R. L. Shwom, 2018: Is storm surge scary? The influence of hazard, impact, and fear-based messages and individual differences on responses to hurricane risks in the U.S. Int. J. Disaster Risk Reduct., 30, 44–58, https://doi.org/10.1016/j.ijdrr.2018.01.023.
Nichols, A. C., and S. Hoekstra, 2011: Buses, bars, and breakdowns: Non-weather factors affecting decision-making at K-12 schools and universities during tornado warnings. 39th Conf. on Broadcast Meteorology, Seattle, WA, Amer. Meteor. Soc., http://ams.confex.com/ams/39BROADCAST/flvgateway.cgi/id/18157?recordingid=181.
Nurture Nature Center, 2023: Winter storm severity index: Improving storm readiness through severity and social impact forecasting. Final Rep., 177 pp., https://nurturenaturecenter.org/wp-content/uploads/2023/08/WSSI-Final-Report.pdf.
Potter, S., P. Kreft, P. Milojev, C. Noble, B. Montz, A. Dhellemmes, R.J. Woods, and S. Gauden-Ing, 2018: The influence of impact-based severe weather warnings on risk perceptions and intended protective actions. Int. J. Disaster Risk Reduct., 30, 34–43, https://doi.org/10.1016/j.ijdrr.2018.03.031.
Potter, S., S. Harrison, and P. Kreft, 2021: The benefits and challenges of implementing impact-based severe weather warning systems: Perspectives of weather, flood, and emergency management personnel. Wea. Climate Soc., 13, 303–314, https://doi.org/10.1175/WCAS-D-20-0110.1.
Ripberger, J. T., C. L. Silva, H. C. Jenkins-Smith, and M. James, 2015: The influence of consequence-based messages on public responses to tornado warnings. Bull. Amer. Meteor. Soc., 96, 577–590, https://doi.org/10.1175/BAMS-D-13-00213.1.
Savelli, S., and S. Joslyn, 2013: The advantages of predictive interval forecasts for non-expert users and the impact of visualizations: Advantages of predictive interval forecasts. Appl. Cognit. Psychol., 27, 527–541, https://doi.org/10.1002/acp.2932.
Schmidt, J., N. Tietze, L. Gerhold, and T. Kox, 2022: Requirements for the use of impact-based forecasts and warnings by road maintenance services in Germany. Adv. Sci. Res., 19, 97–103, https://doi.org/10.5194/asr-19-97-2022.
Sherman-Morris, K., 2013: The public response to hazardous weather events: 25 years of research. Geogr. Compass, 7, 669–685, https://doi.org/10.1111/gec3.12076.
Strong, C. K., Z. Ye, and X. Shi, 2010: Safety effects of winter weather: The state of knowledge and remaining challenges. Transp. Rev., 30, 677–699, https://doi.org/10.1080/01441640903414470.
Weather Prediction Center, 2022: Winter Storm Severity Index (WSSI) product description document. NOAA, 7 pp., https://www.wpc.ncep.noaa.gov/wwd/wssi/WSSI_PDD_2022-23.pdf.
Weyrich, P., A. Scolobig, D. N. Bresch, and A. Patt, 2018: Effects of impact-based warnings and behavioral recommendations for extreme weather events. Wea. Climate Soc., 10, 781–796, https://doi.org/10.1175/WCAS-D-18-0038.1.
WMO, 2015: WMO guidelines on multi-hazard impact-based forecast and warning services. WMO Doc. 1150, 34 pp., https://library.wmo.int/doc_num.php?explnum_id=7901.
Ye, Z., X. Shi, C. Strong, and T. Greenfield, 2009: Evaluation of the effects of weather information on winter maintenance costs. Transp. Res. Rec., 2107, 104–110, https://doi.org/10.3141/2107-11.