Abstract

As climate change unfolds, extreme weather events are on the rise worldwide. According to experts, extreme weather risks already outrank those of terrorism and migration in likelihood and impact. But how well does the public understand weather risks and forecast uncertainty and thus grasp the amplified weather risks that climate change poses for the future? In a nationally representative survey (N = 1004; Germany), we tested the public’s weather literacy and awareness of climate change using 62 factual questions. Many respondents misjudged important weather risks (e.g., they were unaware that UV radiation can be higher under patchy cloud cover than on a cloudless day) and struggled to connect weather conditions to their impacts (e.g., they overestimated the distance to a thunderstorm). Most misinterpreted a probabilistic forecast deterministically, yet they strongly underestimated the uncertainty of deterministic forecasts. Respondents with higher weather literacy obtained weather information more often and spent more time outside but were not more educated. Those better informed about climate change were only slightly more weather literate. Overall, the public does not seem well equipped to anticipate weather risks in the here and now and may thus also fail to fully grasp what climate change implies for the future. These deficits in weather literacy highlight the need for impact forecasts that translate what the weather may be into what the weather may do and for transparent communication of uncertainty to the public. Boosting weather literacy may help to improve the public’s understanding of weather and climate change risks, thereby fostering informed decisions and mitigation support.

1. Introduction

Extreme weather requires not only effective responses by institutions but also behavioral adaptations by individuals. Yet people seem to misjudge weather risks even under critical conditions. In the United States, compliance rates for warnings or evacuation orders are often low, at just 40%–60% (Gibbs and Holloway 2013; Nagele and Trainor 2012). The human and economic costs of extreme weather are an even more dramatic indicator that people tend to underestimate weather risks: Between 1995 and 2015, extreme temperatures, primarily heat waves, caused about 164 000 deaths, most of them in Europe (Centre for Research on the Epidemiology of Disasters 2015). In the United States, floods were the second deadliest weather hazards in the 30 years from 1988 to 2017 (e.g., Ashley and Ashley 2008). These statistics do not yet include more subtle and delayed risks, such as the rise in skin cancer incidence due to increased sun exposure and ozone depletion (Diepgen and Mahler 2002; Diffey 2003), or recurrent risks, such as elevated accident rates due to adverse weather conditions every fall and winter (e.g., Qiu and Nixon 2008). Climate change will amplify these risks even further as extreme events intensify and become more frequent (Beniston et al. 2007; Coumou and Rahmstorf 2012; IPCC 2012). Critically, people who misconceive weather risks not only may put themselves in immediate danger, but may also be unlikely to grasp the tangible risks that climate change represents for the future.

The main aim of this study was to systematically assess the current state of weather literacy and awareness of evident climate change in a representative sample of the German public (N = 1004). By “weather literacy” we mean the ability to understand basic weather-related risks in order to anticipate and adapt to severe weather conditions—a key complement to other dimensions of citizen literacy, such as risk literacy (Hoffrage et al. 2000; Operskalski and Barbey 2016) and climate literacy (McCaffrey and Buhr 2008; Weber and Stern 2011). We focus on two major dimensions of weather literacy: understanding of weather risks and understanding of forecast uncertainty. These two dimensions are preconditions for anticipating weather risks and thus ultimately for reacting appropriately (e.g., Lazo et al. 2009).

To assess the public’s understanding of weather risks, we asked 53 factual questions dealing with when, where, and how quickly different weather conditions arise (i.e., what the weather may be) and with the awareness of their potential impacts (i.e., what the weather may do; World Meteorological Organization 2015). Factual questions make it possible to quantify people’s actual understanding, as opposed to ambiguous risk ratings or self-assessments, which may be overly confident (Sundblad et al. 2009). The questions were implemented in multiple-choice format or, for numerical estimates, in open-response format (see the methods section).

Three further questions probed respondents’ understanding of forecast uncertainty—that is, the ability to gauge the uncertainty of a deterministic forecast and to interpret a probabilistic forecast. Public forecasts and weather warnings are still mostly deterministic or use ambiguous verbal expressions of uncertainty (Budescu et al. 2014; Kox et al. 2015). Although people may expect forecasts to be uncertain (Lazo et al. 2009; Morss et al. 2008), their expectations may thus be misguided if the degree of uncertainty is not explicitly communicated (Joslyn and Savelli 2010; Zabini et al. 2015). Following recommendations (American Meteorological Society 2008; National Research Council 2006; World Meteorological Organization 2008), professional users (e.g., emergency services) are now increasingly provided with probabilistic forecasts (Fundel et al. 2019). The general public is also likely to experience at least some probabilistic forecasts, such as the precipitation forecasts provided by the most frequently visited websites and “apps.” It is thus possible that the public’s understanding has been primed through increased exposure to precipitation forecasts (Abraham et al. 2015; Gigerenzer et al. 2005). Yet despite concerns that probabilities can be difficult for laypeople to understand (Spiegelhalter et al. 2011), explanations of how they are to be interpreted are often lacking, insufficient, or even inconsistent within and across countries.

Studies suggest that people may interpret anomalies in daily temperatures as evidence for climate change (Broomell et al. 2017), that experience of anomalies in local weather can increase people’s beliefs in climate change (Donner and McDaniels 2013; Howe et al. 2013; Li et al. 2011; Taylor et al. 2014), and that knowledge of its consequences can heighten their concern (Shi et al. 2016; Reser et al. 2014). At the same time, it is unclear whether people who are well informed about climate change are also more weather literate. Without understanding weather risks in the here and now, people may fail to fully grasp the tangible consequences of climate change for the future, continuing to perceive the weather risks posed by climate change as psychologically distant and abstract (McDonald et al. 2015; Weber 2006). To determine whether being well informed about climate change correlates with weather literacy, we tested whether the German public is aware of how climate in Germany has changed so far. In addition, we explored predictors of weather literacy and awareness of climate change. To put the public’s weather literacy into perspective, we also administered our questionnaire to a sample of meteorological experts (N = 144).

The overall aim was to comprehensively test the public’s weather literacy in order to identify potentially consequential misconceptions. This represents a first step toward successfully targeting misconceptions through effective communication and ultimately boosting the perception of weather risks today and in the future.

2. Methods

a. Participants and data collection

1) Representative sample of the public

We tested a representative national sample of 1004 people in Germany; age ranged between 14 and 93 years. Data were collected in January 2017 by a market research company [Gesellschaft für Konsumforschung (GFK)] as part of an omnibus survey. The sample was quota based such that it was representative for sex, age, household size, occupation of the householder, city size, and federal state of Germany. We report sociodemographic characteristics before and after poststratification [i.e., after applying sample weights, using the R package survey (Lumley 2004); see Table S1 in the online supplemental material]. Poststratification weights were used in all regression analyses of interindividual differences described below; all proportions reported in the main text were calculated without poststratification weights. Trained interviewers conducted the computer-assisted interviews in the respondents’ own homes. The interviewer read the questions and response options aloud and entered the responses into the computer. If preferred, respondents could enter the information themselves without the interviewer seeing their responses. The study was approved by the Institutional Review Board of the Max Planck Institute for Human Development. Respondents gave informed consent before beginning the survey and participated unpaid.

2) Expert sample and data collection

The survey answers of employees of the German National Weather Service [Deutscher Wetterdienst (DWD)] were used as an expert benchmark. Of 244 DWD experts who completed the survey, we retained the 144 who indicated not only that they had received training in meteorology or related disciplines (physics, geography, etc.), but also that their current tasks involve meteorological issues and questions. Their participation was unpaid and anonymous, and they answered the same questions as the public sample, but online. As meteorologists typically consider varying spatial reference classes, we added a clarification to the false alarms and misses questions, specifying that they referred to the area in which thunder could be heard. We emphasized at the beginning of the study that it was crucial that all questions be answered alone and without looking up any information. At the end of the study, four (of 244) experts indicated having looked up information; their data were excluded from all analyses. When they had completed the survey, respondents received feedback about the number of questions they had answered correctly, and after data collection was completed, they received the full questionnaire with the correct answers.

b. Survey materials and procedure

In total, respondents answered 62 factual questions testing the two dimensions of weather literacy—understanding of weather risks due to specific weather conditions (53 questions) and understanding of forecast uncertainty (3 questions)—as well as the understanding of evident climate change in Germany (6 questions). The questions were selected and constructed in collaboration with specialists, based on established research findings in the relevant fields (e.g., meteorology, climatology, physics, dermatology, chemical engineering, medicine, physiology, forensic medicine), and pretested in the laboratory and online. The questions pertaining to the understanding of weather risks, forecast uncertainty, and climate change were presented in separate blocks. The order of blocks was randomized, as was the order of questions within each block.

All questions (in their English translation), their correct answers, and supporting references are available in the online supplemental material and (in German and English) on the Open Science Framework (https://osf.io/vgsc8/). [Information about the project is also available online (https://www.weatherliteracy.info).]

1) Understanding of weather risks

Weather risks can arise from a broad variety of specific weather conditions. To cover a large number of these heterogeneous risks, we included 53 questions covering the topics of heat, UV radiation, windstorms, thunderstorms, intense rain, and ground frost.

For heat, UV radiation, thunderstorms, and intense rain, respondents were presented with nine statements per topic and identified each statement as correct or incorrect (i.e., multiple-choice format; see Table A1 in  appendix A along with the online supplemental material for the exact wording). For instance, respondents had to mark the following statement about heat as “correct” or “incorrect”: During heat waves, more people die than usual. Most of them are aged 65 years or above. Of the nine statements, eight were factually correct and one was incorrect. To counteract potential response tendencies driven by participant assumptions about the proportion of correct items, the instructions stated that “None, one, several or all answers may be correct.” The order of statements was randomized, as was the order of weather conditions (except for heat and UV radiation, which were always presented in a joint block).

To test whether respondents are able to connect meteorological conditions to their potential impacts, we included numerical estimation questions about severe wind speeds (nine questions), distance to a thunderstorm (one question), ground frost (one question), and rain (six questions). For all these estimation questions (except rain; see below), we elicited numerical estimates using an open-response format. This approach was taken to avoid leaking information about the likely correct values through the range of response options offered (see Schwarz 1999). The estimation questions always preceded the binary-question blocks on the respective weather condition so that the estimates would not be influenced by the multiple-choice questions.

Weather services usually recommend taking cover as soon as an approaching thunderstorm is about 10 km (6 mi) away—a critical distance across which lighting can easily strike. To assess whether people can approximately gauge the distance to an approaching thunderstorm to seek shelter in time, we asked respondents to estimate “How far away is a thunderstorm if there is a 30-second gap between the lightning and the thunder?” (in kilometers; Fig. 1a). Estimates were entered in an open input field [for the same procedure, see Keul et al. (2018, 2009)].

Fig. 1.

Understanding of weather risks: connecting weather conditions to their impacts (cf. Fig. B1 in  appendix B for experts’ understanding). (a) Estimates of the distance to a thunderstorm when there is a 30-s gap between thunder and lightning (correct: 10 km; in light blue, with an error margin of ±20%). (b) Estimates of the air temperature from which a ground frost can occur [correct: 4°C; in light blue, with an error margin from 1°C (34°F) to 7°C (45°F), inclusive]. (c) Distribution of wind speed estimates for different wind forces presented either as descriptions of impacts (e.g., “trees are uprooted”) or as verbal labels (e.g., “storm”). For “violent storm,” only the impact description was tested (see the methods section). The distributions of estimates are shown as kernel density “violins”; the horizontal lines indicate the median. The light blue dot indicates the true value, and the blue vertical range shows an error margin of ±1 Beaufort unit. (d) Interpretations of the meaning of “10 liters of rainfall per square meter” (correct: 10 mm; in light blue).

Fig. 1.

Understanding of weather risks: connecting weather conditions to their impacts (cf. Fig. B1 in  appendix B for experts’ understanding). (a) Estimates of the distance to a thunderstorm when there is a 30-s gap between thunder and lightning (correct: 10 km; in light blue, with an error margin of ±20%). (b) Estimates of the air temperature from which a ground frost can occur [correct: 4°C; in light blue, with an error margin from 1°C (34°F) to 7°C (45°F), inclusive]. (c) Distribution of wind speed estimates for different wind forces presented either as descriptions of impacts (e.g., “trees are uprooted”) or as verbal labels (e.g., “storm”). For “violent storm,” only the impact description was tested (see the methods section). The distributions of estimates are shown as kernel density “violins”; the horizontal lines indicate the median. The light blue dot indicates the true value, and the blue vertical range shows an error margin of ±1 Beaufort unit. (d) Interpretations of the meaning of “10 liters of rainfall per square meter” (correct: 10 mm; in light blue).

To prevent accidents on icy roads, drivers should be particularly careful as soon as air temperatures drop to about 4°C (39°F), especially on bridges, after a cold night, or in low-lying areas. Under these conditions, the temperature just above ground can be lower than the forecast air temperature, which is typically measured 2 m above ground. Many drivers may be unaware of the risk at temperatures above 0°C (32°F). We thus included a question that asked respondents to estimate from what air temperature (in degrees Celsius) a ground frost can occur (Fig. 1b). They were informed that the weather forecast normally reports the air temperature that is measured 2 m above the ground, whereas, for a ground frost to occur, the temperature just above the ground falls to 0° or below.

Wind forecasts and warnings in Germany are typically communicated in terms of wind speed in kilometers per hour and by using verbal category labels such as “storm” for winds of different severity. Although this is common practice, it is unclear how well laypeople are able to connect wind speeds to their corresponding labels or potential impacts. We administered two sets of items on windstorms. In one set, the items presented verbal labels for winds of different severity (ranging from gale to hurricane force, corresponding to forces 8–12 on the Beaufort scale), and respondents were asked to state the wind speeds described by each label. In the other set, the items described the corresponding observable impacts (e.g., outdoor furniture blown away, trees uprooted), and respondents were asked to state the wind speeds they would expect to have the respective impacts. Estimates for each question were entered in an open input field (in kilometers per hour; Fig. 1c). The impact questions pertained to the impacts of a gale, severe gale, storm, violent storm, or hurricane-force wind. The label questions pertained to the same wind intensities with one exception: “violent storm” was not presented as a label because the German term (orkanartiger Sturm) combines the two categories “storm” and “hurricane” into literally “hurricane-like storm,” which we suspected would confuse German respondents in terms of its ranking among the other labels. For each set, we presented the questions in a fixed order of increasing wind speeds, as a pretest showed that people could easily order wind conditions based on either labels or impact descriptions despite being uncertain about the magnitudes of the wind speed (see the distinction between mapping and metric knowledge about quantities; Brown and Siegler 1993).

To anticipate the risk of flooding, people need to understand how much precipitation to expect. Forecasts provide this information either as volume [in liters per square meter (L m−2)] or, increasingly, as precipitation height (mm). A potential advantage of the latter format is that it translates the volume per m2 into a statement of how high the water level will be, which may make it easier to imagine the potential impact. Because our main interest was in respondents’ intuitions about orders of magnitude rather than in their ability to calculate one unit from the other, participants did not have to enter an estimate; instead, they indicated whether six response options were “correct” or “incorrect” (see Fig. 1d and the online supplemental material). The response options corresponded to 10 L of rainfall per square meter (where L indicates liters) being interpreted as either 10 mm (correct), 10 cm, or 10 cm3 of precipitation. This question always preceded the block on intense rain so that no cues were given to the respondents (Fig. 1d).

2) Understanding of forecast uncertainty

Three questions probed the second dimension of weather literacy, respondents’ understanding of forecast uncertainty. In principle, it would also be possible to probe people’s understanding of forecast uncertainty across a broad range of weather risks (for people’s varying perception of other deterministic forecasts, see Joslyn and Savelli 2010). However, understanding forecast uncertainty requires a basic conceptual grasp of the fact that there is uncertainty in forecasts, even if not stated; and even if the uncertainty of the forecast is stated explicitly as a probability, that probability information must still be interpreted correctly, otherwise the understanding of forecast uncertainty is incomplete. We therefore tested people’s ability to gauge the uncertainty of one deterministic forecast for a frequent but hard-to-predict event (thunderstorm in summer), as well as the ability to interpret one common probabilistic forecast about rain.

For the deterministic forecast, we asked respondents to separately estimate the probability of a false alarm and the probability of a missed event for a summer thunderstorm forecast with 24-h lead time. The two questions were presented in random order. For the probability of a false alarm, respondents imagined 10 separate days in summer when the afternoon weather forecast predicted a thunderstorm in their area the next afternoon. They were asked to estimate on how many afternoons there would actually be a thunderstorm (correct alarms) and on how many there would be no thunderstorm (false alarms).

For the probability of a missed event, respondents imagined 10 separate days in summer on which there was a thunderstorm in their area. They were asked to estimate how many of those thunderstorms were actually predicted one day before (detected events) and how many were not (missed events). In order not to focus respondents’ attention only on the potential errors of a forecast, we asked respondents to enter both the number of correct and the number of incorrect predictions for each of these questions. Both numbers had to sum up to 10 before participants could proceed (for the same procedure, see Joslyn and Savelli 2010).

For the probabilistic forecast, respondents were asked to select the best interpretation of a forecast of “30% chance of rain tomorrow” (Gigerenzer et al. 2005; Murphy et al. 1980): “It will rain on 30 percent of the days for which this forecast is issued” (correct, probabilistic interpretation), “It will rain tomorrow in 30 percent of the area for which this forecast is issued” (incorrect, spatial interpretation), or “It will rain tomorrow for 30 percent of the time” (incorrect, temporal interpretation).

3) Awareness of climate change

Six questions tested respondents’ awareness of climate change in Germany since 1880, which is the “preindustrial” baseline against which the 2°C limit in global temperature rise is measured. The six questions represent standard indicators of climate change that the German National Weather Service describes in its official climate report for the public (Deutscher Wetterdienst 2017): 1) average precipitation per year; 2) average temperature per year; number of days per year with 3) at least 10 L of precipitation per square meter, 4) high temperatures (hot days above 30°C or 86°F), 5) low temperatures (cold days below 0°C or 32°F during daytime); and 6) severity of windstorms. Respondents indicated whether they thought each aspect had declined, remained unchanged, or increased since 1880.

4) Individual differences in weather literacy and climate change awareness

To explore which individual characteristics predict weather literacy and awareness of climate change, we asked respondents to report the number of hours they generally spent outside per week in summer (Demuth et al. 2011) by selecting from eight categories ranging from “0–5 h/week” to “more than 35 h/week,” and how often they consulted any kind of media (e.g., mobile apps, radio, television) for weather information (Lazo et al. 2009; Stewart et al. 2012) by selecting from six categories ranging from “twice or more per day” to “rarely or never.”

In addition, the survey company provided 30 sociodemographic variables by default. From the 30 variables, we removed redundant predictors by identifying all pairwise correlations above 0.5 and then deleting 14 variables that were derived from or a coarser version of another variable in the set. Pooled over the retained 18 predictors and respondents, fewer than 1% of the sociodemographic data entries were missing, all stemming from the following three variables: household net income (23% responses missing), educational level, and family status (both less than 1% responses missing). To impute missing data, we used a nonparametric method based on a random forest algorithm [using the R package missforest (Stekhoven and Bühlmann 2011) with its default settings].

3. Results

a. Understanding of weather risks

Fifty-three questions probed the first dimension of weather literacy, respondents’ understanding of risks due to heat, UV radiation, windstorms, thunderstorms, intense rain, and ground frost. Overall, respondents from the general public answered a median (Mdn) of 66% of the questions correctly [interquartile range (IQR) = 60%–74%]; for experts, the median was 87% (IQR = 79%–91%). The proportion of correct answers per question ranged from 24% to 96%, with the median question being answered correctly by 71% of respondents (IQR = 52%–81%) (for experts: Mdn = 90%; IQR = 77%–95%; range = 35%–100%). The internal reliability of this set of questions was 0.76 (Revelle’s ω total; McNeish 2018).

1) Awareness of weather risks

Many respondents misjudged several critical weather risks even when asked to simply identify a statement as correct or incorrect (Table 1; for all results, see Table A1 in  appendix A). For instance, 67% of respondents from the public (experts: 50%) falsely regarded heatstroke as a mild condition. Furthermore, 66% (experts: 4%) falsely believed that higher temperatures mean higher UV radiation levels and may thus not protect themselves sufficiently around noon, when UV radiation peaks while temperatures are still rising.

Table 1.

Weather literacy and awareness of climate change in a representative sample of the German population.a

Weather literacy and awareness of climate change in a representative sample of the German population.a
Weather literacy and awareness of climate change in a representative sample of the German population.a

2) Connecting weather conditions to their impact

Respondents especially struggled to connect weather conditions to their impacts (Fig. 1; for experts, see Fig. B1 in  appendix B).

(i) Thunderstorm

When an approaching thunderstorm is about 10 km (6 mi) away, weather services usually recommend that people take cover, as this is a critical distance across which lighting can strike. A simple way to estimate (in kilometers) the distance is to count the seconds between seeing the lightning flash and hearing the thunder and divide that number by 3, which is about the speed of sound (343 m s−1, or meters per second). A 30-s gap (where again s indicates seconds) thus indicates that a thunderstorm is about 10 km (6 mi) away. However, only 24% of respondents (experts: 79%) correctly estimated this distance ± 20% (Fig. 1a). Thirty-four percent (experts: 6%) believed the thunderstorm to be less than 8 km (5 mi) away. Important is that 42% (experts: 15%) believed it to more than 12 km (7 mi) away. The most common answer (27%) was 30 km (19 mi; experts: 10%), 3 times the true critical distance. These results are consistent with findings from a smaller sample of the Austrian public (N = 133; Keul et al. 2009), where the modal estimate (55%) for a 3-s gap was 3 km. And both results are compatible with the idea that many respondents falsely believed a 1-s gap to correspond to a distance of 1 km—and thereby overestimated by a factor of 3. These respondents are thus likely to overestimate the time they have left to seek shelter.

(ii) Ground frost

Drivers should expect roads to become icy as soon as air temperatures fall to about 4°C (39°F). Especially on bridges, after a cold night, or in low-lying areas, the temperature at ground level can be lower than the forecast air temperature, which is measured 2 m above ground. Respondents’ estimates of the air temperature from which a ground frost can occur were scored as correct from 1°C (34°F) to 7°C (45°F) included to acknowledge that ground frost is possible above 0°C, yet highly unlikely at 8°C (46°F) or above. Although weather forecasts frequently warn that bridges ice before roads, and although today’s cars have a frost warning indicator that alerts the driver when the temperature drops to within a few degrees of freezing, only 55% of respondents (experts: 90%) answered correctly. Nearly half of the respondents, 44% (experts: 6%), falsely estimated that icy conditions are only possible at air temperatures of 0°C (32°F) or below (Fig. 1b) and may therefore fail to adapt their driving behavior appropriately.

(iii) Wind

To acknowledge the uncertainty of wind forecasts and the fact that impacts always depend on additional circumstances, we scored wind speed estimates as correct if they were within ±1 Beaufort unit of the true value (i.e., within the corresponding wind speed ranges in kilometers per hour, with the ends of the range rounded to the nearest multiple of 5 to acknowledge the typical coarseness of estimates; see Table A1 in  appendix A for all results).

Respondents tended to overestimate the wind speeds at which serious damage can occur, especially for more severe wind conditions (Fig. 1c; see also Agdas et al. 2012, who found that people overestimate severe wind speeds experienced in a simulator). Overestimating wind speeds related to severe wind conditions and impacts may be problematic for two reasons. On the one hand, it may lead to people taking action before it is necessary [e.g., “shadow evacuating,” where people evacuate unnecessarily, causing traffic problems for those who do need to leave immediately; for this line of argument, see Agdas et al. (2017)]. On the other hand, it implies that people may underestimate the damage that can be caused by lower wind speeds. For example, one in three respondents overestimated the wind speed of a “storm” (Beaufort 10; >90 km h−1 or 56 mi h−1, where the units indicate kilometers or miles per hour, respectively), by one or more Beaufort units based on verbal labels (34%; experts: 13%) or impacts (34%; experts: 19%). A wind speed of 90 km h−1 can uproot trees, but these respondents would expect such impacts only at 105 km h−1 or more, a wind speed that can seriously damage walls and blow off roofs. Although people may take protective action at lower wind speeds, the results indicate that a considerable proportion of people will misjudge the risk of severe wind conditions based on forecast wind speeds or categories alone. Communicating weather conditions alongside their potential impact may thus help to improve the public’s risk perception.

(iv) Intense rain

To help people anticipate the risk of flooding, forecasts can describe precipitation either as volume (L m−2) or as height (mm). Whereas volume statements require people to translate liters per square meter into a water level, precipitation height directly states how high the water level will be if no water is able to drain, evaporate, or seep away. This may make it easier to imagine the potential impact. Yet only 49% (experts: 95%) of respondents correctly selected one of the two correct answers indicating that 10 L of rainfall per square meter is equivalent to 10 mm of rain (Fig. 1d). Instead 51% (experts: 6%) believed that it translates to a depth of 10 cm—a full 10 times too high. If 10 mm of precipitation were forecast, these respondents may expect just 1 L m−2 of rain instead of 10 L m−2—10 times too little. This may result from mistakes in converting between units, yet also from the fact that water accumulation is difficult to imagine. The results align with previous findings demonstrating that people substantially underestimate rain intensity even based on experience in a simulator (Agdas et al. 2017) and may thus fail to respond appropriately.

Overall, the results demonstrate that the public does not seem well equipped to infer weather risks and impacts from forecast weather conditions alone, and may therefore profit from forecasts that communicate impacts rather than weather conditions alone.

b. Understanding of forecast uncertainty

Three questions probed respondents’ estimates of the uncertainty in a deterministic forecast (numbers of false alarms and misses) and their interpretation of a probabilistic forecast. Overall, respondents struggled to understand forecast uncertainty and only answered a median of one of three questions correctly (33%; IQR = 0%–33%) (for experts: 33%; IQR = 33%–67%) (Fig. 2; for experts, see Fig. B2 in  appendix B). The proportion of correct answers per question ranged from 13% to 44%, with the median question being answered correctly by 23% of respondents (for experts: Mdn = 35%; range = 28%–77%).

Fig. 2.

Understanding of forecast uncertainty (cf. Fig. B2 in  appendix B for experts’ understanding). (a) Reference class selected for the forecast “There is a 30% chance of rain tomorrow” by percentage of respondents (correct: “days with this forecast”). (b) Estimates of how many local thunderstorm forecasts are not followed by an actual thunderstorm within 24 h (false alarms; correct = 80%) and how many actual thunderstorms are not forecast 24 h in advance (misses; correct = 40%). The heat map visualizes the joint distribution of the two estimates across respondents. The two histograms display the marginal distribution of the respective estimates. The light blue lines below the histograms indicate the region of estimates that are within a range that accounts for spatial and interannual variation in Germany.

Fig. 2.

Understanding of forecast uncertainty (cf. Fig. B2 in  appendix B for experts’ understanding). (a) Reference class selected for the forecast “There is a 30% chance of rain tomorrow” by percentage of respondents (correct: “days with this forecast”). (b) Estimates of how many local thunderstorm forecasts are not followed by an actual thunderstorm within 24 h (false alarms; correct = 80%) and how many actual thunderstorms are not forecast 24 h in advance (misses; correct = 40%). The heat map visualizes the joint distribution of the two estimates across respondents. The two histograms display the marginal distribution of the respective estimates. The light blue lines below the histograms indicate the region of estimates that are within a range that accounts for spatial and interannual variation in Germany.

Only 23% (experts: 77%) correctly selected the appropriate reference class for a forecast of a “30% chance of rain tomorrow” (correct: “It will rain on 30% of the days for which this forecast is issued”), whereas 47% (experts: 21%) interpreted the forecast in terms of the proportion of the area affected (Fig. 2a). This spatial interpretation implies a deterministic misinterpretation of forecast uncertainty (i.e., “It will definitely rain somewhere, but maybe not exactly where I am”; Joslyn and LeClerc 2013; Joslyn et al. 2009). Although this misinterpretation may not necessarily lead to harmful actions, it may negatively affect people’s trust in forecasts (Ripberger et al. 2014; Simmons and Sutter 2009; LeClerc and Joslyn 2015): If it does not rain anywhere in the area, the forecast constitutes a false alarm within the spatial interpretation, but it is entirely reconcilable with the correct, probabilistic interpretation. A spatial misinterpretation could likely be avoided if weather services routinely communicate the reference class in a transparent and consistent way.

Deterministic forecasts are not a viable alternative, however; respondents underestimated the uncertainty inherent in a common deterministic forecast about thunderstorms (Fig. 2b). To account for spatial and interannual variation across Germany, we scored estimates as correct if they fell within ±10 percentage points (pp) of the true values provided by DWD. When asked how many (of 10) local thunderstorms were not forecast 24 h in advance (misses), only 44% (experts: 35%) estimated the correct proportion (40% ± 10 pp). Even fewer respondents (13%; experts: 28%) correctly estimated how many local thunderstorm forecasts were not followed by an actual thunderstorm within 24 h (false alarms; 80% ± 10 pp).1 Respondents slightly underestimated the proportion of misses (40% ± 10 pp; Mdn estimate = 30%; IQR = 20%–50%) (for experts: Mdn = 30%; IQR = 20%–43%) but vastly underestimated the proportion of false alarms (80% ± 10 pp; Mdn estimate = 40%; IQR = 20%–50%) (for experts: Mdn = 50%; IQR = 30%–70%). Because thunderstorms are hard to predict, ensuring a low number of misses can currently only be achieved by tolerating many false alarms. Yet only 34% (experts: 60%) correctly expected more false alarms than misses, whereas 45% (experts: 28%) falsely expected the same number for both. Although more experts than members of the public expected more false alarms than misses, they nevertheless also had difficulties estimating the absolute numbers. In fact, 72% of the experts and 87% of the public expected too few false alarms. Both questions reveal a conspicuously pronounced use of 50% responses. According to previous research on how people numerically interpret verbal probability statements, respondents may use “fifty–fifty” as an expression of uncertainty rather than a genuine numerical estimate (Bruine de Bruin et al. 2000). Subtracting the expected proportions of numeric “50%” responses (for methods, see Bruine de Bruin et al. 2000) from the observed proportions revealed that an estimated 14% of respondents may have used “50%” to express uncertainty (“I don’t know”) about the number of false alarms and an estimated 12%, uncertainty about missed events.2

Consistent with findings from a local sample in the United States (Washington and Oregon), our results indicate that people generally understand that forecasts are uncertain, yet this appreciation alone is not sufficient for people to estimate the degree of uncertainty (Joslyn and Savelli 2010). To what extent uncertainty is over- or underestimated depends on the forecast [for people’s varying perception of other deterministic forecasts, see Joslyn and Savelli (2010)] and should be further investigated by including analogous questions for other critical weather forecasts. Deterministic forecasts that lack any indication of uncertainty pose a problem not just for laypeople; indeed, even our experts struggled to estimate the true uncertainty of a deterministic forecast. People may expect deterministic forecasts to be more uncertain than they are (Joslyn and Savelli 2010) or form overly confident expectations, as with the forecast tested here. If the true uncertainty is not communicated, people may take unintended risks.

c. Weather literacy and awareness of climate change

Respondents indicated whether six indicators of climate change had increased, decreased, or remained unchanged in Germany since 1880 (Fig. 3; for experts, see Fig. B3 in  appendix B). Overall, respondents answered a median of 50% of the questions about climate change correctly (IQR = 33%–67%) (for experts: 67%; IQR = 50%–83%). The proportion of correct answers per question ranged from 16% to 70%, with the median question being answered correctly by 51% of respondents (IQR = 38%–58%) (for experts: Mdn = 68%; IQR = 38%–94%; range = 17%–97%). The internal reliability of this set of questions was 0.77 (Revelle’s ω total; McNeish 2018).

Fig. 3.

Awareness of climate change (cf. Fig. B3 in  appendix B for experts’ awareness). Responses to six indicators of climate change in Germany since 1880 (in terms of the number of days or average value per year) are shown. The light blue lines indicate the correct answer. Only 3% of respondents believed that weather conditions have remained completely unchanged, indicating that the German public is largely aware that climate change has already had observable effects and is not just a problem for the future.

Fig. 3.

Awareness of climate change (cf. Fig. B3 in  appendix B for experts’ awareness). Responses to six indicators of climate change in Germany since 1880 (in terms of the number of days or average value per year) are shown. The light blue lines indicate the correct answer. Only 3% of respondents believed that weather conditions have remained completely unchanged, indicating that the German public is largely aware that climate change has already had observable effects and is not just a problem for the future.

Seventy percent of respondents (experts: 97%) knew that the average temperature has increased. However, 55% (experts: 83%) believed that average precipitation has not changed or that it has decreased; in fact, it has increased. Fifty-six percent of respondents (experts: 62%) believed that the number of days with high precipitation has increased, and 80% (experts: 49%) believed the same for storm intensity. Although no change has been detected for either indicator in Germany (Deutscher Wetterdienst 2017; Feser et al. 2015), days with high precipitation have increased on a continental scale (Fischer and Knutti 2016; Fischer et al. 2013).3 The results suggest that both public and expert beliefs may reflect more recent events and the broader media coverage of extreme events (Weber and Stern 2011) such as storms and flash floods.

But did respondents with better awareness of climate change also have a better understanding of weather risks? To test this relationship, we correlated across respondents the proportion of questions answered correctly about climate change and weather risks.4 Respondents with a better awareness of climate change had only a slightly better understanding of weather risks {Spearman’s ρ = 0.10, with 95% highest posterior density interval (HDI) of [0.04, 0.16]; experts: ρ = 0.35, with 95% HDI of [0.20, 0.50]}.5 The understanding of weather risks and the awareness of climate change thus seem to represent two different kinds of knowledge.

d. Individual differences in weather literacy and climate change awareness

The goal of our subsequent, exploratory analyses was to find predictors for respondents’ weather literacy and awareness of climate change. As the dependent variable was binary (correct vs incorrect), we ran three logistic regressions models—separately for the two dimensions of weather literacy and for awareness of climate change. The regression models predict whether a question was answered correctly based on a set of sociodemographic and other relevant interindividual differences (e.g., how frequently people obtained weather information, or how many hours per week they spent outside). Because each respondent provided multiple responses (one per item), we used hierarchical logistic regression models (also known as “mixed effect” or “mixed level” logistic regression models), which allowed the probability of a correct answer to vary for respondents, respondents’ federal state of residence, and items (see the note to Table S2 in the online supplemental material for detailed descriptions of the predictors, regression models, and results). All models incorporated poststratification weights to ensure that the analyses are representative for the German population (Table S1 in the online supplemental material).

The understanding of weather risks was higher among people who obtained weather information more often, used the internet more frequently, or spent more hours outside. That many people now live and work with reduced direct and immediate exposure to weather conditions (e.g., spending time in offices, underground transportation, or gyms; Soga and Gaston 2016) thus seems detrimental to weather literacy. The same three individual differences predicted awareness of climate change, with the exception that awareness was higher not only among respondents who spent much time outside but also among those who spent little time outside (curvilinear effect). On the one hand, these findings could indicate that the understanding of both weather risks and climate change may depend on people’s experience of forecasts and the ensuing weather. On the other hand, for people who rarely spend time outside, the understanding of evident climate change may be based on indirect information (e.g., obtained online) rather than on direct, personal experience. And whereas previous research has found level of education to be linked to self-reported awareness of climate change (Lee et al. 2015), respondents with a higher level of education had no better factual understanding of evident climate change in Germany than those with a lower level of education. Likewise, understanding of forecast uncertainty did not improve with higher levels of education [for varying results on the impact of education on the interpretation of probabilistic forecasts, see Abraham et al. (2015) and Grounds and Joslyn (2018)].

Overall, these exploratory analyses suggest that the understanding of climate change risks may often be based on abstract information rather than on a concrete understanding or even experience of weather risks. Improving the public’s understanding of weather risks by communicating both weather conditions and their impact could thus also help to make the risk that climate change implies for the future more concrete (McDonald et al. 2015).

4. Discussion

Our results indicate deficits in the public understanding of weather risks. Critically, the findings show that people cannot easily infer weather risks from forecasts. For instance, only about half the respondents were aware that ground frost can occur at air temperatures above 0°C (32°F). Moreover, the understanding of weather risks was only weakly related to the awareness of climate change. This suggests that current efforts to inform the public about climate change do not necessarily improve people’s understanding of the tangible weather risks that it implies. Without weather literacy, the current and future risks stemming from climate change remain abstract and psychologically distant and may not impress a need for swift mitigation measures (Broomell et al. 2015; McDonald et al. 2015). The results from Germany may well generalize to other countries where people spend little time outdoors and are thus unlikely to learn about natural risks from personal experience (Soga and Gaston 2016).

Communicating forecasts of impacts may be an effective strategy to bridge this gap, by translating “what the weather might be” into “what the weather might do” (World Meteorological Organization 2015). Representative surveys of the public’s weather literacy are a necessary first step toward addressing potentially consequential misconceptions through impact communication. Adapting and extending the present set of questions to assess weather literacy in countries worldwide would help to understand systematic misconceptions and the degree to which they reflect the specific local weather risks people face around the globe (Keul et al. 2018; Lee et al. 2015; Stewart et al. 2012; Weber and Stern 2011). Once these misconceptions have been identified, involving the public in the design and empirical evaluation of risk communication formats through large-scale crowdsourcing initiatives could be a new and promising approach to putting weather risks into perspective (see, e.g., the approach by Barrio et al. 2016). Findings from other domains offer an important lesson for impact forecasts: Communicating impacts can change risk perception and behavior in unintended ways because people may ignore probabilities (Pachur et al. 2014), infer them from the severity of the outcomes (Leuker et al. 2018), or respond based on affect (Loewenstein et al. 2001; Slovic et al. 2004). Impact forecasts must therefore be carefully designed and tested (Rakow et al. 2015) in order to avoid unintended consequences such as overreaction or dismissal of risks as overstated (Morss and Hayden 2010).

From a policy perspective, it is cause for concern that education did not show a positive association with weather literacy. Risk literacy investigations like ours render it increasingly obvious that whatever is being taught in schools (e.g., in geography and statistics) either neglects or fails to promote a basic understanding of common natural risks in Germany. Incorporating weather literacy into the relevant school curricula in a realistic and accessible manner (e.g., through real-world examples of weather and climate risks) may be the most sustainable long-term strategy for empowering people to successfully manage critical weather situations. For newly emerging risks, such as flash floods or heat waves, information campaigns tailored to high-risk groups that promote necessary private countermeasures may offer a viable complementary approach.

Media weather reports by national and private weather services provide another high-impact channel: Weather reports on television, on websites, and in apps are frequently searched and consumed across age groups, reaching a considerable proportion of the public (Lazo et al. 2009; Keul and Holzer 2013; Kox and Thieken 2017). The major challenge is to break complex information into transparent, meaningful units that are easy to remember and suited for the medium at hand (Keul and Holzer 2013). Today, people’s use of a range of media sources makes it possible to provide complementary information at different levels of detail (e.g., on television and online). In the same way, interactive online representations could help to reduce information overload, guide people’s attention, and provide interpretations (for the need to test interactive visualizations, see Spiegelhalter et al. 2011). Ideally, meteorologists and social and behavioral scientists should collaborate to implement best practice from risk communication and empirically test the information released on websites or apps in surveys and user studies (Fundel et al. 2019; Keul and Holzer 2013). As direct collaborations are not always feasible at the time of developing communication formats (e.g., due to a lack of time or resources), scientists could offer workshops making the central scientific insights available to media meteorologists and journalists. In turn, social and behavioral scientists could profit from these interactions by discovering new research questions and practical challenges in risk communication.

A second cause for concern is how poorly forecast uncertainty is still understood by the public despite the omnipresence of forecasts in people’s lives, from weather forecasts to medical or financial prognoses. The current widespread communication of solely deterministic forecasts (or forecasts with ambiguous verbal probability information) is not a viable solution; in fact, deterministic forecasts may well lie at the root of this lack of understanding. When uncertainty is not explicitly communicated, laypeople and experts alike can only guess at the true uncertainty underlying a forecast. Communicating numeric uncertainty more widely and transparently to the public is vital to supporting informed decisions (Joslyn and Savelli 2010; Morss et al. 2008).

The understanding of probabilistic weather forecasts could clearly be improved by, for example, transparent and consistent descriptions of the reference class (Gigerenzer et al. 2005; Murphy et al. 1980), which is often not communicated at all. Moreover, communicators lack well-tested representation formats for communicating uncertainty in continuous variables (e.g., wind speeds or the value of stocks) to lay audiences (Fundel et al. 2019; Spiegelhalter et al. 2011). Again, a closer collaboration between social and behavioral scientists, media meteorologists, and journalists would be instrumental for harnessing best practices and identifying practically relevant questions for future research.

Probabilistic weather forecasts—if the reference class is properly communicated—may in fact provide a unique and rich learning environment that boosts people’s risk literacy in general: Laypeople’s frequent experience of weather forecasts and the ensuing weather conditions could foster their understanding of probabilistic forecasts in other domains as well (Joslyn and Savelli 2010; Savelli and Joslyn 2013), in contrast to finance, medicine, or climate change, for example, where learning opportunities are rarer. Boosting weather literacy (Hertwig and Grüne-Yanoff 2017) may not only support informed decisions in a variety of domains in which uncertainty rules, but it may also help people grasp the risks of climate change for the future (Lewandowsky et al. 2014), ultimately fostering public support for climate change mitigation.

Acknowledgments

This work was supported by the Hans Ertel Centre for Weather Research of the German National Weather Service (DWD). We thank M. Göber, F. Böttcher, I. Niedeck, K. Horneffer, C. Koppe, K. Wappler, C. Surber, U. Osterwalder, F. Zack, T. Winterrath, and A. Becker for expert advice on developing and selecting questions; the employees of the German National Weather Service; Stephan Pfahl and three anonymous reviewers for critical feedback; Deborah Ain and Susannah Goss for their helpful suggestions and editing of the paper; and Susannah Goss for translating the questions into English. Author Hertwig conceived the initial idea for the study. Authors Fleischhut and Herzog designed the study, and Hertwig provided critical feedback. Fleischhut analyzed the results, and Herzog provided critical feedback. Fleischhut created the first draft of the paper, Fleischhut and Herzog revised the paper, and Hertwig provided critical feedback. All authors reviewed the paper.

 Data availability statement: All data, the R code for all analyses, the exact wording of questions (in English and German), and an interactive table of results are available on the Open Science Framework (https://osf.io/vgsc8/). The English translation of the questions is also available in the online supplemental material. Information about the project is available online (https://www.weatherliteracy.info).

APPENDIX A

Understanding of Weather Risks: Proportion of Correct Answers for all Questions

Table A1 shows the proportion of correct answers for all questions with regard to the understanding of various categories of weather risks.

Table A1.

Understanding of weather risks: proportion of correct answers for all questions (note that these questions do not follow AMS style for units, etc., because they are translated directly from the survey).a

Understanding of weather risks: proportion of correct answers for all questions (note that these questions do not follow AMS style for units, etc., because they are translated directly from the survey).a
Understanding of weather risks: proportion of correct answers for all questions (note that these questions do not follow AMS style for units, etc., because they are translated directly from the survey).a

APPENDIX B

Experts’ Understanding of Weather Risks, Forecast Uncertainty, and Climate Change

Figure B1 shows the experts’ responses to weather-risk-related questions. Figure B2 gives the experts’ responses to questions about forecast uncertainty. Figure B3 shows the experts’ awareness of climate change. See Figs. 13 to make the respective comparisons with the general public’s understanding of these areas.

Fig. B1.

Experts’ understanding of weather risks: connecting weather conditions to their impacts (cf. Fig. 1 for the public’s understanding). (a) Estimates of the distance to a thunderstorm when there is a 30-s gap between thunder and lightning (correct: 10 km; in light blue, with an error margin of ±20%). (b) Estimates of the air temperature from which a ground frost can occur [correct: 4°C; in light blue, with an error margin from 1°C (34°F) to 7°C (45°F)]. (c) Distribution of wind speed estimates for different wind forces presented either as descriptions of impacts (e.g., “trees are uprooted”) or as verbal labels (e.g., “storm”). For “violent storm,” only the impact description was tested (see the methods section). The distributions of estimates are shown as kernel density “violins”; the horizontal lines indicate the median. The light blue dot indicates the true value, and the blue vertical range shows an error margin of ±1 Beaufort unit. (d) Interpretations of the meaning of “10 liters of rainfall per square meter” (correct: 10 mm; in light blue).

Fig. B1.

Experts’ understanding of weather risks: connecting weather conditions to their impacts (cf. Fig. 1 for the public’s understanding). (a) Estimates of the distance to a thunderstorm when there is a 30-s gap between thunder and lightning (correct: 10 km; in light blue, with an error margin of ±20%). (b) Estimates of the air temperature from which a ground frost can occur [correct: 4°C; in light blue, with an error margin from 1°C (34°F) to 7°C (45°F)]. (c) Distribution of wind speed estimates for different wind forces presented either as descriptions of impacts (e.g., “trees are uprooted”) or as verbal labels (e.g., “storm”). For “violent storm,” only the impact description was tested (see the methods section). The distributions of estimates are shown as kernel density “violins”; the horizontal lines indicate the median. The light blue dot indicates the true value, and the blue vertical range shows an error margin of ±1 Beaufort unit. (d) Interpretations of the meaning of “10 liters of rainfall per square meter” (correct: 10 mm; in light blue).

Fig. B2.

Experts’ understanding of forecast uncertainty (cf. Fig. 2 for the public’s understanding). (a) Reference class selected for the forecast “There is a 30% chance of rain tomorrow” by percentage of respondents (correct: “days with this forecast”). (b) Estimates of how many local thunderstorm forecasts are not followed by an actual thunderstorm within 24 h (false alarms; correct = 80%) and how many actual thunderstorms are not forecast 24 h in advance (misses; correct = 40%). The heat map visualizes the joint distribution of the two estimates across respondents. The two histograms display the marginal distribution of the respective estimates. The blue lines below the histograms indicate the region of estimates that are within a range that accounts for spatial and interannual variation in Germany.

Fig. B2.

Experts’ understanding of forecast uncertainty (cf. Fig. 2 for the public’s understanding). (a) Reference class selected for the forecast “There is a 30% chance of rain tomorrow” by percentage of respondents (correct: “days with this forecast”). (b) Estimates of how many local thunderstorm forecasts are not followed by an actual thunderstorm within 24 h (false alarms; correct = 80%) and how many actual thunderstorms are not forecast 24 h in advance (misses; correct = 40%). The heat map visualizes the joint distribution of the two estimates across respondents. The two histograms display the marginal distribution of the respective estimates. The blue lines below the histograms indicate the region of estimates that are within a range that accounts for spatial and interannual variation in Germany.

Fig. B3.

Experts’ awareness of climate change (cf. Fig. 3 for the public’s knowledge). Responses to six indicators of climate change in Germany since 1880 (in terms of the number of days or average value per year) are shown. The light blue lines indicate the correct answer.

Fig. B3.

Experts’ awareness of climate change (cf. Fig. 3 for the public’s knowledge). Responses to six indicators of climate change in Germany since 1880 (in terms of the number of days or average value per year) are shown. The light blue lines indicate the correct answer.

REFERENCES

REFERENCES
Abraham
,
S.
,
R.
Bartlett
,
M.
Standage
,
A.
Black
,
A. C.
Perez
, and
R.
McCloy
,
2015
:
Do location-specific forecasts pose a new challenge for communicating uncertainty?
Meteor. Appl.
,
22
,
554
562
, https://doi.org/10.1002/met.1487.
Agdas
,
D.
,
G. D.
Webster
, and
F. J.
Masters
,
2012
:
Wind speed perception and risk
.
PLOS ONE
,
7
,
e49944
, https://doi.org/10.1371/journal.pone.0049944.
Agdas
,
D.
,
F. J.
Masters
, and
G. D.
Webster
,
2017
:
Role of rain as perception aid in assessing wind speeds and associated personal risks
.
Wea. Climate Soc.
,
9
,
227
233
, https://doi.org/10.1175/WCAS-D-15-0038.1.
American Meteorological Society
,
2008
:
Enhancing weather information with probability forecasts
.
Bull. Amer. Meteor. Soc.
,
89
,
1049
1053
, https://doi.org/10.1175/1520-0477-89.7.1041.
Ashley
,
S. T.
, and
W. S.
Ashley
,
2008
:
Flood fatalities in the United States
.
J. Appl. Meteor. Climatol.
,
47
,
805
818
, https://doi.org/10.1175/2007JAMC1611.1.
Barrio
,
P. J.
,
D. G.
Goldstein
, and
J. M.
Hofman
,
2016
:
Improving comprehension of numbers in the news. Proc. 2016 CHI Conf. on Human Factors in Computing Systems, San Jose, CA, ACM Special Interest Group on Computer–Human Interaction, 2729–2739
, https://doi.org/10.1145/2858036.2858510.
Beniston
,
M.
, and et al
,
2007
:
Future extreme events in European climate: An exploration of regional climate model projections
.
Climatic Change
,
81
,
71
95
, https://doi.org/10.1007/s10584-006-9226-z.
Brasseur
,
G. P.
,
D.
Jacob
, and
S.
Schuck-Zöller
,
2017
:
Klimawandel in Deutschland: Entwicklung, Folgen, Risiken und Perspektiven (Climate Change in Germany: Development, Consequences, Risks and Perspectives). Springer, 348 pp
.
Broomell
,
S. B.
,
D. V.
Budescu
, and
H.-H.
Por
,
2015
:
Personal experience with climate change predicts intentions to act
.
Global Environ. Change
,
32
,
67
73
, https://doi.org/10.1016/j.gloenvcha.2015.03.001.
Broomell
,
S. B.
,
J.-F.
Winkles
, and
P. B.
Kane
,
2017
:
The perception of daily temperatures as evidence of global warming
.
Wea. Climate Soc.
,
9
,
563
574
, https://doi.org/10.1175/WCAS-D-17-0003.1.
Brown
,
N. R.
, and
R. S.
Siegler
,
1993
:
Metrics and mappings: A framework for understanding real-world quantitative estimation
.
Psychol. Rev.
,
100
,
511
534
, https://doi.org/10.1037/0033-295X.100.3.511.
Bruine de Bruin
,
W.
,
B.
Fischhoff
,
S. G.
Millstein
, and
B. L.
Halpern-Felsher
,
2000
:
Verbal and numerical expressions of probability: “It’s a fifty–fifty chance.”
Organ. Behav. Hum. Decis. Processes
,
81
,
115
131
, https://doi.org/10.1006/obhd.1999.2868.
Budescu
,
D. V.
,
H.-H.
Por
,
S. B.
Broomell
, and
M.
Smithson
,
2014
:
The interpretation of IPCC probabilistic statements around the world
.
Nat. Climate Change
,
4
,
508
512
, https://doi.org/10.1038/nclimate2194.
Centre for Research on the Epidemiology of Disasters
,
2015
:
The human cost of weather-related disasters: 1995–2015. CRED and U.N. Office for Disaster Risk Reduction Rep., 30 pp.
, http://www.unisdr.org/files/46796_cop21weatherdisastersreport2015.pdf.
Coumou
,
D.
, and
S.
Rahmstorf
,
2012
:
A decade of weather extremes
.
Nat. Climate Change
,
2
,
491
496
, https://doi.org/10.1038/nclimate1452.
Demuth
,
J. L.
,
J. K.
Lazo
, and
R. E.
Morss
,
2011
:
Exploring variations in people’s sources, uses, and perceptions of weather forecasts
.
Wea. Climate Soc.
,
3
,
177
192
, https://doi.org/10.1175/2011WCAS1061.1.
Deutscher Wetterdienst
,
2017
:
National climate report: Climate—Yesterday, today and in future. DWD Rep., 46 pp.
, https://www.dwd.de/EN/ourservices/nationalclimatereport/download_report_edition-3.pdf.
Diepgen
,
T. L.
, and
V.
Mahler
,
2002
:
The epidemiology of skin cancer
.
Br. J. Dermatol.
,
146
(
Suppl. 61
),
1
6
, https://doi.org/10.1046/j.1365-2133.146.s61.2.x.
Diffey
,
B.
,
2003
:
Climate change, ozone depletion and the impact on ultraviolet exposure of human skin
.
Phys. Med. Biol.
,
49
,
R1
R11
, https://doi.org/10.1088/0031-9155/49/1/R01.
Donner
,
S. D.
, and
J.
McDaniels
,
2013
:
The influence of national temperature fluctuations on opinions about climate change in the US since 1990
.
Climatic Change
,
118
,
537
550
, https://doi.org/10.1007/s10584-012-0690-3.
Feser
,
F.
,
M.
Barcikowska
,
O.
Krueger
,
F.
Schenk
,
R.
Weisse
, and
L.
Xia
,
2015
:
Storminess over the North Atlantic and northwestern Europe—A review
.
Quart. J. Roy. Meteor. Soc.
,
141
,
350
382
, https://doi.org/10.1002/qj.2364.
Fischer
,
E. M.
, and
R.
Knutti
,
2016
:
Observed heavy precipitation increase confirms theory and early models
.
Nat. Climate Change
,
6
,
986
991
, https://doi.org/10.1038/nclimate3110.
Fischer
,
E. M.
,
U.
Beyerle
, and
R.
Knutti
,
2013
:
Robust spatially aggregated projections of climate extremes
.
Nat. Climate Change
,
3
,
1033
1038
, https://doi.org/10.1038/nclimate2051.
Fundel
,
V. J.
,
N.
Fleischhut
,
S. M.
Herzog
,
M.
Göber
, and
R.
Hagedorn
,
2019
:
Promoting the use of probabilistic weather forecasts through a dialogue between scientists, developers, and end-users
.
Quart. J. Roy. Meteor. Soc.
,
145
,
210
231
, https://doi.org/10.1002/qj.3482.
Gibbs
,
L.
, and
C.
Holloway
,
2013
:
Hurricane Sandy after action: Report and recommendations to Mayor Michael R. Bloomberg. City of New York Rep., 67 pp.
, https://archive.org/details/695761-sandy-after-action-report/mode/2up.
Gigerenzer
,
G.
,
R.
Hertwig
,
E.
Van Den Broek
,
B.
Fasolo
, and
K. V.
Katsikopoulos
,
2005
:
“A 30% chance of rain tomorrow”: How does the public understand probabilistic weather forecasts?
Risk Anal.
,
25
,
623
629
, https://doi.org/10.1111/j.1539-6924.2005.00608.x.
Grounds
,
M. A.
, and
S. L.
Joslyn
,
2018
:
Communicating weather forecast uncertainty: Do individual differences matter?
J. Exp. Psychol. Appl.
,
24
,
18
33
, https://doi.org/10.1037/xap0000165.
Hertwig
,
R.
, and
T.
Grüne-Yanoff
,
2017
:
Nudging and boosting: Steering or empowering good decisions
.
Perspect. Psychol. Sci.
,
12
,
973
986
, https://doi.org/10.1177/1745691617702496.
Hoffrage
,
U.
,
S.
Lindsey
,
R.
Hertwig
, and
G.
Gigerenzer
,
2000
:
Communicating statistical information
.
Science
,
290
,
2261
2262
, https://doi.org/10.1126/science.290.5500.2261.
Howe
,
P. D.
,
E. M.
Markowitz
,
T. M.
Lee
,
C.-Y.
Ko
, and
A.
Leiserowitz
,
2013
:
Global perceptions of local temperature change
.
Nat. Climate Change
,
3
,
352
356
, https://doi.org/10.1038/nclimate1768.
IPCC
,
2012
:
Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation. C. B. Field et al., Eds., Cambridge University Press, 582 pp.
, https://www.ipcc.ch/site/assets/uploads/2018/03/SREX_Full_Report-1.pdf.
Joslyn
,
S.
, and
S.
Savelli
,
2010
:
Communicating forecast uncertainty: Public perception of weather forecast uncertainty
.
Meteor. Appl.
,
17
,
180
195
, https://doi.org/10.1002/met.190.
Joslyn
,
S.
, and
J.
LeClerc
,
2013
:
Decisions with uncertainty: The glass half full
.
Curr. Dir. Psychol. Sci.
,
22
,
308
315
, https://doi.org/10.1177/0963721413481473.
Joslyn
,
S.
,
L.
Nadav-Greenberg
, and
R. M.
Nichols
,
2009
:
Probability of precipitation: Assessment and enhancement of end-user understanding
.
Bull. Amer. Meteor. Soc.
,
90
,
185
194
, https://doi.org/10.1175/2008BAMS2509.1.
Keul
,
A. G.
, and
A. M.
Holzer
,
2013
:
The relevance and legibility of radio/TV weather reports to the Austrian public
.
Atmos. Res.
,
122
,
32
42
, https://doi.org/10.1016/j.atmosres.2012.10.023.
Keul
,
A. G.
,
M. M.
Freller
,
R.
Himmelbauer
,
B.
Holzer
, and
B.
Isak
,
2009
:
Lightning knowledge and folk beliefs in Austria
.
J. Lightning Res.
,
1
,
28
35
, https://doi.org/10.2174/1652803400901010028.
Keul
,
A. G.
, and et al
,
2018
:
Multihazard weather risk perception and preparedness in eight countries
.
Wea. Climate Soc.
,
10
,
501
520
, https://doi.org/10.1175/WCAS-D-16-0064.1.
Kox
,
T.
, and
A. H.
Thieken
,
2017
:
To act or not to act? Factors influencing the general public’s decision about whether to take protective action against severe weather
.
Wea. Climate Soc.
,
9
,
299
315
, https://doi.org/10.1175/WCAS-D-15-0078.1.
Kox
,
T.
,
L.
Gerhold
, and
U.
Ulbrich
,
2015
:
Perception and use of uncertainty in severe weather warnings by emergency services in Germany
.
Atmos. Res.
,
158–159
,
292
301
, https://doi.org/10.1016/j.atmosres.2014.02.024.
Kruschke
,
J.
,
2014
:
Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan
.
Academic Press
,
776
pp.
Lazo
,
J. K.
,
R. E.
Morss
, and
J. L.
Demuth
,
2009
:
300 billion served: Sources, perceptions, uses, and values of weather forecasts
.
Bull. Amer. Meteor. Soc.
,
90
,
785
798
, https://doi.org/10.1175/2008BAMS2604.1.
LeClerc
,
J.
, and
S.
Joslyn
,
2015
:
The cry wolf effect and weather-related decision making
.
Risk Anal.
,
35
,
385
395
, https://doi.org/10.1111/risa.12336.
Lee
,
T. M.
,
E. M.
Markowitz
,
P. D.
Howe
,
C.-Y.
Ko
, and
A. A.
Leiserowitz
,
2015
:
Predictors of public climate change awareness and risk perception around the world
.
Nat. Climate Change
,
5
,
1014
1020
, https://doi.org/10.1038/nclimate2728.
Leuker
,
C.
,
T.
Pachur
,
R.
Hertwig
, and
T. J.
Pleskac
,
2018
:
Exploiting risk–reward structures in decision making under uncertainty
.
Cognition
,
175
,
186
200
, https://doi.org/10.1016/j.cognition.2018.02.019.
Lewandowsky
,
S.
,
J. S.
Risbey
,
M.
Smithson
, and
B. R.
Newell
,
2014
:
Scientific uncertainty and climate change: Part II. Uncertainty and mitigation
.
Climatic Change
,
124
,
39
52
, https://doi.org/10.1007/s10584-014-1083-6.
Li
,
Y.
,
E. J.
Johnson
, and
L.
Zaval
,
2011
:
Local warming: Daily temperature change influences belief in global warming
.
Psychol. Sci.
,
22
,
454
459
, https://doi.org/10.1177/0956797611400913.
Loewenstein
,
G. F.
,
E. U.
Weber
,
C. K.
Hsee
, and
N.
Welch
,
2001
:
Risk as feelings
.
Psychol. Bull.
,
127
,
267
286
, https://doi.org/10.1037/0033-2909.127.2.267.
Lumley
,
T.
,
2004
:
Analysis of complex survey samples
.
J. Stat. Software
,
9
,
1
19
, https://doi.org/10.18637/jss.v009.i08.
McCaffrey
,
M. S.
, and
S. M.
Buhr
,
2008
:
Clarifying climate confusion: Addressing systemic holes, cognitive gaps, and misconceptions through climate literacy
.
Phys. Geogr.
,
29
,
512
528
, https://doi.org/10.2747/0272-3646.29.6.512.
McDonald
,
R. I.
,
H. Y.
Chai
, and
B. R.
Newell
,
2015
:
Personal experience and the ‘psychological distance’ of climate change: An integrative review
.
J. Environ. Psychol.
,
44
,
109
118
, https://doi.org/10.1016/j.jenvp.2015.10.003.
McNeish
,
D.
,
2018
:
Thanks coefficient alpha, we’ll take it from here
.
Psychol. Methods
,
23
,
412
433
, https://doi.org/10.1037/met0000144.
Morss
,
R. E.
, and
M. H.
Hayden
,
2010
:
Storm surge and “certain death”: Interviews with Texas coastal residents following Hurricane Ike
.
Wea. Climate Soc.
,
2
,
174
189
, https://doi.org/10.1175/2010WCAS1041.1.
Morss
,
R. E.
,
J. L.
Demuth
, and
J. K.
Lazo
,
2008
:
Communicating uncertainty in weather forecasts: A survey of the U.S. public
.
Wea. Forecasting
,
23
,
974
991
, https://doi.org/10.1175/2008WAF2007088.1.
Murphy
,
A. H.
,
S.
Lichtenstein
,
B.
Fischhoff
, and
R. L.
Winkler
,
1980
:
Misinterpretations of precipitation probability forecasts
.
Bull. Amer. Meteor. Soc.
,
61
,
695
701
, https://doi.org/10.1175/1520-0477(1980)061<0695:MOPPF>2.0.CO;2.
Nagele
,
D. E.
, and
J. E.
Trainor
,
2012
:
Geographic specificity, tornadoes, and protective action
.
Wea. Climate Soc.
,
4
,
145
155
, https://doi.org/10.1175/WCAS-D-11-00047.1.
National Research Council
,
2006
:
Completing the Forecast: Characterizing and Communicating Uncertainty for Better Decisions Using Weather and Climate Forecasts. National Academies Press, 124 pp
.
Operskalski
,
J. T.
, and
A. K.
Barbey
,
2016
:
Risk literacy in medical decision-making
.
Science
,
352
,
413
414
, https://doi.org/10.1126/science.aaf7966.
Pachur
,
T.
,
R.
Hertwig
, and
R.
Wolkewitz
,
2014
:
The affect gap in risky choice: Affect-rich outcomes attenuate attention to probability information
.
Decision
,
1
,
64
78
, https://doi.org/10.1037/DEC0000006.
Qiu
,
L.
, and
W. A.
Nixon
,
2008
:
Effects of adverse weather on traffic crashes: Systematic review and meta-analysis
.
Transp. Res. Rec.
,
2055
,
139
146
, https://doi.org/10.3141/2055-16.
Rakow
,
T.
,
C. L.
Heard
, and
B. R.
Newell
,
2015
:
Meeting three challenges in risk communication: Phenomena, numbers, and emotions
.
Policy Insights Behav. Brain Sci.
,
2
,
147
156
, https://doi.org/10.1177/2372732215601442.
Reser
,
J. P.
,
G. L.
Bradley
, and
M. C.
Ellul
,
2014
:
Encountering climate change: ‘Seeing’ is more than ‘believing.’
Wiley Interdiscip. Rev.: Climate Change
,
5
,
521
537
, https://doi.org/10.1002/WCC.286.
Ripberger
,
J. T.
,
C. L.
Silva
,
H. C.
Jenkins-Smith
,
D. E.
Carlson
,
M.
James
, and
K. G.
Herron
,
2014
:
False alarms and missed events: The impact and origins of perceived inaccuracy in tornado warning systems
.
Risk Anal.
,
35
,
44
56
, https://doi.org/10.1111/risa.12262.
Savelli
,
S.
, and
S.
Joslyn
,
2013
:
The advantages of predictive interval forecasts for non-expert users and the impact of visualizations
.
Appl. Cognit. Psychol.
,
27
,
527
541
, https://doi.org/10.1002/acp.2932.
Schwarz
,
N.
,
1999
:
Self-reports: How the questions shape the answers
.
Amer. Psychol.
,
54
,
93
105
, https://doi.org/10.1037/0003-066X.54.2.93.
Shi
,
J.
,
V. H. M.
Visschers
,
M.
Siegrist
, and
J.
Arvai
,
2016
:
Knowledge as a driver of public perceptions about climate change reassessed
.
Nat. Climate Change
,
6
,
759
762
, https://doi.org/10.1038/nclimate2997.
Simmons
,
K. M.
, and
D.
Sutter
,
2009
:
False alarms, tornado warnings, and tornado casualties
.
Wea. Climate Soc.
,
1
,
38
53
, https://doi.org/10.1175/2009WCAS1005.1.
Slovic
,
P.
,
M. L.
Finucane
,
E.
Peters
, and
D. G.
MacGregor
,
2004
:
Risk as analysis and risk as feelings: Some thoughts about affect, reason, risk, and rationality
.
Risk Anal.
,
24
,
311
322
, https://doi.org/10.1111/j.0272-4332.2004.00433.x.
Soga
,
M.
, and
K. J.
Gaston
,
2016
:
Extinction of experience: The loss of human–nature interactions
.
Front. Ecol. Environ.
,
14
,
94
101
, https://doi.org/10.1002/fee.1225.
Spiegelhalter
,
D.
,
M.
Pearson
, and
I.
Short
,
2011
:
Visualizing uncertainty about the future
.
Science
,
333
,
1393
1400
, https://doi.org/10.1126/science.1191181.
Stekhoven
,
D. J.
, and
P.
Bühlmann
,
2011
:
MissForest—Non-parametric missing value imputation for mixed-type data
.
Bioinformatics
,
28
,
112
118
, https://doi.org/10.1093/bioinformatics/btr597.
Stewart
,
A. E.
,
J. K.
Lazo
,
R. E.
Morss
, and
J. L.
Demuth
,
2012
:
The relationship of weather salience with the perceptions and uses of weather information in a nationwide sample of the United States
.
Wea. Climate Soc.
,
4
,
172
189
, https://doi.org/10.1175/WCAS-D-11-00033.1.
Sundblad
,
E.-L.
,
A.
Biel
, and
T.
Gärling
,
2009
:
Knowledge and confidence in knowledge about climate change among experts, journalists, politicians, and laypersons
.
Environ. Behav.
,
41
,
281
302
, https://doi.org/10.1177/0013916508314998.
Taylor
,
A.
,
W.
Bruine de Bruin
, and
S.
Dessai
,
2014
:
Climate change beliefs and perceptions of weather-related changes in the United Kingdom
.
Risk Anal.
,
34
,
1995
2004
, https://doi.org/10.1111/risa.12234.
Weber
,
E. U.
,
2006
:
Experience-based and description-based perceptions of long-term risk: Why global warming does not scare us (yet)
.
Climatic Change
,
77
,
103
120
, https://doi.org/10.1007/s10584-006-9060-3.
Weber
,
E. U.
, and
P. C.
Stern
,
2011
:
Public understanding of climate change in the United States
.
Amer. Psychol.
,
66
,
315
328
, https://doi.org/10.1037/a0023253.
World Meteorological Organization
,
2008
:
Guidelines on communicating forecast uncertainty. WMO Rep. WMO/TD-1422, 25 pp.
, https://library.wmo.int/doc_num.php?explnum_id=4687.
World Meteorological Organization
,
2015
:
WMO guidelines on multi-hazard impact-based forecast and warning services. WMO Rep. WMO-1150, 34 pp.
, https://library.wmo.int/doc_num.php?explnum_id=7901.
Zabini
,
F.
,
V.
Grasso
,
R.
Magno
,
F.
Meneguzzo
, and
B.
Gozzini
,
2015
:
Communication and interpretation of regional weather forecasts: A survey of the Italian public
.
Meteor. Appl.
,
22
,
495
504
, https://doi.org/10.1002/met.1480.

Footnotes

Denotes content that is immediately available upon publication as open access.

For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).
1

Note that in meteorology the proportion of false alarms among all issued warnings is called the “probability of false alarm” (POFA), which is the complement of the positive-predictive value of the warning. It is important that the POFA not be confused with the “probability of false detection,” which is also known as the “false alarm” or “false-positive rate” in psychology and medicine (i.e., the proportion of false alarms among “negative” cases, i.e., healthy patients).

2

We estimated the expected proportion of numeric “50%” responses as the mean of the proportions of the two neighboring categories (proportion of “40%” and “60%” responses; see Bruine de Bruin et al. 2000). Using this expected proportion of 50% responses in the analysis (i.e., excluding the proportion of respondents who used 50% to express uncertainty) revealed slightly more underestimation of misses (Mdn estimate: 30%; IQR = 20%–40%) and even more underestimation of false alarms (Mdn estimate: 30%; IQR = 20%–50%) in the corrected sample.

3

For Germany, there is no increase in the number of days with more than 10 mm per year between 1951 and 2016 (Deutscher Wetterdienst 2017). For different indicators for intense precipitation, some change (increase as well as decrease) may be observed for parts of Germany, yet there is considerable seasonal, spatial, and temporal variation (Brasseur et al. 2017). For the years before 1951, there is currently no robust observational database that is representative for the whole of Germany.

4

Despite the different numbers of items, the reliability of the two scales was comparable: Revelle’s ω total (McNeish 2018) was 0.76 for weather risks and 0.77 for climate change.

5

We report the median of the posterior distribution and the 95% HDI; the HDI indicates the parameter range “for which all values inside the interval have higher credibility than values outside the interval, and the interval contains 95% of the distribution” [for a primer on Bayesian statistics, see Kruschke (2014), p. 302].