1. Introduction
Thunderstorms and related hazards, such as tornadoes, hail, heavy precipitation, and gusts from downbursts, pose a threat to life and safety (Simmons and Sutter 2008, 2011; Dotzek et al. 2009). These phenomena often cause as much annual property loss in the United States as hurricanes, or even more; for example, the record year 2011 had some $47 billion in economic losses [$26 billion (U.S. dollars) insured losses] (Munich Re 2012a,b). Often they cause more fatalities than hurricanes (Diffenbaugh et al. 2008).
Normalization removes the impact that changes in wealth over time have on losses. To this end, past losses are rescaled according to the relative changes in destructible wealth between the year of the loss and today. Since the 1970s, normalized direct economic and insured losses from such events have increased in the United States (Changnon 2001; Neumayer and Barthel 2011; Barthel and Neumayer 2012). In the particular case of hail, studies have found that losses from strong hailstorms and reports of large hail have increased in recent decades (Brooks and Dotzek 2008; Changnon 2009). Correspondingly, the thunderstorm-related risk level has changed over the past four decades.
In the current study, we analyze the time series 1970–2009 of thunderstorm-related normalized losses from sizeable events exceeding a large loss threshold. The area under focus is the contiguous United States east of 109°W, and the losses have been aggregated annually over the period March–September (Fig. 1).
In the annual time series of such losses as displayed in Fig. 2, the short-term variability and also the mean level of loss are observably enhanced in the latter half of the period as compared to the earlier half (see Fig. 2 and the supplemental material). The new concept presented here is to compare this pattern of change to the time series of severe thunderstorm forcing as inferred from reanalysis data. This way, we address the question whether changes in the hazard can be deemed responsible for changes in normalized losses. An analogous approach was once suggested for the problem of the National Oceanic and Atmospheric Administration (NOAA)'s Storm Prediction Center (SPC) F2-F5 tornado classifications that suffer from overrating in the past (Verbout et al. 2006): According to Diffenbaugh et al. (2008), a comparison of SPC's tornado data with the information on large-scale severe thunderstorm forcing as derived from reanalysis data would shed more light on the problem. For the loss data, we follow Diffenbaugh's recommended comparison to reanalysis data as another “report proxy.”
The paper has the following outline: the data, related concepts, and thresholds will be introduced and explained before the setup of the analysis method on which it is focused. The final three sections will present and discuss the new results of this study, draw conclusions, and present a short outlook.
2. Loss data and concepts
In this study, Munich Re's natural-catastrophe loss database NatCatSERVICE is used. It registers loss events associated with significant damage to persons and property. It is of high quality and comprehensiveness (Kron et al. 2012; Neumayer and Barthel 2011). As insured thunderstorm-related losses are not reported specifically for the individual hazards that constituted the event (hail, tornado, etc.), the best available loss measure aggregates across all perils of an individual severe storm. The same holds for the direct economic loss estimation (Kron et al. 2012). Thus, the investigation starts from the event loss as such, including the damaging impacts of hail, heavy precipitation and resulting flash flooding, lightning, strong winds, and tornadoes in unknown proportions. This fits best the equally holistic measure of severe thunderstorm forcing that is used on the meteorological side (see section 3).
Recent studies utilized another source of thunderstorm-related loss data. They started from estimates of direct economic property losses from tornadoes reported in NOAA's SPC archive (Simmons and Sutter 2011; Simmons et al. 2011). However, this data source cannot be recommended at all for several reasons. The most important one is the massive underestimation of direct economic losses aggregated for tornadoes, hail, and wind since 1996. It amounts to less than half of the documented insured losses. This constitutes fundamental error, since economic losses have to be substantially greater than the insured portion of losses (see the appendix for more details).
Although the impact of changes in destructible wealth on loss has approximately been removed by normalization, there might still be biases arising from exposure change over time: (i) urban sprawl—that is, an ever increasing area covered by destructible wealth—would contribute to a positive trend in loss event frequency from rather small convective events, if all other factors were kept constant; and (ii) normalized losses from the whole domain could increase over time solely because of urban development in regions with higher hazard activity in the south, that has gained by migration from the north. This understanding led Changnon (2001) to the conclusion that the observed increase in normalized insured U.S. thunderstorm losses, event frequencies, and spatial event sizes since the 1970s was not only due to a fluctuation in meteorological conditions but also to demographic shifts to the southern parts of the country since 1950.
These biases can be made negligible by focusing on large loss events indicative of severe weather with multistate impacts. Those large events hit destructible wealth already at the beginning of the analysis period, that is, in the early 1970s. Thus, they most likely have not been missed. We chose normalized economic losses of at least $250 million—those loss events, termed loss250,econ in the following, covered not only many counties but also several states. On average, such loss events belonging in a size bin from $250 million to an upper bound of (less than) $300 million affected 6.2 states in the first decade of our analysis period (12 events), and 4.5 states in the last (15 events). With this high threshold and the associated extent of damage, homogeneity of the time series can be approximately achieved, that is, it is highly unlikely that such events have been missed at any time in the analysis period and in any region within the domain. Hence, the biases in event frequencies and from possible population shifts to regions of higher hazard activity are removed. The threshold of $250 million selects 80% of total thunderstorm-related economic losses east of the Rocky Mountains in the period 1970–2009 (March–September), that is, the major proportion.
Regarding the insured proportion of thunderstorm-related losses, we used a threshold of $150 million, which corresponds to the fact that, on average, insured losses are smaller by a factor of approximately 0.6 than the associated economic losses in excess of $250 million. Events with insured losses in excess of $150 million account for 81% of the total and are termed loss150,ins in the following.
In sum, the selection criteria for losses comprise the following: thunderstorm-related normalized direct economic (insured) losses exceeding $250 million ($150 million) per event from the period 1970–2009 (March–September) and from the contiguous United States east of 109°W; 273 economic loss events were identified in the database (BS-based normalization) which fulfill the predetermined criteria (see Fig. 1).
3. Meteorological data and concepts
Individual thunderstorms occur on spatial scales of tens of kilometers and time scales of several hours. The development of cumulus clouds into severe convective storms is strongly influenced by the larger-scale “environmental” distributions of temperature, moisture, and wind (Trapp et al. 2007). Today, convection parameters derived from these atmospheric fields are often used to investigate long-term trends in convection over past decades (Doswell 1987; Houze 1993; Brooks et al. 2003; Kunz 2007; Kunz et al. 2009). To this end, the reanalysis project run by the NOAA National Centers for Environmental Prediction (NCEP) and the National Center for Atmospheric Research (NCAR) is utilized in this study (Kalnay et al. 1996). The dataset is not completely homogenous over time, because of changes in the observing system during the reanalysis period (1948 up to the present). This was caused by the use of satellite data from the 1970s onward, increasing numbers of observations from aircraft, ocean buoys and other surface platforms, and a changing number of soundings since the late 1980s. Reanalysis data are a merger of model forecasts and observational data, wherein spatial homogeneity is achieved by interpolating results from observation-rich to observation-poor areas on the basis of model equations. The data are available worldwide (192 × 94 grid points) with a spatial resolution of 1.875° in longitude and 1.915° in latitude, corresponding to a box of approximately 208 km × 174 km at 35°N. The temporal resolution is 6 h. As the area of investigation is the United States east of the Rockies, the NCEP–NCAR reanalysis data are used for the domain depicted in Fig. 3a.
For the organization of severe thunderstorms, the vertical deep-layer wind shear (DLS) is also important (Weisman and Klemp 1984; Klemp 1987)—in addition to CAPEml. Here, DLS is defined as the absolute value of the shear vector between the horizontal wind vectors at ground level and 6 km above ground level (Brooks et al. 2003). High values of DLS lead to a spatial separation of updrafts and downdrafts within a thunderstorm cell. This is the precondition for long-lived and strong convective systems, as the separation guarantees the continuing inflow of warm and moist air, providing the necessary energy. It has been shown that the combination of the parameters CAPEml and DLS serves as an adequate discriminator in analyzing the potential for strong convective phenomena (Brooks et al. 2003, 2007; Trapp et al. 2007, 2009; Sander et al. 2008; Sander and Dotzek 2010).
The current study defines severe thunderstorm forcing situations by the exceedance of a predefined high TSP threshold per grid point. This was set at 3000 J kg−1, roughly corresponding to the 99.99th percentile of the six-hourly data from all over the domain. Such an exceedance is termed TSP3000 in the following. As can be inferred from a study on reanalysis-based CAPEml and DLS ranges that have forced significant severe weather in the past, the TSP range characterized by forcing of significant severe weather starts from values of approximately 600 J kg−1 (Fig. 1 in Brooks et al. 2003). Thus, TSP3000 represents a very strong forcing situation.
4. Model setup
As this study investigates the possible role of the temporally varying meteorological “observable” TSP as an “implicit” driver of the impact time series of normalized losses, a method setup with three successive steps was chosen:
First, TSP observations are selected according to the date of the loss event, and within a neighborhood defined by a rectangle of 7 × 7 grid points centered in the nearest-neighbor grid point to the loss event location as displayed in Fig. 3b. In this case, a correlation between the two time series is enabled by a close correspondence in time and space. This is called a forced correlation.
In the next step, TSP observations are selected according to the date of the loss event but without any spatial constraint. This allows for a correlation with the losses that might happen because of correspondence in time, which we call a semiforced correlation because of the lack of any spatial constraint.
Finally, the two time series are correlated without applying any additional selection criteria to TSP observations (neither in space nor time, except that both have to have occurred within the same March–September season). We call this procedure an unforced correlation.
Regarding these three steps, we applied the following settings: (i) every thunderstorm-related loss event comes along with latitude and longitude coordinates. This location was specified by the main focus or spatial center of damage that occurred during the event. (ii) Investigating only TSP values within a neighborhood of 7 × 7 grid points (step 1) spans a box of about 1900 km on the diagonal. This appeared to be an adequate spatial representation for the 6-h time resolution of the reanalysis data, since the fastest thunderstorms cannot travel across the box within six hours. If a smaller environment were chosen, then some large TSP events could be missed. (iii) The period of outbreak and damage for events of loss250,econ regularly spans several consecutive days, with an average of four days. As TSP has to be high already before the thunderstorm hazard materializes, we included the previous day in addition to the event period.
Two ways of investigating the exceedances of a threshold have been specified:
Every TSP value exceeding the threshold can be counted as one exceedance. As the temporal resolution of the reanalysis is 6 h, there is a theoretical potential of four exceedances per day for any given grid point. Hence, during a loss event period, the maximum theoretical number of possible TSP threshold exceedances equals 4 × [number of grid points within neighborhood (49)] × [number of event period days plus preceding day]. This approach accounts for frequency only.
In a second approach, all TSP values exceeding the threshold are summed up. In this case, information about the intensity of the thunderstorm forcing is retained, which may also have materialized in the events and associated losses.
5. Results and discussion
a. Distribution properties
For a better understanding of the correlation properties between TSP3000 and loss250,econ, a closer look at the distribution properties of the variables wmax, DLS, TSP, and economic loss will be of help. Figures 4a and 4b show the probability density functions (pdfs) of 6-hourly values of wmax and DLS, derived from histograms of the NCEP–NCAR data over the period 1970–2009 for all grid points shown in Fig. 3a. The two curves demonstrate an important difference. While the pdf of wmax appears fat tailed because of its power-law shape (displayed as a straight-line fit in the double-logarithmic plot) with an exponent −0.485, the pdf of DLS (in semilogarithmic scales) falls off quickly at large values. A cross-correlation analysis of both data yields R = −0.32, indicating rather weakly anticorrelated behavior between wmax and DLS, which suppresses the production of very extreme TSP. As a consequence, the pdf for 6-hourly TSP values (in particular for large values) shown in Fig. 4c cannot be interpreted as the exact convolution of the pdfs from Figs. 4a and 4b. However, it is clearly of exponential type, indicated by the straight line in the semilogarithmic plot, hence showing pronounced short-tail behavior, implying only little variability among extreme TSP values. This together with a clear upper limit at about 4000 J kg−1 indicates a very narrow variation of subsamples of large TSP events, for example, TSP above a large quantile threshold. This feature will be of substantial importance in the following.
Figure 4d shows the pdf of the normalized loss data for events caused by convective storms with economic damages of at least $50 million (the threshold of $250 million used here is marked off by the dashed vertical line). The shape of the pdf clearly indicates a stretched exponential distribution (red fit) and therefore has fat-tail characteristics, that is, there is a large variability among extreme losses. However, it must be noted that while the pdf of TSP is based on about 6.7 million values, the pdf of loss is based on roughly 600 values. Because of the short-tail characteristics of the atmospheric parameter TSP—in contrast to the fat-tailed loss distribution—the TSP parameter does not exhibit any tendency toward extraordinary outliers.
b. Annual time series of thunderstorm severity potential and loss
As a first step of the time series analyses, Fig. 5 displays the curves for absolute numbers of TSP3000 per method step, that is, for forced, semiforced, and unforced correlations. The coarse pattern of an increase in annual variability in the second half-period 1990–2009 as compared to the first half-period 1970–89 is preserved from forced to semiforced and unforced correlation. Correspondingly, for each time series the standard deviation is lower for the first half-period than for the second (8.5, 11, and 18.2, as against 14.9, 23.1, and 28.1). Additionally, most peaks and relative minima are at identical temporal positions. These findings are important: the coarse pattern is robust against the successive removal of spatial and temporal coherence constraints between losses and TSP. The similarity between the coarse patterns of the curves can be explained by the observation that disastrous loss250,econ events are associated with many severe thunderstorms, covering a large multistate region. As a precondition of these widespread thunderstorms, at times spanning multiday event periods, high TSP values prevail and can extend beyond the neighborhood of the event's coordinates, as defined in Fig. 3b. Consequentially, in the semiforced case TSP3000 tends to display higher numbers than in the forced case, where the neighborhood constraint limits the region from which exceedances can be counted. Even higher TSP3000 numbers tend to accumulate in the unforced case, where the constraints of both spatial and temporal coincidence with loss250,econ events were removed from TSP3000. In this case, also days from beyond the loss event periods can contribute. The fewer the constraints in space and time, the higher the numbers of TSP3000 tend to be while still preserving the coarse pattern of increasing variability over time.
Next, analyses of the correlation between the meteorological time series of TSP3000 and the impact time series of losses are presented. As these will follow the three successive method steps from forced to unforced, the first type of correlation is due to a close correspondence in time and space between loss250,econ and TSP3000 (step 1).
Figure 6a shows the number of threshold exceedances per year, standardized by subtracting from each individual value the overall mean number of annual exceedances and dividing the result by the overall standard deviation. Standardization—that is, the y axis is scaled in units of standard deviation of the mean-centered distributions involved in the diagram—was chosen for the sake of comparability of time series of different variables. Annual TSP3000 numbers are indicated by the orange curve and annual numbers of loss250,econ events by the green curve. Figure 6b displays the standardized annually aggregated TSP3000 values (orange) and the annually aggregated loss250,econ values (green). The above-described type of standardization was chosen since both the annual aggregates and the annual numbers of exceedances represent sums, transforming the very asymmetric pdfs of TSP and loss (see Fig. 4) into a more symmetric, almost Gaussian-like distribution because of the central limit theorem. However, the central limit theorem characteristics do not apply for years with no or little activity, both in terms of TSP and loss, leaving the charts in Fig. 6 still somewhat asymmetric because of the lower bounds (TSP3000 represents a larger quantile than loss250,econ, hence producing more zero values that equal approximately −0.8 at the standardized scale).
There is a correlation between TSP and loss in both graphs that is stronger in the latter half of the time period under consideration (1970–2009) (see Table 1). Anomalies such as in the year 1983, where TSP and loss appear anticorrelated, reflect the level of uncertainty (or noise) remaining in the standardized datasets. Although the characteristics of TSP and loss appear similar in parts of the time series, there are some years where a closer look at Figs. 6a and 6b could lead to confusion. Starting with the year 1974, the standardized numbers in Fig. 6a seem to show consistency between loss250,econ and TSP3000. Although there was no single TSP3000 exceedance during any loss event period and associated 7 × 7 gridpoint neighborhood in this year, we count five major losses in 1974 (Table 2)—implying they were not associated with our definition of severe thunderstorm forcing. Alternatively, this “discrepancy” might reflect issues of the spatial/temporal resolution of the reanalysis data. But in the same year, the accumulated values of loss250,econ and TSP3000 present a great divergence in Fig. 6b. This is caused by an extreme loss of one single event that amounts to more than $8 billion (BS-based normalization). If only a few events occurred, but one was associated with an extraordinary loss, then this situation could result in a good correlation in numbers but a rather bad one in aggregated values. In 2003, the number of grid points that exceeded the given thresholds did not adequately mirror the activity in terms of the annual count of severe loss events (see Fig. 6a). However, the aggregated values in Fig. 6b appear very consistent. In this case, only six loss events occurred (see Table 2), but most of them were associated with very high damage. A clear signal and great conformity is achieved in 1995 and 1998. These were active years with above-average occurrences (11 and 16 events). The difference in the statistical behavior between loss and TSP is foremost grounded in the fact that single losses can reach enormous amounts because of fat-tail characteristics (as shown above), and can occur in relatively small frequencies per year. Thus, they account for considerable variability in annual aggregated values. Conversely, the tail of the TSP distribution falls off quickly and accounts for a limited distribution compared to the losses (see above). Even annual aggregation cannot smooth out this fundamental difference. Of course, a strong contributor to the virtually unlimited characteristic of the loss distribution is the fortuitousness in the quantity of destructible wealth situated along the pathway of individual thunderstorm cells.
Pearson coefficients R of correlation between TSP3000 and loss250,econ (loss150,ins). Rows: coefficients and p values (in brackets) are displayed for steps 1 (forced), 2 (semiforced), and 3 (unforced). Coefficients differ according to the wealth proxy used for loss normalization (GDP or BS based). Columns: for each of the indicated (sub) periods—i.e., 1970–89, 1990–2009, and 1970–2009—coefficients and p values are given for annual numbers (num.) and aggregated values (aggr.).
Example years and associated number of events in three classes of BS-based normalized direct economic losses. In the highest class, losses are given in brackets. The most active year (1998) did not produce very extreme single events.
In the following step, the spatially forced relationship is removed (step 2). The two time series are correlated merely under the constraint that TSP3000 registrations fall within the loss250,econ event period, but without any spatial constraint (semiforced relationship). The results of the correlations (not graphically shown) using the same standardized variables as in Fig. 6 are similar but slightly better than in the forced case; see the correlation coefficients in Table 1. This is plausible, as the time series of annual TSP3000 numbers exhibit roughly similar patterns in both cases, with a tendency to reach higher exceedance numbers in the semiforced case (see above; Fig. 5).
In the final step, we not only neglect the predefined spatial coherence but also the temporal coincidence as prescribed by the event periods (step 3 according to section 4). We take into account every TSP3000 occurrence anywhere in the domain of investigation (schematized in Fig. 3a) during the March–September season. Results for this unforced case are shown for loss250,econ and TSP3000 in Figs. 7a and 7b, using standardized variables. The correlations between loss250,econ and TSP3000 (numbers and aggregated values) are again kept in the same ranges as in the foregoing method steps (Table 1). As the correlations within these method steps should be viewed as based on a physical causative relation, the conclusion is that this causative relation is still sufficiently captured by the unforced case. Hence, it is physically sound to use numbers and values of TSP3000 collected from the whole domain and the whole season for the correlation to the seasonal loss parameters.
A one-to-one correlation between the annually aggregated TSP3000 values and loss250,econ values (Fig. 7b) cannot be expected, however. This is caused by the absence of loss250,econ values of a typical size. Instead, we have a fat-tail characteristic of the distribution that translates into a wide loss range—from $250 million to more than $8 billion, that is, over more than 1.5 orders of magnitude. Hence, the occurrence of a very large loss in a year with very few loss and TSP events can destroy the correlation of annually aggregated losses and TSP values in this spot, as is the case mainly within the first two decades of the analysis period (see also Table 1). The lack of stability of sums on the loss side is somewhat compensated for in the annual number statistics (Fig. 7a): here, a tendency toward better correlations can be found in the earlier decades (besides year 1983), because event numbers are more stable in this regime of small annual counts than aggregated values of parameters from fat-tailed distributions (Table 1).
In the last two decades 1990–2009 of the analysis period, the increased frequency of loss events also improves the aggregated losses compared to the two decades 1970–89, translating into a much better correlation coefficient for annually aggregated values of TSP3000 and loss250,econ (Table 1). It is also improved over the first two decades for the annual numbers (Table 1).
The correlations between time series of insured loss150,ins and TSP3000 (standardized numbers and aggregated values) are presented in Figs 7c and 7d. As has to be expected, these correlations are very similar to the results found for loss250,econ (see also Table 1).
In general terms, Fig. 7 again captures the clear increase in variability and peak maxima from the first half period (1970–89) to the second half period (1990–2009) for both TSP3000 and loss250,econ (numbers and aggregated values). Taking the standard deviation as a measure of variability, this is demonstrated in Table 3 (see also the supplemental material). Hence, the meteorological pattern of change found in the annual time series of severe thunderstorm forcing situations (TSP3000) is reflected also in the annual time series of losses (loss250,econ, loss150,ins).
Standard deviations of standardized annual numbers and aggregated values of loss250,econ and TSP3000, as displayed in Figs. 7a and 7b. Referenced periods are indicated by the main columns. Normalization on the basis of BS data is solely accounted for (corresponding to green curves in Figs. 7a,b). For additional metrics, see the supplemental material.
c. Correlations of longer-term temporal patterns
The correlations measured at annual resolution are affected by noise because of various factors already discussed in statistical terms. In physical terms, such noise can be caused, for instance, by the randomness of destructible wealth exposed to a severe storm's track or missing trigger mechanisms needed to transform high forcing into severe weather and loss. However, the noise can be reduced by averaging over multiyear time intervals: the close relationship and similarity between the longer-term temporal patterns of loss250,econ and TSP3000 (standardized numbers and aggregated values) is obvious from plots of 7-yr running means in Figs. 8b.
By means of normalization and a high loss threshold per event, the imprint of increasing destructible wealth on the time series of losses has been made negligible. As the probability of loss (i.e., risk) is a function of the exposed wealth, its vulnerability, and the hazard (Allen et al. 2012), temporally changing properties of hazard and vulnerability are left as drivers of change. Based on the fingerprint-like similarity between the longer-term patterns of variability in Figs. 8a and 8b, we conclude that among the remaining drivers of change, it is predominantly the longer-term temporal variability of the hazard (TSP3000) that is driving the longer-term change in losses. The remaining differences between TSP3000 and loss curves, particularly the slightly steeper increase in loss parameters from the first half-period to the second half period, might be explained by two features:
The pattern of change in thunderstorm forcing might come along with higher thunderstorm hazard intensities in the second half-period. For instance, wind damages increase with gust wind speed in a nonlinear, progressive way (Heneka et al. 2006; Leckebusch et al. 2007; Donat et al. 2011). Analogously, residential building losses from hail increase in a progressive way with kinetic hail energy (Hohl et al. 2002). Consequently, there might be a stronger increase in losses, and perhaps also in loss250,econ event numbers from thunderstorms, if TSP3000 occurrences increase in annual number and severe thunderstorms become more intense.
A second explanation might be seen in an increase in the vulnerability of buildings over time, for example, through the rapid buildup of residential homes and commercial facilities in thunderstorm-prone locations, whereby no efforts have been made to foster resilience against thunderstorm hazards because of the lack of specific building codes (Munich Re 2012a). In-depth analysis of changes in vulnerabilities is still an almost untapped field for research.
d. Climate change
The current study does not develop a method setup for attributing the changes in severe thunderstorm forcing and losses over time to either anthropogenic climate change or natural climate variability (or both). Even so, the results of recent scientific studies imply that the changes observed are consistent with the modeled effects of anthropogenic climate change. This holds even if it is plausible that large losses from severe thunderstorm outbreaks also occurred in the 1950s and 1960s (Changnon 2001), because today's climatic regime could be fundamentally different compared to these past decades. Trapp et al. (2007, 2009) have found that climate-model-based projections display indications of a regime in which increasing specific humidity (as the main contributor to increasing CAPEml over time) increases the annual frequency of severe thunderstorm environments (defined by the product of CAPEml and DLS) in a transient climate model experiment since 1950. Sander (2011) has found similar results in her climate-model-based analysis of climate change effects on thunderstorm activity in central Europe. As a precondition of rising CAPEml, monthly observations of near-surface specific humidity during the period 1973–1999 [Hadley Centre/Climatic Research Unit Global Surface Humidity dataset (HadCRUH; Peterson et al. 2011)] show a clear increase in the Northern Hemisphere. In eastern North America, this increase equals 3.6% (±2.7). This was shown to be in coarse statistical agreement with the results from (anthropogenically forced) GCM runs over this period (Willett et al. 2010). A similar finding has been inferred from satelliteborne microwave sensor data of atmospheric moisture content over the oceans, also statistically corroborated by climate model experiments (Santer et al. 2007). Hence, what this paper found for the United States fits the concept of increasing humidity that might be brought about by higher SSTs (particularly the Gulf of Mexico) and associated higher evaporation rates. Consistent with this reasoning, we have found a strong increase in annual (March–September) aggregated CAPEml in the United States east of the Rockies since 1970. This increase translates into a rise in the seasonal aggregate of maximum thunderstorm cell updraft velocities wmax as a measure of available maximum convective intensity (Fig. 9).
6. Conclusions
The pattern of change observable in the time series of severe thunderstorm forcing situations (TSP3000) over the period 1970–2009 reveals a clear increase of the annual variability in the second half-period over the first. For the first time, the current study has demonstrated that this pattern, seen as a meteorological fingerprint, can be identified in the time series of thunderstorm-related normalized economic losses in excess of $250 million (loss250,econ). This result holds for numbers and aggregated values of these parameters collected from the whole domain east of the Rocky Mountains in the period March–September. The same finding applies to insured losses in excess of $150 million (loss150,ins). These results gain additional weight from the fact that during the period 1970–2009, normalized economic losses exceeding $250 million per event account for the major proportion (approximately 80%) of thunderstorm-related losses in the United States east of the Rockies.
Correlation analyses have followed a three-step approach that successively has removed constraints in time and space applied to the selection of thunderstorm-forcing measurements (TSP3000). The successive steps correspond to a move from a forced to an unforced relationship between TSP and losses. A comparison of these steps has revealed that the annual variability in TSP3000 numbers and aggregated values, taken from the whole domain and season (unforced case), sufficiently captures the meteorological signal that constitutes the correlation. Hence, this signal is reflected in the annual loss record (loss250,econ, loss150,ins, numbers, and aggregated values)—like a fingerprint of the climatic change. Deviations from this general feature in individual years can be readily explained by the distribution properties of the variables involved. From these findings, we conclude that it is predominantly the change in hazard over time—rather than the change in destructible wealth or vulnerability—that has driven up normalized losses, as reflected in the strong similarity of the longer-term signals in Fig. 8. A secondary role may be left for other drivers, most probably a change in vulnerability. There is more research needed to shed additional light on this. But the most prominent features are the increase in annual variability and multiyear averages over time, particularly since the late 1980s.
As a conclusion, a high probability is assigned to climatic variations primarily driving the changes in normalized losses since 1970. Because of the chosen methodology, the current study has not been able to conclusively attribute the variability in severe thunderstorm forcing situations and losses to either natural climate variability or anthropogenic climate change. Even so, it was demonstrated that the findings presented are consistent with the expected effects of anthropogenic climate change. Climate-model-based studies might further investigate this link in the future.
7. Outlook
This study demonstrates the “fingerprint” of the meteorological signal to be reflected within the loss signal. How much this variability over time is affected by natural climate variability versus anthropogenic climate change will be investigated in future work. Furthermore, we will have a closer look at the interannual variability and where the roots of these anomalies are. Additionally, the focus of investigation will be set on other areas of the world.
Acknowledgments
The study benefited from support of the German Aerospace Center, Institute of Atmospheric Physics, and the Munich Reinsurance Company. The authors thank these institutions, and also NOAA/OAR/ESRL PSD, Boulder, Colorado, for access to and support with the NCEP–NCAR reanalysis data.
The authors are grateful for the work of three anonymous reviewers and also for the editor's work.
APPENDIX
Critical Perspectives on Loss Data
Information on direct economic loss per thunderstorm peril—tornado, hail, and wind—is provided by NOAA's SPC archive and used by recent studies on tornado loss time series (Simmons and Sutter 2011; Simmons et al. 2011). The use of this loss data cannot be recommended for the following reasons:
Before the year 1996, the SPC archive provides individual losses in order of magnitude intervals—that is, the upper interval bound exceeds the lower by a factor of 10. This classification necessarily leads to enormous uncertainty for annually aggregated figures. For instance, an individual loss falling in the interval between $50 million and $500 million can be close to the lower interval bound or to the upper, which implies a very large uncertainty range. If the classes were represented by their means, then the annual aggregate could strongly deviate from the real sum. Thus, it cannot be recommended to use the data from before 1996.
As SPC's archive provides direct loss estimates for tornado, hail, and wind since 1996, the peril-specific loss portions were aggregated on an annual basis for the period 1996–2009. These sums are compared with the best available data on annual insured losses from thunderstorms taken from the NatCatSERVICE database. To ensure comparability, both the agricultural and the flood-related (National Flood Insurance Program) loss components were removed from the NatCatSERVICE data, thus making these figures approximately match the Property Claims Service reports. The latter are the property insurance market standard on loss information. In the first half-period 1996–2002, the SPC aggregate of direct economic losses from severe weather amounts to 53% of documented insured thunderstorm losses, and to only 36% in the second half-period 2003–09. These ratios are proof of a massive underestimation by the SPC figures because in reality, overall economic losses cannot be smaller than documented insured losses.
Additionally, the time series characteristics that are inherent in the measured insured loss data (strong increase in variability over time) are completely smoothed off in the SPC data (decrease in variability over time).
REFERENCES
Allen, S. K., and Coauthors, 2012: Summary for policymakers. Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation, C. B. Field et al., Eds., Cambridge University Press, 3–21.
Barthel, F., and Neumayer E. , 2012: A trend analysis of normalized insured damage from natural disasters. Climatic Change, 113, 215–237, doi:10.1007/s10584-011-0331-2.
Brooks, H. E., and Dotzek N. , 2008: The spatial distribution of severe convective storms and an analysis of their secular changes. Climate Extremes and Society, H. F. Diaz and R. J. Murnane, Eds., Cambridge University Press, 35–53.
Brooks, H. E., Lee J. W. , and Craven J. P. , 2003: The spatial distribution of severe thunderstorm and tornado environments from global reanalysis data. Atmos. Res., 67–68, 73–94.
Brooks, H. E., Anderson A. R. , Riemann K. , Ebbers I. , and Flachs H. , 2007: Climatological aspects of convective parameters from the NCAR/NCEP reanalysis. Atmos. Res., 83, 294–305.
Changnon, S. A., 2001: Damaging thunderstorm activity in the United States. Bull. Amer. Meteor. Soc., 82, 597–608.
Changnon, S. A., 2009: Increasing major hail losses in the U.S. Climatic Change, 96, 161–166.
Craven, J. P., Jewell R. E. , and Brooks H. E. , 2002: Comparison between observed convective cloud-base heights and lifting condensation level for two different lifted parcels. Wea. Forecasting, 17, 885–890.
Cummins, J. D., and Mahul O. , 2009: Catastrophe risk financing in developing countries: Principles for public intervention. World Bank, 256 pp.
Dahl, J., 2010: The development of a new lightning-frequency parameterization and its implementation in a weather prediction model. Ph.D. thesis, Ludwig-Maximilians Universität München, 152 pp.
Diffenbaugh, N. S., Trapp R. J. , and Brooks H. , 2008: Does global warming influence tornado activity? Eos, Trans. Amer. Geophys. Union, 89, 553–560.
Donat, M. G., Leckebusch G. C. , Wild S. , and Ulbrich U. , 2011: Future changes in European winter storm losses and extreme wind speeds inferred from GCM and RCM multi-model simulations. Nat. Hazards Earth Syst. Sci., 11, 1351–1370.
Doswell, C. A., III, 1987: The distinction between large-scale and mesoscale contributions to severe convection: A case study example. Wea. Forecasting, 2, 3–16.
Dotzek, N., Groenemeijer P. , Feuerstein B. , and Holzer A. M. , 2009: Overview of ESSL's severe convective storms research using the European Severe Weather Database ESWD. Atmos. Res., 93, 575–586.
Emanuel, K. A., 1994: Atmospheric Convection. Oxford University Press, 592 pp.
Heneka, P., Hofherr T. , Ruck B. , and Kottmeier C. , 2006: Winter storm risk of residential structures—Model development and application to the German state of Baden-Württemberg. Nat. Hazards Earth Syst. Sci., 6, 721–733.
Hohl, R., Schiesser H.-H. , and Aller D. , 2002: Hailfall: The relationship between radar-derived hail kinetic energy and hail damage to buildings. Atmos. Res., 63, 177–207.
Houze, R. A., 1993: Cloud Dynamics. Academic Press, 573 pp.
Kalnay, E., and Coauthors, 1996: The NCEP/NCAR 40-Year Reanalysis Project. Bull. Amer. Meteor. Soc., 77, 437–471.
Klemp, J. B., 1987: Dynamics of tornadic thunderstorms. Annu. Rev. Fluid Mech., 19, 369–402.
Kron, W., Steuer M. , Löw P. , and Wirtz A. , 2012: How to deal properly with a natural catastrophe database—Analysis of flood losses. Nat. Hazards Earth Syst. Sci., 12, 535–550.
Kunz, M., 2007: The skill of convective parameters and indices to predict isolated and severe thunderstorms. Nat. Hazards Earth Syst. Sci., 7, 327–342.
Kunz, M., Sander J. , and Kottmeier C. , 2009: Recent trends of thunderstorm and hailstorm frequency and their relation to atmospheric characteristics in southwest Germany. Int. J. Climatol., 29, 2283–2297.
Leckebusch, G. C., Ulbrich U. , Fröhlich L. , and Pinto J. G. , 2007: Property loss potentials for European midlatitude storms in a changing climate. Geophys. Res. Lett.,34, L05703, doi:10.1029/2006GL027663.
Moncrieff, M., and Miller M. , 1976: The dynamics and simulation of tropical cumulonimbus and squall lines. Quart. J. Roy. Meteor. Soc., 102, 373–394.
Munich Re, 2012a: Natural catastrophes 2011: Analyses, assessments, positions. Topics Geo, Muenchener Rueckversicherungs-Gesellschaft, Munich, Germany, 56 pp.
Munich Re, 2012b: 2011 natural catastrophe year in review. Munich Re, 44 pp. [Available online at http://www.ctnow.com/media/acrobat/2012-01/67158951.pdf.]
Neumayer, E., and Barthel F. , 2011: Normalizing economic loss from natural disasters: A global analysis. Global Environ. Change, 21, 13–24.
Peterson, T. C., Willett K. M. , and Thorne P. W. , 2011: Observed changes in surface atmospheric energy over land. Geophys. Res. Lett.,38, L16707, doi:10.1029/2011GL048442.
Pielke, R. A., Gratz J. , Landsea C. W. , Collins D. , Saunders M. , and Musulin R. , 2008: Normalized hurricane damages in the United States: 1900–2005. Nat. Hazards Rev., 9, 29–42.
Sander, J., 2011: Extremwetterereignisse im Klimawandel: Bewertung der derzeitigen und zukünftigen Gefährdung. Dissertation, DLR-Forschungsbericht DLR-FB–2011-06, 133 pp.
Sander, J., and Dotzek N. , 2010: The impact of climate change on severe convective storms over Europe. Extended Abstracts, 10th EMS Annual Meeting/Eighth European Conf. on Applied Climatology, Zurich, Switzerland, European Meteorological Society, EMS2010-532. [Available online at http://meetingorganizer.copernicus.org/EMS2010/EMS2010-532.pdf.]
Sander, J., Dotzek N. , and Sausen R. , 2008: First results of climate change impacts on severe convective storms in Europe. Preprints, 24th Conf. on Severe Local Storms, Savannah, GA, Amer. Meteor. Soc., P1.1. [Available online at https://ams.confex.com/ams/24SLS/techprogram/paper_142105.htm.]
Santer, B. D., and Coauthors, 2007: Identification of human-induced changes in atmospheric moisture content. Proc. Natl. Acad. Sci. USA, 104, 15 248–15 253.
Simmons, K. M., and Sutter D. , 2008: Tornado warnings, lead times, and tornado casualties: An empirical investigation. Wea. Forecasting, 23, 246–258.
Simmons, K. M., and Sutter D. , 2011: The Economic and Societal Impact of Tornadoes. Amer. Meteor. Soc., 282 pp.
Simmons, K. M., Sutter D. , and Pielke R. Jr., 2011: Blown away: Monetary and human impacts of the 2011 U.S. tornadoes. The Geneva Reports: Risk and insurance research; Extreme events and insurance: 2011 annus horribilis, C. Courbage and W. R. Stahel, Eds., Geneva Association 5, 107–120.
Trapp, R. J., Diffenbaugh N. S. , Brooks H. E. , Baldwin M. E. , Robinson E. D. , and Pal J. S. , 2007: Changes in severe thunderstorm environment frequency during the 21st century caused by anthropogenically enhanced global radiative forcing. Proc. Natl. Acad. Sci. USA, 104, 19 719–19 723.
Trapp, R. J., Diffenbaugh N. S. , and Gluhovsky A. , 2009: Transient response of severe thunderstorm forcing to elevated greenhouse gas concentrations. Geophys. Res. Lett.,36, L01703, doi:10.1029/2008GL036203.
Verbout, S. M., Brooks H. E. , Leslie L. M. , and Schultz D. M. , 2006: Evolution of the U.S. tornado database: 1954–2003. Wea. Forecasting, 21, 86–93.
Weisman, M. L., and Klemp J. B. , 1984: The structure and classification of numerically simulated convective storms in directionally varying wind shears. Mon. Wea. Rev., 112, 2479–2498.
Willett, K. M., Jones P. D. , Thorne P. W. , and Gillett N. P. , 2010: A comparison of large scale changes in surface humidity over land in observations and CMIP3 general circulation models. Environ. Res. Lett.,5, 025210, doi:10.1088/1748-9326/5/2/025210.