Abstract

The scaling of U.S. tornado frequency with Enhanced Fujita (EF) rated intensity is examined for the range EF1–EF3. Previous work has found that tornado frequency decreases exponentially with increasing EF rating and that many regions around the world show the same exponential rate of decrease despite having quite different overall tornado frequencies. This scaling is important because it relates the frequency of the most intense tornadoes to the overall tornado frequency. Here we find that U.S. tornado frequency decreases more sharply with increasing intensity during summer than during other times of the year. One implication of this finding is that, despite their rarity, when tornadoes do occur during the cool season, the relative likelihood of more intense tornadoes is higher than during summer. The environmental driver of this scaling variability is explored through new EF-dependent tornado environmental indices (TEI-EF) that are fitted to each EF class. We find that the sensitivity of TEI-EF to storm relative helicity (SRH) increases with increasing EF class. This increasing sensitivity to SRH means that TEI-EF predicts a slower decrease in frequency with increasing intensity for larger values of SRH (e.g., cool season) and a sharper decrease in tornado frequency in summer when wind shear plays a less dominant role. This explanation is also consistent with the fact that the fraction of supercell tornadoes is smaller during summer.

This content is only available as a PDF.