The weather radar–based object-oriented convective storm tracking is a standard approach for analyzing and nowcasting convective storms. However, the majority of current storm-tracking algorithms provide nowcasts only in a deterministic fashion with limited ability to estimate the related uncertainties.
This paper proposes a method for probabilistic nowcasting of convective storms that addresses the issue of uncertainty of nowcasts. The approach first utilizes a two-dimensional radar-based storm identification and tracking algorithm in conjunction with the Kalman filtering of noisy measurements of storm centroid with the continuous white noise acceleration model. The resulting smoothed estimates of storm centroid and velocity components and their error covariance values are then applied to nowcast the probability of storm occurrence.
To verify the approach, 20–60-min nowcasts were computed every 5 min using composite weather radar data in Finland including approximately 22 000 tracked storms. The verification shows that the algorithm is applicable in both deterministic and probabilistic manner. Moreover, the forecast probabilities are consistent with observed frequencies of the storms, especially with 20- and 30-min nowcasts. The accuracy of the probabilistic nowcasts was evaluated through the Brier skill score with respect to the deterministic nowcasts and nowcasts based on observation persistence and sample climatology. The results show that the proposed nowcasting method has an improved accuracy over all of the reference forecast types.
Because of severe convective storms, many municipalities suffer from economic and ecological losses every year. The ability to predict the movement of the phenomenon up to a few hours ahead would provide valuable information for various end users. The weather radar–based object-oriented convective storm tracking is a well-established approach for nowcasting and analyzing convective weather (e.g., Dixon and Wiener 1993; Johnson et al. 1998). As opposed to conventional grid-based nowcasting algorithms (e.g., Bellon and Austin 1978; Li et al. 1995; Ruzanski et al. 2011), the object-oriented storm-tracking algorithms enable monitoring and nowcasting of individual storm objects. If the tracked storms are further supplemented with different radar-derived attributes, such as hail probabilities and rainfall accumulations, or additional measurements, such as lightning location data, then potentially hazardous storms can be identified and nowcast better (see, e.g., Rossi et al. 2014 and references therein).
Wide employment of radar-based object-oriented storm-tracking algorithms in operational decision support and nowcasting systems is proof of their added value. For example, the operational Warning Decision Support System (WDSS; e.g., Johnson et al. 2000) and its successor WDSS-Integrated Information (WDSS-II; Lakshmanan et al. 2007) track storm objects using the Storm Cell Identification and Tracking (SCIT) algorithm (Johnson et al. 1998) and rank the severity of storms in real time. The operational Canadian Radar Detection Support System uses a SCIT-based configurable severity estimation and nowcasting of tracked storm objects similar to WDSS (Joe et al. 2012). The Thunderstorm Initiation, Tracking, Analysis, and Nowcasting (TITAN) algorithm (Dixon and Wiener 1993) has been used operationally for tracking and nowcasting storms at various national weather services, such as at the Australian Bureau of Meteorology and at the South African Weather Service (Joe et al. 2012). The Thunderstorm Radar Tracking (TRT) system (Hering et al. 2008) is employed in an operative environment in Switzerland (Rotach et al. 2009) and in France (Joe et al. 2012). Sometimes these algorithms are embedded in larger operational multisource forecast systems. For example, the operational systems described by Mueller et al. (2003) and Nisi et al. (2014) use information from radar-based storm objects and various additional real-time information sources, such as numerical weather prediction (NWP) models and satellites.
Storm-tracking algorithms typically provide short-term nowcasts only in deterministic fashion; that is, the model produces a single forecast with limited guidance of the uncertainties related to the model. However, the uncertainty information of the event occurrence is crucial for decision-making processes of many end users, such as operative forecasters or local rescue authorities. This need has led to the development of the probabilistic nowcasting method for storm objects presented in this paper, which explicitly addresses the issue of uncertainty of the forecasts.
The underlying sources of uncertainty in the radar-based nowcasting can be categorized in three main classes (Foresti and Seed 2015; Pierce et al. 2012). The first source relates to the uncertainty of initial conditions of predicted phenomena. For example, radar measurements themselves are subject to multiple error sources, such as attenuation of the radar signal, randomness of the reflectivity measures, and an increasing measurement volume as a function of range. The second source is the temporal evolution of the phenomena during a forecast time. This is significant, especially when the nowcasting of convective storms is considered, because individual convective cells may grow or decay during a few tens of minutes only. Finally, since most current nowcasting techniques assume stationary velocity fields, the third source of uncertainty relates to the temporal development of the estimated velocity field during the forecast time.
Probabilistic approaches are not new in radar-based nowcasting applications. For grid-based methods, several probabilistic techniques exist. Straightforward approaches include use of time-lagged ensembles (Yeung 2012), where several consecutive forecasts are applied to estimate the uncertainty of the forecast, or spatial sampling in the neighborhood of a deterministic forecast, where the size of the neighborhood increases with time (Andersson and Ivarsson 1991; Schmid et al. 2000; Germann and Zawadzki 2004; Koistinen et al. 2012). Germann and Zawadzki (2004) compared four different methodologies for probabilistic nowcasting: the synoptic-scale Eulerian method, the local Eulerian method, the local Lagrangian method, and a method conditional to Eulerian mean. They found that the local Lagrangian method consistently outperformed the other methods. This approach was later adopted by other authors, for example by Kober et al. (2012) and Scheufele et al. (2014). Other probabilistic methods include stochastic modeling of error sources, such as Bayesian approaches (e.g., Fox and Wikle 2005; Cornford 2004) or spectral methods (Seed 2003; Bowler et al. 2006; Atencia and Zawadzki 2014). Additionally, Pegram and Clothier (2001) introduced the String of Beads algorithm, which is also applied in the method by Berenguer et al. (2011). These methods take into account the uncertainty related to the temporal evolution of the rainfall field, which is known to be one of the major sources of uncertainty in the nowcasting of rainfall (e.g., Germann et al. 2006). To better assist nowcasting in broader time ranges, from a couple of minutes to several hours, some methods are even capable of merging probabilistic forecasts from NWP models and radar-based models (Bowler et al. 2006; Koistinen et al. 2012; Kober et al. 2012; Scheufele et al. 2014).
One of the few studies that explicitly produces probabilistic nowcasts for object-oriented tracked storms was introduced by Dance et al. (2010). They estimate “the strike probability,” which is the probability a given point will be influenced by a tracked storm during the forecasting period, using an analytic function and climatological storm speed and direction error parameters. In addition, some methods illustrate the uncertainty related to nowcast storms’ objects, for example, by increasing the size of predicted storms based on the standard deviation of recent storm velocity vectors (e.g., Rotach et al. 2009). Although this kind of visualization of uncertainty definitely provides helpful guidance to many end users, it does not estimate explicit probabilities of storm occurrence. Moreover, the method by Nisi et al. (2014) produces ensemble nowcasts for the evolution of radar- and satellite-based attributes of storm objects by combining several input sources, such as information from radar, satellite, and NWP.
In this paper, we introduce a new object-oriented Kalman filter–based tracking and nowcasting algorithm for convective storms. The approach provides two important features in the field of convective storm tracking. First, the algorithm provides a compact way to compute smoothed estimates for various storm parameters, such as storm position and velocity, that can be further used for reliable deterministic nowcasting of the convective storm objects. Second, the estimated state mean and covariance can be used for deriving probabilistic nowcast products for tracked individual convective storms. A similar Kalman filtering approach has been previously applied by Larsen (1995) and Lakshmanan et al. (2003) for estimating parameters of individual tracked storms, but their studies did not attempt to estimate the uncertainties associated with the nowcast of the tracked storms. Therefore, deriving a probabilistic nowcasting product for tracked storms is the prime motivation of this study.
The paper is organized into the following sections. Section 2 discusses the applied radar data and the object-oriented tracking algorithm, which is used for the initial tracking of the storms. Section 3 introduces the Kalman filtering model for deriving smoothed position and velocity estimates for tracked storms and for nowcasting the probability of storm occurrence. Section 4 evaluates the method using an example and a statistical verification with an extensive set of storm data. Section 5 concludes the paper.
2. Data and methods
a. Applied radar data
The radar data used in this study were obtained from the eight Doppler C-band weather radars of the Finnish Meteorological Institute (FMI) covering almost all of Finland. All of the radars produce five Doppler-filtered radar reflectivity factor plan position indicator (PPI) scans with elevation angles from 0.3° to 9° every 5 min, and a complete set of 11 PPI scans with elevations up to 45° every 15 min. Each PPI scan has a range of 250 km from the radar. For a detailed discussion on FMI weather radar operations and data processing, see Saltikoff et al. (2010).
In this study, individual PPI scans were further processed to composite constant-altitude PPI (CAPPI) reflectivity images at 500-m altitude with 5-min temporal and 1-km spatial resolution, which were used for identifying storm cells from radar data and for verifying the proposed nowcasting method. The relatively flat topography of Finland allows for the use of such a low-altitude composite product (Saltikoff et al. 2010). At each grid point, the CAPPI reflectivity was computed by taking a distance-weighted average of radar bins overlapping with the grid point and having a vertical distance less than 250 m from the CAPPI level. For further discussion on the generation of the radar composite maps, the reader is referred to Rossi et al. (2014) and section 2a therein.
The composite data in this study were not corrected for the effects of the vertical profile of reflectivity (VPR). Regular VPR correction is not significant in nontropical convective rain at short distances from radar when the beam is within or below the melting layer (e.g., Zhang and Qi 2010). Moreover, as discussed by Koistinen and Pohjola (2014), the underestimation of the ground-level reflectivity due to the effects of the VPR at large distances from radar is smaller in convection than in average rainfall.
b. Storm-tracking algorithm
Like many other object-oriented storm-tracking algorithms, the algorithm utilized in this study comprises two phases. In the identification phase, convective areas are detected from data as individual cells. In the tracking phase, the cells at consecutive time steps are linked to form storm tracks. The appendix gives a detailed description of the tracking part of the algorithm. The tracking algorithm used in this study has been previously applied in studies by Rossi et al. (2013, 2014).
While some storm-tracking algorithms (e.g., Dixon and Wiener 1993; Johnson et al. 1998) define cells as three-dimensional objects from volumetric radar data, the algorithm in this paper identifies storms from low-level radar images as two-dimensional segments. The identification of cells was done by applying a certain radar reflectivity factor threshold value and morphological image processing. The threshold of 35 dBZ was first used for the initial identification of the storm cells and the resulting binary areas were further processed through the morphological closing (see, e.g., Gonzalez and Woods 2002) with a round structuring element of 3 km in diameter and converted to polygons. The 35-dBZ threshold enables capturing the radar-observed life cycle of entire storm systems, such as multicellular storms or intense MCSs. For example, Mecikalski and Bedka (2006) applied the 35-dBZ threshold to capture the initial phases of a convective system life cycle. Additionally, Dixon and Wiener (1993) proposed the threshold of 30–40 dBZ for tracking these types of storms.
After the identification phase, the cells were grouped and tracked with the clustering-based storm tracking (see the appendix). In this algorithm, several cells can belong to the same storm at time step k, if they fulfil certain conditions. For example, a small cell lying in the vicinity of a larger cell can be part of a larger entity and therefore the algorithm classifies both cells to the same storm. The algorithm also estimates splits and mergers of the tracked storms.
3. Kalman filtering model for probabilistic tracking and nowcasting of convective storms
a. Kalman filtering
In the field of object-oriented convective storm-tracking algorithms, the concept of probabilistic analysis is still rare. In various engineering tracking tasks, probabilistic algorithms have been used for a long time (e.g., Bar-Shalom et al. 1995). The most widely used method applied in probabilistic target tracking is definitely Kalman filtering developed by Rudolf E. Kalman in the 1960s (Kalman 1960). Probabilistic approaches, including the Kalman filter, are also applied in NWP (e.g., Zhou et al. 2006).
Kalman filtering can be regarded as a statistical inversion method, where the unknown state x of a system is estimated through noisy measurements y1, …, yN. The state vector x consists of a set of variables that describes the properties of a dynamic system. When applying Kalman filtering with an object-oriented convective storm-tracking algorithm, each individual tracked storm is regarded as the system of interest, and the hidden state x of the system can include, for example, position, velocity, and area of the storm. The goal is to provide an optimal estimate mk of the unknown state vector xk at kth time step using the previous noisy observations y1, …, yk, a dynamic model for the temporal development of and the previously estimated state mk−1. In radar-based storm tracking, the observation vector yk includes variables that can be directly measured from the radar-detected storm object, such as uncertain measurements of the area and the centroid position of the storm.
The generic Kalman filtering assumes that the system can be presented with a linear stochastic state-space model with Gaussian noises, formulated as
where k is the state transition model, k is the observation model, and vk and wk-1 are additional noise variables. The state transition model k defines how the model state evolves between successive time steps. The random variable wk−1 is added to describe the uncertainties of the modeled system. Thus, wk-1 is also called the process noise and it is assumed to follow the multivariate Gaussian distribution with zero mean and the process noise covariance k−1. The observation model k, in turn, defines how the true state variables are mapped to the measurements. The random vector vk is the measurement noise, which relates to the uncertainty of the measurements yk. Like the process noise, vk is assumed to follow the multivariate Gaussian distribution with zero mean and the measurement noise covariance k. It is also assumed that both vk and wk−1 are white noise processes and independent of the initial state x0. Moreover, all vk and wk−1 are assumed to be independent.
The state estimate mk and its error covariance k at the kth time step are computed recursively with the following two-phase algorithm:
The variables and in the prediction step denote the predicted state estimate and covariance, that is, the estimated state estimate and covariance before they are updated with the noisy measurement yk. In the update step, k is the Kalman gain, which defines the weight given to the observation with respect to predicted state. Furthermore, it is assumed that at the first time step, the system state has an initial state m0 and covariance 0. From the Bayesian point of view, the estimates mk and k can be regarded as the mean and covariance of the posterior distribution of the state xk, given the noisy measurement yk and the prior mean and covariance. For further discussion on the general theory of Kalman filtering, we refer to various sources in the literature (e.g., Bar-Shalom et al. 1995; Simon 2006).
b. Kalman filtering model for tracked storms
The Kalman filtering model for tracked convective storms estimates state variables of storm cells. In here, the initial tracking—that is, storm identification and assignment—are performed with the clustering-based tracking algorithm described in section 2b and in the appendix. At each time step k, we apply the Kalman filtering to the tracks, which estimates the state of the tracked storms. To improve the matching between storm cells in the tracking, the cells identified at the previous time step k − 1 are projected to the current time step k according to Kalman filter–estimated velocities in the clustering phase of the algorithm (see the appendix). However, in this paper the main application of the Kalman filter–estimated state information is storm nowcasting, as explained in the following sections.
In many object-oriented storm nowcasting applications, the nowcasting of the storm cells is based on linear extrapolation of the position using previous position or velocity estimates retrieved from storm tracks (e.g., Dixon and Wiener 1993; Handwerker 2002). Similarly, the state variables considered in this study are the geometric center position (x, y) of an identified storm and the corresponding velocity components , denoted as a state vector . The only measured variables are the centroid coordinates of storms and therefore the elements corresponding to the velocity observations in the measurement matrix k are zero. The velocity is estimated later on using the applied dynamic model and the Kalman filtering equations. Thus, the linear observation model k of the system is
and the measurement noise vk is zero-mean Gaussian white noise with covariance
In Eq. (5), the parameter r (length) describes the noise intensity of the measured centroid position. The uncertainty comes from various sources, such as attenuation and the randomness of the reflectivity measurements, which results in random deformation of the identified storm shape and consequently in fluctuations in the centroid position. Objective estimation of different noise components and their distributions is a challenging task and may be a subject of a future study. As a first approximation, it is considered that there are multiple independent additive noise sources. Therefore, the distribution of a superposition of those noises approaches the Gaussian distribution according to the central limit theorem. The magnitude of the measurement noise is also unknown and cannot be easily derived, and therefore in this paper the algorithm is tested with several values of r.
In addition to the measurements and the measurement model, Kalman filtering requires a dynamic model to estimate the state variables of the tracked storms. Here, we adopt the continuous white noise acceleration model, where position and velocity coordinates of a target are modeled with a linear state-space model and the acceleration is assumed to be a zero-mean Gaussian random process (e.g., Bar-Shalom et al. 1995). This means that the velocity of the target stays unchanged on average, but over each time step, it undergoes a small Gaussian random change. The state transition model k = and the process noise covariance k = of the discretized white noise acceleration model are [see, e.g., Bar-Shalom et al. (1995) for a detailed derivation]
where q [length2/(time step)3] defines the process noise intensity and Δt is the interval between two tracking time steps, that is, the interval between two consecutive CAPPI images. Random changes in the velocity over the time step Δt are of the order of , which helps one to choose a reasonable parameter for the process noise. Like with the measurement noise r, the value of q can be chosen subjectively, for example, based on expert knowledge on the storm velocity behavior.
An important part of the filtering is the selection of the initial values of the state vector m0 and the state covariance 0. In this study, the initial state vector m0 was selected by setting the variables x and y in the state to the measured centroid position y. The velocity components and were computed by taking an inverse-distance-weighted average of the velocity components of other existing storms in the radar composite map. If no other storm were available, then the storms were initialized to zero velocity.
Selection of the initial error covariance 0 is difficult without any realistic prior information on the uncertainties of the storm track at the initial time step. Therefore, we choose to initialize the covariance to the so-called steady-state solution, which would be reached by tracking a storm over a large number of time steps. The steady-state solution is also used in many other real-world tracking applications with time invariant state-space models (see, e.g., Gray and Murray 1993).
where and are the steady-state Kalman gain and state covariance of the filter, respectively, which can be precalculated by iterating the Kalman filtering equations with suitable initial conditions until converged values are reached. Alternatively, one can first solve steady-state one-step prediction covariance from the discrete-time algebraic Riccati equation
after which the steady-state covariance and gain can be computed as
c. Nowcasting tracked storms
One of the main applications of the Kalman filter–derived position and velocity is naturally short-term prediction of the storms. This can be done by estimating the state parameters with the Kalman filtering prediction step [Eq. (2)] extended to the forecast time. Thus, the predicted state mean and covariance on the kth step with the lead time of tf are
where and are defined similarly to Eq. (6) as
and means that because of the applied steady-state assumption, the predicted state covariance depends on the lead time tf but not on the time step k. Like in Eq. (6), is the 5-min interval between two consecutive CAPPI images and defines the order of magnitude of the velocity changes over .
d. Dealing with splitting and merging
Important special cases in the convective storm tracking are the storm splitting and merging. A storm can split into two or more new storms, or a group of storms can merge into the same storm. Although the initial storm tracking detects splits and mergers, they are complicated in the filtering, since the original Kalman filter is not designed to deal with such a problem.
In this study, the following heuristic approach is proposed to deal with the splitting and the merging. In the case of the merging, the predicted state vectors of the merging storms are simply combined with the area-weighted average of the merging storms. If a storm splits, then each split successor storm inherits velocity components from the predicted state estimate of the predecessor storm. However, the centroid position estimates are taken directly from the measured centroid positions of the split storms. The state covariance remains unchanged, because the steady-state solution is applied.
e. Ensemble nowcasting of convective storms
In addition to the estimated state parameters, Kalman filter provides the error covariance matrix of the estimated state. Assuming that the accuracy of the state estimate follows a multidimensional Gaussian distribution, the state estimate and its error covariance uniquely define the probability density function of the unknown true state. This probability density function can be used for deriving different probabilistic nowcasting products for the tracked convective storms.
This study estimates the thunderstorm occurrence probability by generating ensembles with forecast state estimate and its error covariance For each storm, the ensembles are generated in the following way. First, and are calculated with Eq. (11). Second, we draw a random sample from the multivariate normal distribution defined by and . This can be done, for example, through an algorithm based on the LU decomposition of the covariance matrix (see, e.g., Goovaerts 1997). Third, the forecast storm at time k + tf is generated by placing the center of the initial storm in the position of the random sample.
After the computation of ensembles, the forecast probabilities for each individual storm are estimated in a spatial grid equivalent to the radar composite map. Thereafter, the probabilities of individual storms are combined by taking the pointwise maximum of the estimated probability maps. This way, the combination can be regarded as the maximum threat among all storms influencing a given location. Another option is to assume statistical independence for the storms and combine probabilities pointwise according to the general rules of probability calculus (Dance et al. 2010). However, it is important to note that individual storms may not be statistically independent.
Dance et al. (2010) introduced an analogous method for deriving strike probability nowcasts for tracked storm objects, that is, the probability that a given location will be affected by a storm within the forecast time period, using constant predefined deviation parameters for the storm velocity. In this study, we estimate the probability of a storm influencing a certain area precisely at a given forecast time, where both the speed and the location uncertainties are estimated through Kalman filter and the assumed constant white noise acceleration model.
4. Case examples and algorithm evaluations
a. Application example
Figure 1 shows an example of the Kalman filtering tracking with r = 2.5 km and συ = 6.5 km h−1, where white polygons depict tracked storm objects. Kalman-filtered track positions (green lines) are smoother than the nonfiltered storm positions (red lines), resulting in a more consistent track. Stable, smooth behavior of the track improves the visualization of the storm tracks in end user products. Moreover, the Kalman filter–based nowcasts (magenta arrows) are consistent with the smoothed track history, although the tracked storms split and merge frequently. Figure 1b also shows development of the forecast position covariance. The dashed circle shows the 95% confidence boundary of the future position covariance, that is, an estimate for an area in which the forecast centroid falls with the probability of 95%. As illustrated in section 4c, this uncertainty can be used for storm probability forecasting. Figure 1c shows an example of storm probability forecasting with 100 ensembles, where forecasts of lead times 20, 40, and 60 min are overlaid on each other. The green ellipse in Fig. 1c also shows an alternative representation for the storm object, which is described below in section 4b.
b. Verification of deterministic nowcast
Whenever new algorithms are introduced, an assessment of the algorithm performance is necessary. Although the primary aim of this work is to produce probabilistic nowcasts, the filter can be also used for conventional deterministic nowcasting. Therefore, the first step of the evaluation included the verification of the algorithm in a deterministic manner.
A deterministic nowcast of the future position of the storms can be estimated with the nowcasting equations in Eq. (11). After the future position is estimated, the final nowcast of a storm is obtained simply by placing the storm in the estimated future position. For verification, we applied the commonly used probability of detection (POD), false alarm ratio (FAR), and critical success index (CSI) (e.g., Wilks 2011). The verification was performed using data from 59 days with widespread convective activity in Finland from years 2005–11. Overall, 14 536 weather radar frames with 5-min temporal resolution were analyzed.
In the POD, FAR, and CSI calculation, the resolution of 1 km was applied. A storm event was considered in the grid pixel if any point within the pixel was influenced by a storm cell. Consequently, the forecast was considered successful if both the forecast and reference grid pixels included a storm event. A false alarm was assumed if the forecast grid pixel had an event but the event was not observed in the reference grid. A failure occurred if the reference pixel included an event but the forecast grid pixel did not.
Gray lines in Fig. 2 show an example of the POD, FAR and CSI values of the track-by-track analysis using the parameter settings r = 2.5 km and συ = 6.5 km h−1. Over 22 000 storm tracks were identified with these parameters, but the number of tracks did not vary significantly when other filtering parameters were applied. In the track-by-track analysis, we calculate skill scores only against storms, for which the history exceeds the lead time (Dixon and Wiener 1993). This is reasonable, as the storm-tracking algorithms are developed to track and forecast the movement of existing storms, and they are not able to deal with the temporal evolution or anticipate developing convection. Thus, the track-by-track analysis measures the nowcasting capability of the algorithm along the track.
In some storm-tracking applications, the storms are represented using a simplified form as ellipses (e.g., Dixon and Wiener 1993; Li and Lai 2004; Rotach et al. 2009; Peleg and Morin 2012; Novo et al. 2014). Representing objects as ellipses instead of polygons has a visual benefit: the simplified shape of a storm is easier to follow on a display or map, which is an advantage in the operational products. Therefore, the algorithm was also tested by approximating the predicted cells with ellipses, which was performed by applying the principal component analysis (e.g., Jolliffe 2002) to the boundary points of each identified cell. Ellipse axes were chosen such that they equal 2 times the corresponding principal component. The green ellipse boundary in Fig. 1c shows an example of the applied ellipse approximation.
Black lines in Fig. 2 show the POD, FAR, and CSI as a function of the lead time when ellipse approximation is applied. Since the forecasts were created by predicting the storms approximated with ellipses, the observed storms were approximated with the same ellipse fitting procedure. As Fig. 2 shows, the POD and CSI of the ellipse representation are larger than that of the polygon representation. This is because the polygon representation estimates the storm boundary precisely, which makes it difficult to predict the constantly evolving storm area accurately. In the ellipse representation, small-scale deformations are smoothed out due to the simplified shape of the cells, resulting in larger skill score values. However, this result does not mean that the ellipse-based forecasting is necessarily better than the polygon-based forecasting from the end user point of view. Instead, the results highlight the capability of the Kalman filtering–based algorithm to nowcast the occurrence of two different object representations.
To get a better picture of the performance of the deterministic nowcasting algorithm, the track-by-track CSI analysis was repeated for several parameter combinations of r and συ. The checkerboard in Fig. 3 shows CSI values for 30-min nowcasts with respect to different combinations of r = [2.5, 5, 10, 20] km and συ = [2.5, 5, 10, 20] km h−1, using both the ellipse and polygon representations of the cells. The combinations were selected logarithmically to highlight an additional feature of the applied continuous white noise acceleration model; it can be shown that doubling both r and συ does not change the steady-state Kalman gain , which eventually defines the properties of the deterministic nowcasting. Hence, the diagonal elements in Fig. 3 reach the same CSI values. However, the steady-state error covariance changes even if the parameter combinations are selected from diagonals of Fig. 3. This means that the diagonal parameter combinations in Fig. 3 perform differently in probabilistic nowcasting, as demonstrated in the following sections.
While the differences between different parameter combinations in Fig. 3 are not large, the combinations located on the main diagonal—that is, (r, συ) pairs (2 km, 2 km h−1), (5 km, 5 km h−1), (10 km, 10 km h−1), and (20 km, 20 km h−1)—maximize the CSI of the nowcasting method, both in polygon- and ellipse-based nowcasting. Note that it is a coincidence that the maximizing combinations occur on the main diagonal.
c. Reliability of the probabilistic nowcasts
As mentioned earlier, the prime objective of the presented method is to provide probabilistic forecasts for tracked convective storm objects. To produce realistic probability forecasts, the estimated probabilities should be in accordance with the true observed frequencies of the forecasts. This can be assessed with the reliability diagram, where observed frequencies are plotted against predicted probabilities (e.g., Jolliffe and Stephenson 2003; Wilks 2011).
In this study, reliability diagrams were estimated with the same dataset as in section 4b for the lead times 20–60 min with the following procedure. First, predicted probabilities were calculated in the 1 km × 1 km grid encompassing the radar composite area using 100 ensembles per storm. Second, each grid pixel was classified according to the probability value of the pixel. For all probability classes yi = i %, where i = 1, … , 100; the comparison was made with the observed storm frequencies. Let o denote a binary observation at a prediction time, where o = 1 means that the pixel was influenced by the storm, while o = 0 indicates that the pixel was not influenced by the storm. In each discrete class yi, both the number of pixels influenced by observed storms ni(o = 1 | yi) and the total number of pixels ni(yi) were counted. The ratio of these, ni(o = 1 | yi)/ni(yi), estimates the observed frequency. It also corresponds to the conditional probability of the storm occurrence given the probability value of the ith class p(o = 1 | yi). Unlike in the verification of deterministic forecasts in section 4b, the verification in this section was not carried out using the track-by-track comparison. This way the results enable better analysis of the effect of growth and decay in forecasts.
The reliability diagrams were calculated for three different parameter combinations of r and συ, which were selected according to the results of the deterministic analysis presented in section 4b. Parameter combinations of (r, συ) = (2.5 km, 2.5 km h−1), (r, συ) = (5 km, 5 km h−1), and (r, συ) = (10 km, 10 km h−1)—denoted as (r, συ)2.5, (r, συ)5, and (r, συ)10, respectively—were qualified to the probabilistic analysis, because they are among the parameters that reached the best performance values in the deterministic nowcasting, as shown in Fig. 3. The selection of these parameters is justified, as one of the purposes of the study is to show that the same filter can be reasonably used for both deterministic and probabilistic forecasting tasks. Also, the combination (r, συ) = (20 km, 20 km h−1), reaching the same performance in deterministic nowcasting as (r, συ)2.5, (r, συ)5, and (r, συ)10, was also tested for reliability, but it exhibited significantly poorer results than the other combinations.
Figure 4 shows the reliability diagrams for lead times of 20, 30, 45, and 60 min with the polygon-based nowcasting system and the applied filter parameters. With the lead time of 20 min, (r, συ)5 reaches significantly more reliable forecasts than the other parameter settings in the predicted probability range of 0%–70%. In general, (r, συ)2.5 has a tendency to overestimate the probabilities, while (r, συ)10 results in underestimated probabilities. However, for larger predicted probabilities, (r, συ)10 parameterization yields better reliabilities than (r, συ)2.5 and (r, συ)5. A similar kind of behavior is observed with other lead times, but in general the overestimation of predicted probabilities is larger than with 20-min forecasts. With (r, συ)5 and (r, συ)2.5, the overestimation of predicted probabilities increases, especially with large probability values. This is as expected, because convective cells have typically a lifetime less than 30 min (e.g., Byers and Braham 1949), and therefore many storms decay or undergo a significant deformation during the forecast time.
Since the algorithm does not include any explicit model for storm evolution, it does not predict larger lead times precisely. Nevertheless, the forecasts seem to improve in terms of reliability with (r, συ)10; the underestimation bias seen in 20-min forecasts decreases with the 30-min forecast, and the overall reliability seems to outperform (r, συ)5 and (r, συ)2.5 when lead times of 45 and 60 min are considered. Larger error covariance of (r, συ)10 spreads individual ensembles more than a small error covariance, which can partially compensate for the uncertainty caused by the growth and decay of the storm. Note that the distribution of probabilities of (r, συ)10-based prediction is concentrated on smaller values with very few predictions of large probabilities. In other words, the forecasts of (r, συ)10 exhibit lower sharpness (e.g., Jolliffe and Stephenson 2008) than the other parameterizations.
The correspondence between the predicted probabilities and observed frequencies changes when both forecast and observed cells are approximated with ellipses. Figure 5 shows that the reliabilities are improved with (r, συ)2.5 and (r, συ)5; although the method cannot reach perfect reliability values, the overall correspondence between the predicted probabilities and observed frequencies is better especially in 30-, 45-, and 60-min nowcasts. With (r, συ)10 the underestimation bias becomes slightly more prominent in 20- and 30-min nowcasts. On the other hand, (r, συ)10 reaches better reliabilities than (r, συ)5 and (r, συ)2.5, if 45- and 60-min nowcasts are considered.
The reliability diagrams are also smoother and more consistent for ellipse-based nowcasting than for the polygon-based nowcasting, implying that the overall reliability diagram estimation is more trustworthy with the ellipse-based system. Furthermore, the observed frequencies increase monotonically as a function of the predicted probability; that is, an increment in the predicted probability value consistently indicates an increment in the observed frequency. A possible explanation for the better reliabilities with ellipse representation is the simplified form of ellipse representation, which eases the forecasting task of objects.
d. Accuracy of the probability forecasts
The reliability diagrams alone are insufficient to quantify how well the forecasts and the observations agree. For example, a simple forecasting system based on the climatology has the perfect reliability but, especially in a storm nowcasting task, would intuitively exhibit rather low accuracy. Therefore, other verification measures are needed to supplement the reliability diagrams.
A common scoring measure that is used to verify the accuracy of probabilistic weather forecasting algorithms is the Brier score (BS; e.g., Jolliffe and Stephenson 2003; Wilks 2011). The BS of zero indicates the perfect forecast, while the worst possible BS has the value of one. To get a better picture how well the studied forecasting system performs in comparison to the reference system, the Brier skill score (BSS) is typically used instead of sole BS. It is defined as
where BS and BSref are the Brier scores of the studied system and the reference system, respectively. Positive values suggest that the studied system has an improved accuracy over the reference system. In the optimal case, the BSS has the value of one.
The reference forecasts in Eq. (13) can be selected in a number of ways, but typically forecasts based on sample climatology or observation persistence are applied (e.g., Jolliffe and Stephenson 2008; Wilks 2011). If the observation persistence is used, then the binary observations can be regarded as probability nowcasts having either values of 0 or 1. However, the two references are very different in terms of the properties of probability forecasts. The climatological forecast is reliable but exhibits extremely low sharpness. The persistence forecast, on the other hand, is sharp but less reliable.
In this study, the same dataset, parameterizations, and verification criteria were used for the BSS calculation as for the reliability estimation in section 4c. Tables 1–3 summarize the BSS values for different lead times and reference forecasts. With the persistence forecasting as a reference, the BSS of both polygon- and ellipse-based nowcasts are of the class of 50% for all lead times and parameterizations, while with respect to the sample climatology-based reference the BSS varies approximately between 10% and 30% for the polygon-based nowcasts and between 20% and 45% for the ellipse-based nowcasts. This suggests that the proposed probabilistic nowcasting system has improved accuracy over both climatological and persistence forecasts. However, the smallest BSS values, reached with 60-min forecasts and sample climatology as a reference, may seem insignificant, as the reference forecast is only a statistical probability occurrence calculated from the test data. This supports the previously noted fact that good nowcasting performance cannot be achieved beyond a few tens of minutes without a proper model for storm evolution.
Forecast skill scores are often computed against forecasting systems that the new system could replace (Jolliffe and Stephenson 2003). Since an objective of this paper is to compare the probabilistic nowcasting of storm objects to the conventional methods, it is reasonable to use deterministic forecasts as a reference in BSS. Like the persistence nowcast, a deterministic nowcast can be regarded as a probability forecast with possible probability values of 0 or 1. Tables 1–3 show BSS with deterministic reference forecasts that are computed using the nowcasting equations in Eq. (11) for lead times 20–60 min. In terms of BS, the probabilistic nowcasts are 24%–37% better than the deterministic nowcasts. This verifies that the probabilistic forecasts not only represent uncertainties related to the nowcast in a realistic manner but also improve the accuracy of the forecasts.
A more detailed examination of Tables 1–3 reveals that (r, συ)5 reaches the best accuracy among the studied parameterizations with both polygon-based and ellipse-based forecasts and all of the lead times, but the differences are not large; (r, συ)5 seems to have 1%–1.5% larger BSS values than (r, συ)2.5 when the polygon-based system is used, while with the ellipses the difference is approximately 1%–3%. The differences between BSS values computed with (r, συ)5 and (r, συ)10 are, however, somewhat larger—approximately 1%–6% with polygon-based forecasts and 1%–5% with ellipse-based forecasts, depending on the lead time.
5. Conclusions and future directions
This work presented a Kalman filtering–based algorithm for tracking convective storms. The algorithm applies the white noise acceleration model to produce smoothed and consistent estimates for the position and velocity of a storm. The primary motivation for applying the proposed Kalman filtering model to the convective storm tracking is its ability to predict storm occurrence in conjunction with associated uncertainties of the forecast. However, as demonstrated in section 4b, the model is also applicable to conventional deterministic storm nowcasting.
In this study, the algorithm was verified using approximately 22 000 convective storms tracked from over 14 500 composite weather radar images in Finland. The approach was tested with the clustering-based storm-tracking algorithm by applying the 35-dBZ threshold and morphological image processing for storm identification, but it is not limited to the specifics of storm tracking and it can be potentially applied to other convective storm-tracking algorithms. Lead times of 20, 30, 45, and 60 min were analyzed using several parameterizations of the algorithm. The analysis demonstrates that especially with 20- and 30-min lead times, probability forecasts are consistent with observed frequencies, verifying that the proposed methodology reflects the true uncertainties of the forecasts. Moreover, in terms of Brier skill score, the probabilistic storm occurrence forecasts have an improved accuracy in comparison to the deterministic forecasts. In this paper, the accuracy of the probability forecasts exhibited an improvement between 24% and 37% with respect to the deterministic forecasts, depending on the lead time, the storm object representation, and the parameterization of the algorithm. The results of the paper support the previously noted fact that good nowcasting performance cannot be acquired beyond a few tens of minutes without a proper model for storm evolution. This is the main limitation of the presented method. Still, given the significant socioeconomic impact of the storms, even a small improvement is important. Further optimization of the filter parameters and noise models could improve forecast reliability and skill, which will be a subject of future studies. Future studies should also explore approaches to take into account the uncertainty due to storm evolution.
An advantage of the presented model is that the noise parameters use physical quantities, which help one to choose a reasonable parameterization for the filter. In this paper, the noise parameters were selected heuristically, but if the noise behavior of the storm can be estimated objectively, then it can be directly included in the model. The Kalman filtering–based storm tracking is also a step toward adaptive tracking, since the parameters of the filter can be changed according to the varying measurement conditions. For example, in the case of attenuation or beam blocking, the measurement noise covariance of the filter could have increased values. In addition, different storm systems, like mesoscale convective systems or supercell storms, can behave differently. Therefore, the noise parameters could be tuned adaptively depending on the type of the storm.
Kalman filtering is a well-established and compact way to fuse the estimation of various storm parameters in the same algorithm. The model presented in this paper includes only the centroid and velocity components of a tracked storm, but it can also be extended with other state variables, such as storm area or echo-top altitude, if their temporal behavior and relationship with other parameters can be formulated with a stochastic state-space presentation. Additionally, the model could use measurements from various instruments. As an example, lightning location data, containing information on locations and movement of individual storms, could provide an additional contribution to the state estimation. Furthermore, the complex forecasting of the associated severe phenomena, such as hail, lightning and tornadoes, could be addressed better by selecting an appropriate multiparameter model.
The comments of Pertti Nurmi, Jarmo Koistinen, and the three anonymous reviewers are gratefully acknowledged. This study has been supported by the Cluster for Energy and Environment through the MMEA research program and the Academy of Finland (Grant 255718). Additionally, Pekka J. Rossi was supported by the Finnish Society of Automation and Väisälä Foundation. The contribution of Dmitri Moisseev was supported by the Academy of Finland (Grant 263333) and the Finnish Center of Excellence (Grant 272041). The participation of V. Chandrasekar in this research was supported by the National Science Foundation, and the Finnish Funding Agency for Technology and Innovation FiDiPro program.
Clustering-Based Storm-Tracking Algorithm
The storm-tracking algorithm of this paper applies cluster analysis, which classifies data into groups so that every object in each group has some properties in common. In the scope of convective storm tracking, a natural property is the spatial vicinity of the identified cells in two consecutive frames.
The tracking algorithm in this paper is founded on the density-based spatial clustering of applications with noise (DBSCAN algorithm; Ester et al. 1996), which searches for spatially dense data groups from a set of a multidimensional vector data in a metric space. The density is defined as the number of data points that fall in the neighborhood of a surrounding data point x. To be more precise, the algorithm considers a sphere Vε(x), centered at x, where ε is the user-defined radius of the sphere. In addition, the algorithm considers a user-defined parameter Nε(x), which determines the number of data points lying inside of Vε(x).The user also needs to define the minimum number of points that Nε(x) must exceed in order for x to be the so-called core point of the cluster. A cluster is defined as a set of points where each point belongs to a neighborhood of at least one core point. Points that do not belong to any cluster are considered as noise.
Quite often data are not simple vector points but objects having more general properties, such as geographical shape representations. Therefore, Sander et al. (1998) generalized the DBSCAN algorithm for arbitrary objects. Instead of simply counting the number of objects in the neighborhood of an object, the generalized DBSCAN (GDBSCAN) algorithm introduces the concept of weighted cardinality, that is, the user-defined “weight” of objects lying in the neighborhood of an object. Moreover, the neighborhood can be defined in an arbitrary way in GDBSCAN.
The storm-tracking algorithm of this paper applies the GDBSCAN clustering algorithm to the identified storm cells in two consecutive frames. Additionally, the cells in the previous frame at time k − 1 are first moved to their foreseen locations at time k. This is done by transferring cells at k − 1 according to the velocity vectors of the cells, which are estimated with the Kalman filtering approach described in section 3. This displacement is not necessary, but it improves the clustering, especially if the time interval between consecutive radar frames is more than 5 min. A similar displacement approach has been used in many other storm-tracking algorithms, for example, in those by Johnson et al. (1998), Handwerker (2002), and Kyznarová and Novák (2009).
In this paper, a cell belongs to the neighborhood of another cell if the minimum distance between the cells is less than 2 km. The cardinality is defined through the total area of the cells in the neighborhood, and at least the area of 20 km2 in the neighborhood of an object is required to fulfill the criterion of the core object. For example, two cells form a cluster if they both have the spatial area of 10 km2 and the minimum distance of less than 2 km from each other.
The clustering is repeated at each time step for the cells identified in the current and the previous frame. Thereafter, the resulting clusters are compared against the clusters produced at the previous time steps and the tracks are established according to definitions 1–5. The underlying idea is that overlapping clusters belong to the same storm track. Figure 6 shows a schematic example of the formation of a storm track.
Definition 1: Cluster of cells
Cluster is a set of identified storm cells , where j refers to the cell index cell and k is the index of the cell identification time.
Definition 2: Storm
A storm consists of cells that belong to the same cluster and are identified at the time step k. In other words, is a maximal subset of a cluster , where all cells have the same index k.
Definition 3: Cluster connectivity
Two clusters are connected to each other if there is a chain of overlapping clusters between them. In mathematical terms, two clusters— and —are connected to each other if there is an ordered chain of clusters such that , and each cluster is connected to itself.
Definition 4: Storm track
Connected clusters form a storm track. Therefore, a track is the maximal set of clusters , , where each cluster , is connected to any other cluster , .
Definition 5: Splitting and merging of a storm track
As noted earlier, convective storms tend to split and merge frequently. A split occurs if two cells belong to the same cluster in the time frame k but to different clusters at the next time step k + 1. The merging is analogous to the splitting; it is considered if cells belong to different cluster in the current frame k but to the same cluster in the next frame k + 1. The track is called complex if it includes splits or mergers.
Figure 6 gives an example of a track that consists of cells o1–o8 identified at time steps k0–k4. The track is formed based on overlapping clusters. For example, o2 belongs to the clusters C1 and C2, and therefore a one-to-one link is established between the cells o1 and o2. However, the objects o5 and o6 belong to the same cluster C3 at k3 but to two different clusters, C4 and C5, at k4. Therefore, a split is identified. The merging is analogous to the splitting. It is considered if cells belong to a different cluster in the current frame but to the same cluster in the next frame. Note that a storm at time step k may have several cells. For example, o3 and o4, identified at k2, belong to the same cluster C3 and hence to the same storm according to definition 3.