Search Results

You are looking at 1 - 10 of 30 items for

  • Author or Editor: Valliappa Lakshmanan x
  • Refine by Access: All Content x
Clear All Modify Search
Valliappa Lakshmanan
and
Travis Smith

Abstract

Although storm-tracking algorithms are a key ingredient of nowcasting systems, evaluation of storm-tracking algorithms has been indirect, labor intensive, or nonspecific. A set of easily computable bulk statistics that can be used to directly evaluate the performance of tracking algorithms on specific characteristics is introduced. These statistics are used to evaluate five widely used storm-tracking algorithms on a diverse set of radar reflectivity data cases. Based on this objective evaluation, a storm-tracking algorithm is devised that performs consistently and better than any of the previously suggested techniques.

Full access
Valliappa Lakshmanan
and
Travis Smith

Abstract

A technique to identify storms and capture scalar features within the geographic and temporal extent of the identified storms is described. The identification technique relies on clustering grid points in an observation field to find self-similar and spatially coherent clusters that meet the traditional understanding of what storms are. From these storms, geometric, spatial, and temporal features can be extracted. These scalar features can then be data mined to answer many types of research questions in an objective, data-driven manner. This is illustrated by using the technique to answer questions of forecaster skill and lightning predictability.

Full access
Valliappa Lakshmanan
and
John S. Kain

Abstract

Verification methods for high-resolution forecasts have been based either on filtering or on objects created by thresholding the images. The filtering methods do not easily permit the use of deformation while identifying objects based on thresholds can be problematic. In this paper, a new approach is introduced in which the observed and forecast fields are broken down into a mixture of Gaussians, and the parameters of the Gaussian mixture model fit are examined to identify translation, rotation, and scaling errors. The advantages of this method are discussed in terms of the traditional filtering or object-based methods and the resulting scores are interpreted on a standard verification dataset.

Full access
Valliappa Lakshmanan
,
Jian Zhang
, and
Kenneth Howard

Abstract

Existing techniques of quality control of radar reflectivity data rely on local texture and vertical profiles to discriminate between precipitating echoes and nonprecipitating echoes. Nonprecipitating echoes may be due to artifacts such as anomalous propagation, ground clutter, electronic interference, sun strobe, and biological contaminants (i.e., birds, bats, and insects). The local texture of reflectivity fields suffices to remove most artifacts, except for biological echoes. Biological echoes, also called “bloom” echoes because of their circular shape and expanding size during the nighttime, have proven difficult to remove, especially in peak migration seasons of various biological species, because they can have local and vertical characteristics that are similar to those of stratiform rain or snow. In this paper, a technique is described that identifies candidate bloom echoes based on the range variance of reflectivity in areas of bloom and uses the global, rather than local, characteristic of the echo to discriminate between bloom and rain. Every range gate is assigned a probability that it corresponds to bloom using morphological (shape based) operations, and a neural network is trained using this probability as one of the input features. It is demonstrated that this technique is capable of identifying and removing echoes due to biological targets and other types of artifacts while retaining echoes that correspond to precipitation.

Full access
Valliappa Lakshmanan
,
Kurt Hondl
, and
Robert Rabin

Abstract

Existing techniques for identifying, associating, and tracking storms rely on heuristics and are not transferrable between different types of geospatial images. Yet, with the multitude of remote sensing instruments and the number of channels and data types increasing, it is necessary to develop a principled and generally applicable technique. In this paper, an efficient, sequential, morphological technique called the watershed transform is adapted and extended so that it can be used for identifying storms. The parameters available in the technique and the effects of these parameters are also explained.

The method is demonstrated on different types of geospatial radar and satellite images. Pointers are provided on the effective choice of parameters to handle the resolutions, data quality constraints, and dynamic ranges found in observational datasets.

Full access
Valliappa Lakshmanan
,
Madison Miller
, and
Travis Smith

Abstract

Accumulating gridded fields over time greatly magnifies the impact of impulse noise in the individual grids. A quality control method that takes advantage of spatial and temporal coherence can reduce the impact of such noise in accumulation grids. Such a method can be implemented using the image processing techniques of hysteresis and multiple hypothesis tracking (MHT). These steps are described in this paper, and the method is applied to simulated data to quantify the improvements and to explain the effect of various parameters. Finally, the quality control technique is applied to some illustrative real-world datasets.

Full access
Valliappa Lakshmanan
,
Benjamin Herzog
, and
Darrel Kingfield

Abstract

Although existing algorithms for storm tracking have been designed to operate in real time, they are also commonly used to do postevent data analysis and research. Real-time algorithms cannot use information on the subsequent positions of a storm because it is not available at the time that associations between frames are made, but postevent analysis is not similarly constrained. Therefore, it should be possible to obtain better tracks for postevent analysis than those that a real-time algorithm is capable of producing. In this paper, a statistical procedure for determining storm tracks from a set of identified storm cells over time is described. It is found that this procedure results in fewer, longer-lived tracks at the potential cost of a small increase in positional error.

Full access
Yunsung Hwang
,
Adam J. Clark
,
Valliappa Lakshmanan
, and
Steven E. Koch

Abstract

Planning and managing commercial airplane routes to avoid thunderstorms requires very skillful and frequently updated 0–8-h forecasts of convection. The National Oceanic and Atmospheric Administration’s High-Resolution Rapid Refresh (HRRR) model is well suited for this purpose, being initialized hourly and providing explicit forecasts of convection out to 15 h. However, because of difficulties with depicting convection at the time of model initialization and shortly thereafter (i.e., during model spinup), relatively simple extrapolation techniques, on average, perform better than the HRRR at 0–2-h lead times. Thus, recently developed nowcasting techniques blend extrapolation-based forecasts with numerical weather prediction (NWP)-based forecasts, heavily weighting the extrapolation forecasts at 0–2-h lead times and transitioning emphasis to the NWP-based forecasts at the later lead times. In this study, a new approach to applying different weights to blend extrapolation and model forecasts based on intensities and forecast times is applied and tested. An image-processing method of morphing between extrapolation and model forecasts to create nowcasts is described and the skill is compared to extrapolation forecasts and forecasts from the HRRR. The new approach is called salient cross dissolve (Sal CD), which is compared to a commonly used method called linear cross dissolve (Lin CD). Examinations of forecasts and observations of the maximum altitude of echo-top heights ≥18 dBZ and measurement of forecast skill using neighborhood-based methods shows that Sal CD significantly improves upon Lin CD, as well as the HRRR at 2–5-h lead times.

Full access
Valliappa Lakshmanan
,
Christopher Karstens
,
John Krause
, and
Lin Tang

Abstract

Because weather radar data are commonly employed in automated weather applications, it is necessary to censor nonmeteorological contaminants, such as bioscatter, instrument artifacts, and ground clutter, from the data. With the operational deployment of a widespread polarimetric S-band radar network in the United States, it has become possible to fully utilize polarimetric data in the quality control (QC) process. At each range gate, a pattern vector consisting of the values of the polarimetric and Doppler moments, and local variance of some of these features, as well as 3D virtual volume features, is computed. Patterns that cannot be preclassified based on correlation coefficient ρ HV, differential reflectivity Z dr, and reflectivity are presented to a neural network that was trained on historical data. The neural network and preclassifier produce a pixelwise probability of precipitation at that range gate. The range gates are then clustered into contiguous regions of reflectivity, with bimodal clustering carried out close to the radar and clustering based purely on spatial connectivity farther away from the radar. The pixelwise probabilities are averaged within each cluster, and the cluster is either retained or censored depending on whether this average probability is greater than or less than 0.5. The QC algorithm was evaluated on a set of independent cases and found to perform well, with a Heidke skill score (HSS) of about 0.8. A simple gate-by-gate classifier, consisting of three simple rules, is also introduced in this paper and can be used if the full QC method is not able to be applied. The simple classifier has an HSS of about 0.6 on the independent dataset.

Full access
Valliappa Lakshmanan
,
John Crockett
,
Kenneth Sperow
,
Mamoudou Ba
, and
Lingyan Xin

Abstract

AutoNowcaster (ANC) is an automated system that nowcasts thunderstorms, including thunderstorm initiation. However, its parameters have to be tuned to regional environments, a process that is time consuming, labor intensive, and quite subjective. When the National Weather Service decided to explore using ANC in forecast operations, a faster, less labor-intensive, and objective mechanism to tune the parameters for all the forecast offices was sought. In this paper, a genetic algorithm approach to tuning ANC is described. The process consisted of choosing datasets, employing an objective forecast verification technique, and devising a fitness function. ANC was modified to create nowcasts offline using weights iteratively generated by the genetic algorithm. The weights were generated by probabilistically combining weights with good fitness, leading to better and better weights as the tuning process proceeded.The nowcasts created by ANC using the automatically determined weights are compared with the nowcasts created by ANC using weights that were the result of manual tuning. It is shown that nowcasts created using the automatically tuned weights are as skilled as the ones created through manual tuning. In addition, automated tuning can be done in a fraction of the time that it takes experts to analyze the data and tune the weights.

Full access