Search Results
You are looking at 1 - 9 of 9 items for :
- Author or Editor: S. Lakshmivarahan x
- Article x
- Refine by Access: All Content x
Abstract
Yoshikazu Sasaki developed a variational method of data assimilation, a cornerstone of modern-day analysis and prediction in meteorology. Fundamentally, he formulated data assimilation as a constrained minimization problem with equality constraints. The generation of this idea is tracked by analyzing his education and research at the University of Tokyo in the immediate post–World War II (WWII) period. Despite austere circumstances—including limited financial support for education, poor living conditions, and a lack of educational resources—Sasaki was highly motivated and overcame these obstacles on his path to developing this innovative method of weather map analysis. The stages of his intellectual development are followed where information comes from access to his early publications, oral histories, and letters of reminiscence.
It has been argued that Sasaki’s unique contribution to meteorological data assimilation stems from his deterministic view of the problem—a view founded on the principles of variational mechanics. Sasaki’s approach to the problem is compared and contrasted with the stochastic view that was pioneered by Arnt Eliassen. Both of these optimal approaches are viewed in the context of the pragmatic–operational objective analysis schemes that were developed in the 1950s–1960s. Finally, current-day methods [e.g., three- and four-dimensional variational data assimilation (3DVAR and 4DVAR)] are linked to the optimal methods of Eliassen and Sasaki.
Abstract
Yoshikazu Sasaki developed a variational method of data assimilation, a cornerstone of modern-day analysis and prediction in meteorology. Fundamentally, he formulated data assimilation as a constrained minimization problem with equality constraints. The generation of this idea is tracked by analyzing his education and research at the University of Tokyo in the immediate post–World War II (WWII) period. Despite austere circumstances—including limited financial support for education, poor living conditions, and a lack of educational resources—Sasaki was highly motivated and overcame these obstacles on his path to developing this innovative method of weather map analysis. The stages of his intellectual development are followed where information comes from access to his early publications, oral histories, and letters of reminiscence.
It has been argued that Sasaki’s unique contribution to meteorological data assimilation stems from his deterministic view of the problem—a view founded on the principles of variational mechanics. Sasaki’s approach to the problem is compared and contrasted with the stochastic view that was pioneered by Arnt Eliassen. Both of these optimal approaches are viewed in the context of the pragmatic–operational objective analysis schemes that were developed in the 1950s–1960s. Finally, current-day methods [e.g., three- and four-dimensional variational data assimilation (3DVAR and 4DVAR)] are linked to the optimal methods of Eliassen and Sasaki.
Abstract
Short-range ensemble forecasts from the Storm and Mesoscale Ensemble Experiment (SAMEX) are examined to explore the importance of model diversity in short-range ensemble forecasting systems. Two basic techniques from multivariate data analysis are used: cluster analysis and principal component analysis. This 25-member ensemble is constructed of 36-h forecasts from four different numerical weather prediction models, including the Eta Model, the Regional Spectral Model (RSM), the Advanced Regional Prediction System (ARPS), and the Pennsylvania State University–National Center for Atmospheric Research fifth-generation Mesoscale Model (MM5). The Eta Model and RSM forecasts are initialized using the breeding of growing modes approach, the ARPS model forecasts are initialized using a scaled lagged average forecasting approach, and the MM5 forecasts are initialized using a random coherent structures approach. The MM5 forecasts also include different model physical parameterization schemes, allowing us to examine the role of intramodel physics differences in the ensemble forecasting process.
Cluster analyses of the 3-h accumulated precipitation, mean sea level pressure, convective available potential energy, 500-hPa geopotential height, and 250-hPa wind speed forecasts started at 0000 UTC 29 May 1998 indicate that the forecasts cluster largely by model, with few intermodel clusters found. This clustering occurs within the first few hours of the forecast and persists throughout the entire forecast period, even though the perturbed initial conditions from some of the models are very similar. This result further highlights the important role played by model physics in determining the resulting forecasts and the need for model diversity in short-range ensemble forecasting systems.
Abstract
Short-range ensemble forecasts from the Storm and Mesoscale Ensemble Experiment (SAMEX) are examined to explore the importance of model diversity in short-range ensemble forecasting systems. Two basic techniques from multivariate data analysis are used: cluster analysis and principal component analysis. This 25-member ensemble is constructed of 36-h forecasts from four different numerical weather prediction models, including the Eta Model, the Regional Spectral Model (RSM), the Advanced Regional Prediction System (ARPS), and the Pennsylvania State University–National Center for Atmospheric Research fifth-generation Mesoscale Model (MM5). The Eta Model and RSM forecasts are initialized using the breeding of growing modes approach, the ARPS model forecasts are initialized using a scaled lagged average forecasting approach, and the MM5 forecasts are initialized using a random coherent structures approach. The MM5 forecasts also include different model physical parameterization schemes, allowing us to examine the role of intramodel physics differences in the ensemble forecasting process.
Cluster analyses of the 3-h accumulated precipitation, mean sea level pressure, convective available potential energy, 500-hPa geopotential height, and 250-hPa wind speed forecasts started at 0000 UTC 29 May 1998 indicate that the forecasts cluster largely by model, with few intermodel clusters found. This clustering occurs within the first few hours of the forecast and persists throughout the entire forecast period, even though the perturbed initial conditions from some of the models are very similar. This result further highlights the important role played by model physics in determining the resulting forecasts and the need for model diversity in short-range ensemble forecasting systems.
Abstract
A data assimilation strategy based on feedback control has been developed for the geophysical sciences—a strategy that uses model output to control the behavior of the dynamical system. Whereas optimal tracking through feedback control had its early history in application to vehicle trajectories in space science, the methodology has been adapted to geophysical dynamics by forcing the trajectory of a deterministic model to follow observations in accord with observation accuracy. Fundamentally, this offline (where it is assumed that the observations in a given assimilation window are all given) approach is based on Pontryagin’s minimum principle (PMP) where a least squares fit of idealized path to dynamic law follows from Hamiltonian mechanics. This utilitarian process optimally determines a forcing function that depends on the state (the feedback component) and the observations. It follows that this optimal forcing accounts for the model error. From this model error, a correction to the one-step transition matrix is constructed. The above theory and technique is illustrated using the linear Burgers’ equation that transfers energy from the large scale to the small scale.
Abstract
A data assimilation strategy based on feedback control has been developed for the geophysical sciences—a strategy that uses model output to control the behavior of the dynamical system. Whereas optimal tracking through feedback control had its early history in application to vehicle trajectories in space science, the methodology has been adapted to geophysical dynamics by forcing the trajectory of a deterministic model to follow observations in accord with observation accuracy. Fundamentally, this offline (where it is assumed that the observations in a given assimilation window are all given) approach is based on Pontryagin’s minimum principle (PMP) where a least squares fit of idealized path to dynamic law follows from Hamiltonian mechanics. This utilitarian process optimally determines a forcing function that depends on the state (the feedback component) and the observations. It follows that this optimal forcing accounts for the model error. From this model error, a correction to the one-step transition matrix is constructed. The above theory and technique is illustrated using the linear Burgers’ equation that transfers energy from the large scale to the small scale.
Abstract
The goal of this paper is to provide a complete picture of the long-term behavior of Lorenz’s maximum simplification equations along with the corresponding meteorological interpretation for all initial conditions and all values of the parameter.
Abstract
The goal of this paper is to provide a complete picture of the long-term behavior of Lorenz’s maximum simplification equations along with the corresponding meteorological interpretation for all initial conditions and all values of the parameter.
Abstract
In Saltzman’s seminal paper from 1962, the author developed a framework based on the spectral method for the analysis of the solution to the classical Rayleigh–Bénard convection problem using low-order models (LOMs), LOM (n) with n ≤ 52. By way of illustrating the power of these models, he singled out an LOM (7) and presented a very preliminary account of its numerical solution starting from one initial condition and for two values of the Rayleigh number, λ = 2 and 5. This paper provides a complete mathematical characterization of the solution of this LOM (7), herein called the Saltzman LOM (7) [S-LOM (7)]. Historically, Saltzman’s examination of the numerical solution of this low-order model contained two salient characteristics: 1) the periodic solution (in the physical 3D space and time) that expand on Rayleigh’s classical study and 2) a nonperiodic solution (in the temporal space dealing with the evolution of Fourier amplitude) that served Lorenz in his fundamental study of chaos in the early 1960s. Interestingly, the presence of this nonperiodic solution was left unmentioned in Saltzman’s study in 1962 but explained in detail in Lorenz’s scientific biography in 1993. Both of these fundamental aspects of Saltzman’s study are fully explored in this paper and bring a sense of completeness to the work.
Abstract
In Saltzman’s seminal paper from 1962, the author developed a framework based on the spectral method for the analysis of the solution to the classical Rayleigh–Bénard convection problem using low-order models (LOMs), LOM (n) with n ≤ 52. By way of illustrating the power of these models, he singled out an LOM (7) and presented a very preliminary account of its numerical solution starting from one initial condition and for two values of the Rayleigh number, λ = 2 and 5. This paper provides a complete mathematical characterization of the solution of this LOM (7), herein called the Saltzman LOM (7) [S-LOM (7)]. Historically, Saltzman’s examination of the numerical solution of this low-order model contained two salient characteristics: 1) the periodic solution (in the physical 3D space and time) that expand on Rayleigh’s classical study and 2) a nonperiodic solution (in the temporal space dealing with the evolution of Fourier amplitude) that served Lorenz in his fundamental study of chaos in the early 1960s. Interestingly, the presence of this nonperiodic solution was left unmentioned in Saltzman’s study in 1962 but explained in detail in Lorenz’s scientific biography in 1993. Both of these fundamental aspects of Saltzman’s study are fully explored in this paper and bring a sense of completeness to the work.
Abstract
An ensemble of 48-h forecasts from 23 cases during the months of July and August 2002, which was created as part of a National Oceanic and Atmospheric Administration pilot program on temperature and air quality forecasting, is evaluated using a clustering method. The ensemble forecasting system consists of 23 total forecasts from four different models: the National Centers for Environmental Prediction (NCEP) Eta Model (ETA), the NCEP Regional Spectral Model (RSM), the Rapid Update Cycle (RUC) model, and the fifth-generation Pennsylvania State University–National Center for Atmospheric Research (PSU–NCAR) Mesoscale Model (MM5). Forecasts of 2-m temperature, 850-hPa u-component wind speed, 500-hPa temperature, and 250-hPa u-component wind speed are bilinearly interpolated to a common grid, and a cluster analysis is conducted at each of the 17 output times for each of the case days using a hierarchical clustering approach.
Results from the clustering indicate that the forecasts largely cluster by model, with these intramodel clusters occurring quite often near the surface and less often at higher levels in the atmosphere. Results also indicate that model physics diversity plays a relatively larger role than initial condition diversity in producing distinct groupings of the forecasts. If the goal of ensemble forecasting is to have each model forecast represent an equally likely solution, then this goal remains distant as the model forecasts too often cluster based upon the model that produces the forecasts. Ensembles that contain both initial condition and model dynamics and physics uncertainty are recommended.
Abstract
An ensemble of 48-h forecasts from 23 cases during the months of July and August 2002, which was created as part of a National Oceanic and Atmospheric Administration pilot program on temperature and air quality forecasting, is evaluated using a clustering method. The ensemble forecasting system consists of 23 total forecasts from four different models: the National Centers for Environmental Prediction (NCEP) Eta Model (ETA), the NCEP Regional Spectral Model (RSM), the Rapid Update Cycle (RUC) model, and the fifth-generation Pennsylvania State University–National Center for Atmospheric Research (PSU–NCAR) Mesoscale Model (MM5). Forecasts of 2-m temperature, 850-hPa u-component wind speed, 500-hPa temperature, and 250-hPa u-component wind speed are bilinearly interpolated to a common grid, and a cluster analysis is conducted at each of the 17 output times for each of the case days using a hierarchical clustering approach.
Results from the clustering indicate that the forecasts largely cluster by model, with these intramodel clusters occurring quite often near the surface and less often at higher levels in the atmosphere. Results also indicate that model physics diversity plays a relatively larger role than initial condition diversity in producing distinct groupings of the forecasts. If the goal of ensemble forecasting is to have each model forecast represent an equally likely solution, then this goal remains distant as the model forecasts too often cluster based upon the model that produces the forecasts. Ensembles that contain both initial condition and model dynamics and physics uncertainty are recommended.
Abstract
An automated procedure for classifying rainfall systems (meso-α scale and larger) was developed using an operational analysis of hourly precipitation estimates from radar and rain gauge data. The development process followed two main phases: a training phase and a testing phase. First, 48 hand-selected cases were used to create a training dataset, from which a set of attributes related to morphological aspects of rainfall systems were extracted. A hierarchy of classes for rainfall systems, in which the systems are separated into general convective (heavy rain) and nonconvective (light rain) classes, was envisioned. At the next level of classification hierarchy, convective events are divided into linear and cellular subclasses, and nonconvective events belong to the stratiform subclass. Essential attributes of precipitating systems, related to the rainfall intensity and degree of linear organization, were determined during the training phase. The attributes related to the rainfall intensity were chosen to be the parameters of the gamma probability distribution fit to observed rainfall amount frequency distributions using the generalized method of moments. Attributes related to the degree of spatial continuity of each rainfall system were acquired from correlogram analysis. Rainfall systems were categorized using hierarchical cluster analysis experiments with various combinations of these attributes. The combination of attributes that resulted in the best match between cluster analysis results and an expert classification were used as the basis for an automated classification procedure.
The development process shifted into the testing phase, where automated procedures for identifying and classifying rainfall systems were used to analyze every rainfall system in the contiguous 48 states during 2002. To allow for a feasible validation, a testing dataset was extracted from the 2002 data. The testing dataset consisted of 100 randomly selected rainfall systems larger than 40 000 km2 as identified by an automated identification system. This subset was shown to be representative of the full 2002 dataset. Finally, the automated classification procedure classified the testing dataset into stratiform, linear, and cellular classes with 85% accuracy, as compared to an expert classification.
Abstract
An automated procedure for classifying rainfall systems (meso-α scale and larger) was developed using an operational analysis of hourly precipitation estimates from radar and rain gauge data. The development process followed two main phases: a training phase and a testing phase. First, 48 hand-selected cases were used to create a training dataset, from which a set of attributes related to morphological aspects of rainfall systems were extracted. A hierarchy of classes for rainfall systems, in which the systems are separated into general convective (heavy rain) and nonconvective (light rain) classes, was envisioned. At the next level of classification hierarchy, convective events are divided into linear and cellular subclasses, and nonconvective events belong to the stratiform subclass. Essential attributes of precipitating systems, related to the rainfall intensity and degree of linear organization, were determined during the training phase. The attributes related to the rainfall intensity were chosen to be the parameters of the gamma probability distribution fit to observed rainfall amount frequency distributions using the generalized method of moments. Attributes related to the degree of spatial continuity of each rainfall system were acquired from correlogram analysis. Rainfall systems were categorized using hierarchical cluster analysis experiments with various combinations of these attributes. The combination of attributes that resulted in the best match between cluster analysis results and an expert classification were used as the basis for an automated classification procedure.
The development process shifted into the testing phase, where automated procedures for identifying and classifying rainfall systems were used to analyze every rainfall system in the contiguous 48 states during 2002. To allow for a feasible validation, a testing dataset was extracted from the 2002 data. The testing dataset consisted of 100 randomly selected rainfall systems larger than 40 000 km2 as identified by an automated identification system. This subset was shown to be representative of the full 2002 dataset. Finally, the automated classification procedure classified the testing dataset into stratiform, linear, and cellular classes with 85% accuracy, as compared to an expert classification.
The NOAA NWS announced at the annual meeting of the American Meteorological Society in February 2003 its intent to create an Internet-based pseudo-operational system for delivering Weather Surveillance Radar-1988 Doppler (WSR-88D) Level II data. In April 2004, the NWS deployed the Next-Generation Weather Radar (NEXRAD) level II central collection functionality and set up a framework for distributing these data. The NWS action was the direct result of a successful joint government, university, and private sector development and test effort called the Collaborative Radar Acquisition Field Test (CRAFT) project. Project CRAFT was a multi-institutional effort among the Center for Analysis and Prediction of Storms, the University Corporation for Atmospheric Research, the University of Washington, and the three NOAA organizations, National Severe Storms Laboratory, WSR-88D Radar Operations Center (ROC), and National Climatic Data Center. The principal goal of CRAFT was to demonstrate the real-time compression and Internet-based transmission of level II data from all WSR-88D with the vision of an affordable nationwide operational implementation. The initial test bed of six radars located in and around Oklahoma grew to include 64 WSR-88D nationwide before being adopted by the NWS for national implementation. A description of the technical aspects of the award-winning Project CRAFT is given, including data transmission, reliability, latency, compression, archival, data mining, and newly developed visualization and retrieval tools. In addition, challenges encountered in transferring this research project into operations are discussed, along with examples of uses of the data.
The NOAA NWS announced at the annual meeting of the American Meteorological Society in February 2003 its intent to create an Internet-based pseudo-operational system for delivering Weather Surveillance Radar-1988 Doppler (WSR-88D) Level II data. In April 2004, the NWS deployed the Next-Generation Weather Radar (NEXRAD) level II central collection functionality and set up a framework for distributing these data. The NWS action was the direct result of a successful joint government, university, and private sector development and test effort called the Collaborative Radar Acquisition Field Test (CRAFT) project. Project CRAFT was a multi-institutional effort among the Center for Analysis and Prediction of Storms, the University Corporation for Atmospheric Research, the University of Washington, and the three NOAA organizations, National Severe Storms Laboratory, WSR-88D Radar Operations Center (ROC), and National Climatic Data Center. The principal goal of CRAFT was to demonstrate the real-time compression and Internet-based transmission of level II data from all WSR-88D with the vision of an affordable nationwide operational implementation. The initial test bed of six radars located in and around Oklahoma grew to include 64 WSR-88D nationwide before being adopted by the NWS for national implementation. A description of the technical aspects of the award-winning Project CRAFT is given, including data transmission, reliability, latency, compression, archival, data mining, and newly developed visualization and retrieval tools. In addition, challenges encountered in transferring this research project into operations are discussed, along with examples of uses of the data.