Search Results

You are looking at 21 - 30 of 35 items for

  • Author or Editor: John M. Lewis x
  • Refine by Access: Content accessible to me x
Clear All Modify Search
Rodger A. Brown
and
John M. Lewis

In this historical paper, we trace the scientific-and engineering-based steps at the National Severe Storms Laboratory (NSSL) and in the larger weather radar community that led to the development of NSSL's first 10-cm-wavelength pulsed Doppler radar. This radar was the prototype for the current Next Generation Weather Radar (NEXRAD), or Weather Surveillance Radar-1998 Doppler (WSR-88D) network.

We track events, both political and scientific, that led to the establishment of NSSL in 1964. The vision of NSSL's first director, Edwin Kessler, is reconstructed through access to historical documents and oral histories. This vision included the development of Doppler radar, where research was to be meshed with the operational needs of the U.S. Weather Bureau and its successor—the National Weather Service.

Realization of the vision came through steps that were often fitful, where complications arose due to personnel concerns, and where there were always financial concerns. The historical research indicates that 1) the engineering prowess and mentorship of Roger Lhermitte was at the heart of Doppler radar development at NSSL; 2) key decisions by Kessler in the wake of Lhermitte's sudden departure in 1967 proved crucial to the ultimate success of the project; 3) research results indicated that Doppler velocity signatures of mesocyclones are a precursor of damaging thunderstorms and tornadoes; and 4) results from field testing of the Doppler-derived products during the 1977-79 Joint Doppler Operational Project—especially the noticeable increase in the verification of tornado warnings and an associated marked decrease in false alarms—led to the government decision to establish the NEXRAD network.

Full access
JOHN M. LEWIS
and
CHARLES B. MOORE
Full access
John M. Lewis
and
Charles B. Moore

During the spring of 1924, U.S. Weather Bureau meteorologist LeRoy Meisinger conducted a series of experiments with a free balloon to determine the trajectories of air around extratropical cyclones. The 10th flight in the series ended with a crash of the balloon overcentral Illinois. Both Meisinger and the pilot, Army Air Services Lt. James Neely, were killed.

An effort has been made to reconstruct this accident using information from a review article by early twentieth-century meteorologist Vincent Jakl and newspaper accounts of the accident. The principal results of the study follow.

  1. Meisinger's balloon was caught in the downdraft of a newly developed thunderstorm over the Bement, Illinois, area on the evening of 2 June;

  2. a hard landing took place in a cornfield just north of Bement, and loss of ballast at the hard-landing site was sufficient to cause the balloon to rise again; and

  3. after rebounding from the ground, the balloon with the two aeronauts aboard was struck by lightning. A fire resulted that burned through the netting and led to a crash four miles northeast of the hard-landing site.

Full access
John M. Lewis
and
Robert A. Maddox

In an effort to encourage college students to consider careers in scientific research, NOAA's National Severe Storms Laboratory has instituted a Summer Employment Program. The program is centered around a scientific mentorship experience that matches each student with a laboratory scientist. During the nominal 12 weeks of the program, the scientist leads and directs a research project that is designed to be commensurate with the student's background. Along with the research experience, there is an educational component that encompasses both classroom work and experimentation. Additionally, students are introduced to a variety of research efforts in the laboratory through a continuing series of guest lectures by lab scientists.

The program has operated in 1987, 1989, and 1990, and has included 17 students, 12 of whom have come from under-represented groups in our society. We report on the evolution of the program and scrutinize the results after these three years of effort.

Full access
John M. Lewis
and
S. Lakshmivarahan

Abstract

A single-day meeting between two theoretical meteorologists took place in 1961 at the Travelers Research Center (TRC) in Hartford, Connecticut. The two scientists were Barry Saltzman and Edward Lorenz, former proteges of V. P. Starr at MIT. Several years before this meeting, Lorenz discovered the following profound result: extended-range weather forecasting was not feasible in the presence of slight errors in initial conditions. The model used was the geostrophic form of a two-level baroclinic model with twelve spectral variables. These results were presented a year earlier at the First Symposium on Numerical Weather Prediction (NWP) in Tokyo, Japan, and met with some skepticism from the NWP elite, dynamical meteorologists, and pioneers in operational NWP. Lorenz held faint hope that Saltzman’s recently developed model of Rayleigh- Bénard convection would produce the profound result found earlier. One of the numerical experiments executed that eventful day with Saltzman’s 7-mode truncated spectral model produced an unexpected result: inability of the model’s 7 variables to settle down and approach a steady state. This occurred when the key parameter, the Rayleigh number, assumed an especially large value, one associated with turbulent convection. And further experimentation with the case delivered the sought-after result that Lorenz had found earlier, and now convincingly found with a simpler model. It built the bridge to chaos theory. The pathway to this exceptional result is explored by revisiting Saltzman’s and Lorenz’s mentorship under V. P. Starr, the authors’ interview with Lorenz in 2002 that complements information in Lorenz’s scientific autobiography, and the authors’ published perspective on Salzman’s 7-mode model.

Open access
S. Lakshmivarahan
,
John M. Lewis
, and
Junjun Hu

Abstract

In Saltzman’s seminal paper from 1962, the author developed a framework based on the spectral method for the analysis of the solution to the classical Rayleigh–Bénard convection problem using low-order models (LOMs), LOM (n) with n ≤ 52. By way of illustrating the power of these models, he singled out an LOM (7) and presented a very preliminary account of its numerical solution starting from one initial condition and for two values of the Rayleigh number, λ = 2 and 5. This paper provides a complete mathematical characterization of the solution of this LOM (7), herein called the Saltzman LOM (7) [S-LOM (7)]. Historically, Saltzman’s examination of the numerical solution of this low-order model contained two salient characteristics: 1) the periodic solution (in the physical 3D space and time) that expand on Rayleigh’s classical study and 2) a nonperiodic solution (in the temporal space dealing with the evolution of Fourier amplitude) that served Lorenz in his fundamental study of chaos in the early 1960s. Interestingly, the presence of this nonperiodic solution was left unmentioned in Saltzman’s study in 1962 but explained in detail in Lorenz’s scientific biography in 1993. Both of these fundamental aspects of Saltzman’s study are fully explored in this paper and bring a sense of completeness to the work.

Full access
S. Lakshmivarahan
,
John M. Lewis
, and
Junjun Hu

Abstract

Over the decades the role of observations in building and/or improving the fidelity of a model to a phenomenon is well documented in the meteorological literature. More recently adaptive/targeted observations have been routinely used to improve the quality of the analysis resulting from the fusion of data with models in a data assimilation scheme and the subsequent forecast. In this paper our goal is to develop an offline (preprocessing) diagnostic strategy for placing observations with a singular view to reduce the forecast error/innovation in the context of the classical 4D-Var. It is well known that the shape of the cost functional as measured by its gradient (also called adjoint gradient or sensitivity) in the control (initial condition and model parameters) space determines the marching of the control iterates toward a local minimum. These iterates can become marooned in regions of control space where the gradient is small. An open question is how to avoid these “flat” regions by bounding the norm of the gradient away from zero. We answer this question in two steps. We, for the first time, derive a linear transformation defined by a symmetric positive semidefinite (SPSD) Gramian G = F ¯ T F ¯ that directly relates the control error to the adjoint gradient. It is then shown that by placing observations where the square of the Frobenius norm of F ¯ (which is also the sum of the eigenvalues of G ) is a maximum, we can indeed bound the norm of the adjoint gradient away from zero.

Free access
John M. Lewis
,
Christopher M. Hayden
, and
Anthony J. Schreiner

Abstract

Comparisons between geopotential analyses derived from rawinsondes (RAOB) and the VISSR Atmospheric Sounder (VAS) generally exhibit differences that are ultimately related to the horizontal density and placement of the respective observations and the vertical resolution inherent in the instruments. In order to overcome some of the inconsistencies that appear, two strategies have been developed which allow the analyses to communicate through the derived variable, geostrophic potential vorticity. The first incorporates the statistics of RAOB derived potential vorticity into the VAS vorticity analysis. This is accomplished by making a least-squares adjustment to VAS while constraining it to have first and second moments identical to the RAOB analysis. The other approach makes mutual least-squares adjustments to RAOB and VAS vorticity analyses subject to the dynamic constraint that forecast and hindcast of potential vorticity to the time midway between analyses are equal. The forecast and hindcast are made from a two-parameter baroclinic model. In both procedures, the heights are recovered from adjusted vorticities by inverting the elliptic operators that relate height to vorticity.

Data from the GOES-East satellite at 1430 GMT 6 March 1982 are used along with rawinsonde data at 1200 GMT to test the schemes. The statistical adjustment approach makes synoptically meaningful adjustments to the VAS analysis over the Gulf of Mexico and Gulf coast region, but fails to correct the obvious discrepancies over the continental United States. The dynamic scheme succeeds in making meaningful adjustments over both the Gulf of Mexico and the continent which result in improved vertical motion fields.

Full access
Qingfu Liu
,
John M. Lewis
, and
Jeanne M. Schneider

Abstract

The evolution of the mean characteristics of the marine boundary layer during cold-air outbreaks can be described with an integrated or slab model. In order to assess the practical applicability of this-type of model to flows over the Gulf of Mexico, we use the observations collected during the Gulf of Mexico Experiment (GUFMEX) by an instrumented National Oceanic and Atmospheric Administration (NOAA) P-3 aircraft and a Cross-chain Loran Atmospheric Sounding System (CLASS) onboard the U.S. Coast Guard vessel Salvia. The numerical results show that the model successfully reproduced the changes in mean characteristics of momentum, moisture, and temperature under unstable conditions. The largest differences between the predictions and measurements are 0.8°C for the potential temperature, 0.15 g kg−1 for the specific humidity, 47 m for the mixed-layer height, and 1.5 m s−1 for the horizontal velocity components. A sensitivity analysis shows that the modeled mixed-layer height is slightly sensitive to changes in the specified sea surface temperature, while the other mean characteristics are relatively insensitive to the input parameters. Based upon the results of this single case study, the slab model appears to be a promising approach to account for the moistening and heating processes at the air-sea interface during cold-air outbreaks over the Gulf of Mexico.

Full access
John M. Lewis
,
Robert A. Maddox
, and
Charlie A. Crisp

The career of severe storm forecaster and teacher Colonel Robert Miller (1920–98) is historically reviewed and evaluated. His pathway to the position of recognized authority in severe storm forecasting is examined in light of his early education at Occidental College, his experiences as a weather officer in the Pacific Theatre during World War II (WWII), and his part in the bold and successful tornado forecast at Tinker Air Force Base in 1948.

We pay particular attention to Miller's development of a three-dimensional view of the severe storm environment in the precomputer age of the late 1940s—a viewpoint that remains central to current operational practice. This conceptual view led Miller and commander Ernest Fawbush to establish empirical criteria/rules that became the foundation of operational prediction at the military's Severe Weather Warning Center (SWWC). The success at the SWWC placed pressure on its civilian counterpart, the Severe Weather Unit (SWU) [later renamed the Severe Local Storms (SELS) unit] of the U.S. Weather Bureau. As part of our historical study, we explore and examine the circumstances that led to the spirit of competitiveness between these groups.

Finally, Miller's approach to forecaster training is discussed by reliance on reminiscences from his protégés. In the epilogue, we grapple with important issues related to forecaster education and training in light of Miller's philosophy.

Full access