Search Results

You are looking at 11 - 14 of 14 items for

  • Author or Editor: S. Mark Leidner x
  • Refine by Access: All Content x
Clear All Modify Search
Christina Holt
,
Istvan Szunyogh
,
Gyorgyi Gyarmati
,
S. Mark Leidner
, and
Ross N. Hoffman

Abstract

The standard statistical model of data assimilation assumes that the background and observation errors are normally distributed, and the first- and second-order statistical moments of the two distributions are known or can be accurately estimated. Because these assumptions are never satisfied completely in practice, data assimilation schemes must be robust to errors in the underlying statistical model. This paper tests simple approaches to improving the robustness of data assimilation in tropical cyclone (TC) regions.

Analysis–forecast experiments are carried out with three types of data—Tropical Cyclone Vitals (TCVitals), DOTSTAR, and QuikSCAT—that are particularly relevant for TCs and with an ensemble-based data assimilation scheme that prepares a global analysis and a limited-area analysis in a TC basin simultaneously. The results of the experiments demonstrate that significant analysis and forecast improvements can be achieved for TCs that are category 1 and higher by improving the robustness of the data assimilation scheme.

Full access
S. Mark Leidner
,
Thomas Nehrkorn
,
John Henderson
,
Marikate Mountain
,
Tom Yunck
, and
Ross N. Hoffman

Abstract

Global Navigation Satellite System (GNSS) radio occultations (RO) over the last 10 years have proved to be a valuable and essentially unbiased data source for operational global numerical weather prediction. However, the existing sampling coverage is too sparse in both space and time to support forecasting of severe mesoscale weather. In this study, the case study or quick observing system simulation experiment (QuickOSSE) framework is used to quantify the impact of vastly increased numbers of GNSS RO profiles on mesoscale weather analysis and forecasting. The current study focuses on a severe convective weather event that produced both a tornado and flash flooding in Oklahoma on 31 May 2013. The WRF Model is used to compute a realistic and faithful depiction of reality. This 2-km “nature run” (NR) serves as the “truth” in this study. The NR is sampled by two proposed constellations of GNSS RO receivers that would produce 250 thousand and 2.5 million profiles per day globally. These data are then assimilated using WRF and a 24-member, 18-km-resolution, physics-based ensemble Kalman filter. The data assimilation is cycled hourly and makes use of a nonlocal, excess phase observation operator for RO data. The assimilation of greatly increased numbers of RO profiles produces improved analyses, particularly of the lower-tropospheric moisture fields. The forecast results suggest positive impacts on convective initiation. Additional experiments should be conducted for different weather scenarios and with improved OSSE systems.

Full access
David R. Stauffer
,
Nelson L. Seaman
,
Glenn K. Hunter
,
S. Mark Leidner
,
Annette Lario-Gibbs
, and
Saffet Tanrikulu

Abstract

This paper describes a new methodology developed to provide objective guidance for cost-effective siting of meteorological observations on the mesoscale for air quality applications. This field-coherence technique (FCT) is based on a statistical analysis of the mesoscale atmospheric structure defined by the spatial and temporal“coherence” in the meteorological fields. The coherence, as defined here, is a measure of the distance scale over which there is temporal consistency in the spatial structure within a variable field. It indicates how well a measurement taken at one location can be used to estimate the value of that field at another location at a given analysis time. The FCT postulates that, the larger the field coherence is, the fewer measurement sites are needed to resolve adequately the dominant characteristics of that field.

Proof of concept was demonstrated using real data from an extensive field-program database over the San Joaquin Valley in the summer of 1990. The FCT next was applied to numerical model results for the same period, which produced similar guidance. The transferability of the methodology from real data to numerical model results having been demonstrated, the FCT then was applied in a model-based study over California’s South Coast Air Basin to contribute in the design of a new field program, the Southern California Ozone Study (SCOS97). Interpretation of the FCT results mostly corroborated a preliminary field-program design produced by the design team and based on past experience, subjective evaluation of historical datasets, and other considerations. However, the FCT results also led the design team to make several changes, which were confirmed by experts familiar with the meteorological behavior of the region and were included in the final SCOS97 field-program plan.

Full access
Robert Atlas
,
Ross N. Hoffman
,
Joseph Ardizzone
,
S. Mark Leidner
,
Juan Carlos Jusem
,
Deborah K. Smith
, and
Daniel Gombos

Abstract

The ocean surface wind mediates exchanges between the ocean and the atmosphere. These air–sea exchange processes are critical for understanding and predicting atmosphere, ocean, and wave phenomena on many time and space scales. A cross-calibrated multiplatform (CCMP) long-term data record of satellite ocean surface winds is available from 1987 to 2008 with planned extensions through 2012. A variational analysis method (VAM) is used to combine surface wind data derived from conventional and in situ sources and multiple satellites into a consistent nearglobal analysis at 25-km resolution, every 6 h. The input data are cross-calibrated wind speeds derived from the Special Sensor Microwave Imager (SSM/I; F08F15), the Tropical Rainfall Measuring Mission Microwave Imager (TMI), and the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), and wind vectors from SeaWinds on the NASA Quick Scatterometer (QuikSCAT) and on the second Japanese Advanced Earth Observing Satellite (ADEOS- 2; i.e., the Midori-2 satellite). These are combined with ECMWF reanalyses and operational analyses by the VAM. VAM analyses and derived data are currently available for interested investigators through the Jet Propulsion Laboratory (JPL) Physical Oceanography Distributed Active Archive Center (PO.DAAC). This paper describes the methodology used to assimilate the input data along with the validation and evaluation of the derived CCMP products.

A supplement to this article is available online:

DOI: 10.1175/2010BAMS2946.2

Full access