Search Results

You are looking at 1 - 10 of 12 items for

  • Author or Editor: David M. Legler x
  • Refine by Access: All Content x
Clear All Modify Search
David M. Legler

Abstract

Surface meteorological reports of wind components, wind speed, air temperature, and sea surface temperature from buoys located in equatorial and midlatitude regions are used in a simulation of random sampling to determine errors of the calculated means due to inadequate sampling. Subsampling the data with several different sample sizes leads to estimates of the accuracy of the subsampled means. The number N of random observations needed to compute mean winds with chosen accuracies of 0.5 (N 0.5) and 1.0(N 1.0) m s−1 and mean air and sea surface temperatures with chosen accuracies of 0.1 (N 0.1) and 0.2(N 0.2)°C were calculated for each 5-day and 30-day period in the buoy datasets. Mean values of N for the various accuracies and datasets are given.

A second-order polynomial relation is established between N and the variability of the data record. This relationship demonstrates that for the same accuracy, N increases as the variability of the data record increases. The relationship is also independent of the data source. Volunteer observing ship data do not satisfy the recommended minimum number of observations for obtaining 0.5 m s−1 and 0.2°C accuracy for most locations. The effect of having remotely sensed data is discussed.

Full access
David M. Legler
Full access
David M. Legler

The method of empirical orthogonal function analysis is applied to wind stress vectors over the tropical Pacific Ocean from 1961 through 1978. It is determined that this vector analysis enables a more thorough analysis of wind components than do comparable scalar analyses. The method is presented and discussed. By partitioning the spatial variance of the data into many patterns, each modulated by a complex time series, this technique provided further insight into the variability of the tropical Pacific wind field.

The trade winds in each hemisphere are strongest during the respective winter and early spring. In addition, the northeast trades are more variable and stronger than the southeast trades. The 1960s were characterized by a relatively flat interannual signal, but the 1970s were more variable and were characterized by an equatorial convergent zone in the western and middle Pacific. The El Niño signal also indicates a difference between the '60s and the '70s. The Los Niños of 1972 and 1976 were preceded by a one-year period of anomalous southeasterlies. The Los Niños of the 1960s, on the other hand, showed no such pattern. Common to all the Los Niños during the analysis period was the onset of westerlies in the middle Pacific at the beginning of each El Niño event. This period of westerlies extended for a time proportional to the relative strength of the associated El Niño. The consistency of these westerlies also is related to the relative strength of each El Niño.

Full access
I. M. Navon
and
David M. Legler

Abstract

During the last few years new meteorological variational analysis methods have evolved, requiring large-scale minimization of a nonlinear objective function described in terms of discrete variables. The conjugate-gradient method was found to represent a good compromise in convergence rates and computer memory requirements between simpler and more complex methods of nonlinear optimization. In this study different available conjugate-gradient algorithms are presented with the aim of assessing their use in large-scale typical minimization problems in meteorology. Computational efficiency and accuracy are our principal criteria.

Four different conjugate-gradient methods, representative of up-to-date available scientific software, were compared by applying them to two different meteorological problems of interest using criteria of computational economy and accuracy. Conclusions are presented as to the adequacy of the different conjugate algorithms for large-scale minimization problems in different meteorological applications.

Full access
David M. Legler
and
James J. O'Brien

Abstract

A simple algorithm is developed and tested to derive a regularly spaced wind field in a limited arm from simulated multi-orbit scatterometer data. The data are generated by sampling a time-varying known wind field, the 1000-mb FGGE data, from a simulated scatterometer. A simple assimilation technique is used to derive a regularly spaced wind field representation of two-day averages from the simulated data. This technique averages the generated scatterometer data in time and space and uses a low-pass filter in primarily the zonal direction. The resultant vectors were compared to the known two-day averages calculated from the FGGE data. To test the technique, synthetic noise was added to the generated data to simulate scatterometer inaccuracies in speed and direction.

Three cases were tested. In the first case, the simulated scatterometer data contained no noise or errors. The average magnitude of the difference wind field, known minus resultant, was less than the natural variability of the known windfield. In the second case, random white noise with standard deviation of 2 m s−1 (and then 4 m s−1) about a zero mean were added to each sampled vector component simulating inaccuracies in speed and direction inherent in scatterometer sampling. The added noise made little difference on the resultant wind field representation. In the third case, spatially correlated noise was added to each simulated swath simulating data with noise in both speed and direction to reflect errors due to sampling a wind field containing both synoptic and mesoscale components. The standard deviation of the spatially correlated noise was initially 2 m s−1 in each vector component. The average magnitude of the difference vectors increased slightly. In addition, when the noise was increased to 3 m s−1 in each component, the error did not increase significantly.

To test the results on another time period, a final case was run with a 24-hour time window. When spatially correlated noise and random white noise, each with standard deviations of 3 m s−1, were added to the sampled vector component the error did not increase significantly over the noise-free case.

This assimilation technique provides representations of two-day averages on a 100 km regularly spaced grid, and might therefore be applicable to large-scale ocean or atmospheric models. The results of this technique signify the level of importance of errors resulting from the application of assimilation schemes on scatterometer data. These errors appear to be more significant and limiting to the application of scatterometer data than errors from scatterometer inaccuracy.

Full access
David M. Legler
,
I. M. Navon
, and
James J. O'Brien

Abstract

A variational approach is used to develop an objective analysis technique which produces monthly average 1-deg pseudostress vector fields over the Indian Ocean. A, cost functional is constructed which consists of five terms, each expressing a lack of fit to prescribed conditions. The first expresses the proximity to the input (first-guess) field. The second deals with the closeness of fit to the climatological value for that month. The third is a measure of data roughness, and the fourth and fifth are kinematic constraints on agreement of the curl and divergence of the results to the curl and divergence of the climatology. Each term also has a coefficient (weight) which determines how closely the minimization fits each lack of fit. These weights are determined by comparing the results using various weight combinations to an independent subjective analysis of the same dataset. The cost functional is minimized using the conjugate-gradient method.

Results from various weight combinations are presented for the months of January and July 1984 and the results examined in terms of thee selections. Quantitative and qualitative comparisons to the subjective analysis are made to find which weight combination provides the best results. It was found that the weight on the second term balances the influence of the original (first-guess) field and climatology. The smoothing term weight determines how wide an area deviations of the first guess from climatology is affected. The weights on the kinematic terms are fine-tuning parameters.

Full access
William M. Putman
,
David M. Legler
, and
James J. O’Brien

Abstract

A technique is applied to seamlessly blend height-adjusted Florida State University (FSU) surface wind pseudostress with National Centers for Environmental Prediction–National Center for Atmospheric Research (NCEP–NCAR) reanalysis-based pseudostress over the Pacific Ocean. The FSU pseudostress is shown to be of higher quality in the equatorial Pacific and thus dominates the analysis in that region, while the NCEP–NCAR reanalysis-based pseudostress is used outside the equatorial region. The blending technique is based on a direct minimization approach. The functional to minimize consists of five constraints; each constraint is given a weight that determines its influence on the solution. The first two constraints are misfits for the FSU and NCEP–NCAR reanalysis datasets. A spatially dependent weighting that highlights the regional strengths of each dataset is designed for these misfit constraints. Climatological structure information is used as a weak smoothing constraint on the solution through Laplacian and kinematic (divergence and curl) constraints. The weights for the smoothing constraints are selected using a sensitivity analysis and evaluation of solution fields. The resulting 37 yr of monthly pseudostress fields are suitable for use in a variety of modeling and climate variability studies.

The monthly mean analyses are produced for 1961 through 1997, over the domain 40°S–40°N, 125°E–70°W. NCEP–NCAR reanalysis data, from 40° to 60°N, are added to the minimization solution fields, and the monthly mean climatologies, based on the solution fields, are removed from the combined fields. The resulting pseudostress anomalies are filtered with an 18-month low-pass filter to focus on interannual and ENSO timescales, and a complex empirical orthogonal function (CEOF) analysis is performed on the filtered anomalies. The CEOF analysis reveals tropical and extratropical linkages, for example, the presence of a strengthening of the Aleutian low in the North Pacific, coincident with the anomalous westerlies along the equator associated with El Niño events. The analysis also reveals a weakening of the Aleutian low during the winter–spring preceding the El Niño events of 1973 and 1983, and during the peak period of El Viejo, the cold phase of ENSO. A change in the nature of the tropical and extratropical linkages is observed from the warm events of the 1960s to those of the 1980s. These linkages are not found using NCEP–NCAR reanalysis data alone.

Full access
Paul T. Beaudoin
,
David M. Legler
, and
James J. O'Brien

Abstract

This study examines ERS-1 3-day repeat orbit scatterometer wind data from January through March 1992. The study region encompasses the North Pacific from 30° to 50°N and 160°E to 130°W longitude. The data are separated by orbit trajectory and binned to 26 km. These data are examined by direct comparative analysis to surface European Centre for Medium-Range Weather Forecasts (ECMWF) model analyses on daily, monthly, and 3-month timescales. The scatterometer wind fields compare favorably, but distinct, nonisolated differences exist. These differences, exhibited in the scatterometer winds, include slightly stronger wind speeds, more distinct curvature, and detail on structures smaller than the ECMWF resolution. Systematic relative northward displacements of cyclonic centers, 1°–3° in latitude, in ECMWF surface winds are also indicated. The scatterometer wind retrieval algorithm (CMODFD/NSCAT MLE) demonstrates some difficulty in selecting the true wind vector. Problems are generally identifiable by inspection. Complex empirical orthogonal function (EOF) analysis on the ascending and descending scatterometer wind fields reveal frequency and amplitude information about the sampled variance. The first four EOFs, for which the results suggest physically motivated phenomena, account for 50%–60% of the total variance sampled in the data. The EOF results partition the sampled variance in the ascending and descending data and suggest the more significant EOFs depict spatiotemporal “bands” of 18–21, 8–10, and 6–8 days, reflecting the planetary wave cycle. large-scale general circulation systems, and smaller-scale storm structures, respectively. The partitioning of the variance demonstrates only limited filtering capability in identifying erroneous ERS-1 wind vectors.

Full access
Shawn R. Smith
,
David M. Legler
, and
Kathleen V. Verzone

Abstract

The uncertainties in the NCEP–NCAR reanalysis (NCEPR) products are not well known. Using a newly developed, high-resolution, quality controlled, surface meteorology dataset from research vessels participating in the World Ocean Circulation Experiment (WOCE), regional and global uncertainties are quantified for the NCEPR air–sea fluxes and the component fields used to create those fluxes.

For the period 1990–95, WOCE vessel and gridded NCEPR fields are matched in time and space. All in situ data are subject to data quality review to remove suspect data. Adjustment of ship observations to the reference height of the NCEPR variables, and calculation of air–sea fluxes from the in situ data are accomplished using bulk formulas that take atmospheric stability, height of the measurements, and other adjustments into consideration. The advantages of using this new set of WOCE ship observations include the ability to compare 6-h integrated fluxes (much of the ship data originate from automated observing systems recording continual measurements), and the ability to perform more exhaustive quality control on these measurements. Over 4500 6-h component (sea level pressure, air and sea temperature, winds, and specific humidity) and flux (latent, sensible, and momentum) matches are statistically evaluated to quantify uncertainties between the ship observations and the NCEPR.

Primary results include a significant underestimation in NCEPR near-surface wind speed at all latitudes. The magnitude of the low bias increases at higher ship wind speeds and may be related to large (rms = 2.7 hPa) errors in sea level atmospheric pressure over the entire globe. The pressure biases show the NCEPR to underestimate the amplitude and/or position of both high and low pressures. The NCEPR slightly underestimates the momentum flux, in part, due to the weaker winds. The NCEPR sensible and latent heat fluxes are largely overestimated when compared to the WOCE ship data. Potential sources of this overestimation (e.g., the NCEPR model flux parameterization) are discussed. Using the NCEPR meteorological variables and an independent flux parameterization, the revised NCEPR sensible heat fluxes are closer to the observations, and the biases of the revised NCEPR latent heat flux change sign. Furthermore, while the revised latent heat flux values reduce the magnitude of the bias at higher wind speeds, they increase the bias at (more frequently occurring) moderate wind speeds and thus may not be suitable for many applications.

Full access
Shawn R. Smith
,
David M. Legler
,
Mylene J. Remigio
, and
James J. O’Brien

Abstract

No abstract available.

Full access