Search Results

You are looking at 1 - 10 of 23 items for

  • Author or Editor: R. Seaman x
  • Refine by Access: All Content x
Clear All Modify Search
R. S. Seaman

Abstract

The dependence of the root-mean-square analysis error of an observed element, its gradient and its Laplacian upon 1) observational density, 2) observational error and 3) spatial correlation of observational error have been assessed using optimum interpolation theory. The results have been generalized by scaling the observational separation s according to a length scale parameter L, which corresponds to the Gaussian population spatial autocorrelation function μ(s)=exp(−s 2 L −2) of the forecast error of the observed element. The responses of different synoptic regimes (subpopulations) were discriminated according to the sub-population length-scale parameter. The number of observations considered at a grid point was limited to either 12 or 4.

It has been shown that 1) an increasing spatial correlation of observational error has a different effect on the analysis errors of absolute and differential quantities, and 2) that the point of diminishing returns, below which an increasing observational density produces little improvement in analysis accuracy, is rather sensitive to the subpopulation length-scale parameter. The results highlight the necessity, for network planning, of considering the relative importance of the absolute value and differential characteristics of an observed element, and the response of defined “extreme” synoptic regimes in addition to the gross population response.

Some of the experiments were repeated with an inverse polynominal autocorrelation function instead of the Gaussian. The results suggest that if, as suggested by some climatologically based autocorrelations, the former function agrees better with observed temperature data, then the use of the Gaussian function may tend to underestimate the benefit of high-resolution remote soundings.

Full access
R. S. Seaman

Abstract

No abstract available.

Full access
R. S. Seaman

Abstract

The parameters of the Barnes objective analysis scheme are often chosen on the basis of a desired frequency response, but they can also be chosen using the criterion of the theoretical root-mean-square interpolation error (E). Using the latter criterion, it is shown how the parameters of a common two-parameter Barnes implementation can be optimized for any specified irregular observational distribution. The problem is then generalized by means of design curves that enable the parameters to be chosen according to (i) average data spacing relative to the correlation coefficient function length scale of the field being analyzed, and (ii) the observational error variance relative to the variance of the true field (noise-to-signal ratio).

A large set of real data was analyzed using parameters chosen on the basis of interpolation theory. The analyses were assessed by comparison against a set of withheld data. The result suggests that minimum E is a satisfactory criterion for objectively choosing the Barnes parameters when the statistical properties of the true field and of the observational error are known in advance. It is also shown that the chosen two-parameter Barnes implementation is robust, in the sense that a large region of parameter space corresponds to values of E only slightly above the minimum.

Full access
David R. Stauffer
and
Nelson L. Seaman

Abstract

Four-dimensional data assimilation (FDDA) schemes capable of effectively analyzing asynoptic, near-continuous data streams art especially important on the mesobeta scale for both model initialization and dynamic analysis. A multiscale nudging approach that utilizes grid nesting is investigated for the generation of complete, dynamically consistent datasets for the mesobeta scale. These datasets are suitable for input into air quality models, but can also be used for other diagnostic purposes including model initialization. A multiscale nudging strategy is used here to simulate the wind flow for two cases over the Colorado Plateau and Grand Canyon region during the winter of 1990 when a special mesobeta-scale observing system was deployed in the region to study the canyon's visibility impairment problem. The special data included Doppler sodars, profilers rawinsondes, and surface stations. Combinations of these data and conventional mesoalpha-scale data were assimilated into a nested version of the Pennsylvania State University-National Center for Atmospheric Research Mesoscale Model to investigate the importance of wale interaction and scale separation during FDDA.

Mesoalpha-scale forcing was shown to be important for accurate simulation of the mesobeta-scale flow over the 48-h period of the simulators. Direct assimilation of mesoalpha-scale analyses on a finescale grid was shown to be potentially harmful to the simulation of mesobeta-scale features. Nudging to mesoalpha-scale analyses on the coarse grid enabled nudging to mesobeta-scale observations on the inner fine grid to be more effective. This grid-nesting multiscale FDDA strategy produced the most accurate simulation of the low-level wind fields. It is demonstrated that when designing an FDDA strategy, scale interactions of different flow regimes cannot be ignored, particularly for simulation periods of several days on the mesobeta scale.

Full access
D. J. Gauntlett
and
R. S. Seaman

Abstract

An attempt has been made to isolate some of the problems likely to be associated with the practical implementation of four-dimensional data assimilation schemes in the Southern Hemisphere. In particular, the requirement for a reference-level specification over the Southern Hemisphere, and the importance of assimilation frequency (i.e., the period between data insertions) are investigated.

The assimilation scheme used consists of three components: a “sigma” surface analysis model to “insert” data as they become available, an initialization module of the Nitta-Hovermale type to remove high-frequency inertio-gravitational oscillations, and a multi-level primitive equation model to “advect” the assimilated atmosphere state forward in time. In all experiments, real data consisting of both the conventional and satellite-derived type are used. Verifications concentrate on the synoptic verisimilitude of the assimilation process, and where possible, the impact of various assimilation procedures on subsequent numerical prognosis.

Results underline the critical importance of reference-level pressure in the scheme evaluated. There is also some suggestion of improved performance when the assimilation frequency is increased.

Full access
R. Seaman
and
on behalf of the local organizing committee
Full access
K. Puri
,
W. Bourke
, and
R. Seaman

Abstract

A five-day data assimilation experiment is described in which linear normal mode initialization is used to suppress the transient gravity waves. The initialization is performed on increments to the model fields resulting from the insertion of data. The scheme is well behaved and is found to produce analyses which are similar to those produced by a scheme in which the more usual nonlinear normal mode initialization is used. The incremental linear scheme, however, is found to maintain a much stronger meridional circulation in the tropics than the nonlinear scheme. An examination of the noise characteristics shows that the incremental linear scheme allows a higher level of transient gravity wave activity.

Full access
W. Bourke
,
K. Puri
, and
R. Seaman

Abstract

The quality of numerical weather prediction available for the Southern Hemisphere from the FGGE data base has been examined. The Australian Numerical Meteorology Research Centre (ANMRC) spectral prediction model has been initialized with analyses produced with the operational system of the Australian Bureau of Meteorology, and experimentally from the ANMRC assimilation scheme. The predictions to 48 hours from the assimilation-based analyses were at least comparable with operational levels of performance, even though the latter require substantial manual interaction.

The impact upon predictions of TIROS-N retrievals and drifting buoy surface pressures was assessed by omitting these observing systems in turn from the assimilations. Both systems impact positively upon predictions. The effect of the buoys is greatest at sea level. The effect of the retrievals is evident both at sea level and aloft.

Full access
G. A. Mills
and
R. S. Seaman

Abstract

A new limited-area data assimilation system has been developed in the BMRC for operational use by the Australian Bureau of Meteorology. The system analyzes deviations from a primitive equations model forecast, using two-dimensional univariate statistical interpolation (SI) to analyze mass, and three-dimensional univariate SI to analyze wind data. Mass and wind increment analyses may mutually influence the other using variational techniques.

Analysis increments are vertically interpolated to prognosis model sigma surfaces, added to the forecast variables, and the model integrated forward to the next analysis time. This ongoing analysis-forecast cycle is now being implemented operationally.

This paper describes in detail the analysis methodology, and presents results from a 17-day trial period. The analyses are compared with operationally prepared analyses for the same period individually, as means, and by data fitting statistics. It is shown that the assimilated analyses have stronger jet streams and greatly improved detail in the moisture analyses. It is also shown that vertical motion patterns in the guess fields are preserved through the analysis initialization phase of the assimilation cycle, and that these vertical motion fields correlate well with the areas of cloud seen in satellite imagery.

Prognoses from this trial period show a much more rapid spinup of forecast rainfall rate than did a series of control forecasts based on operational analyses, and both mean rainfall for the 17-day period and individual cases are presented to demonstrate improved skill of forecasts from the assimilated analyses. Objective verification of mass-field forecasts showed considerable sensitivity of the forecasts to the particular set of bogus mean sea level pressure data used in the analysis; however, preliminary verification statistics from the first 15 days of operational parallel running showed that the assimilation system produced forecasts of similar skill to operational forecasts of MSLP at 24 hours, but greater skill at the upper levels, and had greater skill at all levels for the 36-hour forecast.

Full access
David R. Stauffer
and
Nelson L. Seaman

Abstract

A four-dimensional data assimilation (FDDA) scheme based on Newtonian relaxation or “nudging” is tested using standard rawinsonde data in the Penn State/NCAR limited-area mesoscale model. It is imperative that we better understand these FDDA-generated datasets, which are widely used for model initialization and diagnostic analysis. The main hypothesis to be tested is that use of coarse-resolution rawinsonde observations throughout a model integration, rather than at only the initial time, can limit large-scale model error growth (amplitude and phase errors) while the model generates realistic mesoscale structures not resolved by the data.

The main objective of this study is to determine what assimilation strategies and what meteorological fields (mass, wind or both) have the greatest positive impact via FDDA on the numerical simulators for two midlatitude, real-data cases using the full-physics version of a limited-area model. Seven experiments are performed for each case: one control experiment (no nudging), five experiments which nudge the model solution to analyses of observations, and a seventh experiment in which the actual rawinsonde observations are assimilated directly into the model. Subjective and statistical evaluation of the results include verification of the primitive variable fields, plus a detailed precipitation verification which is especially valuable since rainfall is the result of many complex physical processes and is usually characterized by small-scale variability, which makes it much more difficult to simulate accurately than the other variables.

The results show that the assimilation of both wind and thermal data throughout the model atmosphere had a consistently positive impact on the synoptic-scale and mesoscale mass and wind fields for both cases and for the precipitation simulations in the case dominated by large-scale forcing. However, in the other case for which small-scale convection was the dominant precipitation mechanism, the FDDA system using only rawinsonde data showed only a minor improvement in the rainfall. This may be attributed to 1) the fact that time scales of small convective systems am less than 12 h, the temporal resolution of the data used for FDDA, and 2) assimilation of 12-hourly temperature data near the surface may adversely affect the model's diurnal cycle and low-level stability, which are very important for convection.

Other results show that nudging vorticity or the rawinsonde-based mixing ratio analyses tended to seriously degrade the precipitation simulators for both cases and should be avoided. The transfer of information on the mesoscale from the wind (mass) fields to the mass (wind) fields was found to be significant: for shallow forcing (small equivalent depth), the winds were shown to adjust to the mass fields, while for large-scale forcing through the depth of the troposphere (large equivalent depth), wind data were generally more effective than mass data. The most accurate mass and wind fields in both cases, however, were produced by assimilating both wind and temperature information. Nudging the model' wind and temperature fields directly to the rawinsonde observations generally produced results comparable to nudging to the gridded analyses of these data.

Full access