Search Results

You are looking at 1 - 10 of 20 items for

  • Author or Editor: James A. Hansen x
  • Refine by Access: All Content x
Clear All Modify Search
James A. Hansen

Abstract

Accurate forecasts require accurate initial conditions. For systems of interest, even given a perfect model and an infinitely long time series of observations, it is impossible to determine a system's exact initial state. This motivates a probabilistic approach to both state estimation and forecasting. Two approaches to probabilistic state estimation, the ensemble Kalman filter, and a probabilistic approach to 4DVAR are compared in the perfect model framework using a two-dimensional chaotic map. Probabilistic forecasts are fed back into the probabilistic state estimation routines in the form of background weighting information. It is found that both approaches are capable of producing correct probabilistic forecasts when a perfect model is in hand, but the probabilistic approach to 4DVAR appears to be the least sensitive to nonlinearities.

When only imperfect models are available (i.e., always), one does not have access to the distribution that produces truth, and it is therefore impossible to produce a correct probabilistic forecast. A multimodel approach to ensemble forecasting provides an opportunity to generate ensembles that systematically bound the true system state. Results suggest that a beneficial approach to ensemble construction is to produce multimodel ensemble members that lie on their respective model attractors, and to select model attractors that systematically bound the system attractor. The inclusion of multimodel uncertainty information in ensemble Kalman filter–like approaches allows ensemble members to be drawn off their respective model attractors, while the dynamical constraints intrinsic to probabilistic 4DVAR enables the approach to ignore ensemble spread due to model differences and to produce analyses that remain on their respective model attractors. Rather than implying that probabilistic 4DVAR is the preferred technique for multimodel data assimilation, these results suggest that it is best to ignore multimodel information during data assimilation, either implicitly (e.g., probabilistic 4DVAR) or explicitly (e.g., the “poor man's ensemble”).

Full access
James A. Hansen and Cecile Penland

Abstract

The delicate (and computationally expensive) nature of stochastic numerical modeling naturally leads one to look for efficient and/or convenient methods for integrating stochastic differential equations. Concomitantly, one may wish to sensibly add stochastic terms to an existing deterministic model without having to rewrite that model. In this note, two possibilities in the context of the fourth-order Runge–Kutta (RK4) integration scheme are examined. The first approach entails a hybrid of deterministic and stochastic integration schemes. In these examples, the hybrid RK4 generates time series with the correct climatological probability distributions. However, it is doubtful that the resulting time series are approximate solutions to the stochastic equations at every time step. The second approach uses the standard RK4 integration method modified by appropriately scaling stochastic terms. This is shown to be a special case of the general stochastic Runge–Kutta schemes considered by Ruemelin and has global convergence of order one. Thus, it gives excellent results for cases in which real noise with small but finite correlation time is approximated as white. This restriction on the type of problems to which the stochastic RK4 can be applied is strongly compensated by its computational efficiency.

Full access
Daniel Gombos and James A. Hansen

Abstract

Hakim and Torn (HT) presented a statistical piecewise potential vorticity (PV) regression technique that uses flow-dependent analysis covariances from an ensemble square root filter to statistically infer the relationship between the PV and state fields. This paper illustrates that the PV perturbation effectively regressed by HT’s regression is the projection of the PV perturbation onto the ensemble PV anomalies that define the regression operator. It is shown that the piecewise PV inversion of this effective PV perturbation via the technique presented in Davis and Emanuel yields nearly identical heights to those from an HT regression performed in the subspace of the leading PV singular vectors.

Full access
Andrew A. Lacis and James Hansen

Abstract

A method is described for rapidly computing the amount of solar energy absorbed at the earth's surface and in the atmosphere as a function of altitude. The method is a parametric treatment, but the form of the solution and the coefficients involved are based on accurate multiple-scattering computations. In this treatment the absorption varies with the amount and type of clouds, the humidity, the zenith angle of the sun, and the albedo of the earth's surface. Within the stratosphere the absorption also depends on the vertical distribution of ozone.

This parameterization for solar radiation is being used in current versions of the global atmospheric circulation model developed at the Goddard Institute for Space Studies.

Full access
Leonard A. Smith and James A. Hansen

Abstract

Uncertainty in the initial condition is one of the factors that limits the utility of single-model-run predictions of even deterministic nonlinear systems. In practice, an ensemble of initial conditions is often used to generate forecasts with the dual aims of 1) estimating the reliability of the forecasts and 2) estimating the probability distribution of the future state of the system. Current rank histogram ensemble verification techniques can only evaluate scalars drawn from ensembles and associated verification; a new method is presented that allows verification in high-dimensional spaces, including those of the verifications for 106 dimensional numerical weather prediction forecasts.

Full access
James A. Hansen and Leonard A. Smith

Abstract

Adaptive observation strategies in numerical weather prediction aim to improve forecasts by exploiting additional observations at locations that are themselves optimized with respect to the current state of the atmosphere. The role played by an inexact estimate of the current state of the atmosphere (i.e., error in the “analysis”) in restricting adaptive observation strategies is investigated; necessary conditions valid across a broad class of modeling strategies are identified for strategies based on linearized model dynamics to be productive. It is demonstrated that the assimilation scheme, or more precisely, the magnitude of the analysis error is crucial in limiting the applicability of dynamically based strategies. In short, strategies based on linearized dynamics require that analysis error is sufficiently small so that the model linearization about the analysis is relevant to linearized dynamics of the full system about the true system state. Inasmuch as the analysis error depends on the assimilation scheme, the level of observational error, the spatial distribution of observations, and model imperfection, so too will the preferred adaptive observation strategy. For analysis errors of sufficiently small magnitude, dynamically based selection schemes will outperform those based only upon uncertainty estimates;it is in this limit that singular vector-based adaptive observation strategies will be productive. A test to evaluate the relevance of this limit is demonstrated.

Full access
Andrew R. Lawrence and James A. Hansen

Abstract

An ensemble-based data assimilation approach is used to transform old ensemble forecast perturbations with more recent observations for the purpose of inexpensively increasing ensemble size. The impact of the transformations are propagated forward in time over the ensemble’s forecast period without rerunning any models, and these transformed ensemble forecast perturbations can be combined with the most recent ensemble forecast to sensibly increase forecast ensemble sizes. Because the transform takes place in perturbation space, the transformed perturbations must be centered on the ensemble mean from the most recent forecasts. Thus, the benefit of the approach is in terms of improved ensemble statistics rather than improvements in the mean. Larger ensemble forecasts can be used for numerous purposes, including probabilistic forecasting, targeted observations, and to provide boundary conditions to limited-area models. This transformed lagged ensemble forecasting approach is explored and is shown to give positive results in the context of a simple chaotic model. By incorporating a suitable perturbation inflation factor, the technique was found to generate forecast ensembles whose skill were statistically comparable to those produced by adding nonlinear model integrations. Implications for ensemble forecasts generated by numerical weather prediction models are briefly discussed, including multimodel ensemble forecasting.

Full access
W. Gregory Lawson and James A. Hansen

Abstract

The concept of alternative error models is suggested as a means to redefine estimation problems with non-Gaussian additive errors so that familiar and near-optimal Gaussian-based methods may still be applied successfully. The specific example of a mixed error model including both alignment errors and additive errors is examined. Using the specific form of a soliton, an analytical solution to the Korteweg–de Vries equation, the total (additive) errors of states following the mixed error model are demonstrably non-Gaussian for large enough alignment errors, and an ensemble of such states is handled poorly by a traditional ensemble Kalman filter, even if position observations are included. Consideration of the mixed error model itself naturally suggests a two-step approach to state estimation where the alignment errors are corrected first, followed by application of an estimation scheme to the remaining additive errors, the first step aimed at removing most of the non-Gaussianity so the second step can proceed successfully. Taking an ensemble approach for the soliton states in a perfect-model scenario, this two-step approach shows a great improvement over traditional methods in a wide range of observational densities, observing frequencies, and observational accuracies. In cases where the two-step approach is not successful, it is often attributable to the first step not having sufficiently removed the non-Gaussianity, indicating the problem strictly requires an estimation scheme that does not make Gaussian assumptions. However, in these cases a convenient approximation to the two-step approach is available, which trades obtaining a minimum variance estimate ensemble mean for more physically sound updates of the individual ensemble members.

Full access
W. Gregory Lawson and James A. Hansen

Abstract

Accurate numerical prediction of fluid flows requires accurate initial conditions. Monte Carlo methods have become a popular and realizable approach to estimating the initial conditions necessary for forecasting, and have generally been divided into two classes: stochastic filters and deterministic filters. Both filters strive to achieve the error statistics predicted by optimal linear estimation, but accomplish their goal in different fashions, the former by way of random number realizations and the latter via explicit mathematical transformations. Inspection of the update process of each filter in a one-dimensional example and in a two-dimensional dynamical system offers a geometric interpretation of how their behavior changes as nonlinearity becomes appreciable. This interpretation is linked to three ensemble assessment diagnostics: rms analysis error, ensemble rank histograms, and measures of ensemble skewness and kurtosis. Similar expressions of these diagnostics exist in a hierarchy of models. The geometric interpretation and the ensemble diagnostics suggest that both filters perform as expected in a linear regime, but that stochastic filters can better withstand regimes with nonlinear error growth.

Full access
James A. Hansen, James S. Goerss, and Charles Sampson

Abstract

A method to predict an anisotropic expected forecast error distribution for consensus forecasts of tropical cyclone (TC) tracks is presented. The method builds upon the Goerss predicted consensus error (GPCE), which predicts the isotropic radius of the 70% isopleth of expected TC track error. Consensus TC track forecasts are computed as the mean of a collection of TC track forecasts from different models and are basin dependent. A novel aspect of GPCE is that it uses not only the uncertainty in the collection of constituent models to predict expected error, but also other features of the predicted storm, including initial intensity, forecast intensity, and storm speed. The new method, called GPCE along–across (GPCE-AX), takes a similar approach but separates the predicted error into across-track and along-track components. GPCE-AX has been applied to consensus TC track forecasts in the Atlantic (CONU/TVCN, where CONU is consensus version U and TVCN is the track variable consensus) and in the western North Pacific (consensus version W, CONW). The results for both basins indicate that GPCE-AX either outperforms or is equal in quality to GPCE in terms of reliability (the fraction of time verification is bound by the 70% uncertainty isopleths) and sharpness (the area bound by the 70% isopleths). GPCE-AX has been implemented at both the National Hurricane Center and at the Joint Typhoon Warning Center for real-time testing and evaluation.

Full access