You are looking at 1 - 2 of 2 items for :

  • Review article x
  • Refine by Access: All Content x
Clear All
Pierre Tandeo, Pierre Ailliot, Marc Bocquet, Alberto Carrassi, Takemasa Miyoshi, Manuel Pulido, and Yicun Zhen


Data assimilation combines forecasts from a numerical model with observations. Most of the current data assimilation algorithms consider the model and observation error terms as additive Gaussian noise, specified by their covariance matrices Q and R, respectively. These error covariances, and specifically their respective amplitudes, determine the weights given to the background (i.e., the model forecasts) and to the observations in the solution of data assimilation algorithms (i.e., the analysis). Consequently, Q and R matrices significantly impact the accuracy of the analysis. This review aims to present and to discuss, with a unified framework, different methods to jointly estimate the Q and R matrices using ensemble-based data assimilation techniques. Most of the methods developed to date use the innovations, defined as differences between the observations and the projection of the forecasts onto the observation space. These methods are based on two main statistical criteria: 1) the method of moments, in which the theoretical and empirical moments of the innovations are assumed to be equal, and 2) methods that use the likelihood of the observations, themselves contained in the innovations. The reviewed methods assume that innovations are Gaussian random variables, although extension to other distributions is possible for likelihood-based methods. The methods also show some differences in terms of levels of complexity and applicability to high-dimensional systems. The conclusion of the review discusses the key challenges to further develop estimation methods for Q and R. These challenges include taking into account time-varying error covariances, using limited observational coverage, estimating additional deterministic error terms, or accounting for correlated noise.

Free access
Jeff Kingwell, Junichiro Shimizu, Kaneaki Narita, Hirofumi Kawabata, and Itsuro Shimizu

Many of the techniques employed for rocket meteorology—“rocket-casting”—have been adapted from aviation. However, the unique characteristics and requirements of rocketry demand special meteorological procedures and instrumentation, which are only recently becoming satisfactorily defined.

The influence of weather parameters on operational rocketry is examined, with special emphasis on the Tanegashima Space Center, Japan. It is concluded that the fundamental requirement for efficient launch operations is a highly sophisticated nowcasting facility, backed by an effective research and development program.

On 13 August 1986, the National Space Development Agency of Japan (NASDA) launched from the Osaki rocket range in Tanegashima three payloads on the inaugural flight of the H-1 launch vehicle.

The launch weather was expected to be fine at the range. In the event, a thunderstorm commenced close to the launch area during the last few seconds before launch, which nevertheless proceeded successfully. This incident highlights the uncertainties of rocket operations, particularly in the critical area of the provision of reliable weather information and forecasts.

The synoptic conditions at the time of the 13 August launch incident are examined, and a qualitative forecast checklist is suggested to assist in forecasting similar summertime early-morning maritime storms in the future.

Full access