Search Results

You are looking at 131 - 140 of 5,467 items for :

  • Forecasting techniques x
  • Monthly Weather Review x
  • Refine by Access: All Content x
Clear All
Thomas M. Hamill and Michael Scheuerer

) to increase training sample size and improve the postprocessed NBM temperature guidance. It could also be used to improve longer-lead postprocessed guidance such as week +2 to week +4 temperature forecast products generated by the Climate Prediction Center. The forecast temperature training data in the current NBM use a decaying-average bias correction technique ( Cui et al. 2012 ) that requires archival of only the recent most forecast and analysis. While this procedure is attractive from the

Free access
Manuel Gebetsberger, Jakob W. Messner, Georg J. Mayr, and Achim Zeileis

1. Introduction Nonhomogeneous regression is a popular regression-based technique to statistically correct an ensemble of numerical weather prediction models (NWP; Leith 1974 ). Such corrections are often necessary since current NWP models cannot consider all error sources ( Lorenz 1963 ; Hamill and Colucci 1998 ; Mullen and Buizza 2002 ; Bauer et al. 2015 ) so that the raw forecasts are often biased and uncalibrated. In statistical postprocessing, various approaches have been developed to

Open access
Piet Termonia, Daan Degrauwe, and Rafiq Hamdi

the temporal interpolation (see Termonia et al. 2009 ). The weakness of the restarts proposed in that paper will be reiterated and a nudging-based workaround will be presented. In section 3 the results of a validation of this method are presented and it will be shown how the gridpoint nudging solves the temporal resolution problem of an incoming storm while keeping the forecast farther inside the domain quasi-intact. This paper will then be concluded with a discussion of how this technique may

Full access
Junkyung Kay and Xuguang Wang

, and I/O techniques such as parallel I/O ( Balle and Johnsen 2016 ), we do not include the I/O cost in the estimate. It is worth noting that following Lei and Whitaker (2017) the same time step is used for both the low-resolution and the high-resolution ensemble forecasts because the GFS employs an unconditionally stable semi-Lagrangian time integration scheme which allows stable integration with long time steps ( Williamson 2007 ). The cost of the 4DEnVar update is sensitive to the resolution of

Free access
Maxime Taillardat, Olivier Mestre, Michaël Zamo, and Philippe Naveau

biased and underdispersed ( Hamill and Colucci 1997 ; Hamill and Whitaker 2006 ). Several techniques for the statistical postprocessing of ensemble model output have been developed to square up to these shortcomings. Local quantile regression and probit regression were used for probabilistic forecasts of precipitation by Bremnes (2004) . Other techniques of regression like censored quantile regression have been applied to extreme precipitation ( Friederichs and Hense 2007 ) and logistic regression

Full access
F. Harnisch, S. B. Healy, P. Bauer, and S. J. English

, observations derived from a known “truth,” represented by a “nature run,” in order to estimate the forecast impact of the proposed observing system. OSSEs are computationally expensive because all observations have to be simulated. Recently, Tan et al. (2007) have proposed and developed an alternative approach to analyze the impact of simulated observations using the ensemble of data assimilations (EDA) technique. They applied the EDA approach to estimate the impact of the European Space Agency's (ESA

Full access
Claude Fischer and Ludovic Auger

D) state is achieved. The first of these issues leads to the definition of a “spinup model,” as opposed to a model run in assimilation mode, which would start using a background state obtained from a “domestic” forecast. In the Aire Limitée Adaptation Dynamique Développement International (ALADIN)-France community, the spinup model is also often called a dynamical adaptation model, when the limited area model (LAM) is started with a global uninitialized analysis coming from the Action de

Full access
William F. Campbell, Elizabeth A. Satterfield, Benjamin Ruston, and Nancy L. Baker

the forward model, error of representation, and forecast model bias among other sources. Temperature channel error correlations are much lower, with the exception of some of the ATMS channels, due to correlated instrument error ( Bormann et al. 2013 ). Despite the known shortcomings of the Desroziers technique, small modifications to the computed correlation matrix along with judicious variance inflation yielded large positive impact on forecasts. Our error variances were in line with those used

Full access
Mohamad El Gharamti

localization and hybridization parameters simultaneously. Their technique tackles sampling error by minimizing the quadratic error between the localized-hybrid covariance and an asymptotic covariance matrix (obtained with a very large ensemble). The authors validated their method using a 48 h WRF ensemble forecast system in north America. Satterfield et al. (2018) estimated the weighting factor by first, finding the distribution of the true error variance given the ensemble variance, and then computing

Open access
Caren Marzban, Scott Sandgathe, James D. Doyle, and Nicholas C. Lederer

1. Introduction Sensitivity analysis (SA) refers to a wide suite of techniques for assessing the effect of a set of quantities on another. In meteorological circles, the latter (here called output ) is usually some forecast quantity of interest (e.g., total accumulated precipitation, 2-m temperature, 10-m wind speed, etc.). The former (here called input ), is usually either analyzed initial conditions or model/algorithm parameters, or both. In some cases the input is an observation, an entire

Full access