Search Results

You are looking at 21 - 24 of 24 items for :

  • Author or Editor: John W. Nielsen-Gammon x
  • Refine by Access: All Content x
Clear All Modify Search
Fuqing Zhang
,
Andrew M. Odins
, and
John W. Nielsen-Gammon

Abstract

A mesoscale model is used to investigate the mesoscale predictability of an extreme precipitation event over central Texas on 29 June 2002 that lasted through 7 July 2002. Both the intrinsic and practical aspects of warm-season predictability, especially quantitative precipitation forecasts up to 36 h, were explored through experiments with various grid resolutions, initial and boundary conditions, physics parameterization schemes, and the addition of small-scale, small-amplitude random initial errors. It is found that the high-resolution convective-resolving simulations (with grid spacing down to 3.3 km) do not produce the best simulation or forecast. It was also found that both the realistic initial condition uncertainty and model errors can result in large forecast errors for this warm-season flooding event. Thus, practically, there is room to gain higher forecast accuracy through improving the initial analysis with better data assimilation techniques or enhanced observations, and through improving the forecast model with better-resolved or -parameterized physical processes. However, even if a perfect forecast model is used, small-scale, small-amplitude initial errors, such as those in the form of undetectable random noise, can grow rapidly and subsequently contaminate the short-term deterministic mesoscale forecast within 36 h. This rapid error growth is caused by moist convection. The limited deterministic predictability of such a heavy precipitation event, both practically and intrinsically, illustrates the need for probabilistic forecasts at the mesoscales.

Full access
John W. Nielsen-Gammon
,
Xiao-Ming Hu
,
Fuqing Zhang
, and
Jonathan E. Pleim

Abstract

Meteorological model errors caused by imperfect parameterizations generally cannot be overcome simply by optimizing initial and boundary conditions. However, advanced data assimilation methods are capable of extracting significant information about parameterization behavior from the observations, and thus can be used to estimate model parameters while they adjust the model state. Such parameters should be identifiable, meaning that they must have a detectible impact on observable aspects of the model behavior, their individual impacts should be a monotonic function of the parameter values, and the various impacts should be clearly distinguishable from each other.

A sensitivity analysis is conducted for the parameters within the Asymmetrical Convective Model, version 2 (ACM2) planetary boundary layer (PBL) scheme in the Weather Research and Forecasting model in order to determine the parameters most suited for estimation. A total of 10 candidate parameters are selected from what is, in general, an infinite number of parameters, most being implicit or hidden. Multiple sets of model simulations are performed to test the sensitivity of the simulations to these 10 particular ACM2 parameters within their plausible physical bounds. The most identifiable parameters are found to govern the vertical profile of local mixing within the unstable PBL, the minimum allowable diffusivity, the definition of the height of the unstable PBL, and the Richardson number criterion used to determine the onset of turbulent mixing in stable stratification. Differences in observability imply that the specific choice of parameters to be estimated should depend upon the characteristics of the observations being assimilated.

Full access
John W. Nielsen-Gammon
,
Christina L. Powell
,
M. J. Mahoney
,
Wayne M. Angevine
,
Christoph Senff
,
Allen White
,
Carl Berkowitz
,
Christopher Doran
, and
Kevin Knupp

Abstract

An airborne microwave temperature profiler (MTP) was deployed during the Texas 2000 Air Quality Study (TexAQS-2000) to make measurements of boundary layer thermal structure. An objective technique was developed and tested for estimating the mixed layer (ML) height from the MTP vertical temperature profiles. The technique identifies the ML height as a threshold increase of potential temperature from its minimum value within the boundary layer. To calibrate the technique and evaluate the usefulness of this approach, coincident estimates from radiosondes, radar wind profilers, an aerosol backscatter lidar, and in situ aircraft measurements were compared with each other and with the MTP. Relative biases among all instruments were generally less than 50 m, and the agreement between MTP ML height estimates and other estimates was at least as good as the agreement among the other estimates. The ML height estimates from the MTP and other instruments are utilized to determine the spatial and temporal evolution of ML height in the Houston, Texas, area on 1 September 2000. An elevated temperature inversion was present, so ML growth was inhibited until early afternoon. In the afternoon, large spatial variations in ML height developed across the Houston area. The highest ML heights, well over 2 km, were observed to the north of Houston, while downwind of Galveston Bay and within the late afternoon sea breeze ML heights were much lower. The spatial variations that were found away from the immediate influence of coastal circulations were unexpected, and multiple independent ML height estimates were essential for documenting this feature.

Full access
Martin Hoerling
,
Arun Kumar
,
Randall Dole
,
John W. Nielsen-Gammon
,
Jon Eischeid
,
Judith Perlwitz
,
Xiao-Wei Quan
,
Tao Zhang
,
Philip Pegion
, and
Mingyue Chen

Abstract

The record-setting 2011 Texas drought/heat wave is examined to identify physical processes, underlying causes, and predictability. October 2010–September 2011 was Texas’s driest 12-month period on record. While the summer 2011 heat wave magnitude (2.9°C above the 1981–2010 mean) was larger than the previous record, events of similar or larger magnitude appear in preindustrial control runs of climate models. The principal factor contributing to the heat wave magnitude was a severe rainfall deficit during antecedent and concurrent seasons related to anomalous sea surface temperatures (SSTs) that included a La Niña event. Virtually all the precipitation deficits appear to be due to natural variability. About 0.6°C warming relative to the 1981–2010 mean is estimated to be attributable to human-induced climate change, with warming observed mainly in the past decade. Quantitative attribution of the overall human-induced contribution since preindustrial times is complicated by the lack of a detected century-scale temperature trend over Texas. Multiple factors altered the probability of climate extremes over Texas in 2011. Observed SST conditions increased the frequency of severe rainfall deficit events from 9% to 34% relative to 1981–2010, while anthropogenic forcing did not appreciably alter their frequency. Human-induced climate change increased the probability of a new temperature record from 3% during the 1981–2010 reference period to 6% in 2011, while the 2011 SSTs increased the probability from 4% to 23%. Forecasts initialized in May 2011 demonstrate predictive skill in anticipating much of the SST-enhanced risk for an extreme summer drought/heat wave over Texas.

Full access