Search Results

You are looking at 1 - 10 of 28 items for

  • Author or Editor: Kenneth Mitchell x
  • Refine by Access: All Content x
Clear All Modify Search
Wayne M. Angevine
and
Kenneth Mitchell

Abstract

Atmospheric models are a basic tool for understanding the processes that produce poor air quality, for predicting air quality problems, and for evaluating proposed solutions. At the base of many air quality models is a mesoscale meteorological model. The National Centers for Environmental Prediction (NCEP) is now using a model with spatial resolution better than that used for many previous air quality studies. Mixing depth and wind and temperature profiles in the convective boundary layer are the key parameters that must be predicted correctly by a meteorological model for air quality applications. This paper describes an evaluation of the Eta Model predictions of these parameters based on comparisons to measurements made by boundary layer wind profilers at sites in Illinois and Tennessee. The results indicate that the Eta Model is quite usable as a meteorological driver for air quality modeling under reasonably simple terrain and weather conditions. The model estimates of mixing depth, boundary layer winds, and temperature profiles are reasonably accurate. This performance stems from a combination of recent Eta Model advancements in PBL and surface layer physics, land surface physics, 4D data assimilation, and vertical and horizontal resolution.

Full access
Kenneth F. Mitchell
and
John A. Dutton

Abstract

A truncated spectral model of the forced, dissipative, barotropic vorticity equation on a cyclic β-plane is examined for multiple stationary and periodic solutions. External forcing on one scale of the motion provides a barotropic analog to thermal heating.

For forcing of any (finite) magnitude at the maximum or minimum scale in the truncation, the truncated solution converges in the limit as t → ∞ to the known solution of the corresponding linear model. If the forcing is constant, this limit solution represents a globally attracting stationary point in phase space. These results extend the well-known spectral blocking theorem of Fjortøft (1953) to forced, dissipative flows.

The main results, however, obtain from a low-order model describing two disturbance components interacting with a constant, forced, basic-flow component of intermediate scale. The zonal dependence of either the basic flow or the disturbances is flexible and determined by the choice of component wave vectors. For low-wavenumber disturbances and β ≠ 0, the basic flow represents a unique stationary solution, which becomes unstable when the forcing exceeds a critical value. An application of the Hopf bifurcation theorem in the neighborhood of critical forcing reveals the existence of a periodic solution or limit cycle, which is then derived explicitly in phase space as a closed circular orbit whose frequency is described by a linear combination of the normal-mode Rossby-wave frequencies.

The limit cycle radius, which physically represents the ultimate enstrophy of the disturbances, can be depicted as a response surface on the control plane defined by the independent forcing and beta parameters. If the forcing is zonally dependent, the response surface may exhibit a pronounced fold, which arises from the existence of a snap-through bifurcation. The projection of this fold onto the parameter control plane defines a bimodal or hysteresis region in which multiple stable solutions exist for given parameters. The boundary of the hysteresis region represents parameter states at which the model can exhibit sudden flow regime transitions, analogous to those observed in the laboratory rotating annulus.

This study demonstrates that the degree of nonlinearity, the scale of the forcing, and the spatial dependence of the disturbances and the forcing all crucially influence both the multiplicity and temporal nature of the stable limit solutions in a low-order, forced, dissipative model. Thus, choices in this rather complex array of physical degrees of freedom must be carefully considered in any model of the long-term evolution of large-scale atmospheric flow.

Full access
Kenneth E. Mitchell
and
John B. Hovermale

Abstract

The structure of the thunderstorm gust front is investigated by a nonhydrostatic, two-dimensional (x z/) numerical model. In the model, which is dry, the production of negatively buoyant air by evaporation is parameterized via an externally imposed, local-cooling function. This parameterization sustains a steady cold downdraft, which drives the surface outflow and associated gust front.

It is shown that two dominant factors influencing gust front structure in the vertical plane are the solenoidal field coincident with the front and surface friction, modeled by means of a simple bulk aerodynamic drag formulation. The circulation theorem is invoked to illustrate how solenoidal accelerations oppose the deceleration by surface friction. After the onset of a downdraft in the model, these opposing tendencies soon reach a balance. Thus, following a brief transient stage, the model gust front exhibits a persistent configuration as it propagates rapidly forward. The essential features of this configuration are examined and compared with both tower observations of gust fronts and laboratory models of gravity currents.

Full access
Rongqian Yang
,
Kenneth Mitchell
,
Jesse Meng
, and
Michael Ek

Abstract

To examine the impact from land model upgrades and different land initializations on the National Centers for Environmental Prediction (NCEP)’s Climate Forecast System (CFS), extensive T126 CFS experiments are carried out for 25 summers with 10 ensemble members using the old Oregon State University (OSU) land surface model (LSM) and the new Noah LSM. The CFS using the Noah LSM, initialized in turn with land states from the NCEP–Department of Energy Global Reanalysis 2 (GR-2), Global Land Data System (GLDAS), and GLDAS climatology, is compared to the CFS control run using the OSU LSM initialized with the GR-2 land states. Using anomaly correlation as a primary measure, the summer-season prediction skill of the CFS using different land models and different initial land states is assessed for SST, precipitation, and 2-m air temperature over the contiguous United States (CONUS) on an ensemble basis.

Results from these CFS experiments indicate that upgrading from the OSU LSM to the Noah LSM improves the overall CONUS June–August (JJA) precipitation prediction, especially during ENSO neutral years. Such an enhancement in CFS performance requires the execution of a GLDAS with the very same Noah LSM as utilized in the land component of the CFS, while improper initializations of the Noah LSM using the GR-2 land states lead to degraded CFS performance. In comparison with precipitation, the land upgrades have a relatively small impact on both of the SST and 2-m air temperature predictions.

Full access
Curtis H. Marshall
,
Kenneth C. Crawford
,
Kenneth E. Mitchell
, and
David J. Stensrud

Abstract

On 31 January 1996, the National Centers for Environmental Prediction/Environmental Modeling Center (NCEP/EMC) implemented a state-of-the-art land surface parameterization in the operational Eta Model. The purpose of this study is to evaluate and test its performance and demonstrate its impacts on the diurnal cycle of the modeled planetary boundary layer (PBL). Operational Eta Model output from summer 1997 are evaluated against the unique observations of near-surface and subsurface fields provided by the Oklahoma Mesonet. The evaluation is partially extended to July 1998 to examine the effects of significant changes that were made to the operational model configuration during the intervening time.

Results indicate a severe positive bias in top-layer soil moisture, which was significantly reduced in 1998 by a change in the initialization technique. Net radiation was overestimated, largely because of a positive bias in the downward shortwave component. Also, the ground heat flux was severely underestimated. Given energy balance constraints, the combination of these two factors resulted in too much available energy for the turbulent fluxes of sensible and latent heat. Comparison of model and observed vertical thermodynamic profiles demonstrates that these errors had a marked impact on the model PBL throughout its entire depth. Evidence also is presented that suggests a systematic underestimation of the downward entrainment of relatively warmer, drier air at the top of the PBL during daylight hours.

Analyses of the monthly mean bias of 2-m temperature and specific humidity revealed a cool, moist bias over western Oklahoma, and a warm, dry bias over the eastern portion of the state. A very sharp transition existed across central Oklahoma between these two regimes. The sharp spatial gradient in both the air temperature and humidity bias fields is strikingly correlated with a sharp west–east gradient in the model vegetation greenness database. This result suggests too much (too little) latent heat flux over less (more) vegetated areas of the model domain.

A series of sensitivity tests are presented that were designed to explore the reasons for the documented error in the simulated surface fluxes. These tests have been used as supporting evidence for changes in the operational model. Specifically, an alternative specification for the soil thermal conductivity yields a more realistic ground heat flux. Also, the alternative thermal conductivity, when combined with a slight adjustment to the thermal roughness length, yields much better internal consistency among the simulated skin temperature and surface fluxes, and better agreement with observations.

Full access
Yu-Tai Hou
,
Kenneth A. Campana
,
Kenneth E. Mitchell
,
Shi-Keng Yang
, and
Larry L. Stowe

Abstract

CLAVR [cloud from AVHRR (Advanced Very High Resolution Radiometer)] is a global cloud dataset under development at NOAA/NESDIS (National Environmental Satellite, Data, and Information Service). Total cloud amount from two experimental cases, 9 July 1986 and 9 February 1990, are intercompared with two independent products, the Air Force Real-Time Nephanalysis (RTNEPH), and the International Satellite Cloud Climatology Project (ISCCP). The ISCCP cloud database is a climate product processed retrospectively some years after the data are collected. Thus, only CLAVR and RTNEPH can satisfy the real-time requirements for numerical weather prediction (NWP) models. Compared with RTNEPH and ISCCP, which only use two channels in daytime retrievals and one at night, CLAVR utilizes all five channels in daytime and three at night from AVHRR data. That gives CLAVR a greater ability to detect certain cloud types, such as thin cirrus and low stratus. Designed to be an operational product, CLAVR is also compared with total cloud forecasts from the National Meteorological Center (NMC) Medium Range Forecast (MRF) Model. The datasets are mapped to the orbits of NOAA polar satellites, such that errors from temporal sampling are minimized. A set of statistical scores, histograms, and maps are used to display the characteristics of the datasets. The results show that the CLAVR data can realistically resolve global cloud distributions. The spatial variation is, however, less than that of RTNEPH and ISCCP, due to current constraints in the CLAVR treatment of partial cloudiness. Results suggest that if the satellite cloud data is available in real time, it can be used to improve the cloud parameterization in numerical forecast models and data assimilation systems.

Full access
Weizhong Zheng
,
Michael Ek
,
Kenneth Mitchell
,
Helin Wei
, and
Jesse Meng

Abstract

This study examines the performance of the NCEP Global Forecast System (GFS) surface layer parameterization scheme for strongly stable conditions over land in which turbulence is weak or even disappears because of high near-surface atmospheric stability. Cases of both deep snowpack and snow-free conditions are investigated. The results show that decoupling and excessive near-surface cooling may appear in the late afternoon and nighttime, manifesting as a severe cold bias of the 2-m surface air temperature that persists for several hours or more. Concurrently, because of negligible downward heat transport from the atmosphere to the land, a warm temperature bias develops at the first model level. The authors test changes to the stable surface layer scheme that include introduction of a stability parameter constraint that prevents the land–atmosphere system from fully decoupling and modification to the roughness-length formulation. GFS sensitivity runs with these two changes demonstrate the ability of the proposed surface layer changes to reduce the excessive near-surface cooling in forecasts of 2-m surface air temperature. The proposed changes prevent both the collapse of turbulence in the stable surface layer over land and the possibility of numerical instability resulting from thermal decoupling between the atmosphere and the surface. The authors also execute and evaluate daily GFS 7-day test forecasts with the proposed changes spanning a one-month period in winter. The assessment reveals that the systematic deficiencies and substantial errors in GFS near-surface 2-m air temperature forecasts are considerably reduced, along with a notable reduction of temperature errors throughout the lower atmosphere and improvement of forecast skill scores for light and medium precipitation amounts.

Full access
Yongkang Xue
,
Ratko Vasic
,
Zavisa Janjic
,
Fedor Mesinger
, and
Kenneth E. Mitchell

Abstract

This study investigates the capability of the dynamic downscaling method (DDM) in a North American regional climate study using the Eta/Simplified Simple Biosphere (SSiB) Regional Climate Model (RCM). The main objective is to understand whether the Eta/SSiB RCM is capable of simulating North American regional climate features, mainly precipitation, at different scales under imposed boundary conditions. The summer of 1998 was selected for this study and the summers of 1993 and 1995 were used to confirm the 1998 results. The observed precipitation, NCEP–NCAR Global Reanalysis (NNGR), and North American Regional Reanalysis (NARR) were used for evaluation of the model’s simulations and/or as lateral boundary conditions (LBCs). A spectral analysis was applied to quantitatively examine the RCM’s downscaling ability at different scales.

The simulations indicated that choice of domain size, LBCs, and grid spacing were crucial for the DDM. Several tests with different domain sizes indicated that the model in the North American climate simulation was particularly sensitive to its southern boundary position because of the importance of moisture transport by the southerly low-level jet (LLJ) in summer precipitation. Among these tests, only the RCM with 32-km resolution and NNGR LBC or with 80-km resolution and NARR LBC, in conjunction with appropriate domain sizes, was able to properly simulate precipitation and other atmospheric variables—especially humidity over the southeastern United States—during all three summer months—and produce a better spectral power distribution than that associated with the imposed LBC (for the 32-km case) and retain spectral power for large wavelengths (for the 80-km case). The analysis suggests that there might be strong atmospheric components of high-frequency variability over the Gulf of Mexico and the southeastern United States.

Full access
Alan Basist
,
Don Garrett
,
Ralph Ferraro
,
Norman Grody
, and
Kenneth Mitchell

Abstract

A comparison between two satellite-derived snow cover products demonstrates the strengths and weakness of each procedure. The current NESDIS operational product is subjectively derived from visible satellite imagery. The analysis is performed once a week, using the most recent clear view of the surface. The experimental product is objectively derived from daily microwave measurements observed by polar-orbiting satellites. The operational product uses a high albedo in the visible spectrum to identify snow cover, whereas the experimental product uses a high albedo in the visible spectrum to identify snow cover, whereas the experimental product uses a passive microwave scattering signature.

Comparisons between the operational and experimental product show good agreement in the extent and distribution of snow cover during the middle of the winter and summer seasons. However, the agreement weakness in the transition seasons and along the southern edge of the snowpack. The analysis suggests that the operational procedure is better at observing snow under a densely vegetated canopy, whereas the experimental procedure is better over rugged terrain and persistent cloud cover. The experimental product is also better at observing rapid fluctuations in the snowpack, since it has higher temporal resolution and can see through nonprecipitating clouds.

An integration of the two products, currently under development by NOAA/Office of Research and Application and the National Meteorological Center, would represent true snow cover better than either single procedure. However, it would probably introduce discontinuity into the 5-yr time series of the current operational product, which is the longest record of snow cover over the Northern Hemisphere. Averaging the experimental product between two consecutive weeks effectively brings the two datasets into closer agreement throughout the global time series. However, this technique does not resolve the regional biases between the two datasets into closer agreement throughout the global time series. However, this technique does not resolve the regional biases between the two datasets. Surface observations would help identify the source of these biases; unfortunately, these reports are severely limited over many areas.

Full access
Dingchen Hou
,
Kenneth Mitchell
,
Zoltan Toth
,
Dag Lohmann
, and
Helin Wei

Abstract

Hydrological processes are strongly coupled with atmospheric processes related, for example, to precipitation and temperature, and a coupled atmosphere–land surface system is required for a meaningful hydrological forecast. Since the atmosphere is a chaotic system with limited predictability, ensemble forecasts offer a practical tool to predict the future state of the coupled system in a probabilistic fashion, potentially leading to a more complete and informative hydrologic prediction. As ensemble forecasts with coupled meteorological–hydrological models are operationally running at major numerical weather prediction centers, it is currently possible to produce a gridded streamflow prognosis in the form of a probabilistic forecast based on ensembles. Evaluation and improvement of such products require a comprehensive assessment of both components of the coupled system.

In this article, the atmospheric component of a coupled ensemble forecasting system is evaluated in terms of its ability to provide reasonable forcing to the hydrological component and the effect of the uncertainty represented in the atmospheric ensemble system on the predictability of streamflow as a hydrological variable. The Global Ensemble Forecast System (GEFS) of NCEP is evaluated following a “perfect hydrology” approach, in which its hydrological component, including the Noah land surface model and attached river routing model, is considered free of errors and the initial conditions in the hydrological variables are assumed accurate. The evaluation is performed over the continental United States (CONUS) domain for various sizes of river basins. The results from the experiment suggest that the coupled system is capable of generating useful gridded streamflow forecast when the land surface model and the river routing model can successfully simulate the hydrological processes, and the ensemble strategy significantly improves the forecast. The expected forecast skill increases with increasing size of the river basin. With the current GEFS system, positive skill in short-range (one to three days) predictions can be expected for all significant river basins; for the major rivers with mean streamflow more than 500 m3 s−1, significant skill can be expected from extended-range (the second week) predictions. Possible causes for the loss of skills, including the existence of systematic error and insufficient ensemble spread, are discussed and possible approaches for the improvement of the atmospheric ensemble forecast system are also proposed.

Full access