Search Results

You are looking at 1 - 10 of 20 items for

  • Author or Editor: Robert J. Anderson x
  • All content x
Clear All Modify Search
Robert Pincus, Robert J. Patrick Hofmann, Jeffrey L. Anderson, Kevin Raeder, Nancy Collins, and Jeffrey S. Whitaker

Abstract

This paper explores the degree to which short-term forecasts with global models might be improved if clouds were fully included in a data assimilation system, so that observations of clouds affected all parts of the model state and cloud variables were adjusted during assimilation. The question is examined using a single ensemble data assimilation system coupled to two present-generation climate models with different treatments of clouds. “Perfect-model” experiments using synthetic observations, taken from a free run of the model used in subsequent assimilations, are used to circumvent complications associated with systematic model errors and observational challenges; these provide a rough upper bound on the utility of cloud observations with these models. A series of experiments is performed in which direct observations of the model’s cloud variables are added to the suite of observations being assimilated. In both models, observations of clouds reduce the 6-h forecast error, with much greater reductions in one model than in the other. Improvements are largest in regions where other observations are sparse. The two cloud schemes differ in their complexity and number of degrees of freedom; the model using the simpler scheme makes better use of the cloud observations because of the stronger correlations between cloud-related and dynamical variables (particularly temperature). This implies that the impact of real cloud observations will depend on both the strength of the instantaneous, linear relationships between clouds and other fields in the natural world, and how well each assimilating model’s cloud scheme represents those relationships.

Full access
Mark A. Donelan, Fred W. Dobson, Stuart D. Smith, and Robert J. Anderson

Abstract

No abstract available

Full access
Mark A. Donelan, Fred W. Dobson, Stuart D. Smith, and Robert J. Anderson

Abstract

The aerodynamic roughness of the sea surface, z 0, is investigated using data from Lake Ontario, from the North Sea near the Dutch coast, and from an exposed site in the Atlantic Ocean off the coast of Nova Scotia. Scaling z 0 by rms wave height gives consistent results for all three datasets, except where wave heights in the Atlantic Ocean are dominated by swell. The normalized roughness depends strongly on wave age: younger waves (traveling slower than the wind) are rougher than mature waves. Alternatively, the roughness may be normalized using the friction velocity, u *, of the wind stress. Again, young waves are rougher than mature waves. This contradicts some recent deductions in the literature, but the contradiction arises from attempts to describe z 0 in laboratory tanks and in the field with a single simple parameterization. Here, it is demonstrated that laboratory waves are inappropriate for direct comparison with field data, being much smoother than their field equivalents. In the open ocean there is usually a mixture of swell and wind-driven sea, and more work is needed before the scaling of surface roughness in these complex conditions can be understood.

Full access
Theodore L. Anderson, Robert J. Charlson, David M. Winker, John A. Ogren, and Kim Holmén

Abstract

Tropospheric aerosols are calculated to cause global-scale changes in the earth's heat balance, but these forcings are space/time integrals over highly variable quantities. Accurate quantification of these forcings will require an unprecedented synergy among satellite, airborne, and surface-based observations, as well as models. This study considers one aspect of achieving this synergy—the need to treat aerosol variability in a consistent and realistic way. This need creates a requirement to rationalize the differences in spatiotemporal resolution and coverage among the various observational and modeling approaches. It is shown, based on aerosol optical data from diverse regions, that mesoscale variability (specifically, for horizontal scales of 40–400 km and temporal scales of 2–48 h) is a common and perhaps universal feature of lower-tropospheric aerosol light extinction. Such variation is below the traditional synoptic or “airmass” scale (where the aerosol is often assumed to be essentially homogeneous except for plumes from point sources) and below the scales that are readily resolved by chemical transport models. The present study focuses on documenting this variability. Possible physical causes and practical implications for coordinated observational strategies are also discussed.

Full access
Kevin Raeder, Jeffrey L. Anderson, Nancy Collins, Timothy J. Hoar, Jennifer E. Kay, Peter H. Lauritzen, and Robert Pincus

Abstract

The Community Atmosphere Model (CAM) has been interfaced to the Data Assimilation Research Testbed (DART), a community facility for ensemble data assimilation. This provides a large set of data assimilation tools for climate model research and development. Aspects of the interface to the Community Earth System Model (CESM) software are discussed and a variety of applications are illustrated, ranging from model development to the production of long series of analyses. CAM output is compared directly to real observations from platforms ranging from radiosondes to global positioning system satellites. Such comparisons use the temporally and spatially heterogeneous analysis error estimates available from the ensemble to provide very specific forecast quality evaluations. The ability to start forecasts from analyses, which were generated by CAM on its native grid and have no foreign model bias, contributed to the detection of a code error involving Arctic sea ice and cloud cover. The potential of parameter estimation is discussed. A CAM ensemble reanalysis has been generated for more than 15 yr. Atmospheric forcings from the reanalysis were required as input to generate an ocean ensemble reanalysis that provided initial conditions for decadal prediction experiments. The software enables rapid experimentation with differing sets of observations and state variables, and the comparison of different models against identical real observations, as illustrated by a comparison of forecasts initialized by interpolated ECMWF analyses and by DART/CAM analyses.

Full access
John H. Seinfeld, Ralph A. Kahn, Theodore L. Anderson, Robert J. Charlson, Roger Davies, David J. Diner, John A. Ogren, Stephen E. Schwartz, and Bruce A. Wielicki

Aerosols are involved in a complex set of processes that operate across many spatial and temporal scales. Understanding these processes, and ensuring their accurate representation in models of transport, radiation transfer, and climate, requires knowledge of aerosol physical, chemical, and optical properties and the distributions of these properties in space and time. To derive aerosol climate forcing, aerosol optical and microphysical properties and their spatial and temporal distributions, and aerosol interactions with clouds, need to be understood. Such data are also required in conjunction with size-resolved chemical composition in order to evaluate chemical transport models and to distinguish natural and anthropogenic forcing. Other basic parameters needed for modeling the radiative influences of aerosols are surface reflectivity and three-dimensional cloud fields. This large suite of parameters mandates an integrated observing and modeling system of commensurate scope. The Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) concept, designed to meet this requirement, is motivated by the need to understand climate system sensitivity to changes in atmospheric constituents, to reduce climate model uncertainties, and to analyze diverse collections of data pertaining to aerosols. This paper highlights several challenges resulting from the complexity of the problem. Approaches for dealing with them are offered in the set of companion papers.

Full access
Wiebe A. Oost, Christopher W. Fairall, James B. Edson, Stuart D. Smith, Robert J. Anderson, John A.B. Wills, Kristina B. Katsaros, and Janice DeCosmo

Abstract

Several methods are examined for correction of turbulence and eddy fluxes in the atmospheric boundary layer, two of them based on a potential-flow approach initiated by Wyngaard. If the distorting object is cylindrical or if the distance to the sensor is much greater than the size of the body, the undisturbed wind stress can be calculated solely from measurements made by the sensor itself; no auxiliary measurements or lengthy model calculations are needed. A more general potential-flow correction has been developed in which distorting objects of complex shape are represented as a number of ellipsoidal elements.

These models are applied to data from three turbulence anemometers with differing amounts of flow distortion, operated simultaneously in the Humidity Exchange over the Sea (HEXOS) Main Experiment. The results are compared with wind-stress estimates by the inertial-dissipation technique; these are much less sensitive to local flow distortion and are consistent with the corrected eddy correlation results. From these comparisons it is concluded that the commonly used “tilt correction” is not sufficient to correct eddy wind stress for distortion by nearby objects, such as probe supports and neighboring sensors.

Neither potential-flow method is applicable to distortion by larger bodies of a scale comparable to the measuring height, such as the superstructure of the Meetpost Noordwijk (MPN) platform used in HEXOS. Flow distortion has been measured around a model of MPN in a wind tunnel study. The results were used to correct mean winds, but simulation of distortion effects on turbulence levels and wind stress turned out not to be feasible.

Full access
David J. Diner, Robert T. Menzies, Ralph A. Kahn, Theodore L. Anderson, Jens Bösenberg, Robert J. Charlson, Brent N. Holben, Chris A. Hostetler, Mark A. Miller, John A. Ogren, Graeme L. Stephens, Omar Torres, Bruce A. Wielicki, Philip J. Rasch, Larry D. Travis, and William D. Collins

A comprehensive and cohesive aerosol measurement record with consistent, well-understood uncertainties is a prerequisite to understanding aerosol impacts on long-term climate and environmental variability. Objectives to attaining such an understanding include improving upon the current state-of-the-art sensor calibration and developing systematic validation methods for remotely sensed microphysical properties. While advances in active and passive remote sensors will lead to needed improvements in retrieval accuracies and capabilities, ongoing validation is essential so that the changing sensor characteristics do not mask atmospheric trends. Surface-based radiometer, chemical, and lidar networks have critical roles within an integrated observing system, yet they currently undersample key geographic regions, have limitations in certain measurement capabilities, and lack stable funding. In situ aircraft observations of size-resolved aerosol chemical composition are necessary to provide important linkages between active and passive remote sensing. A planned, systematic approach toward a global aerosol observing network, involving multiple sponsoring agencies and surface-based, suborbital, and spaceborne sensors, is required to prioritize trade-offs regarding capabilities and costs. This strategy is a key ingredient of the Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) framework. A set of recommendations is presented.

Full access
Jerome M. Schmidt, Piotr J. Flatau, Paul R. Harasti, Robert. D. Yates, David J. Delene, Nicholas J. Gapp, William J. Kohri, Jerome R. Vetter, Jason E. Nachamkin, Mark G. Parent, Joshua D. Hoover, Mark J. Anderson, Seth Green, and James E. Bennett

Abstract

Descriptions of the experimental design and research highlights obtained from a series of four multiagency field projects held near Cape Canaveral, Florida, are presented. The experiments featured a 3 MW, dual-polarization, C-band Doppler radar that serves in a dual capacity as both a precipitation and cloud radar. This duality stems from a combination of the radar’s high sensitivity and extremely small-resolution volumes produced by the narrow 0.22° beamwidth and the 0.543 m along-range resolution. Experimental highlights focus on the radar’s real-time aircraft tracking capability as well as the finescale reflectivity and eddy structure of a thin nonprecipitating stratus layer. Examples of precipitating storm systems focus on the analysis of the distinctive and nearly linear radar reflectivity signatures (referred to as “streaks”) that are caused as individual hydrometeors traverse the narrow radar beam. Each streak leaves a unique radar reflectivity signature that is analyzed with regard to estimating the underlying particle properties such as size, fall speed, and oscillation characteristics. The observed along-streak reflectivity oscillations are complex and discussed in terms of diameter-dependent drop dynamics (oscillation frequency and viscous damping time scales) as well as radar-dependent factors governing the near-field Fresnel radiation pattern and inferred drop–drop interference.

Free access
Robert E. Dickinson, Stephen E. Zebiak, Jeffrey L. Anderson, Maurice L. Blackmon, Cecelia De Luca, Timothy F. Hogan, Mark Iredell, Ming Ji, Ricky B. Rood, Max J. Suarez, and Karl E. Taylor

A common modeling infrastructure ad hoc working group evolved from an NSF/NCEP workshop in 1998, in recognition of the need for the climate and weather modeling communities to develop a more organized approach to building the software that underlies modeling and data analyses. With its significant investment of pro bono time, the working group made the first steps in this direction. It suggested standards for model data and model physics and explored the concept of a modeling software framework. An overall software infrastructure would facilitate separation of the scientific and computational aspects of comprehensive models. Consequently, it would allow otherwise isolated scientists to effectively contribute to core U.S. modeling activities, and would provide a larger market to computational scientists and computer vendors, hence encouraging their support.

Full access