Search Results

You are looking at 1 - 9 of 9 items for

  • Author or Editor: B. B. Edwards x
  • All content x
Clear All Modify Search
Howard B. Edwards

Abstract

It has been demonstrated that the methods used to measure daily mean temperatures are subject to bias errors and therefore, may not be accurate enough to determine climatic trends. To eliminate bias errors, it is necessary to measure temperature continuously with readings recorded often enough to provide at least two data points for each cycle of the highest frequency fluctuation discernible in the transducer signal. This can be accomplished by a system having a transducer with a 7 min time constant, the signals from which are sampled every 30 s. The resulting data can be filtered to show smoothed daily cycles and then reduced to 12 points per day with the daily cycle eliminated. These points are true mean temperatures and can be used through succeeding filtering steps to smooth and eliminate annual and solar cycles and reveal climatic trends.

Full access
Howard B. Edwards

Abstract

Of all the errors discussed in climatology literature, aliasing errors caused by undersampling of unsmoothed or improperly smoothed temperature data seem to be completely overlooked. This is a serious oversight in view of long-term trends of 1 K or less. Adequate sampling of properly smoothed data is demonstrated with a Hamming digital filter. It is also demonstrated that hourly temperatures daily averages, and annual averages free of aliasing errors can be obtained by use of a microprocessor added to standard weather sensors and recorders.

Full access
C. P. Chang, F. T. Jacobs, and B. B. Edwards

Abstract

A diagnostic model is proposed to use digitized satellite cloud brightness data to estimate objectively the large-scale flow patterns over data-void tropical regions. The model utilizes a linear barotropic vorticity equation with two primary assumptions: 1) that the area-averaged cloud brightness is positively correlated with large-scale divergence in the tropical upper troposphere; and 2) that the large-scale tropical flow is quasi-barotropic and quasi-non-divergent. It is designed to be used at any upper tropospheric level where divergence is important in determining the vorticity field. Three types of information are required: 1) boundary conditions determined from surrounding wind reports, 2) a mean zonal flow determined from climatology, and 3) an equivalent divergence forcing function constructed empirically from the brightness data.

The model is tested daily over a western North Pacific region for July-August 1971. Results for an 8-day representative period are presented and discussed. In general for 25% of the days tested, the model produces a flow field which accurately resembles the major features of the streamfunction field analyzed by the National Meteorological Center. In another 30% of the days it provides some valuable information about the flow patterns which would be difficult to obtain from boundary information alone. Experiments are also performed for two days in which the brightness data are enhanced by time-interpolated satellite infrared data. The resultant flow fields bear better resemblance to the NMC analysis. It is thus suggested that improved results may be expected when infrared and other types of advanced satellite data are available.

Full access
K. P. Edwards, F. E. Werner, and B. O. Blanton

Abstract

Lagrangian particle tracking using three-dimensional (3D) numerical modeling approaches has become an important tool in coastal oceanography. In this note, an approach is described that can reduce the difference between observed and numerical drifter trajectories in the coastal ocean by including corrections to the water velocity due to differences between observed winds and the wind field used to drive the 3D circulation model and some specific characteristics of the observed drifters in the algorithm that estimate the numerical trajectory. Quantitative improvements are obtained whereby the separation distance between the numerical and observed drifters is almost halved (in this particular field case from 2.6 to 1.4 km day−1).

Full access
Jason Naylor, Matthew S. Gilmore, Richard L. Thompson, Roger Edwards, and Robert B. Wilhelmson

Abstract

The accuracy, reliability, and skill of several objective supercell identification methods are evaluated using 113 simulations from an idealized cloud model with 1-km horizontal grid spacing. Horizontal cross sections of vorticity and radar reflectivity at both mid- and low levels were analyzed for the presence of a supercell, every 5 min of simulation time, to develop a “truth” database. Supercells were identified using well-known characteristics such as hook echoes, inflow notches, bounded weak-echo regions (BWERs), and the presence of significant vertical vorticity.

The three objective supercell identification techniques compared were the Pearson correlation (PC) using an analysis window centered on the midlevel storm updraft; a modified Pearson correlation (MPC), which calculates the PC at every point in the horizontal using a small 3 km × 3 km analysis window; and updraft helicity (UH). Results show that the UH method integrated from 2 to 5 km AGL, and using a threshold value of 180 m2 s−2, was equally as accurate as the MPC technique—averaged from 2 to 5 km AGL and using a minimum updraft threshold of 7 m s−1 with a detection threshold of 0.3—in discriminating between supercells and nonsupercells for 1-km horizontal grid spacing simulations. At courser resolutions, the UH technique performed best, while the MPC technique produced the largest threat scores for higher-resolution simulations. In addition, requiring that the supercell detection thresholds last at least 20 min reduced the number of false alarms.

Full access
H. W. Barker, G. L. Stephens, P. T. Partain, J. W. Bergman, B. Bonnel, K. Campana, E. E. Clothiaux, S. Clough, S. Cusack, J. Delamere, J. Edwards, K. F. Evans, Y. Fouquart, S. Freidenreich, V. Galin, Y. Hou, S. Kato, J. Li, E. Mlawer, J.-J. Morcrette, W. O'Hirok, P. Räisänen, V. Ramaswamy, B. Ritter, E. Rozanov, M. Schlesinger, K. Shibata, P. Sporyshev, Z. Sun, M. Wendisch, N. Wood, and F. Yang

Abstract

The primary purpose of this study is to assess the performance of 1D solar radiative transfer codes that are used currently both for research and in weather and climate models. Emphasis is on interpretation and handling of unresolved clouds. Answers are sought to the following questions: (i) How well do 1D solar codes interpret and handle columns of information pertaining to partly cloudy atmospheres? (ii) Regardless of the adequacy of their assumptions about unresolved clouds, do 1D solar codes perform as intended?

One clear-sky and two plane-parallel, homogeneous (PPH) overcast cloud cases serve to elucidate 1D model differences due to varying treatments of gaseous transmittances, cloud optical properties, and basic radiative transfer. The remaining four cases involve 3D distributions of cloud water and water vapor as simulated by cloud-resolving models. Results for 25 1D codes, which included two line-by-line (LBL) models (clear and overcast only) and four 3D Monte Carlo (MC) photon transport algorithms, were submitted by 22 groups. Benchmark, domain-averaged irradiance profiles were computed by the MC codes. For the clear and overcast cases, all MC estimates of top-of-atmosphere albedo, atmospheric absorptance, and surface absorptance agree with one of the LBL codes to within ±2%. Most 1D codes underestimate atmospheric absorptance by typically 15–25 W m–2 at overhead sun for the standard tropical atmosphere regardless of clouds.

Depending on assumptions about unresolved clouds, the 1D codes were partitioned into four genres: (i) horizontal variability, (ii) exact overlap of PPH clouds, (iii) maximum/random overlap of PPH clouds, and (iv) random overlap of PPH clouds. A single MC code was used to establish conditional benchmarks applicable to each genre, and all MC codes were used to establish the full 3D benchmarks. There is a tendency for 1D codes to cluster near their respective conditional benchmarks, though intragenre variances typically exceed those for the clear and overcast cases. The majority of 1D codes fall into the extreme category of maximum/random overlap of PPH clouds and thus generally disagree with full 3D benchmark values. Given the fairly limited scope of these tests and the inability of any one code to perform extremely well for all cases begs the question that a paradigm shift is due for modeling 1D solar fluxes for cloudy atmospheres.

Full access
Ariel E. Cohen, Richard L. Thompson, Steven M. Cavallo, Roger Edwards, Steven J. Weiss, John A. Hart, Israel L. Jirak, William F. Bunting, Jaret W. Rogers, Steven F. Piltz, Alan E. Gerard, Andrew D. Moore, Daniel J. Cornish, Alexander C. Boothe, and Joel B. Cohen

Abstract

During the 2014–15 academic year, the National Oceanic and Atmospheric Administration (NOAA) National Weather Service Storm Prediction Center (SPC) and the University of Oklahoma (OU) School of Meteorology jointly created the first SPC-led course at OU focused on connecting traditional theory taught in the academic curriculum with operational meteorology. This class, “Applications of Meteorological Theory to Severe-Thunderstorm Forecasting,” began in 2015. From 2015 through 2017, this spring–semester course has engaged 56 students in theoretical skills and related hands-on weather analysis and forecasting applications, taught by over a dozen meteorologists from the SPC, the NOAA National Severe Storms Laboratory, and the NOAA National Weather Service Forecast Offices. Following introductory material, which addresses many theoretical principles relevant to operational meteorology, numerous presentations and hands-on activities focused on instructors’ areas of expertise are provided to students. Topics include the following: storm-induced perturbation pressure gradients and their enhancement to supercells, tornadogenesis, tropical cyclone tornadoes, severe wind forecasting, surface and upper-air analyses and their interpretation, and forecast decision-making. This collaborative approach has strengthened bonds between meteorologists in operations, research, and academia, while introducing OU meteorology students to the vast array of severe thunderstorm forecast challenges, state-of-the-art operational and research tools, communication of high-impact weather information, and teamwork skills. The methods of collaborative instruction and experiential education have been found to strengthen both operational–academic relationships and students’ appreciation of the intricacies of severe thunderstorm forecasting, as detailed in this article.

Open access
Kirsten Zickfeld, Michael Eby, Andrew J. Weaver, Kaitlin Alexander, Elisabeth Crespin, Neil R. Edwards, Alexey V. Eliseev, Georg Feulner, Thierry Fichefet, Chris E. Forest, Pierre Friedlingstein, Hugues Goosse, Philip B. Holden, Fortunat Joos, Michio Kawamiya, David Kicklighter, Hendrik Kienert, Katsumi Matsumoto, Igor I. Mokhov, Erwan Monier, Steffen M. Olsen, Jens O. P. Pedersen, Mahe Perrette, Gwenaëlle Philippon-Berthier, Andy Ridgwell, Adam Schlosser, Thomas Schneider Von Deimling, Gary Shaffer, Andrei Sokolov, Renato Spahni, Marco Steinacher, Kaoru Tachiiri, Kathy S. Tokos, Masakazu Yoshimori, Ning Zeng, and Fang Zhao

Abstract

This paper summarizes the results of an intercomparison project with Earth System Models of Intermediate Complexity (EMICs) undertaken in support of the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). The focus is on long-term climate projections designed to 1) quantify the climate change commitment of different radiative forcing trajectories and 2) explore the extent to which climate change is reversible on human time scales. All commitment simulations follow the four representative concentration pathways (RCPs) and their extensions to year 2300. Most EMICs simulate substantial surface air temperature and thermosteric sea level rise commitment following stabilization of the atmospheric composition at year-2300 levels. The meridional overturning circulation (MOC) is weakened temporarily and recovers to near-preindustrial values in most models for RCPs 2.6–6.0. The MOC weakening is more persistent for RCP8.5. Elimination of anthropogenic CO2 emissions after 2300 results in slowly decreasing atmospheric CO2 concentrations. At year 3000 atmospheric CO2 is still at more than half its year-2300 level in all EMICs for RCPs 4.5–8.5. Surface air temperature remains constant or decreases slightly and thermosteric sea level rise continues for centuries after elimination of CO2 emissions in all EMICs. Restoration of atmospheric CO2 from RCP to preindustrial levels over 100–1000 years requires large artificial removal of CO2 from the atmosphere and does not result in the simultaneous return to preindustrial climate conditions, as surface air temperature and sea level response exhibit a substantial time lag relative to atmospheric CO2.

Full access
Nirnimesh Kumar, James A. Lerczak, Tongtong Xu, Amy F. Waterhouse, Jim Thomson, Eric J. Terrill, Christy Swann, Sutara H. Suanda, Matthew S. Spydell, Pieter B. Smit, Alexandra Simpson, Roland Romeiser, Stephen D. Pierce, Tony de Paolo, André Palóczy, Annika O’Dea, Lisa Nyman, James N. Moum, Melissa Moulton, Andrew M. Moore, Arthur J. Miller, Ryan S. Mieras, Sophia T. Merrifield, Kendall Melville, Jacqueline M. McSweeney, Jamie MacMahan, Jennifer A. MacKinnon, Björn Lund, Emanuele Di Lorenzo, Luc Lenain, Michael Kovatch, Tim T. Janssen, Sean R. Haney, Merrick C. Haller, Kevin Haas, Derek J. Grimes, Hans C. Graber, Matt K. Gough, David A. Fertitta, Falk Feddersen, Christopher A. Edwards, William Crawford, John Colosi, C. Chris Chickadel, Sean Celona, Joseph Calantoni, Edward F. Braithwaite III, Johannes Becherer, John A. Barth, and Seongho Ahn

Abstract

The inner shelf, the transition zone between the surfzone and the midshelf, is a dynamically complex region with the evolution of circulation and stratification driven by multiple physical processes. Cross-shelf exchange through the inner shelf has important implications for coastal water quality, ecological connectivity, and lateral movement of sediment and heat. The Inner-Shelf Dynamics Experiment (ISDE) was an intensive, coordinated, multi-institution field experiment from September–October 2017, conducted from the midshelf, through the inner shelf, and into the surfzone near Point Sal, California. Satellite, airborne, shore- and ship-based remote sensing, in-water moorings and ship-based sampling, and numerical ocean circulation models forced by winds, waves, and tides were used to investigate the dynamics governing the circulation and transport in the inner shelf and the role of coastline variability on regional circulation dynamics. Here, the following physical processes are highlighted: internal wave dynamics from the midshelf to the inner shelf; flow separation and eddy shedding off Point Sal; offshore ejection of surfzone waters from rip currents; and wind-driven subtidal circulation dynamics. The extensive dataset from ISDE allows for unprecedented investigations into the role of physical processes in creating spatial heterogeneity, and nonlinear interactions between various inner-shelf physical processes. Overall, the highly spatially and temporally resolved oceanographic measurements and numerical simulations of ISDE provide a central framework for studies exploring this complex and fascinating region of the ocean.

Full access