Search Results

You are looking at 51 - 60 of 70 items for

  • Author or Editor: Robert H. Johns x
  • All content x
Clear All Modify Search

AIRS

Improving Weather Forecasting and Providing New Data on Greenhouse Gases

MOUSTAFA T. CHAHINE, THOMAS S. PAGANO, HARTMUT H. AUMANN, ROBERT ATLAS, CHRISTOPHER BARNET, JOHN BLAISDELL, LUKE CHEN, MURTY DIVAKARLA, ERIC J. FETZER, MITCH GOLDBERG, CATHERINE GAUTIER, STEPHANIE GRANGER, SCOTT HANNON, FREDRICK W. IRION, RAMESH KAKAR, EUGENIA KALNAY, BJORN H. LAMBRIGTSEN, SUNG-YUNG LEE, JOHN Le MARSHALL, W. WALLACE MCMILLAN, LARRY MCMILLIN, EDWARD T. OLSEN, HENRY REVERCOMB, PHILIP ROSENKRANZ, WILLIAM L. SMITH, DAVID STAELIN, L. LARRABEE STROW, JOEL SUSSKIND, DAVID TOBIN, WALTER WOLF, and LIHANG ZHOU

The Atmospheric Infrared Sounder (AIRS) and its two companion microwave sounders, AMSU and HSB were launched into polar orbit onboard the NASA Aqua Satellite in May 2002. NASA required the sounding system to provide high-quality research data for climate studies and to meet NOAA's requirements for improving operational weather forecasting. The NOAA requirement translated into global retrieval of temperature and humidity profiles with accuracies approaching those of radiosondes. AIRS also provides new measurements of several greenhouse gases, such as CO2, CO, CH4, O3, SO2, and aerosols.

The assimilation of AIRS data into operational weather forecasting has already demonstrated significant improvements in global forecast skill. At NOAA/NCEP, the improvement in the forecast skill achieved at 6 days is equivalent to gaining an extension of forecast capability of six hours. This improvement is quite significant when compared to other forecast improvements over the last decade. In addition to NCEP, ECMWF and the Met Office have also reported positive forecast impacts due AIRS.

AIRS is a hyperspectral sounder with 2,378 infrared channels between 3.7 and 15.4 μm. NOAA/NESDIS routinely distributes AIRS data within 3 hours to NWP centers around the world. The AIRS design represents a breakthrough in infrared space instrumentation with measurement stability and accuracies far surpassing any current research or operational sounder..The results we describe in this paper are “work in progress,” and although significant accomplishments have already been made much more work remains in order to realize the full potential of this suite of instruments.

Full access
T. C. Johns, C. F. Durman, H. T. Banks, M. J. Roberts, A. J. McLaren, J. K. Ridley, C. A. Senior, K. D. Williams, A. Jones, G. J. Rickard, S. Cusack, W. J. Ingram, M. Crucifix, D. M. H. Sexton, M. M. Joshi, B.-W. Dong, H. Spencer, R. S. R. Hill, J. M. Gregory, A. B. Keen, A. K. Pardaens, J. A. Lowe, A. Bodas-Salcedo, S. Stark, and Y. Searl

Abstract

A new coupled general circulation climate model developed at the Met Office's Hadley Centre is presented, and aspects of its performance in climate simulations run for the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC AR4) documented with reference to previous models. The Hadley Centre Global Environmental Model version 1 (HadGEM1) is built around a new atmospheric dynamical core; uses higher resolution than the previous Hadley Centre model, HadCM3; and contains several improvements in its formulation including interactive atmospheric aerosols (sulphate, black carbon, biomass burning, and sea salt) plus their direct and indirect effects. The ocean component also has higher resolution and incorporates a sea ice component more advanced than HadCM3 in terms of both dynamics and thermodynamics. HadGEM1 thus permits experiments including some interactive processes not feasible with HadCM3. The simulation of present-day mean climate in HadGEM1 is significantly better overall in comparison to HadCM3, although some deficiencies exist in the simulation of tropical climate and El Niño variability. We quantify the overall improvement using a quasi-objective climate index encompassing a range of atmospheric, oceanic, and sea ice variables. It arises partly from higher resolution but also from greater fidelity in modeling dynamical and physical processes, for example, in the representation of clouds and sea ice. HadGEM1 has a similar effective climate sensitivity (2.8 K) to a CO2 doubling as HadCM3 (3.1 K), although there are significant regional differences in their response patterns, especially in the Tropics. HadGEM1 is anticipated to be used as the basis both for higher-resolution and higher-complexity Earth System studies in the near future.

Full access
David A. R. Kristovich, George S. Young, Johannes Verlinde, Peter J. Sousounis, Pierre Mourad, Donald Lenschow, Robert M. Rauber, Mohan K. Ramamurthy, Brian F. Jewett, Kenneth Beard, Elen Cutrim, Paul J. DeMott, Edwin W. Eloranta, Mark R. Hjelmfelt, Sonia M. Kreidenweis, Jon Martin, James Moore, Harry T. Ochs III, David C Rogers, John Scala, Gregory Tripoli, and John Young

A severe 5-day lake-effect storm resulted in eight deaths, hundreds of injuries, and over $3 million in damage to a small area of northeastern Ohio and northwestern Pennsylvania in November 1996. In 1999, a blizzard associated with an intense cyclone disabled Chicago and much of the U.S. Midwest with 30–90 cm of snow. Such winter weather conditions have many impacts on the lives and property of people throughout much of North America. Each of these events is the culmination of a complex interaction between synoptic-scale, mesoscale, and microscale processes.

An understanding of how the multiple size scales and timescales interact is critical to improving forecasting of these severe winter weather events. The Lake-Induced Convection Experiment (Lake-ICE) and the Snowband Dynamics Project (SNOWBAND) collected comprehensive datasets on processes involved in lake-effect snowstorms and snowbands associated with cyclones during the winter of 1997/98. This paper outlines the goals and operations of these collaborative projects. Preliminary findings are given with illustrative examples of new state-of-the-art research observations collected. Analyses associated with Lake-ICE and SNOWBAND hold the promise of greatly improving our scientific understanding of processes involved in these important wintertime phenomena.

Full access
John S. Kain, Steve Willington, Adam J. Clark, Steven J. Weiss, Mark Weeks, Israel L. Jirak, Michael C. Coniglio, Nigel M. Roberts, Christopher D. Karstens, Jonathan M. Wilkinson, Kent H. Knopfmeier, Humphrey W. Lean, Laura Ellam, Kirsty Hanley, Rachel North, and Dan Suri

Abstract

In recent years, a growing partnership has emerged between the Met Office and the designated U.S. national centers for expertise in severe weather research and forecasting, that is, the National Oceanic and Atmospheric Administration (NOAA) National Severe Storms Laboratory (NSSL) and the NOAA Storm Prediction Center (SPC). The driving force behind this partnership is a compelling set of mutual interests related to predicting and understanding high-impact weather and using high-resolution numerical weather prediction models as foundational tools to explore these interests.

The forum for this collaborative activity is the NOAA Hazardous Weather Testbed, where annual Spring Forecasting Experiments (SFEs) are conducted by NSSL and SPC. For the last decade, NSSL and SPC have used these experiments to find ways that high-resolution models can help achieve greater success in the prediction of tornadoes, large hail, and damaging winds. Beginning in 2012, the Met Office became a contributing partner in annual SFEs, bringing complementary expertise in the use of convection-allowing models, derived in their case from a parallel decadelong effort to use these models to advance prediction of flash floods associated with heavy thunderstorms.

The collaboration between NSSL, SPC, and the Met Office has been enthusiastic and productive, driven by strong mutual interests at a grassroots level and generous institutional support from the parent government agencies. In this article, a historical background is provided, motivations for collaborative activities are emphasized, and preliminary results are highlighted.

Full access
Rolf H. Reichle, Gabrielle J. M. De Lannoy, Qing Liu, Randal D. Koster, John S. Kimball, Wade T. Crow, Joseph V. Ardizzone, Purnendu Chakraborty, Douglas W. Collins, Austin L. Conaty, Manuela Girotto, Lucas A. Jones, Jana Kolassa, Hans Lievens, Robert A. Lucchesi, and Edmond B. Smith

Abstract

The Soil Moisture Active Passive (SMAP) mission Level-4 Soil Moisture (L4_SM) product provides 3-hourly, 9-km resolution, global estimates of surface (0–5 cm) and root-zone (0–100 cm) soil moisture and related land surface variables from 31 March 2015 to present with ~2.5-day latency. The ensemble-based L4_SM algorithm assimilates SMAP brightness temperature (Tb) observations into the Catchment land surface model. This study describes the spatially distributed L4_SM analysis and assesses the observation-minus-forecast (OF) Tb residuals and the soil moisture and temperature analysis increments. Owing to the climatological rescaling of the Tb observations prior to assimilation, the analysis is essentially unbiased, with global mean values of ~0.37 K for the OF Tb residuals and practically zero for the soil moisture and temperature increments. There are, however, modest regional (absolute) biases in the OF residuals (under ~3 K), the soil moisture increments (under ~0.01 m3 m−3), and the surface soil temperature increments (under ~1 K). Typical instantaneous values are ~6 K for OF residuals, ~0.01 (~0.003) m3 m−3 for surface (root zone) soil moisture increments, and ~0.6 K for surface soil temperature increments. The OF diagnostics indicate that the actual errors in the system are overestimated in deserts and densely vegetated regions and underestimated in agricultural regions and transition zones between dry and wet climates. The OF autocorrelations suggest that the SMAP observations are used efficiently in western North America, the Sahel, and Australia, but not in many forested regions and the high northern latitudes. A case study in Australia demonstrates that assimilating SMAP observations successfully corrects short-term errors in the L4_SM rainfall forcing.

Full access
David M. Tratt, John A. Hackwell, Bonnie L. Valant-Spaight, Richard L. Walterscheid, Lynette J. Gelinas, James H. Hecht, Charles M. Swenson, Caleb P. Lampen, M. Joan Alexander, Lars Hoffmann, David S. Nolan, Steven D. Miller, Jeffrey L. Hall, Robert Atlas, Frank D. Marks Jr., and Philip T. Partain

Abstract

The prediction of tropical cyclone rapid intensification is one of the most pressing unsolved problems in hurricane forecasting. The signatures of gravity waves launched by strong convective updrafts are often clearly seen in airglow and carbon dioxide thermal emission spectra under favorable atmospheric conditions. By continuously monitoring the Atlantic hurricane belt from the main development region to the vulnerable sections of the continental United States at high cadence, it will be possible to investigate the utility of storm-induced gravity wave observations for the diagnosis of impending storm intensification. Such a capability would also enable significant improvements in our ability to characterize the 3D transient behavior of upper-atmospheric gravity waves and point the way to future observing strategies that could mitigate the risk to human life caused by severe storms. This paper describes a new mission concept involving a midinfrared imager hosted aboard a geostationary satellite positioned at approximately 80°W longitude. The sensor’s 3-km pixel size ensures that the gravity wave horizontal structure is adequately resolved, while a 30-s refresh rate enables improved definition of the dynamic intensification process. In this way the transient development of gravity wave perturbations caused by both convective and cyclonic storms may be discerned in near–real time.

Open access
Lynn M. Russell, Armin Sorooshian, John H. Seinfeld, Bruce A. Albrecht, Athanasios Nenes, Lars Ahlm, Yi-Chun Chen, Matthew Coggon, Jill S. Craven, Richard C. Flagan, Amanda A. Frossard, Haflidi Jonsson, Eunsil Jung, Jack J. Lin, Andrew R. Metcalf, Robin Modini, Johannes Mülmenstädt, Greg Roberts, Taylor Shingler, Siwon Song, Zhen Wang, and Anna Wonaschütz

Aerosol–cloud–radiation interactions are widely held to be the largest single source of uncertainty in climate model projections of future radiative forcing due to increasing anthropogenic emissions. The underlying causes of this uncertainty among modeled predictions of climate are the gaps in our fundamental understanding of cloud processes. There has been significant progress with both observations and models in addressing these important questions but quantifying them correctly is nontrivial, thus limiting our ability to represent them in global climate models. The Eastern Pacific Emitted Aerosol Cloud Experiment (E-PEACE) 2011 was a targeted aircraft campaign with embedded modeling studies, using the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS) Twin Otter aircraft and the research vessel Point Sur in July and August 2011 off the central coast of California, with a full payload of instruments to measure particle and cloud number, mass, composition, and water uptake distributions. EPEACE used three emitted particle sources to separate particle-induced feedbacks from dynamical variability, namely 1) shipboard smoke-generated particles with 0.05–1-μm diameters (which produced tracks measured by satellite and had drop composition characteristic of organic smoke), 2) combustion particles from container ships with 0.05–0.2-μm diameters (which were measured in a variety of conditions with droplets containing both organic and sulfate components), and 3) aircraft-based milled salt particles with 3–5-μm diameters (which showed enhanced drizzle rates in some clouds). The aircraft observations were consistent with past large-eddy simulations of deeper clouds in ship tracks and aerosol– cloud parcel modeling of cloud drop number and composition, providing quantitative constraints on aerosol effects on warm-cloud microphysics.

Full access
Randall M. Dole, J. Ryan Spackman, Matthew Newman, Gilbert P. Compo, Catherine A. Smith, Leslie M. Hartten, Joseph J. Barsugli, Robert S. Webb, Martin P. Hoerling, Robert Cifelli, Klaus Wolter, Christopher D. Barnet, Maria Gehne, Ronald Gelaro, George N. Kiladis, Scott Abbott, Elena Akish, John Albers, John M. Brown, Christopher J. Cox, Lisa Darby, Gijs de Boer, Barbara DeLuisi, Juliana Dias, Jason Dunion, Jon Eischeid, Christopher Fairall, Antonia Gambacorta, Brian K. Gorton, Andrew Hoell, Janet Intrieri, Darren Jackson, Paul E. Johnston, Richard Lataitis, Kelly M. Mahoney, Katherine McCaffrey, H. Alex McColl, Michael J. Mueller, Donald Murray, Paul J. Neiman, William Otto, Ola Persson, Xiao-Wei Quan, Imtiaz Rangwala, Andrea J. Ray, David Reynolds, Emily Riley Dellaripa, Karen Rosenlof, Naoko Sakaeda, Prashant D. Sardeshmukh, Laura C. Slivinski, Lesley Smith, Amy Solomon, Dustin Swales, Stefan Tulich, Allen White, Gary Wick, Matthew G. Winterkorn, Daniel E. Wolfe, and Robert Zamora

Abstract

Forecasts by mid-2015 for a strong El Niño during winter 2015/16 presented an exceptional scientific opportunity to accelerate advances in understanding and predictions of an extreme climate event and its impacts while the event was ongoing. Seizing this opportunity, the National Oceanic and Atmospheric Administration (NOAA) initiated an El Niño Rapid Response (ENRR), conducting the first field campaign to obtain intensive atmospheric observations over the tropical Pacific during El Niño.

The overarching ENRR goal was to determine the atmospheric response to El Niño and the implications for predicting extratropical storms and U.S. West Coast rainfall. The field campaign observations extended from the central tropical Pacific to the West Coast, with a primary focus on the initial tropical atmospheric response that links El Niño to its global impacts. NOAA deployed its Gulfstream-IV (G-IV) aircraft to obtain observations around organized tropical convection and poleward convective outflow near the heart of El Niño. Additional tropical Pacific observations were obtained by radiosondes launched from Kiritimati , Kiribati, and the NOAA ship Ronald H. Brown, and in the eastern North Pacific by the National Aeronautics and Space Administration (NASA) Global Hawk unmanned aerial system. These observations were all transmitted in real time for use in operational prediction models. An X-band radar installed in Santa Clara, California, helped characterize precipitation distributions. This suite supported an end-to-end capability extending from tropical Pacific processes to West Coast impacts. The ENRR observations were used during the event in operational predictions. They now provide an unprecedented dataset for further research to improve understanding and predictions of El Niño and its impacts.

Open access
Masashi Nagata, Lance Leslie, Yoshio Kurihara, Russell L. Elsberry, Masanori Yamasaki, Hirotaka Kamahori, Robert Abbey Jr., Kotaro Bessho, Javier Calvo, Johnny C. L. Chan, Peter Clark, Michel Desgagne, Song-You Hong, Detlev Majewski, Piero Malguzzi, John McGregor, Hiroshi Mino, Akihiko Murata, Jason Nachamkin, Michel Roch, and Clive Wilson

The Third Comparison of Mesoscale Prediction and Research Experiment (COMPARE) workshop was held in Tokyo, Japan, on 13–15 December 1999, cosponsored by the Japan Meteorological Agency (JMA), Japan Science and Technology Agency, and the World Meteorological Organization. The third case of COMPARE focuses on an event of explosive tropical cyclone [Typhoon Flo (9019)] development that occurred during the cooperative three field experiments, the Tropical Cyclone Motion experiment 1990, Special Experiment Concerning Recurvature and Unusual Motion, and TYPHOON-90, conducted in the western North Pacific in August and September 1990. Fourteen models from nine countries have participated in at least a part of a set of experiments using a combination of four initial conditions provided and three horizontal resolutions. The resultant forecasts were collected, processed, and verified with analyses and observational data at JMA. Archived datasets have been prepared to be distributed to participating members for use in further evaluation studies.

In the workshop, preliminary conclusions from the evaluation study were presented and discussed in the light of initiatives of the experiment and from the viewpoints of tropical cyclone experts. Initial conditions, depending on both large-scale analyses and vortex bogusing, have a large impact on tropical cyclone intensity predictions. Some models succeeded in predicting the explosive deepening of the target typhoon at least qualitatively in terms of the time evolution of central pressure. Horizontal grid spacing has a very large impact on tropical cyclone intensity prediction, while the impact of vertical resolution is less clear, with some models being very sensitive and others less so. The structure of and processes in the eyewall clouds with subsidence inside as well as boundary layer and moist physical processes are considered important in the explosive development of tropical cyclones. Follow-up research activities in this case were proposed to examine possible working hypotheses related to the explosive development.

New strategies for selection of future COMPARE cases were worked out, including seven suitability requirements to be met by candidate cases. The VORTEX95 case was withdrawn as a candidate, and two other possible cases were presented and discussed.

Full access
Jonathan Spinoni, Paulo Barbosa, Edoardo Bucchignani, John Cassano, Tereza Cavazos, Jens H. Christensen, Ole B. Christensen, Erika Coppola, Jason Evans, Beate Geyer, Filippo Giorgi, Panos Hadjinicolaou, Daniela Jacob, Jack Katzfey, Torben Koenigk, René Laprise, Christopher J. Lennard, M. Levent Kurnaz, Delei Li, Marta Llopart, Niall McCormick, Gustavo Naumann, Grigory Nikulin, Tugba Ozturk, Hans-Juergen Panitz, Rosmeri Porfirio da Rocha, Burkhardt Rockel, Silvina A. Solman, Jozef Syktus, Fredolin Tangang, Claas Teichmann, Robert Vautard, Jürgen V. Vogt, Katja Winger, George Zittis, and Alessandro Dosio

Abstract

Two questions motivated this study: 1) Will meteorological droughts become more frequent and severe during the twenty-first century? 2) Given the projected global temperature rise, to what extent does the inclusion of temperature (in addition to precipitation) in drought indicators play a role in future meteorological droughts? To answer, we analyzed the changes in drought frequency, severity, and historically undocumented extreme droughts over 1981–2100, using the standardized precipitation index (SPI; including precipitation only) and standardized precipitation-evapotranspiration index (SPEI; indirectly including temperature), and under two representative concentration pathways (RCP4.5 and RCP8.5). As input data, we employed 103 high-resolution (0.44°) simulations from the Coordinated Regional Climate Downscaling Experiment (CORDEX), based on a combination of 16 global circulation models (GCMs) and 20 regional circulation models (RCMs). This is the first study on global drought projections including RCMs based on such a large ensemble of RCMs. Based on precipitation only, ~15% of the global land is likely to experience more frequent and severe droughts during 2071–2100 versus 1981–2010 for both scenarios. This increase is larger (~47% under RCP4.5, ~49% under RCP8.5) when precipitation and temperature are used. Both SPI and SPEI project more frequent and severe droughts, especially under RCP8.5, over southern South America, the Mediterranean region, southern Africa, southeastern China, Japan, and southern Australia. A decrease in drought is projected for high latitudes in Northern Hemisphere and Southeast Asia. If temperature is included, drought characteristics are projected to increase over North America, Amazonia, central Europe and Asia, the Horn of Africa, India, and central Australia; if only precipitation is considered, they are found to decrease over those areas.

Open access