Browse
Abstract
Are we going to have a white Christmas? That is a question that scientists at the National Oceanic and Atmospheric Administration (NOAA) receive each autumn from members of the media and general public. NOAA personnel typically respond by way of a press release and map depicting the climatological probability of observing snow on the ground on 25 December at stations across the contiguous United States. This map has become one of the most popular applications of NOAA’s 1981–2010 U.S. Climate Normals.
The purpose of this paper is to expand upon the annual press release in two ways. First, the methodology for empirically calculating the probabilities of snow on the ground is documented. Second, additional maps describing the median snow depth on 25 December as well as the probability and amount of snowfall are presented.
The results are consistent with a climatologist’s intuitive expectations. In the Sierras, Cascades, the leeward side of the Great Lakes, and northern New England, snow cover is a near certainty. In these regions, most precipitation falls as snow, and the probability of snowfall can exceed 25%. At higher elevations of the Rocky Mountains and at many locations between the northern Rockies and New England, snowfall is considerably less frequent on Christmas Day, yet the probability of snow on the ground exceeds 50%. For those who would like to escape the snow, the best places to be in late December are in Southern California, the lower elevations of the Southwest, and Florida.
Abstract
Are we going to have a white Christmas? That is a question that scientists at the National Oceanic and Atmospheric Administration (NOAA) receive each autumn from members of the media and general public. NOAA personnel typically respond by way of a press release and map depicting the climatological probability of observing snow on the ground on 25 December at stations across the contiguous United States. This map has become one of the most popular applications of NOAA’s 1981–2010 U.S. Climate Normals.
The purpose of this paper is to expand upon the annual press release in two ways. First, the methodology for empirically calculating the probabilities of snow on the ground is documented. Second, additional maps describing the median snow depth on 25 December as well as the probability and amount of snowfall are presented.
The results are consistent with a climatologist’s intuitive expectations. In the Sierras, Cascades, the leeward side of the Great Lakes, and northern New England, snow cover is a near certainty. In these regions, most precipitation falls as snow, and the probability of snowfall can exceed 25%. At higher elevations of the Rocky Mountains and at many locations between the northern Rockies and New England, snowfall is considerably less frequent on Christmas Day, yet the probability of snow on the ground exceeds 50%. For those who would like to escape the snow, the best places to be in late December are in Southern California, the lower elevations of the Southwest, and Florida.
Abstract
In a recent BAMS article, it is argued that community-based Open Source Software (OSS) could foster scientific progress in weather radar research, and make weather radar software more affordable, flexible, transparent, sustainable, and interoperable.
Nevertheless, it can be challenging for potential developers and users to realize these benefits: tools are often cumbersome to install; different operating systems may have particular issues, or may not be supported at all; and many tools have steep learning curves.
To overcome some of these barriers, we present an open, community-based virtual machine (VM). This VM can be run on any operating system, and guarantees reproducibility of results across platforms. It contains a suite of independent OSS weather radar tools (BALTRAD, Py-ART, wradlib, RSL, and Radx), and a scientific Python stack. Furthermore, it features a suite of recipes that work out of the box and provide guidance on how to use the different OSS tools alone and together. The code to build the VM from source is hosted on GitHub, which allows the VM to grow with its community.
We argue that the VM presents another step toward Open (Weather Radar) Science. It can be used as a quick way to get started, for teaching, or for benchmarking and combining different tools. It can foster the idea of reproducible research in scientific publishing. Being scalable and extendable, it might even allow for real-time data processing.
We expect the VM to catalyze progress toward interoperability, and to lower the barrier for new users and developers, thus extending the weather radar community and user base.
Abstract
In a recent BAMS article, it is argued that community-based Open Source Software (OSS) could foster scientific progress in weather radar research, and make weather radar software more affordable, flexible, transparent, sustainable, and interoperable.
Nevertheless, it can be challenging for potential developers and users to realize these benefits: tools are often cumbersome to install; different operating systems may have particular issues, or may not be supported at all; and many tools have steep learning curves.
To overcome some of these barriers, we present an open, community-based virtual machine (VM). This VM can be run on any operating system, and guarantees reproducibility of results across platforms. It contains a suite of independent OSS weather radar tools (BALTRAD, Py-ART, wradlib, RSL, and Radx), and a scientific Python stack. Furthermore, it features a suite of recipes that work out of the box and provide guidance on how to use the different OSS tools alone and together. The code to build the VM from source is hosted on GitHub, which allows the VM to grow with its community.
We argue that the VM presents another step toward Open (Weather Radar) Science. It can be used as a quick way to get started, for teaching, or for benchmarking and combining different tools. It can foster the idea of reproducible research in scientific publishing. Being scalable and extendable, it might even allow for real-time data processing.
We expect the VM to catalyze progress toward interoperability, and to lower the barrier for new users and developers, thus extending the weather radar community and user base.
Abstract
Massive economic and population growth, and urbanization are expected to lead to a tripling of anthropogenic emissions in southern West Africa (SWA) between 2000 and 2030. However, the impacts of this on human health, ecosystems, food security, and the regional climate are largely unknown. An integrated assessment is challenging due to (a) a superposition of regional effects with global climate change; (b) a strong dependence on the variable West African monsoon; (c) incomplete scientific understanding of interactions between emissions, clouds, radiation, precipitation, and regional circulations; and (d) a lack of observations. This article provides an overview of the DACCIWA (Dynamics–Aerosol–Chemistry–Cloud Interactions in West Africa) project. DACCIWA will conduct extensive fieldwork in SWA to collect high-quality observations, spanning the entire process chain from surface-based natural and anthropogenic emissions to impacts on health, ecosystems, and climate. Combining the resulting benchmark dataset with a wide range of modeling activities will allow (a) assessment of relevant physical, chemical, and biological processes; (b) improvement of the monitoring of climate and atmospheric composition from space; and (c) development of the next generation of weather and climate models capable of representing coupled cloud–aerosol interactions. The latter will ultimately contribute to reduce uncertainties in climate predictions. DACCIWA collaborates closely with operational centers, international programs, policymakers, and users to actively guide sustainable future planning for West Africa. It is hoped that some of DACCIWA’s scientific findings and technical developments will be applicable to other monsoon regions.
Abstract
Massive economic and population growth, and urbanization are expected to lead to a tripling of anthropogenic emissions in southern West Africa (SWA) between 2000 and 2030. However, the impacts of this on human health, ecosystems, food security, and the regional climate are largely unknown. An integrated assessment is challenging due to (a) a superposition of regional effects with global climate change; (b) a strong dependence on the variable West African monsoon; (c) incomplete scientific understanding of interactions between emissions, clouds, radiation, precipitation, and regional circulations; and (d) a lack of observations. This article provides an overview of the DACCIWA (Dynamics–Aerosol–Chemistry–Cloud Interactions in West Africa) project. DACCIWA will conduct extensive fieldwork in SWA to collect high-quality observations, spanning the entire process chain from surface-based natural and anthropogenic emissions to impacts on health, ecosystems, and climate. Combining the resulting benchmark dataset with a wide range of modeling activities will allow (a) assessment of relevant physical, chemical, and biological processes; (b) improvement of the monitoring of climate and atmospheric composition from space; and (c) development of the next generation of weather and climate models capable of representing coupled cloud–aerosol interactions. The latter will ultimately contribute to reduce uncertainties in climate predictions. DACCIWA collaborates closely with operational centers, international programs, policymakers, and users to actively guide sustainable future planning for West Africa. It is hoped that some of DACCIWA’s scientific findings and technical developments will be applicable to other monsoon regions.
Abstract
Since the advent of computers midway through the twentieth century, computational resources have increased exponentially. It is likely they will continue to do so, especially when accounting for recent trends in multicore processors. History has shown that such an increase tends to directly lead to weather and climate models that readily exploit the extra resources, improving model quality and resolution. We show that Large-Eddy Simulation (LES) models that utilize modern, accelerated (e.g., by GPU or coprocessor), parallel hardware systems can now provide turbulence-resolving numerical weather forecasts over a region the size of the Netherlands at 100-m resolution. This approach has the potential to speed the development of turbulence-resolving numerical weather prediction models.
Abstract
Since the advent of computers midway through the twentieth century, computational resources have increased exponentially. It is likely they will continue to do so, especially when accounting for recent trends in multicore processors. History has shown that such an increase tends to directly lead to weather and climate models that readily exploit the extra resources, improving model quality and resolution. We show that Large-Eddy Simulation (LES) models that utilize modern, accelerated (e.g., by GPU or coprocessor), parallel hardware systems can now provide turbulence-resolving numerical weather forecasts over a region the size of the Netherlands at 100-m resolution. This approach has the potential to speed the development of turbulence-resolving numerical weather prediction models.
Abstract
NOAA’s NWS implemented the new Local Climate Analysis Tool (LCAT) on 1 July 2013. The tool supports the delivery of climate services by quickly providing information to help with climate-sensitive decisions and to facilitate the development of local climate studies and assessments. LCAT provides its users with the ability to conduct local climate variability and change analyses using scientific techniques and the most trusted data, identified through consultation and approval with NOAA subject matter experts. LCAT data include climate-relevant surface observations for individual stations, regional divisions, and gridded reanalysis output. LCAT methods include trend-fitting techniques to assess the local rate of climate change, frequency and conditional probability analyses, and correlation studies to identify existing relationships between local climate and modes of climate variability, such as El Niño Southern Oscillation (ENSO). The tool produces customized output for individual users through a web-interface. These include graphical and tabular numeric data that can be either saved in the LCAT online environment or exported in standard formats for further analysis. For each query, LCAT provides an explanation for all graphical output to help users interpret the scientific results. LCAT also offers training modules explaining usability, data, scientific methods, and potential applications, with emphasis on the tool’s appropriate and inappropriate uses. Examples of LCAT applications include guidance for planning, resources management, and assessment purposes. LCAT has the potential for expansion to include a wide variety of datasets for broader application in environmental and socioeconomic decision support.
Abstract
NOAA’s NWS implemented the new Local Climate Analysis Tool (LCAT) on 1 July 2013. The tool supports the delivery of climate services by quickly providing information to help with climate-sensitive decisions and to facilitate the development of local climate studies and assessments. LCAT provides its users with the ability to conduct local climate variability and change analyses using scientific techniques and the most trusted data, identified through consultation and approval with NOAA subject matter experts. LCAT data include climate-relevant surface observations for individual stations, regional divisions, and gridded reanalysis output. LCAT methods include trend-fitting techniques to assess the local rate of climate change, frequency and conditional probability analyses, and correlation studies to identify existing relationships between local climate and modes of climate variability, such as El Niño Southern Oscillation (ENSO). The tool produces customized output for individual users through a web-interface. These include graphical and tabular numeric data that can be either saved in the LCAT online environment or exported in standard formats for further analysis. For each query, LCAT provides an explanation for all graphical output to help users interpret the scientific results. LCAT also offers training modules explaining usability, data, scientific methods, and potential applications, with emphasis on the tool’s appropriate and inappropriate uses. Examples of LCAT applications include guidance for planning, resources management, and assessment purposes. LCAT has the potential for expansion to include a wide variety of datasets for broader application in environmental and socioeconomic decision support.
Abstract
Televised media is one of the most frequently accessed sources of weather information. The local weathercaster is the link between weather information and the public, and as such weathercaster characteristics, from vocal cadence to physical appearance, can impact viewer understanding. This study considers the role of weathercaster gesturing on viewer attention during weather forecasts. Two variations of a typical weather forecast were viewed by a total of 36 students during an eye tracking session. The first forecast variation contained physical gestures toward forecast text by the newscaster (Gesture condition) while the second variation contained minimal gesturing (No Gesture condition). Following each eye tracking session, students completed a retention survey related to the forecast. These data were used to identify areas of interest to which students attended during viewing and to ascertain how well the forecast was retained across the gesturing treatments. Study results suggest that the weathercaster’s gesturing during forecasts may have induced confusion among participants, but did not affect retention of the weather information investigated in the study. Gesturing diverted attention from other areas of interest within the forecast by encouraging participants to focus on the weathercaster’s hands. This study indicates that minor modifications to weathercaster behavior can produce significant changes in viewer behavior.
Abstract
Televised media is one of the most frequently accessed sources of weather information. The local weathercaster is the link between weather information and the public, and as such weathercaster characteristics, from vocal cadence to physical appearance, can impact viewer understanding. This study considers the role of weathercaster gesturing on viewer attention during weather forecasts. Two variations of a typical weather forecast were viewed by a total of 36 students during an eye tracking session. The first forecast variation contained physical gestures toward forecast text by the newscaster (Gesture condition) while the second variation contained minimal gesturing (No Gesture condition). Following each eye tracking session, students completed a retention survey related to the forecast. These data were used to identify areas of interest to which students attended during viewing and to ascertain how well the forecast was retained across the gesturing treatments. Study results suggest that the weathercaster’s gesturing during forecasts may have induced confusion among participants, but did not affect retention of the weather information investigated in the study. Gesturing diverted attention from other areas of interest within the forecast by encouraging participants to focus on the weathercaster’s hands. This study indicates that minor modifications to weathercaster behavior can produce significant changes in viewer behavior.
Abstract
Performance in the prediction of hurricane intensity and associated hazards has been evaluated for a newly developed convection-permitting forecast system that uses ensemble data assimilation techniques to ingest high-resolution airborne radar observations from the inner core. This system performed well for three of the ten costliest Atlantic hurricanes: Ike (2008), Irene (2011), and Sandy (2012). Four to five days before these storms made landfall, the system produced good deterministic and probabilistic forecasts of not only track and intensity, but also of the spatial distributions of surface wind and rainfall. Averaged over all 102 applicable cases that have inner-core airborne Doppler radar observations during 2008–2012, the system reduced the day-2-to-day-4 intensity forecast errors by 25%–28% compared to the corresponding National Hurricane Center’s official forecasts (which have seen little or no decrease in intensity forecast errors over the past two decades). Empowered by sufficient computing resources, advances in both deterministic and probabilistic hurricane prediction will enable emergency management officials, the private sector, and the general public to make more informed decisions that minimize the losses of life and property.
Abstract
Performance in the prediction of hurricane intensity and associated hazards has been evaluated for a newly developed convection-permitting forecast system that uses ensemble data assimilation techniques to ingest high-resolution airborne radar observations from the inner core. This system performed well for three of the ten costliest Atlantic hurricanes: Ike (2008), Irene (2011), and Sandy (2012). Four to five days before these storms made landfall, the system produced good deterministic and probabilistic forecasts of not only track and intensity, but also of the spatial distributions of surface wind and rainfall. Averaged over all 102 applicable cases that have inner-core airborne Doppler radar observations during 2008–2012, the system reduced the day-2-to-day-4 intensity forecast errors by 25%–28% compared to the corresponding National Hurricane Center’s official forecasts (which have seen little or no decrease in intensity forecast errors over the past two decades). Empowered by sufficient computing resources, advances in both deterministic and probabilistic hurricane prediction will enable emergency management officials, the private sector, and the general public to make more informed decisions that minimize the losses of life and property.