Browse
Abstract
A tailored camera setup has been used to take photographs of the atmosphere and the environment as seen in the ultraviolet (UV) wavelength band. These photographs make visible what the human eye cannot normally perceive—in particular, the effects of the increasingly strong scattering of UV radiation by the molecular atmosphere. This scattering of sunlight by air molecules is commonly known as Rayleigh scattering, and its scattering efficiency is inversely proportional to the fourth power of the wavelength; the shorter the wavelength, the stronger the scattering. The blue color of the cloud-free sky is a well-known consequence of this, while it is also known that radiation in the UV band is even more diffuse than blue light. The UV photographs presented here demonstrate these effects of Rayleigh scattering. They show, for example, how clouds are much harder to distinguish from the background (the sky) in the UV than in the visible band, and how shadows tend to disappear in the UV. Thereby, these photographs provide intuitive insight into the physics of Rayleigh scattering, and help connect the typically abstract and theoretical information of textbooks and scientific articles with a more concrete understanding of the effects of Rayleigh scattering.
Abstract
A tailored camera setup has been used to take photographs of the atmosphere and the environment as seen in the ultraviolet (UV) wavelength band. These photographs make visible what the human eye cannot normally perceive—in particular, the effects of the increasingly strong scattering of UV radiation by the molecular atmosphere. This scattering of sunlight by air molecules is commonly known as Rayleigh scattering, and its scattering efficiency is inversely proportional to the fourth power of the wavelength; the shorter the wavelength, the stronger the scattering. The blue color of the cloud-free sky is a well-known consequence of this, while it is also known that radiation in the UV band is even more diffuse than blue light. The UV photographs presented here demonstrate these effects of Rayleigh scattering. They show, for example, how clouds are much harder to distinguish from the background (the sky) in the UV than in the visible band, and how shadows tend to disappear in the UV. Thereby, these photographs provide intuitive insight into the physics of Rayleigh scattering, and help connect the typically abstract and theoretical information of textbooks and scientific articles with a more concrete understanding of the effects of Rayleigh scattering.
Abstract
Sudden local severe weather is a threat, and we explore what the highest-end supercomputing and sensing technologies can do to address this challenge. Here we show that using the Japanese flagship “K” supercomputer, we can synergistically integrate “big simulations” of 100 parallel simulations of a convective weather system at 100-m grid spacing and “big data” from the next-generation phased array weather radar that produces a high-resolution 3-dimensional rain distribution every 30 s—two orders of magnitude more data than the currently used parabolic-antenna radar. This “big data assimilation” system refreshes 30-min forecasts every 30 s, 120 times more rapidly than the typical hourly updated systems operated at the world’s weather prediction centers. A real high-impact weather case study shows encouraging results of the 30-s-update big data assimilation system.
Abstract
Sudden local severe weather is a threat, and we explore what the highest-end supercomputing and sensing technologies can do to address this challenge. Here we show that using the Japanese flagship “K” supercomputer, we can synergistically integrate “big simulations” of 100 parallel simulations of a convective weather system at 100-m grid spacing and “big data” from the next-generation phased array weather radar that produces a high-resolution 3-dimensional rain distribution every 30 s—two orders of magnitude more data than the currently used parabolic-antenna radar. This “big data assimilation” system refreshes 30-min forecasts every 30 s, 120 times more rapidly than the typical hourly updated systems operated at the world’s weather prediction centers. A real high-impact weather case study shows encouraging results of the 30-s-update big data assimilation system.
Abstract
Local TV meteorologists are optimally positioned to educate the public about the local implications of global climate change: They have high public trust as a source of climate science information, local TV is the #1 source of weather information in America, and most weathercasters have relevant scientific training and excellent communication skills. Surveys show that most TV meteorologists would like to report on climate change, but lack of time, lack of broadcast-quality graphics, and lack of access to appropriate experts are barriers that inhibit such coverage.
With funding from the National Science Foundation and philanthropic foundations, we developed Climate Matters as an educational resources program to help interested local TV meteorologists educate their viewers about the local impacts of global climate change. Currently, the program provides more than 160 participating weathercasters nationwide with weekly localized broadcast-ready graphics and script ideas, short videos, and opportunities for brief (hour-long webinars) and more intensive (day-long seminars) professional development sessions—at no cost to participating weathercasters. We aim to more than double participation in the program over the next several years.
This article will chronicle the development of Climate Matters over the past five years—beginning with a pilot test at a single news station in Columbia, South Carolina, that was shown to be effective at helping viewers better understand climate change and culminating in a comprehensive national educational resource program that is available to all interested weathercasters.
Abstract
Local TV meteorologists are optimally positioned to educate the public about the local implications of global climate change: They have high public trust as a source of climate science information, local TV is the #1 source of weather information in America, and most weathercasters have relevant scientific training and excellent communication skills. Surveys show that most TV meteorologists would like to report on climate change, but lack of time, lack of broadcast-quality graphics, and lack of access to appropriate experts are barriers that inhibit such coverage.
With funding from the National Science Foundation and philanthropic foundations, we developed Climate Matters as an educational resources program to help interested local TV meteorologists educate their viewers about the local impacts of global climate change. Currently, the program provides more than 160 participating weathercasters nationwide with weekly localized broadcast-ready graphics and script ideas, short videos, and opportunities for brief (hour-long webinars) and more intensive (day-long seminars) professional development sessions—at no cost to participating weathercasters. We aim to more than double participation in the program over the next several years.
This article will chronicle the development of Climate Matters over the past five years—beginning with a pilot test at a single news station in Columbia, South Carolina, that was shown to be effective at helping viewers better understand climate change and culminating in a comprehensive national educational resource program that is available to all interested weathercasters.
Abstract
For users of climate services, the ability to quickly determine the datasets that best fit one’s needs would be invaluable. The volume, variety, and complexity of climate data makes this judgment difficult. The ambition of CHARMe (Characterization of metadata to enable high-quality climate services) is to give a wider interdisciplinary community access to a range of supporting information, such as journal articles, technical reports, or feedback on previous applications of the data. The capture and discovery of this “commentary” information, often created by data users rather than data providers, and currently not linked to the data themselves, has not been significantly addressed previously. CHARMe applies the principles of Linked Data and open web standards to associate, record, search, and publish user-derived annotations in a way that can be read both by users and automated systems. Tools have been developed within the CHARMe project that enable annotation capability for data delivery systems already in wide use for discovering climate data. In addition, the project has developed advanced tools for exploring data and commentary in innovative ways, including an interactive data explorer and comparator (“CHARMe Maps”), and a tool for correlating climate time series with external “significant events” (e.g., instrument failures or large volcanic eruptions) that affect the data quality. Although the project focuses on climate science, the concepts are general and could be applied to other fields. All CHARMe system software is open-source and released under a liberal license, permitting future projects to reuse the source code as they wish.
Abstract
For users of climate services, the ability to quickly determine the datasets that best fit one’s needs would be invaluable. The volume, variety, and complexity of climate data makes this judgment difficult. The ambition of CHARMe (Characterization of metadata to enable high-quality climate services) is to give a wider interdisciplinary community access to a range of supporting information, such as journal articles, technical reports, or feedback on previous applications of the data. The capture and discovery of this “commentary” information, often created by data users rather than data providers, and currently not linked to the data themselves, has not been significantly addressed previously. CHARMe applies the principles of Linked Data and open web standards to associate, record, search, and publish user-derived annotations in a way that can be read both by users and automated systems. Tools have been developed within the CHARMe project that enable annotation capability for data delivery systems already in wide use for discovering climate data. In addition, the project has developed advanced tools for exploring data and commentary in innovative ways, including an interactive data explorer and comparator (“CHARMe Maps”), and a tool for correlating climate time series with external “significant events” (e.g., instrument failures or large volcanic eruptions) that affect the data quality. Although the project focuses on climate science, the concepts are general and could be applied to other fields. All CHARMe system software is open-source and released under a liberal license, permitting future projects to reuse the source code as they wish.
Abstract
The possibility offered by the Internet to share pictures of tornadoes, and the storm-report archiving in the European Storm Weather Database, have made it apparent that the occurrence of tornadoes over Europe has been underestimated. Together with weak waterspouts and tornadoes, large and intense vortices are occasionally observed. Among these, an EF3 multivortex tornado with a path width of some hundreds of meters affected southeastern Italy on 28 November 2012, causing one casualty and estimated damage of 60M to the largest steel plant in Europe. A tide gauge positioned near the location of tornado landfall and a vertical atmospheric profile available a few hours later near the affected region represent unique sources of information for these events in the Mediterranean. During its transit across the port of Taranto, a waterspout, which was to become the tornado, was observed to have induced a sea level rise of about 30 cm. The supercell responsible for the tornado developed from convective cells triggered by orographic uplift over the Apennines. The 0–1-km wind shear was exceptional in comparison with other Italian tornadoes, and was remarkable in comparison with U.S. events as well. Other indices for severe convection diagnosis also showed extremely high values. The occasional occurrence of events with similar or stronger intensities over Italy emphasizes the need for the Distributed National Weather Service—which will integrate Italian meteorological institutions under one agency and is currently under development—to devise a warning system dedicated to the monitoring and prediction of severe convective events.
Abstract
The possibility offered by the Internet to share pictures of tornadoes, and the storm-report archiving in the European Storm Weather Database, have made it apparent that the occurrence of tornadoes over Europe has been underestimated. Together with weak waterspouts and tornadoes, large and intense vortices are occasionally observed. Among these, an EF3 multivortex tornado with a path width of some hundreds of meters affected southeastern Italy on 28 November 2012, causing one casualty and estimated damage of 60M to the largest steel plant in Europe. A tide gauge positioned near the location of tornado landfall and a vertical atmospheric profile available a few hours later near the affected region represent unique sources of information for these events in the Mediterranean. During its transit across the port of Taranto, a waterspout, which was to become the tornado, was observed to have induced a sea level rise of about 30 cm. The supercell responsible for the tornado developed from convective cells triggered by orographic uplift over the Apennines. The 0–1-km wind shear was exceptional in comparison with other Italian tornadoes, and was remarkable in comparison with U.S. events as well. Other indices for severe convection diagnosis also showed extremely high values. The occasional occurrence of events with similar or stronger intensities over Italy emphasizes the need for the Distributed National Weather Service—which will integrate Italian meteorological institutions under one agency and is currently under development—to devise a warning system dedicated to the monitoring and prediction of severe convective events.
Abstract
The Mesoscale Predictability Experiment (MPEX) was a field campaign conducted 15 May through 15 June 2013 within the Great Plains region of the United States. One of the research foci of MPEX regarded the upscaling effects of deep convective storms on their environment, and how these feed back to the convective-scale dynamics and predictability. Balloon-borne GPS radiosondes, or “upsondes,” were used to sample such environmental feedbacks. Two of the upsonde teams employed dual-frequency sounding systems that allowed for upsonde observations at intervals as fast as 15 min. Because these dual-frequency systems also had the capacity for full mobility during sonde reception, highly adaptive and rapid storm-relative sampling of the convectively modified environment was possible. This article documents the mobile sounding capabilities and unique sampling strategies employed during MPEX.
Abstract
The Mesoscale Predictability Experiment (MPEX) was a field campaign conducted 15 May through 15 June 2013 within the Great Plains region of the United States. One of the research foci of MPEX regarded the upscaling effects of deep convective storms on their environment, and how these feed back to the convective-scale dynamics and predictability. Balloon-borne GPS radiosondes, or “upsondes,” were used to sample such environmental feedbacks. Two of the upsonde teams employed dual-frequency sounding systems that allowed for upsonde observations at intervals as fast as 15 min. Because these dual-frequency systems also had the capacity for full mobility during sonde reception, highly adaptive and rapid storm-relative sampling of the convectively modified environment was possible. This article documents the mobile sounding capabilities and unique sampling strategies employed during MPEX.
Abstract
In meteorological investigations, the reference variable or “ground truth” typically comes from an instrument. This study uses human observations of surface precipitation types to evaluate the same variables that are estimated from an automated algorithm. The NOAA/National Severe Storms Laboratory’s Multi-Radar Multi-Sensor (MRMS) system relies primarily on observations from the Next Generation Radar (NEXRAD) network and model analyses from the Earth System Research Laboratory’s Rapid Refresh (RAP) system. Each hour, MRMS yields quantitative precipitation estimates and surface precipitation types as rain or snow. To date, the surface precipitation type product has received little attention beyond case studies. This study uses precipitation type reports collected by citizen scientists who have contributed observations to the meteorological Phenomena Identification Near the Ground (mPING) project. Citizen scientist reports of rain and snow during the winter season from 19 December 2012 to 30 April 2013 across the United States are compared to the MRMS precipitation type products. Results show that while the mPING reports have a limited spatial distribution (they are concentrated in urban areas), they yield similar critical success indexes of MRMS precipitation types in different cities. The remaining disagreement is attributed to an MRMS algorithmic deficiency of yielding too much rain, as opposed to biases in the mPING reports. The study also shows reduced detectability of snow compared to rain, which is attributed to lack of sensitivity at S band and the shallow nature of winter storms. Some suggestions are provided for improving the MRMS precipitation type algorithm based on these findings.
Abstract
In meteorological investigations, the reference variable or “ground truth” typically comes from an instrument. This study uses human observations of surface precipitation types to evaluate the same variables that are estimated from an automated algorithm. The NOAA/National Severe Storms Laboratory’s Multi-Radar Multi-Sensor (MRMS) system relies primarily on observations from the Next Generation Radar (NEXRAD) network and model analyses from the Earth System Research Laboratory’s Rapid Refresh (RAP) system. Each hour, MRMS yields quantitative precipitation estimates and surface precipitation types as rain or snow. To date, the surface precipitation type product has received little attention beyond case studies. This study uses precipitation type reports collected by citizen scientists who have contributed observations to the meteorological Phenomena Identification Near the Ground (mPING) project. Citizen scientist reports of rain and snow during the winter season from 19 December 2012 to 30 April 2013 across the United States are compared to the MRMS precipitation type products. Results show that while the mPING reports have a limited spatial distribution (they are concentrated in urban areas), they yield similar critical success indexes of MRMS precipitation types in different cities. The remaining disagreement is attributed to an MRMS algorithmic deficiency of yielding too much rain, as opposed to biases in the mPING reports. The study also shows reduced detectability of snow compared to rain, which is attributed to lack of sensitivity at S band and the shallow nature of winter storms. Some suggestions are provided for improving the MRMS precipitation type algorithm based on these findings.