Search Results
You are looking at 1 - 10 of 13 items for
- Author or Editor: Andreas Becker x
- Refine by Access: All Content x
Abstract
The World Ocean Circulation Experiment has established Lagrangian observations with neutrally buoyant floats as a routine tool in the study of deep-sea currents. Here a novel variant of the well-proven RAFOS concept for seeding floats at locations where they can be triggered on a timed basis is introduced. This cost-effective method obviates the need to revisit sites with a high-priced research vessel each time floats are to be deployed. It enables multiple Lagrangian time series, for example, for the observation of intermediate point sources of water masses, which are independent but have identical start points. This can be done even in environmentally challenging regions such as below the ice. The successfully tested autonomous float park concept does not rely on a release carousel moored on the seafloor. Instead, a second release was added to the standard RAFOS float for optional delay of regular drift missions. A float park can easily be installed by a conductivity–temperature–depth recorder system with a slightly modified rosette sampler.
Abstract
The World Ocean Circulation Experiment has established Lagrangian observations with neutrally buoyant floats as a routine tool in the study of deep-sea currents. Here a novel variant of the well-proven RAFOS concept for seeding floats at locations where they can be triggered on a timed basis is introduced. This cost-effective method obviates the need to revisit sites with a high-priced research vessel each time floats are to be deployed. It enables multiple Lagrangian time series, for example, for the observation of intermediate point sources of water masses, which are independent but have identical start points. This can be done even in environmentally challenging regions such as below the ice. The successfully tested autonomous float park concept does not rely on a release carousel moored on the seafloor. Instead, a second release was added to the standard RAFOS float for optional delay of regular drift missions. A float park can easily be installed by a conductivity–temperature–depth recorder system with a slightly modified rosette sampler.
Abstract
As the nonhydrostatic regional model of the Consortium for Small-Scale Modelling in Climate Mode (COSMO-CLM) is increasingly employed for studying the effects of urbanization on the environment, the authors extend its surface-layer parameterization by the Town Energy Budget (TEB) parameterization using the “tile approach” for a single urban class. The new implementation COSMO-CLM+TEB is used for a 1-yr reanalysis-driven simulation over Europe at a spatial resolution of 0.11° (~12 km) and over the area of Berlin at a spatial resolution of 0.025° (~2.8 km) for evaluating the new coupled model. The results on the coarse spatial resolution of 0.11° show that the standard and the new models provide 2-m temperature and daily precipitation fields that differ only slightly by from −0.1 to +0.2 K per season and ±0.1 mm day−1, respectively, with very similar statistical distributions. This indicates only a negligibly small effect of the urban parameterization on the model's climatology. Therefore, it is suggested that an urban parameterization may be omitted in model simulations on this scale. On the spatial resolution of 0.025° the model COSMO-CLM+TEB is able to better represent the magnitude of the urban heat island in Berlin than the standard model COSMO-CLM. This finding shows the importance of using the parameterization for urban land in the model simulations on fine spatial scales. It is also suggested that models could benefit from resolving multiple urban land use classes to better simulate the spatial variability of urban temperatures for large metropolitan areas on spatial scales below ~3 km.
Abstract
As the nonhydrostatic regional model of the Consortium for Small-Scale Modelling in Climate Mode (COSMO-CLM) is increasingly employed for studying the effects of urbanization on the environment, the authors extend its surface-layer parameterization by the Town Energy Budget (TEB) parameterization using the “tile approach” for a single urban class. The new implementation COSMO-CLM+TEB is used for a 1-yr reanalysis-driven simulation over Europe at a spatial resolution of 0.11° (~12 km) and over the area of Berlin at a spatial resolution of 0.025° (~2.8 km) for evaluating the new coupled model. The results on the coarse spatial resolution of 0.11° show that the standard and the new models provide 2-m temperature and daily precipitation fields that differ only slightly by from −0.1 to +0.2 K per season and ±0.1 mm day−1, respectively, with very similar statistical distributions. This indicates only a negligibly small effect of the urban parameterization on the model's climatology. Therefore, it is suggested that an urban parameterization may be omitted in model simulations on this scale. On the spatial resolution of 0.025° the model COSMO-CLM+TEB is able to better represent the magnitude of the urban heat island in Berlin than the standard model COSMO-CLM. This finding shows the importance of using the parameterization for urban land in the model simulations on fine spatial scales. It is also suggested that models could benefit from resolving multiple urban land use classes to better simulate the spatial variability of urban temperatures for large metropolitan areas on spatial scales below ~3 km.
Abstract
The uncertainty of the precipitation parameter in the ECMWF twentieth-century (ERA-20C) centennial reanalysis is assessed by means of a comparison with the GPCC in situ product Full Data Monthly Version 7 (FDM-V7). For the spatial and temporal validation of ERA-20C, global temporal scores were calculated on monthly, seasonal, and annual time scales. These include contingency table scores, correlations, and differences in the trend, along with time series analyses. Not surprisingly, the regions with the strongest deviations correspond to regions with data scarcity, such as mountainous regions with their upwind and downwind effects, and monsoon regions. They all show a strong systematic bias (ERA-20C minus FDM-V7) and significant breaks in the time series. The mean annual global bias is about 37 mm, and the median is about 8 mm yr−1. Among the largest mean annual biases are, for example, 3361 mm in the southern Andes, 2603 mm in the Western Ghats, and 2682 mm in Papua New Guinea. However, if there is high station density, the precipitation distribution is correctly reproduced, even in orographically demanding regions such as the Alps.
Abstract
The uncertainty of the precipitation parameter in the ECMWF twentieth-century (ERA-20C) centennial reanalysis is assessed by means of a comparison with the GPCC in situ product Full Data Monthly Version 7 (FDM-V7). For the spatial and temporal validation of ERA-20C, global temporal scores were calculated on monthly, seasonal, and annual time scales. These include contingency table scores, correlations, and differences in the trend, along with time series analyses. Not surprisingly, the regions with the strongest deviations correspond to regions with data scarcity, such as mountainous regions with their upwind and downwind effects, and monsoon regions. They all show a strong systematic bias (ERA-20C minus FDM-V7) and significant breaks in the time series. The mean annual global bias is about 37 mm, and the median is about 8 mm yr−1. Among the largest mean annual biases are, for example, 3361 mm in the southern Andes, 2603 mm in the Western Ghats, and 2682 mm in Papua New Guinea. However, if there is high station density, the precipitation distribution is correctly reproduced, even in orographically demanding regions such as the Alps.
Abstract
Weather radars have been widely used to detect and quantify precipitation and nowcast severe weather for more than 50 years. Operational weather radars generate huge three-dimensional datasets that can accumulate to terabytes per day. So it is essential to review what can be done with existing vast amounts of data, and how we should manage the present datasets for the future climatologists. All weather radars provide the reflectivity factor, and this is the main parameter to be archived. Saving reflectivity as volumetric data in the original spherical coordinates allows for studies of the three-dimensional structure of precipitation, which can be applied to understand a number of processes, for example, analyzing hail or thunderstorm modes. Doppler velocity and polarimetric moments also have numerous applications for climate studies, for example, quality improvement of reflectivity and rain rate retrievals, and for interrogating microphysical and dynamical processes. However, observational data alone are not useful if they are not accompanied by sufficient metadata. Since the lifetime of a radar ranges between 10 and 20 years, instruments are typically replaced or upgraded during climatologically relevant time periods. As a result, present metadata often do not apply to past data. This paper outlines the work of the Radar Task Team set by the Atmospheric Observation Panel for Climate (AOPC) and summarizes results from a recent survey on the existence and availability of long time series. We also provide recommendations for archiving current and future data and examples of climatological studies in which radar data have already been used.
Abstract
Weather radars have been widely used to detect and quantify precipitation and nowcast severe weather for more than 50 years. Operational weather radars generate huge three-dimensional datasets that can accumulate to terabytes per day. So it is essential to review what can be done with existing vast amounts of data, and how we should manage the present datasets for the future climatologists. All weather radars provide the reflectivity factor, and this is the main parameter to be archived. Saving reflectivity as volumetric data in the original spherical coordinates allows for studies of the three-dimensional structure of precipitation, which can be applied to understand a number of processes, for example, analyzing hail or thunderstorm modes. Doppler velocity and polarimetric moments also have numerous applications for climate studies, for example, quality improvement of reflectivity and rain rate retrievals, and for interrogating microphysical and dynamical processes. However, observational data alone are not useful if they are not accompanied by sufficient metadata. Since the lifetime of a radar ranges between 10 and 20 years, instruments are typically replaced or upgraded during climatologically relevant time periods. As a result, present metadata often do not apply to past data. This paper outlines the work of the Radar Task Team set by the Atmospheric Observation Panel for Climate (AOPC) and summarizes results from a recent survey on the existence and availability of long time series. We also provide recommendations for archiving current and future data and examples of climatological studies in which radar data have already been used.
Abstract
The measurement of global precipitation, both rainfall and snowfall, is critical to a wide range of users and applications. Rain gauges are indispensable in the measurement of precipitation, remaining the de facto standard for precipitation information across Earth’s surface for hydrometeorological purposes. However, their distribution across the globe is limited: over land their distribution and density is variable, while over oceans very few gauges exist and where measurements are made, they may not adequately reflect the rainfall amounts of the broader area. Critically, the number of gauges available, or appropriate for a particular study, varies greatly across the Earth owing to temporal sampling resolutions, periods of operation, data latency, and data access. Numbers of gauges range from a few thousand available in near–real time to about 100,000 for all “official” gauges, and to possibly hundreds of thousands if all possible gauges are included. Gauges routinely used in the generation of global precipitation products cover an equivalent area of between about 250 and 3,000 m2. For comparison, the center circle of a soccer pitch or tennis court is about 260 m2. Although each gauge should represent more than just the gauge orifice, autocorrelation distances of precipitation vary greatly with regime and the integration period. Assuming each Global Precipitation Climatology Centre (GPCC)–available gauge is independent and represents a surrounding area of 5-km radius, this represents only about 1% of Earth’s surface. The situation is further confounded for snowfall, which has a greater measurement uncertainty.
Abstract
The measurement of global precipitation, both rainfall and snowfall, is critical to a wide range of users and applications. Rain gauges are indispensable in the measurement of precipitation, remaining the de facto standard for precipitation information across Earth’s surface for hydrometeorological purposes. However, their distribution across the globe is limited: over land their distribution and density is variable, while over oceans very few gauges exist and where measurements are made, they may not adequately reflect the rainfall amounts of the broader area. Critically, the number of gauges available, or appropriate for a particular study, varies greatly across the Earth owing to temporal sampling resolutions, periods of operation, data latency, and data access. Numbers of gauges range from a few thousand available in near–real time to about 100,000 for all “official” gauges, and to possibly hundreds of thousands if all possible gauges are included. Gauges routinely used in the generation of global precipitation products cover an equivalent area of between about 250 and 3,000 m2. For comparison, the center circle of a soccer pitch or tennis court is about 260 m2. Although each gauge should represent more than just the gauge orifice, autocorrelation distances of precipitation vary greatly with regime and the integration period. Assuming each Global Precipitation Climatology Centre (GPCC)–available gauge is independent and represents a surrounding area of 5-km radius, this represents only about 1% of Earth’s surface. The situation is further confounded for snowfall, which has a greater measurement uncertainty.
Abstract
There is high demand and a growing expectation for predictions of environmental conditions that go beyond 0–14-day weather forecasts with outlooks extending to one or more seasons and beyond. This is driven by the needs of the energy, water management, and agriculture sectors, to name a few. There is an increasing realization that, unlike weather forecasts, prediction skill on longer time scales can leverage specific climate phenomena or conditions for a predictable signal above the weather noise. Currently, it is understood that these conditions are intermittent in time and have spatially heterogeneous impacts on skill, hence providing strategic windows of opportunity for skillful forecasts. Research points to such windows of opportunity, including El Niño or La Niña events, active periods of the Madden–Julian oscillation, disruptions of the stratospheric polar vortex, when certain large-scale atmospheric regimes are in place, or when persistent anomalies occur in the ocean or land surface. Gains could be obtained by increasingly developing prediction tools and metrics that strategically target these specific windows of opportunity. Across the globe, reevaluating forecasts in this manner could find value in forecasts previously discarded as not skillful. Users’ expectations for prediction skill could be more adequately met, as they are better aware of when and where to expect skill and if the prediction is actionable. Given that there is still untapped potential, in terms of process understanding and prediction methodologies, it is safe to expect that in the future forecast opportunities will expand. Process research and the development of innovative methodologies will aid such progress.
Abstract
There is high demand and a growing expectation for predictions of environmental conditions that go beyond 0–14-day weather forecasts with outlooks extending to one or more seasons and beyond. This is driven by the needs of the energy, water management, and agriculture sectors, to name a few. There is an increasing realization that, unlike weather forecasts, prediction skill on longer time scales can leverage specific climate phenomena or conditions for a predictable signal above the weather noise. Currently, it is understood that these conditions are intermittent in time and have spatially heterogeneous impacts on skill, hence providing strategic windows of opportunity for skillful forecasts. Research points to such windows of opportunity, including El Niño or La Niña events, active periods of the Madden–Julian oscillation, disruptions of the stratospheric polar vortex, when certain large-scale atmospheric regimes are in place, or when persistent anomalies occur in the ocean or land surface. Gains could be obtained by increasingly developing prediction tools and metrics that strategically target these specific windows of opportunity. Across the globe, reevaluating forecasts in this manner could find value in forecasts previously discarded as not skillful. Users’ expectations for prediction skill could be more adequately met, as they are better aware of when and where to expect skill and if the prediction is actionable. Given that there is still untapped potential, in terms of process understanding and prediction methodologies, it is safe to expect that in the future forecast opportunities will expand. Process research and the development of innovative methodologies will aid such progress.
Abstract
The European Reanalysis of Global Climate Observations 2 (ERA-CLIM2) is a European Union Seventh Framework Project started in January 2014 and due to be completed in December 2017. It aims to produce coupled reanalyses, which are physically consistent datasets describing the evolution of the global atmosphere, ocean, land surface, cryosphere, and the carbon cycle. ERA-CLIM2 has contributed to advancing the capacity for producing state-of-the-art climate reanalyses that extend back to the early twentieth century. ERA-CLIM2 has led to the generation of the first European ensemble of coupled ocean, sea ice, land, and atmosphere reanalyses of the twentieth century. The project has funded work to rescue and prepare observations and to advance the data-assimilation systems required to generate operational reanalyses, such as the ones planned by the European Union Copernicus Climate Change Service. This paper summarizes the main goals of the project, discusses some of its main areas of activities, and presents some of its key results.
Abstract
The European Reanalysis of Global Climate Observations 2 (ERA-CLIM2) is a European Union Seventh Framework Project started in January 2014 and due to be completed in December 2017. It aims to produce coupled reanalyses, which are physically consistent datasets describing the evolution of the global atmosphere, ocean, land surface, cryosphere, and the carbon cycle. ERA-CLIM2 has contributed to advancing the capacity for producing state-of-the-art climate reanalyses that extend back to the early twentieth century. ERA-CLIM2 has led to the generation of the first European ensemble of coupled ocean, sea ice, land, and atmosphere reanalyses of the twentieth century. The project has funded work to rescue and prepare observations and to advance the data-assimilation systems required to generate operational reanalyses, such as the ones planned by the European Union Copernicus Climate Change Service. This paper summarizes the main goals of the project, discusses some of its main areas of activities, and presents some of its key results.