Search Results

You are looking at 1 - 10 of 27 items for

  • Author or Editor: Paul Joe x
  • Refine by Access: All Content x
Clear All Modify Search
Paul Joe and Roland List

Abstract

Two laboratory optical array spectrometers with greyscale were evaluated for their sizing, depth of field and timing performance; these three factors are necessary to calculate concentrations and liquid water contents. The probes were of 10 and 150 μm resolution with 40, 60, 80% and 25, 50, 75% threshold detection, respectively. Sizing was a function of the particle location in the sample area and may be in error by up to four channels, the effect is most pronounced for smaller particles. The greyscale feature is a quantitative measure of the sharpness of focus of the particle image. It permits the determination of the particle location in the sample area which, in turn, permits the rejection of extraneous or the missized particles. It also allows for a more accurate and user-definable sample volume. The timing check revealed substantial errors which were corrected via software. Comparison with several methods of estimating liquid water content showed consistent results. The sizing and depth of field results are applicable to the more popular nongreyscale probes since these effects depend on the optics and not on the greyscale feature. The greyscale probes provide definite advantages which would be useful in airborne applications.

Full access
Joe L. Sutherland and Paul M. Ostapuk

Abstract

No abstract available.

Full access
Thomas Hengstebeck, Kathrin Wapler, Dirk Heizenreder, and Paul Joe

Abstract

The radar network of the German Weather Service [Deutscher Wetterdienst (DWD)] provides 3D Doppler data in high spatial and temporal resolution, supporting the identification and tracking of dynamic small-scale weather phenomena. The software framework Polarimetric Radar Algorithms (POLARA) has been developed at DWD to better exploit the capabilities of the existing remote sensing data. The data processing and quality assurance implemented in POLARA include a dual-PRF dealiasing algorithm with error correction. Azimuthal shear information is derived and processed in the mesocyclone detection algorithm (MCD). Low- and midlevel azimuthal shear and track products are available as composite (multiradar) products. Azimuthal shear may be considered as a proxy for rotation. The MCD results and azimuthal shear products are part of the severe weather detection algorithms of DWD and are provided to the forecaster on the NinJo meteorological workstation system. The forecaster analyzes potentially severe cells by combining near-storm environment data with the MCD product as well as with the instantaneous azimuthal shear products (mid- and low level) and their tracks. These products and tracks are used to diagnose threat potential by means of azimuthal shear intensity and track longevity. Feedback from forecasters has shown the utility of the algorithms to analyze and diagnose severe convective cells in Germany and in adjacent Europe. In this paper, the abovementioned algorithms and products are presented in detail and case studies illustrating usability and performance are shown.

Open access
Robert Benoit, Pierre Pellerin, Nick Kouwen, Harold Ritchie, Norman Donaldson, Paul Joe, and E. D. Soulis

Abstract

The purpose of this study is to present the possibilities offered by coupled atmospheric and hydrologic models as a new tool to validate and interpret results produced by atmospheric models. The advantages offered by streamflow observations are different from those offered by conventional precipitation observations. The dependence between basins and subbasins can be very useful, and the integrating effect of the large basins facilitates the evaluation of state-of-the-art atmospheric models by filtering out some of the spatial and temporal variability that complicate the point-by-point verifications that are more commonly used. Streamflow permits a better estimate of the amount of water that has fallen over a region. A comparison of the streamflow predicted by the coupled atmospheric–hydrologic model versus the measured streamflow is sufficiently sensitive to clearly assess atmospheric model improvements resulting from increasing horizontal resolution and altering the treatment of precipitation processes in the model.

A case study using the WATFLOOD hydrologic model developed at the University of Waterloo is presented for several southern Ontario river basins. WATFLOOD is one-way coupled to a nonhydrostatic mesoscale atmospheric model that is integrated at horizontal resolutions of 35, 10, and 3 km. This hydrologic model is also driven by radar-derived precipitation amounts from King City radar observations. Rain gauge observations and measured streamflows are also available for this case, permitting multiple validation comparisons. These experiments show some uncertainties associated with each tool independently, and also the interesting complementary nature of these tools when they are used together. The predicted precipitation patterns are also compared directly with rain gauge observations and with radar data. It is demonstrated that the hydrologic model is sufficiently sensitive and accurate to diagnose model and radar errors. This tool brings an additional degree of verification that will be very important in the improvement of technologies associated with atmospheric models, radar observations, and water resource management.

Full access
Mingling R. Wu, Bradley J. Snyder, Ruping Mo, Alex J. Cannon, and Paul I. Joe

Abstract

The East Vancouver Island region on the west coast of Canada is prone to heavy snow in winter due to its unique geographical setting, which involves complicated interactions among the atmosphere, ocean, and local topography. The challenge for operational meteorologists is to distinguish a weather system that produces extreme snow amounts from one that produces modest amounts in this region. In this study, subjective, objective, and hybrid classification techniques are used to analyze the characteristics of 81 snowstorms observed in this region over a 10-yr period (2000–09). It is demonstrated that there are four principal weather patterns (occluded front, lee low, warm advection, and convective storm) conducive to heavy snow in East Vancouver Island. The occluded front pattern is the most ubiquitous for producing snow events, while the lee low pattern is the most extreme snow producer that poses the biggest forecast challenge.

Based on the identified weather patterns and a further investigation of five key weather ingredients, four conceptual models are developed to illustrate the meteorological processes leading to significant snowfalls in East Vancouver Island. These conceptual models have the potential to help meteorologists better understand and identify weather systems that would produce heavy snowfalls in this region and, therefore, improve forecasting and warning performance.

Full access
Armin Dehghan, Zen Mariani, Sylvie Leroyer, David Sills, Stéphane Bélair, and Paul Joe

Abstract

Canadian Global Environmental Multiscale (GEM) numerical model output was compared with the meteorological data from an enhanced observational network to investigate the model’s ability to predict Lake Ontario lake breezes and their characteristics for two cases in the Greater Toronto Area—one in which the large-scale wind opposed the lake breeze and one in which it was in the same direction as the lake breeze. An enhanced observational network of surface meteorological stations, a C-band radar, and two Doppler wind lidars were deployed among other sensors during the 2015 Pan and Parapan American Games in Toronto. The GEM model was run for three nested domains with grid spacings of 2.5, 1, and 0.25 km. Comparisons between the model predictions and ground-based observations showed that the model successfully predicted lake breezes for the two events. The results indicated that using GEM 1 and 0.25 km increased the forecast accuracy of the lake-breeze location, updraft intensity, and depth. The accuracy of the modeled lake breeze timing was approximately ±135 min. The model underpredicted the surface cooling caused by the lake breeze. The GEM 0.25-km model significantly improved the temperature forecast accuracy during the lake-breeze circulations, reducing the bias by up to 72%, but it mainly underpredicted the moisture and overpredicted the surface wind speed. Root-mean-square errors of wind direction forecasts were generally high because of large biases and high variability of errors.

Open access
Bart van den Hurk, Martin Best, Paul Dirmeyer, Andy Pitman, Jan Polcher, and Joe Santanello

No abstract available.

Full access
Rodger A. Brown, Thomas A. Niziol, Norman R. Donaldson, Paul I. Joe, and Vincent T. Wood

Abstract

During the winter, lake-effect snowstorms that form over Lake Ontario represent a significant weather hazard for the populace around the lake. These storms, which typically are only 2 km deep, frequently can produce narrow swaths (20–50 km wide) of heavy snowfall (2–5 cm h−1 or more) that extend 50–75 km inland over populated areas. Subtle changes in the low-altitude flow direction can mean the difference between accumulations that last for 1–2 h and accumulations that last 24 h or more at a given location. Therefore, it is vital that radars surrounding the lake are able to detect the presence and strength of these shallow storms. Starting in 2002, the Canadian operational radars on the northern side of the lake at King City, Ontario, and Franktown, Ontario, began using elevation angles of as low as −0.1° and 0.0°, respectively, during the winter to more accurately estimate snowfall rates at the surface. Meanwhile, Weather Surveillance Radars-1988 Doppler in New York State on the southern and eastern sides of the lake—Buffalo (KBUF), Binghamton (KBGM), and Montague (KTYX)—all operate at 0.5° and above. KTYX is located on a plateau that overlooks the lake from the east at a height of 0.5 km. With its upward-pointing radar beams, KTYX’s detection of shallow lake-effect snowstorms is limited to the eastern quarter of the lake and surrounding terrain. The purpose of this paper is to show—through simulations—the dramatic increase in snowstorm coverage that would be possible if KTYX were able to scan downward toward the lake’s surface. Furthermore, if KBUF and KBGM were to scan as low as 0.2°, detection of at least the upper portions of lake-effect storms over Lake Ontario and all of the surrounding land area by the five radars would be complete. Overlake coverage in the lower half (0–1 km) of the typical lake-effect snowstorm would increase from about 40% to about 85%, resulting in better estimates of snowfall rates in landfalling snowbands over a much broader area.

Full access
Paul Joe, Don Burgess, Rod Potts, Tom Keenan, Greg Stumpf, and Andrew Treloar

Abstract

One of the main goals of the Sydney 2000 Forecast Demonstration Project was to demonstrate the efficacy and utility of automated severe weather detection radar algorithms. As a contribution to this goal, this paper describes the radar-based severe weather algorithms used in the project, their performance, and related radar issues. Participants in this part of the project included the National Severe Storm Laboratory (NSSL) Warning Decision Support System (WDSS), the Meteorological Service of Canada Canadian Radar Decision Support (CARDS) system, the National Center for Atmospheric Research Thunderstorm Initiation, Tracking, Analysis, and Nowcasting (TITAN) system, and a precipitation-typing algorithm from the Bureau of Meteorology Research Centre polarized C-band polarimetric (C-Pol) radar. Three radars were available: the S-band reflectivity-only operational radar, the C-band Doppler Kurnell radar, and the C-band Doppler polarization C-Pol radar.

The radar algorithms attempt to diagnose the presence of storm cells; provide storm tracks; identify mesocyclone circulations, downbursts and/or microbursts, and hail; and provide storm ranking. The tracking and identification of cells was undertaken using TITAN and WDSS. Three versions of TITAN were employed to track weak and strong cells. Results show TITAN cell detection thresholds influence the ability of the algorithm to clearly identify storm cells and also the ability to correctly track the storms. WDSS algorithms are set up with lower-volume thresholds and provided many more tracks. WDSS and CARDS circulation algorithms were adapted to the Southern Hemisphere. CARDS had lower detection thresholds and, hence, detected more circulations than WDSS. Radial-velocity-based and reflectivity-based downburst algorithms were available from CARDS. Since the reflectivity-based algorithm was based on features aloft, it provided an earlier indication of strong surface winds. Three different hail algorithms from WDSS, CARDS, and C-Pol provided output on the presence, the probability, and the size of hail. Although the algorithms differed considerably they provided similar results. Size distributions were similar to observations. The WDSS provided a ranking algorithm to identify the most severe storm.

Many of the algorithms had been adapted and altered to account for differences in radar technology, configuration, and meteorological regime. The various combinations of different algorithms and different radars provided an unprecedented opportunity to study the impact of radar technology on the performance of the severe weather algorithms. The algorithms were able to operate on both single- and dual-pulse repetition frequency Doppler radars and on C- and S-band radars with minimal changes. The biggest influence on the algorithms was data quality. Beamwidth smoothing limited the effective range of the algorithms and ground clutter and ground clutter filtering affected the quality of the low-level radial velocities and the detection of low-level downbursts. Cycle time of the volume scans significantly affected the tracking results.

Full access
Chris Kidd, Andreas Becker, George J. Huffman, Catherine L. Muller, Paul Joe, Gail Skofronick-Jackson, and Dalia B. Kirschbaum

Abstract

The measurement of global precipitation, both rainfall and snowfall, is critical to a wide range of users and applications. Rain gauges are indispensable in the measurement of precipitation, remaining the de facto standard for precipitation information across Earth’s surface for hydrometeorological purposes. However, their distribution across the globe is limited: over land their distribution and density is variable, while over oceans very few gauges exist and where measurements are made, they may not adequately reflect the rainfall amounts of the broader area. Critically, the number of gauges available, or appropriate for a particular study, varies greatly across the Earth owing to temporal sampling resolutions, periods of operation, data latency, and data access. Numbers of gauges range from a few thousand available in near–real time to about 100,000 for all “official” gauges, and to possibly hundreds of thousands if all possible gauges are included. Gauges routinely used in the generation of global precipitation products cover an equivalent area of between about 250 and 3,000 m2. For comparison, the center circle of a soccer pitch or tennis court is about 260 m2. Although each gauge should represent more than just the gauge orifice, autocorrelation distances of precipitation vary greatly with regime and the integration period. Assuming each Global Precipitation Climatology Centre (GPCC)–available gauge is independent and represents a surrounding area of 5-km radius, this represents only about 1% of Earth’s surface. The situation is further confounded for snowfall, which has a greater measurement uncertainty.

Full access