Search Results

You are looking at 1 - 8 of 8 items for

  • Author or Editor: B. Katz x
  • Refine by Access: All Content x
Clear All Modify Search
Akira Kasahara, Ramesh C. Balgovind, and B. Katz

Abstract

A scheme is proposed to incorporate satellite radiometric imagery data into the specification of initial conditions for the National Meteorological Center (NMC) operational global prediction model in order to improve the analysis of the divergent wind field in the tropics. The basic assumptions are that outgoing longwave radiation (OLR) data can provide 1) the division between convective (upward motion) and clear sky (downward motion) areas, and 2) the height of convection cells. The intensity of ascending motion in the convective areas is estimated based on OLR data. The intensity of descending motion is evaluated from the thermodynamic energy balance between radiative cooling and adiabatic warming, since the local time change of temperature is small in the tropics. Once the vertical motion field is determined, the horizontal divergence field can be calculated from the mass continuity equation. Then, a divergent wind field is determined. The total wind is the sum of the new divergent wind and the rotational part, which is assumed to be unchanged. The proposed scheme is tested using the NMC analysis dataset of 21 January 1985 with satisfactory results.

Full access
Richard W. Katz and Marc B. Parlange

Abstract

Simple stochastic models fit to time series of daily precipitation amount have a marked tendency to underestimate the observed (or interannual) variance of monthly (or seasonal) total precipitation. By considering extensions of one particular class of stochastic model known as a chain-dependent process, the extent to which this “overdispersion” phenomenon is attributable to an inadequate model for high-frequency variation of precipitation is examined. For daily precipitation amount in January at Chico, California, fitting more complex stochastic models greatly reduces the underestimation of the variance of monthly total precipitation. One source of overdispersion, the number of wet days, can be completely eliminated through the use of a higher-order Markov chain for daily precipitation occurrence. Nevertheless, some of the observed variance remains unexplained and could possibly be attributed to low-frequency variation (sometimes termed “potential predictability”). Of special interest is the fact that these more complex stochastic models still underestimate the monthly variance, more so than does an alternative approach, in which the simplest form of chain-dependent process is conditioned on an index of large-scale atmospheric circulation.

Full access
Marc B. Parlange and Richard W. Katz

Abstract

The Richardson model is a popular technique for stochastic simulation of daily weather variables, including precipitation amount, maximum and minimum temperature, and solar radiation. This model is extended to include two additional variables, daily mean wind speed and dewpoint, because these variables (or related quantities such as relative humidity) are required as inputs for certain ecological/vegetation response and agricultural management models. To allow for the positively skewed distribution of wind speed, a power transformation is applied. Solar radiation also is transformed to make the shape of its modeled distribution more realistic. A model identification criterion is used as an aid in determining whether the distributions of these two variables depend on precipitation occurrence. The approach can be viewed as an integration of what is known about the statistical properties of individual weather variables into a single multivariate model.

As an application, this extended model is fitted to weather data in the Pacific Northwest. To aid in understanding how such a stochastic weather generator works, considerable attention is devoted to its statistical properties. In particular, marginal and conditional distributions of wind speed and solar radiation are examined, with the model being capable of representing relationships between variables in which the variance is not constant, as well as certain forms of nonlinearity.

Full access
Mari R. Tye, David B. Stephenson, Greg J. Holland, and Richard W. Katz

Abstract

Reliable estimates of future changes in extreme weather phenomena, such as tropical cyclone maximum wind speeds, are critical for climate change impact assessments and the development of appropriate adaptation strategies. However, global and regional climate model outputs are often too coarse for direct use in these applications, with variables such as wind speed having truncated probability distributions compared to those of observations. This poses two problems: How can model-simulated variables best be adjusted to make them more realistic? And how can such adjustments be used to make more reliable predictions of future changes in their distribution?

This study investigates North Atlantic tropical cyclone maximum wind speeds from observations (1950–2010) and regional climate model simulations (1995–2005 and 2045–55 at 12- and 36-km spatial resolutions). The wind speed distributions in these datasets are well represented by the Weibull distribution, albeit with different scale and shape parameters.

A power-law transfer function is used to recalibrate the Weibull variables and obtain future projections of wind speeds. Two different strategies, bias correction and change factor, are tested by using 36-km model data to predict future 12-km model data (pseudo-observations). The strategies are also applied to the observations to obtain likely predictions of the future distributions of wind speeds. The strategies yield similar predictions of likely changes in the fraction of events within Saffir–Simpson categories—for example, an increase from 21% (1995–2005) to 27%–37% (2045–55) for category 3 or above events and an increase from 1.6% (1995–2005) to 2.8%–9.8% (2045–55) for category 5 events.

Full access
R. van Hout, W. Zhu, L. Luznik, J. Katz, J. Kleissl, and M. B. Parlange

Abstract

Particle image velocimetry (PIV) measurements just within and above a mature corn canopy have been performed to clarify the small-scale spatial structure of the turbulence. The smallest resolved scales are about 15 times the Kolmogorov length scale (η ≈ 0.4 mm), the Taylor microscales are about 100η, and the Taylor scale Reynolds numbers range between Rλ = 2000 and 3000. The vertical profiles of mean flow and turbulence parameters match those found in previous studies. Frequency spectra, obtained using the data as time series, are combined with instantaneous spatial spectra to resolve more than five orders of magnitude of length scales. They display an inertial range spanning three decades. However, the small-scale turbulence in the dissipation range exhibits anisotropy at all measurement heights, in spite of apparent agreement with conditions for reaching local isotropy, following a high-Reynolds-number wind tunnel study. Directly calculated subgrid-scale (SGS) energy flux, determined by spatially filtering the PIV data, increases significantly with decreasing filter size, providing support for the existence of a spectral shortcut that bypasses the cascading process and injects energy directly into small scales. The highest measured SGS flux is about 40% of the estimated energy cascading rate as determined from a −5/3 fit to the spectra. Terms appearing in the turbulent kinetic energy budget that can be calculated from the PIV data are in agreement with previous results. Evidence of a very strong correlation between dissipation rate and out-of-plane component of the vorticity is demonstrated by a striking similarity between their time series.

Full access
Peter Rogowski, Mark Otero, Joel Hazard, Thomas Muschamp, Scott Katz, and Eric Terrill

Abstract

Accurate surface meteorological (MET) observations reported reliably and in near–real time remain a critical component of on-scene environmental observation systems. Presented is a system developed by Scripps Institution of Oceanography that allows for rapid, global deployment of ground-based weather observations to support both timely decision-making and collection of high-quality weather time series for science or military applications in austere environments. Named the Expeditionary Meteorological (XMET), these weather stations have been deployed in extreme conditions devoid of infrastructure ranging from tropical, polar, maritime, and desert environments where near continuous observations were reported. To date, over 1 million weather observations have been collected during 225 deployments around the world with a data report success rate of 99.5%. XMET had its genesis during Operation Iraqi Freedom (OIF), when the U.S. Marine Corps 3rd Marine Aircraft Wing identified an immediate capability gap in environmental monitoring of their operation area due to high spatiotemporal variability of dust storms in the region. To address the sensing gap, XMET was developed to be a portable, expendable, ruggedized, self-contained, bidirectional, weather observation station that can be quickly deployed anywhere in the world to autonomously sample and report aviation weather observations. This paper provides an overview of the XMETs design, reliability in different environments, and examples of unique meteorological events that highlight both the unit’s reliability and ability to provide quality time series. The overview shows expeditionary MET sensing systems, such as the XMET, are able to provide long-term continuous observational records in remote and austere locations essential for regional spatiotemporal MET characterization.

Restricted access
M. Kanamitsu, J.C. Alpert, K.A. Campana, P.M. Caplan, D.G. Deaven, M. Iredell, B. Katz, H.-L. Pan, J. Sela, and G.H. White

Abstract

A number of improvements were implemented on 6 March 1991 into the National Meteorological Center's global model, which is used in the global data assimilation system (GDAS), the aviation (AVN) forecast, and the medium-range forecast (MRF):

  • The horizontal resolution of the forecast model was increased from triangular truncation T80 to T126, which corresponds to an equivalent increase in grid resolution from 160 km to 105 km.

  • The use of enhanced orography has been discontinued and replaced by mean orography.

  • A new marine-stratus parameterization was introduced.

  • A new mass-conservation constraint was implemented.

  • The horizontal diffusion in the medium scales was reduced by adopting the Leith formulation.

  • A new, more accurate sea-surface temperature analysis is now used.

In this note, we discuss each of the changes and briefly review the new model performance.

Full access
Thomas C. Peterson, Richard R. Heim Jr., Robert Hirsch, Dale P. Kaiser, Harold Brooks, Noah S. Diffenbaugh, Randall M. Dole, Jason P. Giovannettone, Kristen Guirguis, Thomas R. Karl, Richard W. Katz, Kenneth Kunkel, Dennis Lettenmaier, Gregory J. McCabe, Christopher J. Paciorek, Karen R. Ryberg, Siegfried Schubert, Viviane B. S. Silva, Brooke C. Stewart, Aldo V. Vecchia, Gabriele Villarini, Russell S. Vose, John Walsh, Michael Wehner, David Wolock, Klaus Wolter, Connie A. Woodhouse, and Donald Wuebbles

Weather and climate extremes have been varying and changing on many different time scales. In recent decades, heat waves have generally become more frequent across the United States, while cold waves have been decreasing. While this is in keeping with expectations in a warming climate, it turns out that decadal variations in the number of U.S. heat and cold waves do not correlate well with the observed U.S. warming during the last century. Annual peak flow data reveal that river flooding trends on the century scale do not show uniform changes across the country. While flood magnitudes in the Southwest have been decreasing, flood magnitudes in the Northeast and north-central United States have been increasing. Confounding the analysis of trends in river flooding is multiyear and even multidecadal variability likely caused by both large-scale atmospheric circulation changes and basin-scale “memory” in the form of soil moisture. Droughts also have long-term trends as well as multiyear and decadal variability. Instrumental data indicate that the Dust Bowl of the 1930s and the drought in the 1950s were the most significant twentieth-century droughts in the United States, while tree ring data indicate that the megadroughts over the twelfth century exceeded anything in the twentieth century in both spatial extent and duration. The state of knowledge of the factors that cause heat waves, cold waves, floods, and drought to change is fairly good with heat waves being the best understood.

Full access