Search Results

You are looking at 1 - 9 of 9 items for :

  • Author or Editor: Richard W. Katz x
  • Journal of Climate x
  • Refine by Access: All Content x
Clear All Modify Search
Richard W. Katz

Abstract

A dynamic decision-making problem is considered involving the use of information about the autocorrelation of a climate variable. Specifically, an infinite horizon, discounted version of the dynamic cost-loss ratio model is treated, in which only two states of weather ("adverse” or “not adverse") are possible and only two actions are permitted ("protect” or “do not protect"). To account for the temporal dependence of the sequence of states of the occurrence (or nonoccurrence) of adverse weather, a Markov chain model is employed. It is shown that knowledge of this autocorrelation has potential economic value to a decision maker, even without any genuine forecasts being available. Numerical examples are presented to demonstrate that a decision maker who erroneously follows a suboptimal strategy based on the belief that the climate variable is temporally independent could incur unnecessary expense. This approach also provides a natural framework for extension to the situation in which forecasts are available for an autocorrelated climate variable.

Full access
Richard W. Katz

Abstract

A statistical procedure is described for making inferences about changes in climate variability. The fundamental question of how to define climate variability is first addressed, and a definition of intrinsic climate variability based on a “prewhitening” of the data is advocated. A test for changes in variability that is not sensitive to departures from the assumption of a Gaussian distribution for the data is outlined. In addition to establishing whether observed differences in variability are statistically significant, the procedure provides confidence intervals for the ratio of variability. The technique is applied to time series of daily mean surface air temperature generated by the Oregon State University atmospheric general circulation model. The test application provides estimates of the magnitude of change in variability that the procedure should be likely to detect.

Full access
Mary W. Downton
and
Richard W. Katz

Abstract

Many climatic applications, including detection of climate change, require temperature time series that are free from discontinuities introduced by nonclimatic events such as relocation of weather stations. Although much attention has been devoted to discontinuities in the mean, possible changes in the variance have not been considered. A method is proposed to test and possibly adjust for nonclimatic inhomogeneities in the variance of temperature time series. The method is somewhat analogous to that developed by Karl and Williams to adjust for nonclimatic inhomogeneities in the mean. It uses the nonparametric bootstrap technique to compute confidence intervals for the discontinuity in variance. The method is tested on 1901–88 summer and winter mean maximum temperature data from 21 weather stations in the midwestern United States. The reasonableness, reliability, and accuracy of the estimated changes in variance are evaluated.

The bootstrap technique is found to be a valuable tool for obtaining confidence limits on the proposed variance adjustment. Inhomogeneities in variance are found to be more frequent than would be expected by chance in the summer temperature data, indicating that variance inhomogeneity is indeed a problem. Precision of the estimates in the test data indicates that changes of about 25%–30% in standard deviation can be detected if sufficient data are available. However, estimates of the changes in the standard deviation may be unreliable when less than 10 years of data are available before or after a potential discontinuity. This statistical test can be a useful tool for screening out stations that have unacceptably large discontinuities in variance.

Full access
Richard W. Katz
and
Xiaogu Zheng

Abstract

Stochastic models fit to time series of daily precipitation amount generally ignore any year-to-year (i.e., low frequency) source of random variation, and such models are known to underestimate the interannual variance of monthly or seasonal total precipitation. To explicitly account for this “overdispersion” phenomenon, a mixture model is proposed. A hidden index, taking on one of two possible states, is assumed to exist (perhaps representing different modes of atmospheric circulation). To represent the intermittency of precipitation and the tendency of wet or dry spells to persist, a stochastic model known as a chain-dependent process is applied. The parameters of this stochastic model are permitted to vary conditionally on the hidden index.

Data for one location in California (whose previous study motivated the present approach), as well as for another location in New Zealand, are analyzed. To estimate the parameters of a mixture of two conditional chain-dependent processes by maximum likelihood, the “expectation-maximization algorithm” is employed. It is demonstrated that this approach can either eliminate or greatly reduce the extent of the overdispersion phenomenon. Moreover, an attempt is made to relate the hidden indexes to observed features of atmospheric circulation. This approach to dealing with overdispersion is contrasted with the more prevalent alternative of fitting more complex stochastic models for high-frequency variations to time series of daily precipitation.

Full access
Barbara G. Brown
and
Richard W. Katz

Abstract

The statistical theory of extreme values is applied to daily minimum and maximum temperature time series in the U.S. Midwest and Southeast. If the spatial pattern in the frequency of extreme temperature events can be explained simply by shifts in location and scale parameters (e.g., the mean and standard deviation) of the underlying temperature distribution, then the area under consideration could be termed a “region.” A regional analysis of temperature extremes suggests that the Type I extreme value distribution is a satisfactory model for extreme high temperatures. On the other hand, the Type III extreme value distribution (possibly with common shape parameter) is often a better model for extreme low temperatures. Hence, our concept of a region is appropriate when considering maximum temperature extremes, and perhaps also for minimum temperature extremes.

Based on this regional analysis, if a temporal climate change were analogous to a spatial relocation, then it would be possible to anticipate how the frequency of extreme temperature events might change. Moreover, if the Type III extreme value distribution were assumed instead of the more common Type I, then the sensitivity of the frequency of extremes to changes in the location and scale parameters would be greater.

Full access
Lesley F. Tarleton
and
Richard W. Katz

Abstract

A reanalysis of the same Phoenix daily minimum and maximum temperature data examined by Balling et al. has been performed. As evidenced by substantial increasing trends in both the mean minimum and maximum temperatures, this area has experienced a marked heat island effect in recent decades. Balling et al. found that a statistical model for climate change in which simply a trend in the mean is permitted is inadequate to explain the observed trend in occurrence of extreme maximum temperatures. The present reanalysis establishes that by allowing for the observed decrease in the standard deviation, the tendency to overestimate the frequency of extreme high-temperature events is reduced. Thus, the urban heat island provides a real-world application in which trends in variability need to be taken into account to anticipate changes in the frequency of extreme events.

Full access
Richard W. Katz
and
Marc B. Parlange

Abstract

Simple stochastic models fit to time series of daily precipitation amount have a marked tendency to underestimate the observed (or interannual) variance of monthly (or seasonal) total precipitation. By considering extensions of one particular class of stochastic model known as a chain-dependent process, the extent to which this “overdispersion” phenomenon is attributable to an inadequate model for high-frequency variation of precipitation is examined. For daily precipitation amount in January at Chico, California, fitting more complex stochastic models greatly reduces the underestimation of the variance of monthly total precipitation. One source of overdispersion, the number of wet days, can be completely eliminated through the use of a higher-order Markov chain for daily precipitation occurrence. Nevertheless, some of the observed variance remains unexplained and could possibly be attributed to low-frequency variation (sometimes termed “potential predictability”). Of special interest is the fact that these more complex stochastic models still underestimate the monthly variance, more so than does an alternative approach, in which the simplest form of chain-dependent process is conditioned on an index of large-scale atmospheric circulation.

Full access
Jana Sillmann
,
Mischa Croci-Maspoli
,
Malaak Kallache
, and
Richard W. Katz

Abstract

North Atlantic atmospheric blocking conditions explain part of the winter climate variability in Europe, being associated with anomalous cold winter temperatures. In this study, the generalized extreme value (GEV) distribution is fitted to monthly minima of European winter 6-hourly minimum temperatures from the ECHAM5/MPI-OM global climate model simulations and the ECMWF reanalysis product known as ERA-40, with an indicator for atmospheric blocking conditions being used as covariate. It is demonstrated that relating the location and scale parameter of the GEV distribution to atmospheric blocking improves the fit to extreme minimum temperatures in large areas of Europe. The climate model simulations agree reasonably with ERA-40 in the present climate (1961–2000). Under the influence of atmospheric blocking, a decrease in the 0.95th quantiles of extreme minimum temperatures can be distinguished. This cooling effect of atmospheric blocking is, however, diminished in future climate simulations because of a shift in blocking location, and thus reduces the chances of very cold winters in northeastern parts of Europe.

Full access
Mari R. Tye
,
David B. Stephenson
,
Greg J. Holland
, and
Richard W. Katz

Abstract

Reliable estimates of future changes in extreme weather phenomena, such as tropical cyclone maximum wind speeds, are critical for climate change impact assessments and the development of appropriate adaptation strategies. However, global and regional climate model outputs are often too coarse for direct use in these applications, with variables such as wind speed having truncated probability distributions compared to those of observations. This poses two problems: How can model-simulated variables best be adjusted to make them more realistic? And how can such adjustments be used to make more reliable predictions of future changes in their distribution?

This study investigates North Atlantic tropical cyclone maximum wind speeds from observations (1950–2010) and regional climate model simulations (1995–2005 and 2045–55 at 12- and 36-km spatial resolutions). The wind speed distributions in these datasets are well represented by the Weibull distribution, albeit with different scale and shape parameters.

A power-law transfer function is used to recalibrate the Weibull variables and obtain future projections of wind speeds. Two different strategies, bias correction and change factor, are tested by using 36-km model data to predict future 12-km model data (pseudo-observations). The strategies are also applied to the observations to obtain likely predictions of the future distributions of wind speeds. The strategies yield similar predictions of likely changes in the fraction of events within Saffir–Simpson categories—for example, an increase from 21% (1995–2005) to 27%–37% (2045–55) for category 3 or above events and an increase from 1.6% (1995–2005) to 2.8%–9.8% (2045–55) for category 5 events.

Full access