Search Results
You are looking at 1 - 10 of 28 items for
- Author or Editor: H. M. van den Dool x
- Refine by Access: All Content x
Abstract
A bias in skill may exist in statistical forecast methods in which the verification datum is withheld from the developmental data (cross-validation methods). Under certain circumstances this bias in skill can become troublesome. By way of example, it is shown that the judgment of the quality of forecasts based on analogues and anti-analogues may severely suffer from a bias in skill. A cure to the problem is discussed. Some implications for published results of long-range weather forecasting models based on analogues are discussed.
Abstract
A bias in skill may exist in statistical forecast methods in which the verification datum is withheld from the developmental data (cross-validation methods). Under certain circumstances this bias in skill can become troublesome. By way of example, it is shown that the judgment of the quality of forecasts based on analogues and anti-analogues may severely suffer from a bias in skill. A cure to the problem is discussed. Some implications for published results of long-range weather forecasting models based on analogues are discussed.
Abstract
An empirical study based on three years (1981–83) of monthly mean data revealed that colocated anomalies in precipitation (^PP) and vertical motion at 500 mb (^ω) are moderately well correlated over the United States,in winter. The ^PP data are spatial averages in 344 Climate Divisions while the ω, are derived from initialized fields of the ECMWF and NMC NWP models. To a first-order approximation the deficit of rain associated with anomalous downward motion is just as large as the surplus of rain associated with anomalous upward motion. Therefore, it should be possible to generate some of the latent heat of condensation in a linear model for the time mean atmosphere by expressing ^PP linearly in ^ω Monthly mean ω of the ECMWF and NMC are highly correlated with each other and relate about equally well to rainfall over the United States. The empirical constant a in the relation PP=a&omega turns out to be about − −0.6 mm day−1/(10−2N m−2s−1, which is on the same order of magnitude as the theoretical amount of precipitation produced by diabatic ascent of magnitude 10−22 S2s−1. Attempts to empirically extract the role of atmospheric moisture in the relation between ^PP and ^ω were made by comparing summer to winter, high latitudes to lower latitudes and the United States to India but the results are at best modest.
Implementation of parameterized latent had sources and sinks in a linear steady state anomaly model for the time mean atmosphere is equivalent to reducing its static stability by a sizable amount. This leads to increased response to a prescribed forcing.
Abstract
An empirical study based on three years (1981–83) of monthly mean data revealed that colocated anomalies in precipitation (^PP) and vertical motion at 500 mb (^ω) are moderately well correlated over the United States,in winter. The ^PP data are spatial averages in 344 Climate Divisions while the ω, are derived from initialized fields of the ECMWF and NMC NWP models. To a first-order approximation the deficit of rain associated with anomalous downward motion is just as large as the surplus of rain associated with anomalous upward motion. Therefore, it should be possible to generate some of the latent heat of condensation in a linear model for the time mean atmosphere by expressing ^PP linearly in ^ω Monthly mean ω of the ECMWF and NMC are highly correlated with each other and relate about equally well to rainfall over the United States. The empirical constant a in the relation PP=a&omega turns out to be about − −0.6 mm day−1/(10−2N m−2s−1, which is on the same order of magnitude as the theoretical amount of precipitation produced by diabatic ascent of magnitude 10−22 S2s−1. Attempts to empirically extract the role of atmospheric moisture in the relation between ^PP and ^ω were made by comparing summer to winter, high latitudes to lower latitudes and the United States to India but the results are at best modest.
Implementation of parameterized latent had sources and sinks in a linear steady state anomaly model for the time mean atmosphere is equivalent to reducing its static stability by a sizable amount. This leads to increased response to a prescribed forcing.
Abstract
The influence of cloud amount on the earth's climate is studied with an energy balance climate model. Planetary albedo and infrared radiation are parameterized in terms of cloud amount and surface temperature. For the present climate a prescribed change in cloud amount (independent of latitude) leads to a negligible change in the global mean temperature (∂T̄/∂A c≈0). For global temperatures lower than present ∂T̄/∂A c, becomes positive rapidly; higher temperatures lead to. negative values of ∂T̄/∂A c. The sensitivity of the global mean temperature to a 1% change in the solar constant (∂T̄/∂S) is ∼ 1.5 K. With reduced cloud amount ∂T̄/∂S becomes larger because the snow-ice feedback is active in the larger cloud-free portion; with increased cloud amount ∂T̄/∂S becomes smaller. Due to the strong absorption of solar radiation by clouds deep freeze solutions are possible only for very low values of the solar constant. The response of the model to changes in cloud amount or incoming radiation should be studied as a function of latitude. Two expressions that quantify the sensitivity to changes In the solar constant and cloud amount as a function of latitude are defined. If cloud amount is assumed to increase with temperature in a certain latitude belt and to decrease with temperature elsewhere, ∂T̄/∂S (as a function of latitude) and ∂T̄/∂S can change considerably.
Abstract
The influence of cloud amount on the earth's climate is studied with an energy balance climate model. Planetary albedo and infrared radiation are parameterized in terms of cloud amount and surface temperature. For the present climate a prescribed change in cloud amount (independent of latitude) leads to a negligible change in the global mean temperature (∂T̄/∂A c≈0). For global temperatures lower than present ∂T̄/∂A c, becomes positive rapidly; higher temperatures lead to. negative values of ∂T̄/∂A c. The sensitivity of the global mean temperature to a 1% change in the solar constant (∂T̄/∂S) is ∼ 1.5 K. With reduced cloud amount ∂T̄/∂S becomes larger because the snow-ice feedback is active in the larger cloud-free portion; with increased cloud amount ∂T̄/∂S becomes smaller. Due to the strong absorption of solar radiation by clouds deep freeze solutions are possible only for very low values of the solar constant. The response of the model to changes in cloud amount or incoming radiation should be studied as a function of latitude. Two expressions that quantify the sensitivity to changes In the solar constant and cloud amount as a function of latitude are defined. If cloud amount is assumed to increase with temperature in a certain latitude belt and to decrease with temperature elsewhere, ∂T̄/∂S (as a function of latitude) and ∂T̄/∂S can change considerably.
Abstract
We address whether there are pairs of instantaneous 500-mb flow patterns that are, relative to the climatology, as much as possible each other's opposite (which we term “mirror images” or “antilogs”), and we investigate whether such flows are followed by opposite 12-h time tendencies.
Over eastern North America for almost all wintertime flow patterns in a 15-yr dataset it is almost as easy to find an antilog as an analog. Exceptions are very deep lows for which no mirror-imaged highs exist. In addition, antilogs make for 12-h height forecasts at a skill level almost as good as those based on analogs.
Therefore the multivariate height distribution (i.e., flow patterns) is almost symmetric, and mirror imaging an observed flow is likely to yield a physically plausible pattern although not necessarily observed so far. Note that time tendencies tend to be opposite for opposite initial conditions for short periods of time even though the perturbations (full anomalies) are not small. An explanation of the latter is sought by running a global barotropic model from both regular and mirror-imaged initial conditions. Out to 12 h the tendency of the midlatitude streamfunction is primarily determined by the linear part of the absolute vorticity advection. However, on small scales (i.e., the vorticity field) forecasts deteriorate after about 6 h when the nonlinear term is either omitted (linear run) or represented wrongly (in the mirror-imaged run).
Abstract
We address whether there are pairs of instantaneous 500-mb flow patterns that are, relative to the climatology, as much as possible each other's opposite (which we term “mirror images” or “antilogs”), and we investigate whether such flows are followed by opposite 12-h time tendencies.
Over eastern North America for almost all wintertime flow patterns in a 15-yr dataset it is almost as easy to find an antilog as an analog. Exceptions are very deep lows for which no mirror-imaged highs exist. In addition, antilogs make for 12-h height forecasts at a skill level almost as good as those based on analogs.
Therefore the multivariate height distribution (i.e., flow patterns) is almost symmetric, and mirror imaging an observed flow is likely to yield a physically plausible pattern although not necessarily observed so far. Note that time tendencies tend to be opposite for opposite initial conditions for short periods of time even though the perturbations (full anomalies) are not small. An explanation of the latter is sought by running a global barotropic model from both regular and mirror-imaged initial conditions. Out to 12 h the tendency of the midlatitude streamfunction is primarily determined by the linear part of the absolute vorticity advection. However, on small scales (i.e., the vorticity field) forecasts deteriorate after about 6 h when the nonlinear term is either omitted (linear run) or represented wrongly (in the mirror-imaged run).
Abstract
In the literature, the use of analogues for short-range weather forecasting has practically been discarded. This is because no good matches for today's extratropical large-scale flow patterns can be found in a 30-year data library. We propose here a limited-area model approach for Analogue-Forecasting (AF). In order to make a 12-hour AF valid at a target point, we require analogy in initial states only over a circle with radius of about 900 km. On a limited area there are usually several good analogues, sometimes to within observational error. Different historical analogues may be used at different target points.
The usefulness of the limited area approach is first demonstrated with some examples. We then present verification statistics of 3000 12-hour 500-mb height point forecasts in the Northern Hemisphere winter at 38°N, 80°W (over West Virginia, U.S.A.). In order to beat persistence at 12 hours at this point we need an analogue which differs by about 40 geopotential meters or less from the base case. This requirement is met almost all of the time using a 15-year dataset for analogue searching. We find a few percent of the analogue pairs to be within observational error. In the mean, over 3000 cases the initial discrepancy is 33 gpm. When averaging over the first five analogues 12-hour AF over the eastern United States can be characterized by a 52 gpm rms error and 0.95 (0.77) anomaly (tendency) correlation. The forecasts have the correct amplitude, i.e., no damping, in spite of the averaging over five individual forecasts. We then show an example of a 500-mb height forecast map on a (roughly) 2000 × 2000 km area over the eastern part of North America. Although different analogues were used to arrive at the 12 hour forecast at each of the 25 gridpoints, the resulting map looks meteorological and the forecast is moderately successful. A verification of a large set of 12-hour forecast maps shows that the height gradients are indeed forecast with some skill. We then proceed to make 24-hour point forecasts by finding historical limited-area matches to the 12-hour forecast maps. This second time step indicates that the AF-process holds up, with forecast accuracy increasing its margin over persistence.
Two applications are discussed. Comparing initial and 12-hour forecast error tells us something about “error growth” and predictability at that spot according to a perfect model. Given that there are usually several good analogues, Monte Carlo experiments and probabilistic forecasts naturally suggest themselves. In particular we find the spread of analogue forecasts to be an eminent predictor of forecast skill.
Refinements and applications and extension to longer range forecasts are discussed in the final section.
Abstract
In the literature, the use of analogues for short-range weather forecasting has practically been discarded. This is because no good matches for today's extratropical large-scale flow patterns can be found in a 30-year data library. We propose here a limited-area model approach for Analogue-Forecasting (AF). In order to make a 12-hour AF valid at a target point, we require analogy in initial states only over a circle with radius of about 900 km. On a limited area there are usually several good analogues, sometimes to within observational error. Different historical analogues may be used at different target points.
The usefulness of the limited area approach is first demonstrated with some examples. We then present verification statistics of 3000 12-hour 500-mb height point forecasts in the Northern Hemisphere winter at 38°N, 80°W (over West Virginia, U.S.A.). In order to beat persistence at 12 hours at this point we need an analogue which differs by about 40 geopotential meters or less from the base case. This requirement is met almost all of the time using a 15-year dataset for analogue searching. We find a few percent of the analogue pairs to be within observational error. In the mean, over 3000 cases the initial discrepancy is 33 gpm. When averaging over the first five analogues 12-hour AF over the eastern United States can be characterized by a 52 gpm rms error and 0.95 (0.77) anomaly (tendency) correlation. The forecasts have the correct amplitude, i.e., no damping, in spite of the averaging over five individual forecasts. We then show an example of a 500-mb height forecast map on a (roughly) 2000 × 2000 km area over the eastern part of North America. Although different analogues were used to arrive at the 12 hour forecast at each of the 25 gridpoints, the resulting map looks meteorological and the forecast is moderately successful. A verification of a large set of 12-hour forecast maps shows that the height gradients are indeed forecast with some skill. We then proceed to make 24-hour point forecasts by finding historical limited-area matches to the 12-hour forecast maps. This second time step indicates that the AF-process holds up, with forecast accuracy increasing its margin over persistence.
Two applications are discussed. Comparing initial and 12-hour forecast error tells us something about “error growth” and predictability at that spot according to a perfect model. Given that there are usually several good analogues, Monte Carlo experiments and probabilistic forecasts naturally suggest themselves. In particular we find the spread of analogue forecasts to be an eminent predictor of forecast skill.
Refinements and applications and extension to longer range forecasts are discussed in the final section.
Abstract
A study of long records of monthly mean air temperature (MMAT) for many stations in the Netherlands indicates that the atmosphere's response to surface boundary forcing is often of a very simple local nature. In the Dutch area, the atmosphere seems to respond to a sea surface temperature (SST) anomaly in the North Sea with an air temperature anomaly of the same sign. Because of the abrupt change in lower boundary forcing near the coastline, very small spatial scales are introduced in air temperature anomalies at long time scales. Over the sea MMAT anomalies have much larger time scales than over the land; a similar increase in time scale can be found in the delay of the climatologically normal temperature with respect to the solar forcing. When extended to the United States, the study showed very similar results; that is, monthly mean surface air temperature (MMAT) anomalies live longest in areas where the air temperature response is slowest to the annual cycle in incoming radiation. Apart from boundary forcing by the oceans, the Gulf of Mexico and the Great Lakes, there is sonic evidence of forcing by snow-cover in the Northeast.
Since surface boundary forcing by SST anomalies can be quite persistent, MMAT anomalies are more predictable over the sea and in the coastal zone than in the interior of big land masses. This explains why Madden and Shea (1978) found that the potentially predictable part of the interannual variance in MMAT is largest in predominantly coastal areas, California, in particular. A sizable fraction of the potential predictability in these areas can be effected by such simple tools as linear regression onto antecedent MMAT.
Abstract
A study of long records of monthly mean air temperature (MMAT) for many stations in the Netherlands indicates that the atmosphere's response to surface boundary forcing is often of a very simple local nature. In the Dutch area, the atmosphere seems to respond to a sea surface temperature (SST) anomaly in the North Sea with an air temperature anomaly of the same sign. Because of the abrupt change in lower boundary forcing near the coastline, very small spatial scales are introduced in air temperature anomalies at long time scales. Over the sea MMAT anomalies have much larger time scales than over the land; a similar increase in time scale can be found in the delay of the climatologically normal temperature with respect to the solar forcing. When extended to the United States, the study showed very similar results; that is, monthly mean surface air temperature (MMAT) anomalies live longest in areas where the air temperature response is slowest to the annual cycle in incoming radiation. Apart from boundary forcing by the oceans, the Gulf of Mexico and the Great Lakes, there is sonic evidence of forcing by snow-cover in the Northeast.
Since surface boundary forcing by SST anomalies can be quite persistent, MMAT anomalies are more predictable over the sea and in the coastal zone than in the interior of big land masses. This explains why Madden and Shea (1978) found that the potentially predictable part of the interannual variance in MMAT is largest in predominantly coastal areas, California, in particular. A sizable fraction of the potential predictability in these areas can be effected by such simple tools as linear regression onto antecedent MMAT.
Abstract
The level of month-to-month persistence of anomalies in the monthly mean atmospheric circulation was determined from a 29-year data set of Northern Hemisphere analyses of 500 mb height, surface pressure and 500–1000 mb thickness. A well-defined annual march is found, with greatest persistence from January to February and from July to August. The minima occur in spring and fall. Expressed in a linear correlation coefficient the largest persistence amounts to no more than 0.3.
A qualitative explanation for the double peak in the annual march was sought in linear theory. The response of a stationary linear atmospheric model to the forcing of an anomalous heat source depends on the properties of the basic state around which the model is linearized. Model runs were made with climatological mean basic states corresponding to all 12 calendar months. In all runs the forcing was kept the same. As climatology changes least from January to February and from July to August, the model response to the constant forcing then is almost 100% persistent. The persistence is low from April to May and from October to November because in these months the basic state changes rather drastically.
Although the maxima in persistence on the monthly time scale in the observed circulation are indeed found in summer and winter, the level of persistence is far below 100%. This can be interpreted as observational evidence of the very large forcing of the time-mean atmosphere by high-frequency transient eddies. The forcing associated with long-lived anomalies in external factors (oceans, snow, etc.) seems to control only a small part of the observed anomalies in the atmospheric circulation.
Abstract
The level of month-to-month persistence of anomalies in the monthly mean atmospheric circulation was determined from a 29-year data set of Northern Hemisphere analyses of 500 mb height, surface pressure and 500–1000 mb thickness. A well-defined annual march is found, with greatest persistence from January to February and from July to August. The minima occur in spring and fall. Expressed in a linear correlation coefficient the largest persistence amounts to no more than 0.3.
A qualitative explanation for the double peak in the annual march was sought in linear theory. The response of a stationary linear atmospheric model to the forcing of an anomalous heat source depends on the properties of the basic state around which the model is linearized. Model runs were made with climatological mean basic states corresponding to all 12 calendar months. In all runs the forcing was kept the same. As climatology changes least from January to February and from July to August, the model response to the constant forcing then is almost 100% persistent. The persistence is low from April to May and from October to November because in these months the basic state changes rather drastically.
Although the maxima in persistence on the monthly time scale in the observed circulation are indeed found in summer and winter, the level of persistence is far below 100%. This can be interpreted as observational evidence of the very large forcing of the time-mean atmosphere by high-frequency transient eddies. The forcing associated with long-lived anomalies in external factors (oceans, snow, etc.) seems to control only a small part of the observed anomalies in the atmospheric circulation.
Abstract
This paper presents a study on simple and inexpensive techniques for extension of NMC's Medium Range Forecasting (MRF) model. Three control forecasts are tested to make 1-day extensions of 500-mb height fields initiated from the MRF at days 0–9. They are persistence (PER), a divergent anomaly vorticity advection model (dAVA), and the empirical wave propagation (EWP) method.
First the traditional 1–10-day global forecasts made by the MRF and the three controls from a common set of 361 initial conditions are discussed. Taking this as a basis, 1-day extension control forecasts starting from MRF prediction over four successive winters are examined next. Experiments show that regardless of the presence or absence of the systematic error in the MRF model output, there exists some point (T 0 = n) into the forecast after which the 1-day extension of the day n MRF out to day n + 1 by a control forecast is as good as or better than the continued integration of the full blown MRF model. In particular, the EWP provides a 1-day extension that beats the MRF most consistently after about 6 days in the Northern Hemisphere. Decomposition of the forecasts in terms of zonal harmonics further indicates that the skill improvement over the MRF is primarily in the long waves, but contributions from shorter waves are not negligible.
Efforts have been made to understand the mechanisms by which simple methods are superior to complicated models for low-frequency prediction at extended range. It seems that at least two simplifications made in one or all of the control forecasts are crucial in outperforming the MRF beyond day 6. The first one is well known, that is, the contaminating effects of synoptic-scale baroclinic eddies have been filtered out in the simple models considered. More generally, the nonlinear terms (whether barotropic or baroclinic) contribute to skill deterioration beyond day 6. The second reason is the explicit elimination of the divergence process in the control forecasts, as the MRF model may contain significant errors in forecasting the divergence.
Abstract
This paper presents a study on simple and inexpensive techniques for extension of NMC's Medium Range Forecasting (MRF) model. Three control forecasts are tested to make 1-day extensions of 500-mb height fields initiated from the MRF at days 0–9. They are persistence (PER), a divergent anomaly vorticity advection model (dAVA), and the empirical wave propagation (EWP) method.
First the traditional 1–10-day global forecasts made by the MRF and the three controls from a common set of 361 initial conditions are discussed. Taking this as a basis, 1-day extension control forecasts starting from MRF prediction over four successive winters are examined next. Experiments show that regardless of the presence or absence of the systematic error in the MRF model output, there exists some point (T 0 = n) into the forecast after which the 1-day extension of the day n MRF out to day n + 1 by a control forecast is as good as or better than the continued integration of the full blown MRF model. In particular, the EWP provides a 1-day extension that beats the MRF most consistently after about 6 days in the Northern Hemisphere. Decomposition of the forecasts in terms of zonal harmonics further indicates that the skill improvement over the MRF is primarily in the long waves, but contributions from shorter waves are not negligible.
Efforts have been made to understand the mechanisms by which simple methods are superior to complicated models for low-frequency prediction at extended range. It seems that at least two simplifications made in one or all of the control forecasts are crucial in outperforming the MRF beyond day 6. The first one is well known, that is, the contaminating effects of synoptic-scale baroclinic eddies have been filtered out in the simple models considered. More generally, the nonlinear terms (whether barotropic or baroclinic) contribute to skill deterioration beyond day 6. The second reason is the explicit elimination of the divergence process in the control forecasts, as the MRF model may contain significant errors in forecasting the divergence.
Abstract
A method for time interpolation based on the climatological speed of large-scale atmospheric waves that is empirically determined is proposed. When tested on a 7-yr dataset this method is found to be easy to use, has good accuracy, and is, in fact, considerably more accurate than the much used linear time interpolation. The gain in accuracy is particularly large for mobile synoptic waves. Several applications of a time-continuous description of the atmosphere are discussed.
Abstract
A method for time interpolation based on the climatological speed of large-scale atmospheric waves that is empirically determined is proposed. When tested on a 7-yr dataset this method is found to be easy to use, has good accuracy, and is, in fact, considerably more accurate than the much used linear time interpolation. The gain in accuracy is particularly large for mobile synoptic waves. Several applications of a time-continuous description of the atmosphere are discussed.
Abstract
An attempt is made to estimate the thermal inertia of the upper ocean, relevant to climatic change. This is done by assuming that the annual variation in sea surface temperature (SST) can, to a first-order approximation, be described by a simple energy-balance equation. From the observed climatological annual variation in SST and in absorbed solar radiation we can estimate then a typical value of the heat capacity (C) of the active layer of the ocean. Also we can estimate how fast the SST is damped towards an equilibrium value (damping coefficient b). Within the same theoretical framework the decay time of SST anomalies allows us to estimate the seasonality of C/b.
The method is first tested on SST at six ocean weather ships and two coastal stations. The calculated depth of the active layer looks reasonable though somewhat small and it is encouraging that the seasonality in C/b, derived from daily SST data at one station, is similar to the observed seasonality in mixed layer depth. One of the problems seems to be that we need rather precise observations concerning solar radiation reaching the earth's surface. At many places such knowledge is not available. The spatial distribution of calculated active layer depth over the North Pacific is very similar to that of observed annual mean mixed layer depth but the mixed layer seems to be twice as deep as the active layer. Also the effective mixed layer defined and used by Manabe and Stouffer is substantially deeper than our calculated active layer.
The results are discussed in the context of both the surface energy balance and the vertically averaged energy balance. One of the interesting findings of this study is that the layer of the ocean involved in the annual cycle should be taken as 20–50 m rather than the more customary 60–80 m. Another conclusion is that the SST seems to damp (towards equilibrium) at least five times faster than the vertically integrated energy content of the climate system as a whole (including the ocean!).
Abstract
An attempt is made to estimate the thermal inertia of the upper ocean, relevant to climatic change. This is done by assuming that the annual variation in sea surface temperature (SST) can, to a first-order approximation, be described by a simple energy-balance equation. From the observed climatological annual variation in SST and in absorbed solar radiation we can estimate then a typical value of the heat capacity (C) of the active layer of the ocean. Also we can estimate how fast the SST is damped towards an equilibrium value (damping coefficient b). Within the same theoretical framework the decay time of SST anomalies allows us to estimate the seasonality of C/b.
The method is first tested on SST at six ocean weather ships and two coastal stations. The calculated depth of the active layer looks reasonable though somewhat small and it is encouraging that the seasonality in C/b, derived from daily SST data at one station, is similar to the observed seasonality in mixed layer depth. One of the problems seems to be that we need rather precise observations concerning solar radiation reaching the earth's surface. At many places such knowledge is not available. The spatial distribution of calculated active layer depth over the North Pacific is very similar to that of observed annual mean mixed layer depth but the mixed layer seems to be twice as deep as the active layer. Also the effective mixed layer defined and used by Manabe and Stouffer is substantially deeper than our calculated active layer.
The results are discussed in the context of both the surface energy balance and the vertically averaged energy balance. One of the interesting findings of this study is that the layer of the ocean involved in the annual cycle should be taken as 20–50 m rather than the more customary 60–80 m. Another conclusion is that the SST seems to damp (towards equilibrium) at least five times faster than the vertically integrated energy content of the climate system as a whole (including the ocean!).