Search Results
Abstract
A simple linear two level model of the atmosphere is developed which has a reasonable representation of the external and tropically important baroclinic modes. By blocking the lower layer of the model with a meridional wall, the interaction between diabatic forcing centered in the Amazon basin and the Andes is studied. This forcing can be considered a source of Rossby waves that scatter from the above partial barrier. The scattering process is examined analytically by making the long-wave approximation, with the conclusion that transmission of external Rossby waves, reflection of baroclinic Kelvin waves, and the creation of topographic jets are likely to be important.
Numerical solutions without the long-wave approximation are then considered and the effects of the above scattering process are examined. The upper level circulation is shown to be qualitatively similar to that obtained without a barrier. The low-level circulation west of the barrier is weak in the winds and consists of a positive geopotential response centered at approximately 30°S. To the east the circulation near the barrier shows similarity to models of the Somali jet. The model produces quite strong trade winds and places a low-pressure center in approximately the position where one is actually observed.
Abstract
A simple linear two level model of the atmosphere is developed which has a reasonable representation of the external and tropically important baroclinic modes. By blocking the lower layer of the model with a meridional wall, the interaction between diabatic forcing centered in the Amazon basin and the Andes is studied. This forcing can be considered a source of Rossby waves that scatter from the above partial barrier. The scattering process is examined analytically by making the long-wave approximation, with the conclusion that transmission of external Rossby waves, reflection of baroclinic Kelvin waves, and the creation of topographic jets are likely to be important.
Numerical solutions without the long-wave approximation are then considered and the effects of the above scattering process are examined. The upper level circulation is shown to be qualitatively similar to that obtained without a barrier. The low-level circulation west of the barrier is weak in the winds and consists of a positive geopotential response centered at approximately 30°S. To the east the circulation near the barrier shows similarity to models of the Somali jet. The model produces quite strong trade winds and places a low-pressure center in approximately the position where one is actually observed.
Abstract
The nature of statistical predictability is analyzed in a T42 global atmospheric model that is able to adequately capture the main features of the midlatitude atmosphere. Key novel features of the present study include very large prediction ensembles and information theoretic techniques. It is found globally that predictability declines in a quasi-linear fashion with time for short-term predictions (3–25 days), while for long ranges (30–45 days) there is an exponential tail. In general, beyond 45 days the prediction and climatological ensembles have essentially converged, which means that beyond that point, atmospheric initial conditions are irrelevant to atmospheric statistical prediction.
Regional predictions show considerable variation in behavior. Both of the (northern) winter storm-track regions show a close-to-quasi-linear decline in predictability toward a cutoff at around 40 days. The (southern) summer storm track shows a much more exponential and considerably slower decline with a small amount of predictability still in evidence even at 90 days. Because the winter storm tracks dominate global variance the behavior of their predictability tends to dominate the global measure, except at longer lags. Variability in predictability with respect to initial conditions is also examined, and it is found that this is related more strongly to ensemble signal rather than ensemble spread. This result may serve to explain why the relation between weather forecast skill and ensemble spread is often observed to be significantly less than perfect. Results herein suggest that the ensemble signal as well as spread variations may be a major contributor to skill variations. Finally, it is found that the sensitivity of the calculated global predictability to changes in model horizontal resolution is not large; results from a T85 resolution model are not qualitatively all that different from the T42 case.
Abstract
The nature of statistical predictability is analyzed in a T42 global atmospheric model that is able to adequately capture the main features of the midlatitude atmosphere. Key novel features of the present study include very large prediction ensembles and information theoretic techniques. It is found globally that predictability declines in a quasi-linear fashion with time for short-term predictions (3–25 days), while for long ranges (30–45 days) there is an exponential tail. In general, beyond 45 days the prediction and climatological ensembles have essentially converged, which means that beyond that point, atmospheric initial conditions are irrelevant to atmospheric statistical prediction.
Regional predictions show considerable variation in behavior. Both of the (northern) winter storm-track regions show a close-to-quasi-linear decline in predictability toward a cutoff at around 40 days. The (southern) summer storm track shows a much more exponential and considerably slower decline with a small amount of predictability still in evidence even at 90 days. Because the winter storm tracks dominate global variance the behavior of their predictability tends to dominate the global measure, except at longer lags. Variability in predictability with respect to initial conditions is also examined, and it is found that this is related more strongly to ensemble signal rather than ensemble spread. This result may serve to explain why the relation between weather forecast skill and ensemble spread is often observed to be significantly less than perfect. Results herein suggest that the ensemble signal as well as spread variations may be a major contributor to skill variations. Finally, it is found that the sensitivity of the calculated global predictability to changes in model horizontal resolution is not large; results from a T85 resolution model are not qualitatively all that different from the T42 case.
Abstract
In a weather prediction, information flows from the initial conditions to a later prediction. The uncertainty in the initial conditions implies that such a flow should be quantified with tools from probability theory. Using several recent developments in information theory, this flow is explored using a moderate-resolution primitive equation atmospheric model with simplified physics. Consistent with operational experience and other methodologies explored in the literature, such as singular vectors, it is found that the midlatitude flow is mainly in an easterly direction. At upper levels, the flow is primarily steered by advection of the jet stream; however, at low levels there is clear evidence that synoptic dynamics are important and this makes the direction of flow more complex. Horizontal rather than vertical flow is generally found to be more important, although there was evidence for propagation from the mid- to upper troposphere of zonal velocity.
As expected, as the length of the prediction increases, more remote areas become important to local predictions. To obtain reliable/stable results, rather large ensembles are used; however, it is found that the basic qualitative results can be obtained with ensembles within present practical reach. The present method has the advantage that it makes no assumptions concerning linearity or ensemble Gaussianicity.
Abstract
In a weather prediction, information flows from the initial conditions to a later prediction. The uncertainty in the initial conditions implies that such a flow should be quantified with tools from probability theory. Using several recent developments in information theory, this flow is explored using a moderate-resolution primitive equation atmospheric model with simplified physics. Consistent with operational experience and other methodologies explored in the literature, such as singular vectors, it is found that the midlatitude flow is mainly in an easterly direction. At upper levels, the flow is primarily steered by advection of the jet stream; however, at low levels there is clear evidence that synoptic dynamics are important and this makes the direction of flow more complex. Horizontal rather than vertical flow is generally found to be more important, although there was evidence for propagation from the mid- to upper troposphere of zonal velocity.
As expected, as the length of the prediction increases, more remote areas become important to local predictions. To obtain reliable/stable results, rather large ensembles are used; however, it is found that the basic qualitative results can be obtained with ensembles within present practical reach. The present method has the advantage that it makes no assumptions concerning linearity or ensemble Gaussianicity.
Abstract
A new parameter of dynamical system predictability is introduced that measures the potential utility of predictions. It is shown that this parameter satisfies a generalized second law of thermodynamics in that for Markov processes utility declines monotonically to zero at very long forecast times. Expressions for the new parameter in the case of Gaussian prediction ensembles are derived and a useful decomposition of utility into dispersion (roughly equivalent to ensemble spread) and signal components is introduced. Earlier measures of predictability have usually considered only the dispersion component of utility. A variety of simple dynamical systems with relevance to climate and weather prediction is introduced, and the behavior of their potential utility is analyzed in detail. For the climate systems examined here, the signal component is at least as important as the dispersion in determining the utility of a particular set of initial conditions. The simple “weather” system examined (the Lorenz system) exhibited different behavior with the dispersion being more important than the signal at short prediction lags. For longer lags there appeared no relation between utility and either signal or dispersion. On the other hand, there was a very strong relation at all lags between utility and the location of the initial conditions on the attractor.
Abstract
A new parameter of dynamical system predictability is introduced that measures the potential utility of predictions. It is shown that this parameter satisfies a generalized second law of thermodynamics in that for Markov processes utility declines monotonically to zero at very long forecast times. Expressions for the new parameter in the case of Gaussian prediction ensembles are derived and a useful decomposition of utility into dispersion (roughly equivalent to ensemble spread) and signal components is introduced. Earlier measures of predictability have usually considered only the dispersion component of utility. A variety of simple dynamical systems with relevance to climate and weather prediction is introduced, and the behavior of their potential utility is analyzed in detail. For the climate systems examined here, the signal component is at least as important as the dispersion in determining the utility of a particular set of initial conditions. The simple “weather” system examined (the Lorenz system) exhibited different behavior with the dispersion being more important than the signal at short prediction lags. For longer lags there appeared no relation between utility and either signal or dispersion. On the other hand, there was a very strong relation at all lags between utility and the location of the initial conditions on the attractor.
Abstract
Simple linear models with additive stochastic forcing have been rather successful in explaining the observed spectrum of important climate variables. Motivated by this, the authors analyze the spectral character of such a general stochastic system of finite dimension. The spectral matrix is derived in the case that the spectrum is a linear combination of dynamical variables and their stochastic forcings. It is found that the most convenient basis for analysis is provided by the normal modes. In general the spectrum consists of two pieces. The first “diagonal” piece is a symmetric Lorentzian curve centered on the normal mode frequencies with breadth and strength determined by the normal mode dissipation. The second cross-spectrum piece derives usually from the coherency of the stochastic forcing of two different normal modes. The cross-spectrum is smaller in magnitude than the corresponding two diagonal pieces. This relative magnitude is controlled by the Wiener coherency, which is equal to the magnitude of the correlation of the stochastic forcings of different normal modes. This new analysis framework is studied in detail for the ENSO case for which a two-dimensional stochastically forced oscillator has been previously suggested as a minimal model. It is found that the observed spectrum is rather easily reproduced given appropriate dissipation. Further, it is found that the cross-spectrum results in a phase-dependent enhancement or suppression of frequencies smaller than the dominant ENSO frequency. This therefore provides a new mechanism for decadal ENSO variability. Since the cross-spectrum is phase dependent, the decadal variability generated has a distinctive spatial character. The significance of the cross-spectrum depends on the Wiener coherency, which in turn depends on the statistics of the stochastic forcing.
Abstract
Simple linear models with additive stochastic forcing have been rather successful in explaining the observed spectrum of important climate variables. Motivated by this, the authors analyze the spectral character of such a general stochastic system of finite dimension. The spectral matrix is derived in the case that the spectrum is a linear combination of dynamical variables and their stochastic forcings. It is found that the most convenient basis for analysis is provided by the normal modes. In general the spectrum consists of two pieces. The first “diagonal” piece is a symmetric Lorentzian curve centered on the normal mode frequencies with breadth and strength determined by the normal mode dissipation. The second cross-spectrum piece derives usually from the coherency of the stochastic forcing of two different normal modes. The cross-spectrum is smaller in magnitude than the corresponding two diagonal pieces. This relative magnitude is controlled by the Wiener coherency, which is equal to the magnitude of the correlation of the stochastic forcings of different normal modes. This new analysis framework is studied in detail for the ENSO case for which a two-dimensional stochastically forced oscillator has been previously suggested as a minimal model. It is found that the observed spectrum is rather easily reproduced given appropriate dissipation. Further, it is found that the cross-spectrum results in a phase-dependent enhancement or suppression of frequencies smaller than the dominant ENSO frequency. This therefore provides a new mechanism for decadal ENSO variability. Since the cross-spectrum is phase dependent, the decadal variability generated has a distinctive spatial character. The significance of the cross-spectrum depends on the Wiener coherency, which in turn depends on the statistics of the stochastic forcing.
Abstract
Space–time spectral analysis has been used frequently in studying observational evidence of convectively coupled equatorial waves. Here 23 yr of brightness temperature Tb data and dynamical reanalysis data are analyzed by an appropriate projection onto the meridional basis functions of the β-plane linear shallow-water equations. Evidence of peaks in power along linear equatorial mode dispersion curves in Tb , zonal and meridional wind, divergence, and geopotential spectra are presented.
Another feature of all space–time spectra considered is the redness in frequency, zonal wavenumber, and meridional mode number. It is found that spectral peaks in the dynamical variable spectra are largely consistent with linear shallow-water waves, but peaks related to barotropic waves and extratropical storm track activity are also apparent. The convectively coupled wave signals are seen to be confined to the first few meridional basis functions, suggesting a filtering method to reduce noise for these signals. This result also has implications for future modeling and theoretical work. A comparison of the results herein for two different reanalysis products shows only minor differences, adding confidence in the robustness of the results presented. This work implies that any comprehensive theory of tropical convection should explain the ubiquity of moist linear waves as well as the spectral redness with respect to all temporal and horizontal scales.
Abstract
Space–time spectral analysis has been used frequently in studying observational evidence of convectively coupled equatorial waves. Here 23 yr of brightness temperature Tb data and dynamical reanalysis data are analyzed by an appropriate projection onto the meridional basis functions of the β-plane linear shallow-water equations. Evidence of peaks in power along linear equatorial mode dispersion curves in Tb , zonal and meridional wind, divergence, and geopotential spectra are presented.
Another feature of all space–time spectra considered is the redness in frequency, zonal wavenumber, and meridional mode number. It is found that spectral peaks in the dynamical variable spectra are largely consistent with linear shallow-water waves, but peaks related to barotropic waves and extratropical storm track activity are also apparent. The convectively coupled wave signals are seen to be confined to the first few meridional basis functions, suggesting a filtering method to reduce noise for these signals. This result also has implications for future modeling and theoretical work. A comparison of the results herein for two different reanalysis products shows only minor differences, adding confidence in the robustness of the results presented. This work implies that any comprehensive theory of tropical convection should explain the ubiquity of moist linear waves as well as the spectral redness with respect to all temporal and horizontal scales.
Abstract
The nature of predictability is examined in a numerical model relevant to the midlatitude atmosphere and oceans. The approach followed is novel and uses new theoretical tools from information theory, namely entropy functionals, as measures of information content and their application to finite ensembles. Particular attention is paid here to the practical application of these methods to the problem of ensemble prediction in dynamical systems with state spaces of high dimensionality. In this case, typically only an estimate of the prediction probability distribution function is available at coarse resolution. A methodology for estimating the information loss implied by this limited knowledge is introduced and applied to the practical problem of measuring prediction information content in a model able to generate geophysical turbulence. The application studied here generates such turbulence through the mechanism of baroclinic instability via an imposed and constant mean vertical shear. In traditional studies in this area, considerable attention has been paid to variations in ensemble spread as the major determinant of how predictability may change as prediction initial conditions vary. The analysis here reveals that such a scenario neglects the important role of the so-called ensemble signal, which is related to the difference in the first moments of the prediction and climatological distributions. It is found, in fact, that this quantity is often a strong control over variations in predictability of the large-scale barotropic flow. An initial investigation of the role of non-Gaussian effects shows that for the univariate large-scale barotropic case, they are only of minor importance to variations in predictability.
Abstract
The nature of predictability is examined in a numerical model relevant to the midlatitude atmosphere and oceans. The approach followed is novel and uses new theoretical tools from information theory, namely entropy functionals, as measures of information content and their application to finite ensembles. Particular attention is paid here to the practical application of these methods to the problem of ensemble prediction in dynamical systems with state spaces of high dimensionality. In this case, typically only an estimate of the prediction probability distribution function is available at coarse resolution. A methodology for estimating the information loss implied by this limited knowledge is introduced and applied to the practical problem of measuring prediction information content in a model able to generate geophysical turbulence. The application studied here generates such turbulence through the mechanism of baroclinic instability via an imposed and constant mean vertical shear. In traditional studies in this area, considerable attention has been paid to variations in ensemble spread as the major determinant of how predictability may change as prediction initial conditions vary. The analysis here reveals that such a scenario neglects the important role of the so-called ensemble signal, which is related to the difference in the first moments of the prediction and climatological distributions. It is found, in fact, that this quantity is often a strong control over variations in predictability of the large-scale barotropic flow. An initial investigation of the role of non-Gaussian effects shows that for the univariate large-scale barotropic case, they are only of minor importance to variations in predictability.
Abstract
It is argued that a major fundamental limitation on the predictability of the El Niño–Southern Oscillation phenomenon is provided by the stochastic forcing of the tropical coupled ocean–atmosphere system by atmospheric transients. A new theoretical framework is used to analyze in detail the sensitivity of a skillful coupled forecast model to this stochastic forcing. The central concept in this analysis is the so-called stochastic optimal, which represents the spatial pattern of noise most efficient at causing variance growth within a dynamical system. A number of interesting conclusions are reached. (a) Sensitivity to forcing is greatest during the northern spring season and prior to warm events. (b) There is little sensitivity to meridional windstress noise. (c) A western Pacific dipole pattern in heat flux noise is most efficient in forcing eastern Pacific SST variance. An estimate of the actual wind stress stochastic forcing is obtained from recent ECMWF analyses and it is found that “unavoidable” error growth within the model due to this stochastic forcing saturates at approximately 0.5°C in the NINO3 region with very rapid error growth during the first 6 months. The noise projects predominantly onto the first stochastic optimal and, in addition, around 95% of the error growth can be attributed to stochastic forcing with a strong synoptic character.
Abstract
It is argued that a major fundamental limitation on the predictability of the El Niño–Southern Oscillation phenomenon is provided by the stochastic forcing of the tropical coupled ocean–atmosphere system by atmospheric transients. A new theoretical framework is used to analyze in detail the sensitivity of a skillful coupled forecast model to this stochastic forcing. The central concept in this analysis is the so-called stochastic optimal, which represents the spatial pattern of noise most efficient at causing variance growth within a dynamical system. A number of interesting conclusions are reached. (a) Sensitivity to forcing is greatest during the northern spring season and prior to warm events. (b) There is little sensitivity to meridional windstress noise. (c) A western Pacific dipole pattern in heat flux noise is most efficient in forcing eastern Pacific SST variance. An estimate of the actual wind stress stochastic forcing is obtained from recent ECMWF analyses and it is found that “unavoidable” error growth within the model due to this stochastic forcing saturates at approximately 0.5°C in the NINO3 region with very rapid error growth during the first 6 months. The noise projects predominantly onto the first stochastic optimal and, in addition, around 95% of the error growth can be attributed to stochastic forcing with a strong synoptic character.
Abstract
A predictability framework, based on relative entropy, is applied here to low-frequency variability in a standard T21 barotropic model on the sphere with realistic orography. Two types of realistic climatology, corresponding to different heights in the troposphere, are used. The two dynamical regimes with different mixing properties, induced by the two types of climate, allow the testing of the predictability framework in a wide range of situations. The leading patterns of empirical orthogonal functions, projected onto physical space, mimic the large-scale teleconnections of observed flow, in particular the Arctic Oscillation, Pacific–North American pattern, and North Atlantic Oscillation. In the ensemble forecast experiments, relative entropy is utilized to measure the lack of information in three different situations: the lack of information in the climate relative to the forecast ensemble, the lack of information by using only the mean state and variance of the forecast ensemble, and information flow—the time propagation of the lack of information in the direct product of marginal probability densities relative to joint probability density in a forecast ensemble. A recently developed signal–dispersion–cross-term decomposition is utilized for climate-relative entropy to determine different physical sources of forecast information. It is established that though dispersion controls both the mean state and variability of relative entropy, the sum of signal and cross-term governs physical correlations between a forecast ensemble and EOF patterns. Information flow is found to be responsible for correlated switches in the EOF patterns within a forecast ensemble.
Abstract
A predictability framework, based on relative entropy, is applied here to low-frequency variability in a standard T21 barotropic model on the sphere with realistic orography. Two types of realistic climatology, corresponding to different heights in the troposphere, are used. The two dynamical regimes with different mixing properties, induced by the two types of climate, allow the testing of the predictability framework in a wide range of situations. The leading patterns of empirical orthogonal functions, projected onto physical space, mimic the large-scale teleconnections of observed flow, in particular the Arctic Oscillation, Pacific–North American pattern, and North Atlantic Oscillation. In the ensemble forecast experiments, relative entropy is utilized to measure the lack of information in three different situations: the lack of information in the climate relative to the forecast ensemble, the lack of information by using only the mean state and variance of the forecast ensemble, and information flow—the time propagation of the lack of information in the direct product of marginal probability densities relative to joint probability density in a forecast ensemble. A recently developed signal–dispersion–cross-term decomposition is utilized for climate-relative entropy to determine different physical sources of forecast information. It is established that though dispersion controls both the mean state and variability of relative entropy, the sum of signal and cross-term governs physical correlations between a forecast ensemble and EOF patterns. Information flow is found to be responsible for correlated switches in the EOF patterns within a forecast ensemble.
Abstract
In this study, ensemble predictions were constructed using two realistic ENSO prediction models and stochastic optimals. By applying a recently developed theoretical framework, the authors have explored several important issues relating to ENSO predictability including reliability measures of ENSO dynamical predictions and the dominant precursors that control reliability. It was found that prediction utility (R), defined by relative entropy, is a useful measure for the reliability of ENSO dynamical predictions, such that the larger the value of R, the more reliable the prediction. The prediction utility R consists of two components, a dispersion component (DC) associated with the ensemble spread and a signal component (SC) determined by the predictive mean signals. Results show that the prediction utility R is dominated by SC.
Using a linear stochastic dynamical system, SC was examined further and found to be intrinsically related to the leading eigenmode amplitude of the initial conditions. This finding was validated by actual model prediction results and is also consistent with other recent work. The relationship between R and SC has particular practical significance for ENSO predictability studies, since it provides an inexpensive and robust method for exploring forecast uncertainties without the need for costly ensemble runs.
Abstract
In this study, ensemble predictions were constructed using two realistic ENSO prediction models and stochastic optimals. By applying a recently developed theoretical framework, the authors have explored several important issues relating to ENSO predictability including reliability measures of ENSO dynamical predictions and the dominant precursors that control reliability. It was found that prediction utility (R), defined by relative entropy, is a useful measure for the reliability of ENSO dynamical predictions, such that the larger the value of R, the more reliable the prediction. The prediction utility R consists of two components, a dispersion component (DC) associated with the ensemble spread and a signal component (SC) determined by the predictive mean signals. Results show that the prediction utility R is dominated by SC.
Using a linear stochastic dynamical system, SC was examined further and found to be intrinsically related to the leading eigenmode amplitude of the initial conditions. This finding was validated by actual model prediction results and is also consistent with other recent work. The relationship between R and SC has particular practical significance for ENSO predictability studies, since it provides an inexpensive and robust method for exploring forecast uncertainties without the need for costly ensemble runs.