In the late 1960s, well before the availability of computer power to produce ensemble weather forecasts, Edward Epstein (1931–2008) developed a stochastic–dynamic prediction (SDP) method for calculating the temporal evolution of mean value, variance, and covariance of the model variables: the statistical moments of a time-varying probability density function that define an ensemble forecast. This statistical–dynamical approach to ensemble forecasting is an alternative to the Monte Carlo formulation that is currently used in operations. The stages of Epstein's career that led to his development of this methodology are presented with the benefit of his oral history and supporting documentation that describes the retreat of strict deterministic weather forecasting. The important follow-on research by two of Epstein's protégés, Rex Fleming and Eric Pitcher, is also presented.
A low-order nonlinear dynamical system is used to discuss the rudiments of SDP and Monte Carlo and to compare these approximate methods with the exact solution found by solving Liouville's equation. Graphical results from these various methods of solution are found in the main body of the paper while mathematical development is contained in an online supplement. The paper ends with a discussion of SDP's strengths and weaknesses and its possible future as an operational and research tool in probabilistic–dynamic weather prediction.
Without regard for limitations of computer resources that prohibited ensemble weather forecasting in the 1960s, Edward Epstein forged ahead and developed a stochastic–dynamic system that stimulated dynamicists worldwide.
As we read about developments of early-twentieth-century physics in books like Freeman Dyson's Infinite in All Directions, David Bohm's Causality and Chance in Modern Physics, and Kenneth Ford's The World of Elementary Particles, we are vicariously drawn into the intellectual conflict between the deterministic view associated with classical physics and the probabilistic view that came with quantum mechanics (Dyson 1988; Bohm 1957; Ford 1963). Quoting from Ford (1963, p. 53),
The probability of the macroscopic world (and of classical physics) is a probability of ignorance; the probability of the microscopic world is a fundamental probability of nature. The only reason the slot in which the roulette ball stops cannot be calculated in advance of the spin is ignorance of what the physicist calls “initial conditions”. . . The difference in the quantum mechanical law of probability is that one can not, in principle, as well as in fact, calculate the exact course of an atomic event, no matter how precisely the initial conditions are known.1
Mathematicians and physicists at Los Alamos developed the so-called Monte Carlo method in the late 1940s to deal with the uncertainty of branching events in the life of elementary particles (Metropolis and Ulam 1949). For example, these computationally demanding algorithms relied on repeated random sampling to determine the fate of neutrons in fissionable material such as uranium. With advances in computational power, these Monte Carlo methods began to enter the minds of dynamic meteorologists and turbulence theorists by the mid-1960s (Lorenz 1965; Leith 1997). The dynamic–probabilistic approach to operational numerical weather prediction (NWP) has become mainstream today (Hirschberg et al. 2011).
At about the same time that quantum mechanics came into full bloom, L. F. Richardson adopted Vilhelm Bjerknes's principle of weather prediction as an initial value problem in classical physics (Richardson 1922). His bold manual execution of an NWP experiment failed for reasons only fully appreciated decades later (Platzman 1967; Lynch 2006). With the advent of the programmable digital computer in the immediate post–World War II (WWII) period, Jule Charney and the team at Princeton's Institute for Advanced Study (IAS) made several 24-h forecasts of the large-scale features of the hemispheric circulation based on quasigeostrophic principles: advection of the geostrophic vorticity by the geostrophic wind (Charney 1948; Charney et al. 1950). Forecasts initialized on 30 January and 13 February 1949 were impressive, but the forecast initialized on 5 January was not particularly good. It is instructive to read the even-handed accounts of events surrounding these forecasts by two of the participants, George Platzman and Joseph Smagorinsky (Platzman 1979; Smagorinsky 1983). An informative history of these events is found in Harper (2012, chapters 4 and 5).
The meteorological community was well aware of the high profile work at the IAS. Even before the success of the numerical experiment was announced, Eric Eady—a fresh Ph.D. in mathematics out of Imperial College in London—waved an amber-colored flag of warning regarding the perils of deterministic NWP. In clear-cut and trenchant arguments found in his dissertation (Eady 1948) and abridged versions of it (Eady 1949, 1951), he discounted strict determinism in favor of an ensemble approach to weather forecasting: “. . . we must extend our analysis and consider the properties of a set or ‘ensemble’ (corresponding to the Gibbs-ensemble of statistical mechanics) of all possible developments” (Eady 1951). The statement was made in consideration of his dissertation results related to development of baroclinic weather systems. Namely, small perturbations below a certain margin of error in the initial state can grow at an exponential rate along with the unstable disturbance, and the forecast error grows to the point where signal is masked by noise.2 A photo of Eady is shown in Fig. 1.
This insightful vision that heralded the need for caution regarding extended-range forecasting was not well received by the worldwide community of meteorologists. As succinctly stated by Philip Thompson, “They didn't really want to introduce any element of uncertainty into what was pleasingly deterministic” (Thompson 1983). Nevertheless, a body of evidence that came from experiences with operational NWP and simulations with general circulation models (GCMs) lent credibility to Eady's conjecture by the mid-1960s. Into this environment of question regarding the limits of deterministic weather prediction came Edward Epstein (1931–2008), a meteorologist with a penchant for applying statistics to weather. From his post alongside the dynamicists, he offered a novel view of ensemble prediction that fundamentally linked dynamics with statistics: a methodology that he called stochastic–dynamic prediction (SDP).
We review the steps that prepared Epstein for his major contribution to ensemble weather prediction (as found in Epstein 1969). These steps are viewed in the context of his academic experiences and the limits of deterministic weather forecasting. Further, a study of SDP is conducted with a low-order dynamical constraint that is simpler than the one used by Epstein (1969) but true to the spirit of his work. The mathematical underpinning of SDP is contained in an online supplement (http://dx.doi.org/10.1175/BAMS-D-13-00036.2) that complements the graphical displays and qualitative discussion in the main body of text. Comparison and contrast of SDP with Monte Carlo ensemble prediction and the true probabilistic–dynamic prediction is presented at several junctures in the paper. The paper ends with a summary of the strengths and weaknesses of SDP and conjectures related to its future as a vehicle for making ensemble forecasts.
STEPS ON EPSTEIN'S PATH TO SDP.
Family background and youth.
The family tree of Edward Epstein, a heritage traced to nineteenth-century Russia and Hungary, gives little evidence of intellectual or academic tradition. He grew up in Highbridge, a working-class neighborhood of the Bronx borough of New York City. His father was a movie theatre projectionist with a fourth-grade education and his mother fell one year short of graduating from high school. His parents, nevertheless, stressed the value of education to Edward and his older sister.
Edward became a precocious prodigy of astronomy. He read every adult book in the public library on the subject and made frequent solo trips via bus and subway to the American Museum of Natural History, which housed the Hayden Planetarium. He was elected president of the Junior Astronomy Club of New York City, became editor of the club's quarterly journal, and became known at the Hayden as “ the boy who answered the questions put to the audience by the lecturers” (E. Epstein 2002, personal communication, hereafter E2002). He graduated from the Bronx High School of Science in 1947 at age 16 and “. . . entered my freshman year at Harvard on a scholarship” (E2002). Epstein took residence at Lowell House, Harvard University, and a photograph of him as a member of the house's tackle football team is shown in Fig. 2.
Mentors: Whipple to Mosteller to Panofsky.
Upon entry into Harvard, Epstein initially elected to major in mathematics; but this would change as he recounted (E2002):
I was assigned to an advisor who tried hard to steer me into pure mathematics. Since my interest was in applied math, I decided to switch to astronomy. I had already, in my first semester, taken a course called Practical Astronomy taught by Professor Fred Whipple. I very much liked his approach to science, emphasizing quantitative considerations and making sure that results made physical sense. I later took a second course from Professor Whipple, this time on the computation of cometary orbits. I learned to be a whiz with all varieties of desk calculators. This was before the advent of computing machines.
However, as often happens in the presence of gifted teachers, the career path changes. In Epstein's case it was a course in statistics under Frederick Mosteller, a newly appointed professor with a Ph.D. out of Princeton who would soon become the central figure in statistics at Harvard. He was indeed a gifted teacher as students around the country would observe in 1960 when he taught statistics on the National Broadcasting System's (NBC's) Continental Classroom, a series of television-based courses in mathematics and science over the 6-yr period of 1958–63 (Carlisle 1974). The textbook for the course with exercises that vitalize statistics stands as a testament to the teaching and writing skills of Mosteller and his two coauthors (Mosteller et al. 1961).3 As Epstein recalled, “I didn't do particularly well in the course (my grade was some variety of B), but toward the end of the semester I grew enthusiastic about the subject, so much so that I decided I would like to do graduate work in it” (E2002).
Indeed, upon graduation from Harvard in 1951 with a B.A. (cum laude) in astronomy, Epstein pursued the study of statistics in graduate school. Somewhat surprisingly, he studied it in the context of business at the Columbia University Graduate School of Business Administration. The decision was made in part because he was able to live at home with his parents and was able to secure a half-time job as “statistical tabulator” on a human resources research project at the university.4 He took an MBA from Columbia in early 1953, but upon graduation he lost his deferment from military service (the Korean War was still ongoing at this time). Rather than being drafted, he opted for training in the U.S. Air Force (USAF), a USAF officer-training program in meteorology. As he remembered, “I knew nothing at all about meteorology, but I very much liked the idea of continuing my education” (E2002).
Epstein was sent to Pennsylvania State University (PSU) for meteorology training and “I was singled out for special attention because of my background in astronomy and statistics plus pretty good grades from good schools (E2002).” He was placed under the supervision of Hans Panofsky, the resident theoretician at PSU, and the two of them hit it off perfectly. Panofsky and his 1-yr-younger brother, Wolfgang, were youthful prodigies who studied physics and astronomy at Princeton in the late 1930s (Wheeler and Ford 1998; Panofsky 2007). As remembered by Epstein, “This was a particularly good match with Hans Panofsky who became my mentor” (E2002). With his strong scientific background, Epstein was able to complete the M.S. degree in the year he went through training in meteorology. Between 1954 and 1957, Epstein was assigned to work for the Air Force Cambridge Research Laboratories (AFCRL), centered in Cambridge, Massachusetts, with a satellite facility on the campus of Northern Arizona University in Flagstaff, Arizona [Air Force Cambridge Research Center (AFCRC)].5 While at AFCRC, he took a leadership role in the investigation of stratospheric ozone through use of ground-based infrared radiance measurements. He developed a mathematical inversion technique to convert infrared radiance measurements into ozone profiles (Epstein et al. 1956). Upon completion of military service in 1957, he returned to PSU and parlayed his research on ozone into a dissertation on stratospheric structure (Epstein 1960). Epstein successfully defended his dissertation in January 1960 but was self-deprecating about this work: “I am not proud of my dissertation; unlike my work in Arizona, it was not particularly original or of much significance, but it satisfied those in authority at Penn State” (E2002). In spite of this disappointment, the dissertation gives clear evidence of strength in meteorological analysis, demonstrated through adroit arguments and impressive hand-drawn analyses of the stratospheric ozone and the associated circulation patterns. Without doubt, he had gained much under the tutelage of Hans Panofsky.
Even before Epstein received the Ph.D., University of Michigan (UM) meteorology professor H. Wendel Hewson invited him to join the Air Pollution Aeroallergens Project, a well-funded interdisciplinary project sponsored by the National Institutes of Health (NIH). As Epstein recalled, “They [UM] offered me a salary I couldn't match anywhere else” (E2002). A photograph of Epstein soon after accepting the UM faculty position at the beginning of the 1960/61 academic year is shown in Fig. 3. He joined a small contingent of meteorologists in the department of civil engineering, soon to be transferred to the department of engineering mechanics. In the order of seniority, this contingent consisted of Hewson, Gerald Gill, Donald Portman, and Nelson Dingle. In 1963, UM decided to create a separate department of meteorology and oceanography (M&O) [“Great Lakes” oceanography as labeled by oceanography professor John Winchester (J. Winchester 2012, personal communication)]. The newly appointed chair of the M&O department was Aksel Wiin-Nielsen, a Rossby protégé who was central to Sweden's entry into operational NWP (Wiin-Nielsen 1991). He changed the academic complexion of meteorology at UM from a strictly applied instrument-oriented program into one with emphasis on atmospheric dynamics. Wiin-Nielsen is pictured in Fig. 4.
Epstein brought his version of statistics into the classroom (offering a graduate course in statistical methods) and into research (serving as a meteorological statistician on the aeroallergens project under the direction of Hewson and Gill). Epstein's memory of the statistics course follows (E2002):
In the graduate statistics course, I tried to simultaneously learn and teach . . . This led me to my first experience with Monte Carlo calculations . . . We all had to learn to program the university's top-of-the-line computer, the IBM 650 . . . I presented the students with work on decision making by Tom Gleeson [a faculty member in Florida State University's meteorology department (1960s–80s)]. Gleeson stimulated my thoughts although I disagreed with his interpretation . . . I found his resulting decisions greatly counterintuitive and unreasonably pessimistic.6 I discussed this with an acquaintance in the School of Public Health. He immediately replied, “Oh you're a Bayesian!” and directed me to Leonard Savage, who was a visiting faculty member in the math department, I believe.
Leonard (“Jimmie”) Savage was a gifted statistician who made a 4-yr stop at UM (1960–63) between professorships at University of Chicago and Yale University. He was a well-known “Bayesian,” as was Mosteller. Both of these renowned statisticians were devotees to the use of the “prior,” the a priori probability, revised by later experience to yield the a posteriori probability. As found in the following recollection, Savage's influence on Epstein was lifelong (E2002):
Savage explained to me his view of probability as a personal degree of belief. This is spelled out in his book The Foundation of Statistics [Savage 1954]. I strongly adhere to this viewpoint and it has colored all of my scientific efforts since.
Figure 5 shows photographs of Mosteller and Savage, the statisticians who most influenced Epstein.
In regard to mentorship, Epstein had the view that “a finishing graduate student should be more current in his knowledge of the literature and the details of his subject, but that his mentor should be expected to be wiser” (E2002). He also had a key characteristic of Carl Rossby as remembered by Horace Byers: “Rossby encouraged all of us [doctoral students] to proceed on our own, but he himself never lost interest in the work” (H. Byers, 1992, personal communication). The following vignettes from Epstein's doctoral students add substance to these generalizations (years of association in parentheses):
JOHN LEESE (1961–64):
During much of my stay at the University of Michigan, Ed was on leave of absence in Washington working with Robert White [chief of the U.S. Weather Bureau] in restructuring the Weather Bureau to form the Environmental Science Services Administration (ESSA) . . . [Nevertheless], he expressed great confidence in me to direct the TIROS [Television Infrared Observation Satellite] project and use this work as the basis of my Ph.D.
ROLAND DRAYSON (1963–67):
[Epstein's] style was informal. His delivery [in the classroom] was somewhat hesitant . . . [and] he encouraged us to explore special interests . . . He gave me the freedom to pursue my interests in research and that was ideal for me.
REX FLEMING (1968–70):
Ed Epstein was a very intelligent man. He was easy to work with; he listened to new ideas . . . Unfortunately Ed and I did not have the many hours together that a normal situation would have demanded between a Ph.D. student and his advisor. The few hours we did have were very good.
ERIC PITCHER (1970–74):
It was a pleasure working with him [Epstein]. He had an easy manner and was always accessible. Whenever I conferred with him, I learned something new or received a new insight . . . As a person, he was kind and humble, yet confident.
Another doctoral student under Epstein was Allan Hunt Murphy (1931–97). He was one of those special students who became more than a protégé, one who became a collaborator (Murphy and Epstein 1967a,b). Epstein credits Murphy with helping him “. . . focus on meteorological statistics including probability forecasts and forecast verification as opposed to my initially diverse academic interests in radiation, aeronomy, and satellite image interpretation” (E2002).
As stated in Leese's vignette, Epstein served under bureau chief Robert M. White (during the 1962/63 academic year). This service as a junior colleague under the able leadership of White would have a dramatic influence on his career as discussed later. He was singled out to serve under White by authorities at AFCRL and J. Herbert Holloman in the Department of Commerce (E. Bierly 2013, personal communication). Epstein's strength as a motivated scientific leader had been recognized at AFCRC.
A CRACK IN DETERMINISM'S ARMOR.
The “2-week limit.”
Despite knowledge that errors in dynamical models grew in response to initial-condition uncertainty, operational deterministic NWP exhibited marked improvement in skill during the decades following its inception (Lewis 1998, 2005, 2008). During the 1960s–70s, simulations from the deterministic-based global circulation models (GCMs) clarified complex ocean–land–atmosphere interactions. Nevertheless, the time limits of useful deterministic prediction were documented by the mid-1960s. The GCM numerical experiments that established these limits have been well documented (Committee on Atmospheric Sciences 1966; in abbreviated form in GARP 1969). It is worthwhile to repeat an argument by one of the principal investigators involved in establishing this limit, Professor Akio Arakawa (A. Arakawa 2002, personal communication; Lewis 2005):
The report [Committee on Atmospheric Sciences 1966] represents one of the major steps toward the planning of the GARP [Global Atmospheric Research Program] . . . It showed, for the first time using a realistic model of the atmosphere, the existence of a deterministic predictability limit the order of weeks. The report specifically says that the limit is two weeks, which became a matter of controversy later. To me, there is no reason that it is a fixed number. It should depend on many factors, such as the part of the time/space spectrum, climate and weather regimes, region of the globe and height in the vertical, season, etc. The important indication of the report is that the limit is not likely to be the order of days or the order of months for deterministic prediction of middle-latitude synoptic disturbances.
Edward Lorenz (1982) added substance to this result when he examined an entire winter season of operational deterministic forecasts generated by the European Centre for Medium-Range Weather Forecasts (ECMWF). Lorenz's primary conclusions from this study were 1) the doubling time for small errors is about 2.5 days and 2) the limit of extended-range forecasting is slightly greater than 2 weeks.6 A photograph of Lorenz is shown in Fig. 6.
Lorenz's outline for Monte Carlo weather prediction.
Lorenz outlined a plan for Monte Carlo weather prediction in 1964 at the International Union of Geodesy and Geophysics–World Meteorological Organization (IUGG–WMO) conference in Boulder, Colorado (Lorenz 1965; WMO 1965). Details of the execution plan are found in Lewis (2005, section 4). It is safe to argue that Lorenz outlined this plan in response to the GCM-generated predictability limit and his studies with low-order nonperiodic models that exhibited extreme sensitivity to incorrect initial conditions (Lorenz 1962, 1963). Further, earlier research experiences while working with Victor Starr and Robert White on the Massachusetts Institute of Technology (MIT) general circulation project influenced his thinking. As he recalled (E. N. Lorenz 2002, personal communication),
Any interest I had in ensembles at that time [early 1950s] was to explain the typical behavior of the atmosphere—you can't do this by looking at one or two solutions. You can do it by looking at a large ensemble of solutions. I was trying to explain why the angular momentum transport in the Northern Hemisphere was toward the north . . . There are individual days when it is directed the other way so that it isn't something that has to happen all the time. It has to happen more of the time than not (Lorenz 1953).
The Hartford, Connecticut, conference.
In late May 1968, Epstein, Lorenz, and Gleeson delivered talks at the American Meteorological Society's First Statistical Meteorology Conference in Hartford, Connecticut (AMS 1968; Epstein 1968; Lorenz 1968; Gleeson 1968). Lorenz presented an example of ensemble forecasting that had a profound effect on Epstein. He remembered the presentation as follows (E2002):
Lorenz's paper [on Monte Carlo ensemble prediction] gave . . . a clear presentation that greatly sharpened my view of phase space and the correspondence of uncertainty with a glob of points each of which would follow its own deterministic path. Central to my train of thought was the notion that one wanted, indeed needed, the dispersion of this glob as a measure of the uncertainty.
Epstein was led to consider an alternate strategy for following the glob of points in phase space (spectral-component space) based on his background in statistics. He wanted to predict the statistical moments directly: the means, variances, and covariances that describe evolution of the multivariate probability density function (pdf). That is, instead of following the Monte Carlo philosophy that indirectly finds these moments by averaging over a large number of deterministic paths associated with perturbed initial conditions, he wanted moment equations with structures similar to the governing dynamical equations. He did not realize that there was a body of literature in physics, mathematics, and electrical engineering that approached the stochastic problem in much the same manner (six seminal papers on the subject are found in Wax 1954). These earlier contributions focused on evolution of the pdf for problems such as Brownian motion/random walk through solution of Liouville's equation or its generalization, the Fokker–Planck equation (discussed at length in section 2 of Chandrasekhar 1943). Fokker–Planck considers random forcing in the dynamical equations as well as uncertainty in the initial conditions, while Liouville only accounts for uncertain initial conditions.
Epstein's limited knowledge of the earlier work aside, he wrote a prospectus that described his research plan for SDP in meteorology. The plan rested on phase-space representation as opposed to physical-space representation: that is, spectral as opposed to gridpoint representation. The inordinate increase in the number of equations for SDP compared to deterministic prediction dictated this strategy (discussed further in the next section). Epstein showed the prospectus to Wiin-Nielsen, his colleague with an impressive background in dynamics and NWP. Wiin-Nielsen immediately gave the prospectus “thumbs up,” a level of support that encouraged Epstein to take a sabbatical leave from UM and apply for a year's study at the International Institute of Meteorology, University of Stockholm. This was “Rossby's institute,” with a sterling reputation for welcoming visiting meteorologists. With Wiin-Nielsen working on both academic ends, Epstein received a letter of invitation from Bert Bolin in early summer: an invitation to spend the 1968/69 academic year at Stockholm. A photograph of Bolin is shown in Fig. 7. Upon arrival in Stockholm, Epstein joyfully remembered the heroic welcome he received (E2002):
I had no notion if Bert Bolin would be agreeable with my plan. But with my assignment of an office, paper, and pencils, I was told that I had a sizeable allotment of computer time and even asked if that would be sufficient! I had many conversations with Bert, Bo Döös, and Hilding Sundquist. If I ran into difficulty I would seek out one of them and bounce ideas off them. Hilding would do the same with me. I ran out of computer time and more was provided. Early in the year  I gave a seminar on stochastic methods, although I cannot remember exactly what I talked about. Later in the year I presented my results once the paper [Epstein 1969] was accepted for Tellus. I thought both presentations were well received.
The fundamental ideas behind Epstein's SDP system are explored in the next section.
RUDIMENTS OF EPSTEIN'S SDP SYSTEM.
The best way to investigate and determine the distinguishing characteristics of SDP compared to other prediction methods is to examine SDP and the other methods in the context of a simplified yet nontrivial dynamical model. Platzman's truncated spectral version of the nonlinear advection equation is chosen to test these prediction methods (Platzman 1964). This dynamical constraint bears a strong resemblance to the nonlinear barotropic advection constraint used by Epstein (1969)—the three-component spectral solution to Lorenz's “minimum equations” (Lorenz 1960). Both dynamical systems describe the transfer of energy between spectral modes. There is a pedagogical advantage that comes with the two-component model; namely, the full pdf associated with the ensemble forecast can be displayed as the third dimension of a Cartesian coordinate system.
Although Platzman's spectral model is a simplified nonlinear system, the associated mathematical underpinning for statistical–dynamical prediction is substantive. Accordingly, an online supplement contains the mathematical development of deterministic, Monte Carlo, SDP, and the exact dynamical–probabilistic systems. The qualitative discussion and graphic results from the study follow in the main body of the text.
Deterministic and Monte Carlo prediction.
As shown in the online supplement, nondimensional equations governing evolution of primary and secondary spectral amplitudes of nonlinear advection take the form
where t represents time and u1(t) and u2(t) represent the amplitudes of the primary wave and secondary wave, respectively.
For deterministic prediction, the amplitude pair [u2(0), u2(0)] = [1.25, −0.35] is taken as the initial condition. This pair is labeled the cardinal initial condition (IC). The deterministic solution with the cardinal IC over the time period t = 0 → 3 is displayed as the thick curved line in Fig. 8. This solution indicates that the primary wave amplitude decreases while the amplitude of the secondary wave increases.
For the probabilistic–dynamic forecast methods, we assume the initial state is given by a bivariate normal distribution with the cardinal IC as mean and variances of 0.09 (standard deviation of 0.3) for both amplitudes. The initial covariance is assumed to be zero.
The Monte Carlo ensemble prediction is achieved by first creating a set of random initial conditions chosen from the bivariate normal distribution. The number of members is represented by the integer m. Eight member pairs (m = 8) are chosen from this distribution and the associated trajectories up through t = 3 are plotted as thin curves in Fig. 8. The Monte Carlo–derived pdf at time t is found by ex post facto counting of amplitude pairs [u1(t), u2(t)] that fall within elemental areas of the phase space. As evident from the array of trajectory end points at t = 3, an accurate estimate of the pdf at this time demands inclusion of many more members.
Whereas Monte Carlo prediction has a mathematically discrete foundation—discrete amplitude pairs are drawn from the bivariate distribution—the SDP has a mathematical continuum foundation. Statistical structure of the pdf is governed by solution to a set of coupled equations (derived in the online supplement). For the simplest form of SDP that discards third moments, the SDP system consists of five coupled nonlinear differential equations in the first and second moments of the assumed bivariate normal distribution. Using the probabilistic initial conditions mentioned in the previous subsection, the solution to the moment equations deliver the pdf. The SDP-generated pdfs at t = 1 and 3 are shown beside the initially specified distribution in Fig. 9. As expected, the secondary wave amplitude u2 increases with time at the expense of decrease in the primary wave amplitude u1. There is a slight positive correlation between the amplitudes at t = 1 and negative correlation at t = 3. Most obvious is the reduction in primary wave variance compared to secondary wave variance at both times.
For dynamical systems with many spectral components, the dimension of the SDP problem—the number of equations—is extraordinarily large. In Table 1, this dimension is enumerated for deterministic, Monte Carlo, and SDP systems under the assumption that an N-component spectral model is used for prediction. In the order of deterministic, Monte Carlo, and SDP models, the required number of equations exhibit the ratios
where m is the number of members in the Monte Carlo ensemble. Using the latest estimates from ECMWF,7 we take N ∼ 109 and m ∼ 50. Under these conditions, the ratios are ∼109:1010:1018 . Thus, the million spectral components for the planned ECMWF model will translate into a quintillion coupled nonlinear differential equations for SDP. For our two-mode system where m = 8 and N = 2, the ratios are 2:16:5.
Exact probability density function.
The analytic solution for the two-mode system has been found and presented in (ES6)–(ES8) (see online supplement). This is the general analytic solution as opposed to the special-case solution in Platzman (1964). From this general solution, the exact pdf for the two-mode nonlinear advection equation has been found by solving Liouville's equation [solution displayed in (ES23)].
The first, second, and third moments from SDP, Monte Carlo, and the exact solution at t = 2 are shown in Table 2. As mentioned in the online supplement, there is little difference in the results for third- and fourth-moment discard. Up through the second moments, the exact and SDP exhibit small differences except for the covariance (the SDP errors are generally <10%). Errors in the third moments are substantial. The SDP-generated pdf (third-moment discard version) is compared to the exact pdf in Fig. 10. The positive difference between SDP and exact is displayed at the bottom of this figure (the negative difference is not shown but its magnitude is comparable to the positive difference and its position is slightly displaced). The structural error is clearly linked to the covariance discrepancy, yet the maximum pdf error is only ±10%. The error at t = 5 is more severe, as shown in Fig. 11. The exact pdf has a bimodal structure: that is, a “valley” between two narrow elongated zones of high-valued probability density. It is impossible for the bivariate normal distribution to capture this complex structure. The maximum errors in the SDP-generated pdf are about ±25% at t = 5.
The Monte Carlo prediction for m = 8 exhibits substantial error as expected, but the results for m = 80 are good. The third-moment values and correlation ρ are especially impressive.
Even though the fourth-moment discard version of SDP delivers third moments, they cannot be incorporated into an SDP-generated pdf. That is, a unique pdf does not exist from knowledge of first, second, and third moments. The third moments affect second moments through terms in the governing SDP equations, but entries in Table 2 make it clear that improvements are minimal in this instance.
In summary, the SDP faces two problems: 1) the need for some form of closure including moment discard and 2) limitations that come from an assumption of a multivariate normal probability distribution no matter the level of moment discard.
EPSTEIN PROTÉGÉS IN SDP: FLEMING AND PITCHER.
Two doctoral students at the University of Michigan were drawn into research on SDP: Rex Fleming and Eric Pitcher. They produced the following dissertations under Epstein's guidance:
Generally stated, these research efforts were aimed at application of SDP to models more realistic than the one used in Epstein (1969). The following recollection from Fleming gives an idea of how his interest in this problem was generated (R. Fleming 2012, personal communication):
I had read Ed's first “prospective” document on stochastic dynamic prediction (SDP) (which led to his 1969 Tellus paper) and liked it. I wanted to marry that work with the energetics background I learned from Aksel [Wiin-Nielsen]. Ed, Aksel, and Warren [Washington]8 all agreed to serve on my dissertation committee with Ed as the lead. I was able to go to NCAR (registered at the U. of Michigan in absentia) and use their marvelous computer facility.
Fleming's thesis made use of a two-level quasigeostrophic model in spectral form with 14 components for each field: the streamfunction and temperature fields. As can be determined from entries in Table 1 for the case of N = 28, there are 28 equations for the mean values and 406 variance and covariance equations, giving a total of 434 equations for the first and second moments of the pdf for this model. Results from this experiment are reviewed in Epstein and Fleming (1971), where emphasis was placed on interpretation of pdf's, similar in fashion to the discussion in the previous section but with more complicated dynamics. Fleming's goal of “marrying” energetics with SDP was presented in Fleming (1971a,b). Here he examined the energy exchange and made estimates of uncertainty in these exchanges that served as a complement to the deterministic approaches with general circulation model simulations (see, e.g., Oort 1964). A novel partitioning process that divided energy into “certain” and “uncertain” components assessed the energy exchanges, with the certain and uncertain components dependent on the mean values and variances, respectively. Fleming (1971a) also investigated various methods of closure based on advances in turbulence modeling.
Eric Pitcher's memory of his entry into SDP follows (E. Pitcher 2013, personal communication):
As part of my acceptance at U of M, I was given a research assistantship and assigned to Ed to assist with his research. I recall meeting Ed on my first day in Ann Arbor. He gave me several papers to read including his 1969 Tellus paper. I still have distinct memory of sitting on the floor later that day in my empty apartment, reading those papers, and being captivated by these ideas. During the first year I worked with Ed extending the two-level quasigeostrophic model with 28 degrees of freedom developed by Fleming. We incorporated an analysis procedure based upon Bayesian statistics, and published the results in Epstein and Pitcher (Epstein and Pitcher 1972). When I got around to choosing a Ph.D. dissertation topic, it seemed perfectly natural to extend the previous study to a model with greater resolution, using real data. This became the core of my thesis. I traveled to NCAR periodically and used the CDC [Control Data Corporation] 6600 and 7600 for most of the computations.
Epstein took great pride in the accomplishments of Fleming and Pitcher. As he remembered (E2002),
Their dissertation topics grew out of the ideas that were outlined in my long letter [prospectus] to Aksel. I was fortunate to find such bright and conscientious students to attack these problems. I learned a great deal about the limits of SDP from the efforts of Eric Pitcher and Rex Fleming.
The sheer magnitude of computations required to accomplish SDP led to some skepticism in the research community.9 Nevertheless, dynamicists and atmospheric turbulence theorists Phil Thompson and Chuck Leith “. . . responded most favorably to my Tellus paper [Epstein 1969]” (E2002). Nearly 15 years after publication of this paper, Thompson applauded Epstein and his students for their effort: “I regard this development [Epstein's SDP method] as being highly significant and promising. It has been pushed further by two of his students, Fleming . . . and Pitcher” (Thompson 1983).
Joe Tribbia, a doctoral student under the direction of Ferdinand Baer at University of Michigan in the early 1970s, was close to the developments regarding SDP and Monte Carlo ensemble prediction. His recollection follows (J. Tribbia 2012, personal communication):
As for SD versus ensemble [Monte Carlo], I think that even Ed knew that the moment method would eventually lose to ensemble methods because 1) moment methods must make some nasty approximations like discard that may give wrong even nonsensical results and 2) they are computationally prohibitive O(N2) where N is big. I think there was some hope at first because moment methods were being used by Kraichnan and Leith [Kraichnan 1964, 1970; Leith and Kraichnan 1972] in turbulence studies . . . But when Chuck Leith wrote the paper on Monte Carlo methods [Leith 1974] and showed that the prediction of the mean was accurate with only 10  realizations, Ed felt that this might be the most practical way forward. Ed never voiced any objections to me regarding ensemble methods when I talked to him in his NMC [National Meteorological Center] days in any case [1983–93].
Nils Wedi provided a complementary view (N. Wedi 2012, personal communication):
Given the projected developments towards exascale computing and the improved computational efficiency of fast spectral transforms, I would be less pessimistic regarding the computational effort [of SDP] not being affordable . . . How massively parallel and executable the SDP method would be is more the question together with the (what Joe [Tribbia] calls nasty) scientific assumptions.
Ed Epstein left University of Michigan and academia in 1973 to spend the remainder of his career in high-level administrative posts within the National Oceanic and Atmospheric Administration (NOAA).10 Tribbia remembered Chuck Leith making the statement “Ed has Potomac fever,” the desire to be involved in national decisions at a post in Washington, D.C. However, the value of SDP in assessing uncertainty in climate energetics may have also played a role in enticing him to leave Ann Arbor for Washington, D.C. As stated by Fleming (R. Fleming 2013, personal communication), “I was acquainted with Robert White in the early 1970s [administrator at NOAA at that time], and I had the impression that White invited Epstein to assume a leadership role in the climate division of NOAA.” Eugene Bierly, a top-level manager at NSF during this period, was aware of factors that led Epstein to Washington. The following statement validates Fleming's impression (E. Bierly 2013, personal communication):
The World Weather Watch and GARP brought climate into the national scientific picture and Bob White wanted Epstein to be a key administrator for this component of NOAA. So White coaxed Epstein to leave Michigan in 1973. Epstein had talent as an administrator and as an academic scholar . . . Both Bob and I noticed that Ed's normal exuberance and energy were flagging in 1979. Sadly, we were informed that he was suffering from Parkinson's disease.
In Epstein's case, Parkinson disease progressed slowly and he had especially productive years at the Climate Analysis Center where he wrote a book on Bayesian statistics (Epstein 1985) and took an active role in the operational 6–10-day forecasts and the formulation of the 5-day-mean climatology.
Interestingly, both Fleming and Pitcher took positions with the premier large-scale computer companies of the 1970s–80s after graduating from University of Michigan: Fleming with Texas Instruments (1972–75) and Pitcher initially with Cray (1987–2002) and then with Linux Networx (2003–08).
DISCUSSION AND CONCLUSIONS.
From the late 1940s through the 1960s, widely varying and rapidly changing views on the prospects for dynamical/numerical weather prediction were in evidence. Edward Lorenz's experiences reflect the swiftness of attitude change on this subject. He initially had an optimistic view that exhibited retreat over the short span of 10 years. Lorenz's view upon entry into the MIT doctoral program follows (E. N. Lorenz 2002, personal communication)11:
. . . not knowing about chaos and those things then [late 1940s], I had the feeling that this [weather forecasting] was a problem that could be solved, but the reason we didn't make perfect forecasts was that they hadn't mastered the technique yet.
This optimism was dampened by results from his dissertation (Lorenz 1948), an exploration of short-range forecasting based on Taylor series expansions (in time) of the variables in an adiabatic primitive equation model. The 3–6-h simulations with structures typical of midlatitude cyclonic systems produced poor results. When Lorenz inadvertently introduced small truncation error into a low-order nonperiodic model in the late 1950s, the exponential growth of this error led him to question the feasibility of extended-range weather forecasting. The 2-week limit on useful weather prediction that came from the general circulation model simulations reinforced Lorenz's view.
Edward Epstein, a little known academic meteorologist with a passion for statistics, had the good fortune of exposure to Lorenz's forward-looking view on weather forecasting at the 1968 American Meteorological Society (AMS) conference on statistics. At this meeting, Lorenz laid out his framework for ensemble weather prediction based on the Monte Carlo method. Lorenz's graphic that showed a “glob of points” each following their own deterministic paths immediately stimulated Epstein. Soon afterward he developed a strategy for following these points without Monte Carlo's piecemeal construction of the pdf. The SDP method he developed predicted the moments of an assumed multivariate normal probability density function that fed into optimal data assimilation via the Bayesian framework, where a priori estimates of model variance/covariance are linked with observation variance/covariance to minimize the error variance of the system state. In the succeeding decades, the work of Epstein and his students inspired others to make the extended Kalman filter workable (reviewed in Evensen 1994). Their work also invigorated researchers in the climate dynamics and general circulation communities (see, e.g., Kurihara 1970; Opsteegh and Van den Dool 1980).
SDP's performance under the constraint of nonlinear advection produced creditable estimates for first and second moments in those cases with relatively smooth density functions (at times t < 3 in the example). The errors were on the order of 10% at these earlier times. When the density functions began to exhibit complex geometrical structure (at times t > 4)—evidence of impact from higher-order moments—the SDP's bivariate distribution fell short and gave rise to errors on the order of 25%.
Epstein was never discouraged with the computational demands of SDP. His approach was more philosophical than pragmatic where the majesty and romanticism of statistics in service to science reigned supreme, similar in nature to renowned physicist John Wheeler's view regarding cosmic rays: “To me, cosmic radiation was a romantic subject . . . [where] particle collisions at energies far higher than any accelerator could reach . . . [could] make new particles never seen before” (Wheeler and Ford 1998, p. 132). Scientific idealism drove Epstein as captured in the reflective assessment of his work several years before passing (E2002):
I believe that most of the work now going on in spectral analysis, ensemble prediction and portraying probabilistic predictions was at least suggested in the Tellus paper or in the papers by Epstein and Fleming [Epstein and Fleming 1971] and by Epstein and Pitcher [Epstein and Pitcher 1972]. There are flaws in both the original SDP approach and in the current techniques [Monte Carlo method]. The closure problem is more severe than I originally thought . . . The actual ensemble distribution rapidly loses dimension as each distinct member goes off on its own manifold. In my original calculations I was forced to take extremely small time steps to assure that the matrix of correlation coefficients remain positive, which it must be for any real distribution. If I have given up on a closure solution, there remain the Monte Carlo calculations. Underestimating the growth of computer power, I never took it seriously as an operational possibility until I saw them being produced. But remember, I was, and still am, interested in the variance and covariance and the influence of uncertainty in the initial conditions. I don't think the present scheme for ensemble calculations (as I understand them: I haven't been involved for several years) does the trick.
Whether or not advances in the “present scheme”—Monte Carlo predictions—during the past several years have alleviated any of Epstein's concerns is of course impossible to tell. Nevertheless, a relentless research effort has been mounted that strives to improve Monte Carlo ensemble prediction and the associated data assimilation component of the problem. Presentations at the recent conference celebrating ECMWF's 20th anniversary of operational ensemble forecasts give evidence of this research thrust.12 Based on these presentations, it is apparent that uncertainty in forcing as well as initial conditions is a central concern, a concern that focuses on “external error generated by the discrepancy between the dynamics of the model and the real atmosphere” as stated by Chuck Leith in his eloquent essay on statistical–dynamical prediction (Leith 1974, section 2). Roberto Buizza, one of the speakers at this celebratory conference, has itemized strategies that hold promise for overcoming deficiencies in ensemble forecasting (R. Buizza 2012, personal communication): 1) simulation of physical processes that take uncertainty into account, 2) improving the link with existing ensembles of data assimilation and assessment of alternate methods for creating initial conditions, and 3) coupling to a better and higher-resolution ocean–wave–sea ice model from initial time. This latter work is expected to further extend the predictability limit to the monthly and seasonal time scale.
Will SDP ever be given serious consideration as an alternate to Monte Carlo ensemble weather forecasting? Nils Wedi's view, as expressed in his personal communication found in the previous section, hits close to the bull's-eye of the target question. SDP is not likely to be limited by computational power—exascale computing with speeds ∼ 1018 instructions per second are expected by the early 2020s—or the speed of spectral transforms; however, limitations linked to the severity of closure assumptions lurk supreme and stand in the path of SDP as a viable alternate to Monte Carlo prediction. It is fair to say that the verdict on SDP's performance to date takes a form of acquittal in the Scot's three-tiered legal system: “not proven.”
Despite skepticism and doubt from members of the larger meteorological community, Epstein's probabilistic–dynamic approach to weather forecasting found support from a cadre of Rossby protégés (Aksel Wiin-Nielsen, Bert Bolin, Bo Döös, and Hilding Sundquist) along with theoreticians Phil Thompson and Chuck Leith. Buoyed by support from this elite group of meteorologists, Edward Epstein and his two doctoral students forged a fresh path into NWP. We commend their effort and hope that meteorology will continue to produce researchers not fettered or constrained by practicality or ease of operational implementation. Progress depends on it.
I am grateful for the informative oral histories I received from Edward Epstein, Edward Lorenz, and Malcolm Walker: Epstein for his comprehensive and candid responses to a host of questions related to his life and work, Lorenz for the first-hand account of his changing attitude toward weather prediction, and Walker for his memories of Eric Eady. I also appreciate the conversations I had with Nelson Wax in the early 1970s that served as my introduction to stochastic–dynamic prediction. Alice Epstein, Edward's wife, was most generous in expanding on her husband's life beyond science.
Bill Bourke, S. Lakshmivarahan, Rex Fleming, Eric Pitcher, and Jim Purser provided informal reviews of the manuscript that served as valued complements to the thoughtful formal reviews. Insightful suggestions for improvement of the presentation came from Roberto Buizza and Nils Wedi, and Gene Bierly filled in many gaps in my knowledge of the fledgling department of meteorology and oceanography at the University of Michigan.
Reminiscences from former students and colleagues of Epstein made it possible to define his academic style. Those who contributed were the following: Gene Bierly, Roland Drayson, Rex Fleming, Tom Grayson, John Leese, Eric Pitcher, Alan Strong, Joe Tribbia, and John Winchester. The librarians and archivists who used their skills to search for documents and photographs related to this study are the following: John Ford (Desert Research Institute), Margaret Leary (University of Michigan), Robin McElheny (Harvard University), and Kristen McDonald (Yale University). I also thank Norman Phillips and Hal Klieforth for offering photographs from their collections.
A supplement to this article is available online (10.1175/BAMS-D-13-00036.2)
1 The phrase “probability of ignorance” was introduced into scientific literature by Henri Poincaré (Poincaré 1952, chapter 6).
2 A stimulating discussion of Eady's and Charney's fundamental contributions to midlatitude cyclone development is found in Gill (1982, chapter 13).
3 The teaching skills exhibited by Mosteller in his Continental Classroom lectures have been extolled by Carlisle (1974).
4 Epstein served as a statistician on a project directed by Columbia University economics professor Eli Ginsberg. It resulted in a book titled The Uneducated (Ginsberg and Bray 1953). Epstein is listed as a staff member on the project in the book's front pages.
5 Northern Arizona University was named Arizona State College during the period that Epstein worked there.
7 The 25 June 2013 scheduled upgrade of ECMWF's global model will specify complex (two component) spectral arrays for each of the following variables: momentum (two components), temperature, and moisture, at each of 137 vertical levels with T1279 triangular truncation (15-km horizontal resolution in each horizontal direction). Thus, the number of complex spectral components is (0.5 × 12792 × 4 × 137) ~ 4 × 108 or 8 × 108 ~ 109 real components where the factor of 0.5 is associated with triangular truncation (www.ecmwf.int/products/changes/ifs_cycle_38r2/#timetable_for_implementation).
8 Warren Washington took leave of his position at the National Center for Atmospheric Research (NCAR) to serve as adjunct professor at University of Michigan during the 1968/69 academic year.
9 The author remembers Eric Pitcher's seminar at University of Illinois in the mid-1970s. He discussed results from his dissertation research that used ∼102 spectral components to investigate SDP with a barotropic model (Pitcher 1974, 1977). This demanded solution to a coupled set of ∼5,000 nonlinear differential equations (see Table 1). Even the computer-savvy 3D thunderstorm modelers, experts at coding the massively parallel Illinois Automatic Computer, version 4 (ILLIAC IV) were mildly stunned by the magnitude of the SDP problem under a relatively simple dynamical constraint.
10 Epstein's positions within the Department of Commerce included associate administrator of NOAA (1973–78), director of NOAA's National Climate Program (1978–81), chief of NOAA's Climate and Earth Sciences Laboratory (1981–83), and his final position as chief scientist of the Climate Analysis Center of the National Weather Service's National Meteorological Center (1983–93).
11 The nearly identical statement is found in Lorenz (1966).
12 A summary of the eight presentations at this celebratory conference are found in ECMWF Newsletter 134 for winter 2012/13 and online (www.ecmwf.int/publications/cms/get/ecmwfnews/305).