Although atmospheric observing systems were already an important part of meteorology before the American Meteorological Society was established in 1919, the past 100 years have seen a steady increase in their numbers and types. Examples of how observing systems were developed and how they have enabled major scientific discoveries are presented. These examples include observing systems associated with the boundary layer, the upper air, clouds and precipitation, and solar and terrestrial radiation. Widely used specialized observing systems such as radar, lidar, and research aircraft are discussed, and examples of applications to weather forecasting and climate are given. Examples drawn from specific types of chemical measurements, such as ozone and carbon dioxide, are included. Sources of information on observing systems, including other chapters of this monograph, are also discussed. The past 100 years has been characterized by synergism between societal needs for weather observations and the needs of fundamental meteorological research into atmospheric processes. In the latter half of the period, observing system improvements have been driven by the increasing demands for higher-resolution data for numerical models, the need for long-term measurements, and for more global coverage. This has resulted in a growing demand for data access and for integrating data from an increasingly wide variety of observing system types and networks. These trends will likely continue.
The modern science of meteorology has its roots in the development of the basic instrumentation for measuring the fundamental properties of the atmosphere, such as temperature, pressure, humidity, and wind speed and direction. These were in use long before the American Meteorological Society (AMS) was established in 1919. Historical accounts [e.g., National Research Council (NRC) 1958] credit the invention of the telegraph in the mid-nineteenth century as a turning point in meteorology, as it provided a way for atmospheric measurements and observations from different locations to be analyzed as a connected whole, that is, as an observing system. Atmospheric observing systems have been an essential driver of progress in meteorology, although the widespread use of the term itself appears to be a relatively modern phenomenon. No general definition appears in the Glossary of Meteorology (Glickman 2000), although related terms (“global observing system” and “observational network”) appear but do not cover the range of common usage of the term in the meteorological literature. We use a definition of an atmospheric observing system: An instrument or group of instruments that can be used to generate a set of connected observational data. Thus, the definition is based upon the type of data that are generated. A scanning radar, although in some sense might be thought of as a stand-alone instrument, generates a set of measurements (e.g., reflectivity) connected in space and time and is therefore commonly referred to as an observing system.
This chapter examines the role of some major observing systems in enabling milestones in meteorology and atmospheric sciences over the last 100 years. The objective is to provide examples to illustrate the relationship between scientific progress and the development and implementation of atmospheric observing systems. With the growth in the number and types of observing systems, especially in recent years, it is not possible to cover all systems and their varied uses over this time period. Furthermore, many important observing systems are described in other monograph chapters and there are many excellent references that provide much more information than can be presented here. A second objective of this chapter is to provide a guide to sources of additional information on observing systems, including where they can be found in other chapters of this monograph. Finally, the relationships between science, society, and observing systems are explored by examining a few illustrative examples from the past 100 years.
Observing systems are an essential component of most areas of meteorology, and their development and uses are tied to several needs. First, fundamental research into basic atmospheric processes inspires the development of new observing systems. Second, operational meteorology, such as weather forecasting, depends heavily on timely observations that are geographically relevant. Weather forecasts and the numerical models that contribute to the forecasts are critically dependent on initial conditions, which drive the requirements for observations. The monograph chapter on weather forecasting and numerical weather prediction describes the history of weather observations used for preparing forecasts during the past 100 years (Benjamin et al. 2019; Table 2-1). Third, many industries (aviation, agriculture, hydrology, air quality, etc.) depend heavily on specialized observing systems and these societal needs often drive the development of these system. Some examples are provided below and additional examples are provided in the chapters on applied meteorology (Haupt et al. 2019a,b,c). Finally, progress in understanding the Earth system and the changing nature of its climate are fundamentally tied to progress in observing systems that are implemented over long time scales and large geographical areas. Unlike weather forecasting, climate studies are not as dependent on initial conditions, resulting in a different approach to collecting and distributing data from observing systems.
This chapter focuses on observing systems for the atmosphere, although it is emphasized that observations of the entire Earth system provide critical connections with atmospheric observations, especially with regard to climate. In keeping with the theme of the 100th anniversary of the American Meteorological Society, specific examples of observing systems implemented by some U.S. federal agencies are provided. Reference to particular events in other countries and international efforts such as those by the World Meteorological Organization are also discussed.
This chapter provides a summary of progress in observing the planetary boundary layer (PBL) in section 2 and the upper air (sounding systems) in section 3. Section 4 provides an overview of some common remote sensing systems, and section 5 covers solar and terrestrial radiation observing systems. Research aircraft are discussed in section 6. The relationship between societal needs and observing systems is briefly discussed in section 7.
a. Progress in atmospheric observing systems in the twentieth century
Figure 2-1 provides a chronology of some important scientific discoveries in atmospheric sciences during the past 100 years overlaid with some observing system milestones. For Fig. 2-1, seminal papers are called out as milestone events, even though the papers in most cases relied on previous work and work on the topic continued after the papers were published. In some cases several papers, field experiments, or installations were included over longer time periods as discussed below.
At the start of this 100-yr period there were already important observing systems established. In the United States, the collection of weather observations was mandated by the Congress in 1870 and further advanced by the establishment of the Weather Bureau and the Cooperative Observer Program (COOP) in 1890–91 and the Air Commerce Act in 1926 (Fig. 2-1). The Air Commerce Act assigned responsibility for observation, forecasts and warnings to the U.S. Weather Bureau (NRC 2009). This was expanded in 1938 to include hydrology and water resources (NRC 2009). The COOP is the largest and oldest network of weather observations in the United States; today it relies on more than 11 000 observers devoting more than one million hours each year.1 The current program is managed by the National Weather Service (NWS), which provides training, data acquisition and processing, quality control, archiving, and publishing through the National Centers for Environmental Information. Instrumentation and siting must meet NWS standards. Of the 11 000 COOP stations nearly 5000 are climate stations and more than 6000 support hydrologic observations. Recent modernization of surface observing systems began in 1991, with the introduction of the Automated Surface Observing System (ASOS), which expanded in coverage throughout the 1990s.
Section 3 of this chapter and the review by Lewis (2003) provide an overview of the observing systems in use for measuring upper-level winds in the early twentieth century. Section 3 provides some additional historical details and descriptions of systems that are not covered by Lewis (2003). A comprehensive historical upper-air dataset from the first half of the twentieth century has been compiled by Stickler et al. (2010), who also document the growth in upper-air observations (their Fig. 3) and provide a detailed history of these observations. Early progress in mapping the upper-level winds also benefited from extrapolating surface data into the upper air, for example, by the use of the thermal wind equation. This provides an example of how theory is able to extend the usefulness of observing systems, a feature that is common throughout the history of observing systems.
The creation of AMS nearly coincided with the publication of the classic book by Vilhelm Bjerknes: On the Dynamics of the Circular Vortex: With Applications to the Atmosphere and Atmospheric Vortex and Wave Motions (Bjerknes 1921). This book provided a theoretical basis for the observations of the polar front on the synoptic charts of the Norwegian Weather Service, which relied on the network of weather stations that he and his colleagues established throughout Norway during World War I (e.g., Fiolek 2004). Much of the basis for modern meteorology, such as the concept of an air mass and frontal systems, arose during this period because of the availability of networked weather stations and the earlier work by Bjerknes and collaborators.
One of the most important historical examples of the advances in meteorology brought about by upper-air observing systems is the understanding of planetary waves (Rossby waves), which was pioneered by Carl-Gustaf Rossby in 1939–40—for example, see the reviews of Rossby’s work by Byers (1960) and Platzman (1968). Other important phenomena discovered using upper-air observations include the quasi-biennial oscillation (e.g., see section 3 and the review by Wallace 1973) and the Madden–Julian Oscillation (MJO). Madden and Julian (1971) analyzed nearly 10 years of daily rawinsonde data for Canton Island to observe a 41–53-day oscillation, which is now known as the MJO. Further details on the history of upper-air sounding systems and their impacts on science are described in section 3.
A milestone in trace-gas observing systems occurred with the development of the Dobson spectrophotometer for measuring ozone in the early 1920s and its subsequent applications in different parts of the world (e.g., see the review by Dobson 1968). This observing system, together with the development of methods for measuring upper-air humidity, enabled our understanding of the global stratospheric mass circulation and the exchange processes (the Brewer–Dobson circulation) that occur between the troposphere and stratosphere [e.g., see the review by Butchart (2014) and section 3 of this chapter] and for the discovery of the Chapman mechanism for stratospheric ozone in 1931. More recently, the Nobel Prize in chemistry for 1995 was awarded to Crutzen, Molina, and Rowland for their work in understanding the depletion of ozone in the stratosphere by chlorofluorocarbons (CFCs). It is clear that their work would not have been possible without observing systems for ozone, which had earlier resulted in the discovery of the ozone hole (Farman et al. 1985). Their discovery also relied on advances in technology for measuring CFCs (e.g., NRC 1996), as well as other trace species, and solar radiation. The development and application of observing systems and instrumentation for atmospheric chemistry expanded greatly during the past 100 years, with applications to meteorology as well as many areas of societal interests. These are discussed in the monograph chapter on atmospheric chemistry (Wallington et al. 2019) and in NRC (1998).
Although hurricane warnings by the U.S. Army Signal Corps date much earlier, the first hurricane warning system was established by the Weather Bureau in 1935.2 The improvements in the quantity and quality of sounding information, first by the deployment of the radiosonde and later by the introduction of droppable sounding systems (dropsondes), have resulted in continuous improvements in hurricane forecasting over a multiyear period (e.g., Burpee et al. 1996). The monograph chapter by Emanuel (2019) provides further details on the history of hurricane and tropical cyclone research.
Although networked in situ observing systems such as radiosondes are essential components of modern meteorology, remote sensing systems have come to play a major role, especially after satellites became available as remote sensing platforms. Remote sensing systems are typically classified into passive and active systems. Passive systems sample naturally occurring phenomena, such as ambient electromagnetic radiation, while active system transmit and receive signals. Radars and lidars are two examples of active systems. Meteorological radars are perhaps the most successful and widely applied atmospheric remote sensing systems. The modern field of radar meteorology has its origins in the radar technology developed during the Second World War (WWII); the development of radar itself grew out of radio technology developed in earlier years. The ability of radars to rapidly scan storms and display a connected field of reflectivity (and more recently other variables, such as Doppler velocities and polarization parameters), has made radar an indispensable part of storm research and operational weather forecasting. In the United States, weather radar networks using modified war surplus systems first appeared after WWII. Radars specifically designed for weather observations appeared in the 1950s, and the National Weather Service network of WSR-57 systems began on 26 June 1959 with the installation of the first WSR-57 at the new Hurricane Forecast Center in Miami, Florida (NRC 2009). The network was updated starting in the 1990s with the WSR-88D Doppler radar, known as NEXRAD3 (Fig. 2-1). The NEXRAD system was further upgraded after 2010 to incorporate a capability for dual-polarization measurements that reveal much greater detail about storm structure and hydrometeor characteristics and improve data quality. Further details on the development of radar in meteorology are described in section 4.
Radar remains an essential tool for spotting thunderstorms that either contain a tornado or are likely to develop one. The first radar hook echo associated with a tornadic storm was observed at the Illinois State Water Survey on 9 April 1953.4 Fujita’s seminal work on documenting damage from tornadoes (the Fujita scale) relied heavily on relating damage assessment to observations of wind speed and developing a conceptual model (e.g., Fig. 2 in Fujita 1971) of the process. Such conceptual models have proven to be essential in explaining observations.
One of the early demonstrations of radar’s impact on the field of meteorology was the Thunderstorm Project (Byers and Braham 1948, 1949), which was certainly an important scientific milestone in our understanding of these storms (Fig. 2-1). The Byers and Braham (1949) description of the thunderstorm life cycle remains in most textbooks on storms. Although the Thunderstorm Project relied heavily on ground-based radar and surface measurements, it also made use of dedicated research aircraft such as the P-61 (nicknamed “Black Widow”; Fig. 2-2), which was one of the early military applications of airborne weather radar (see also section 4). The P-61 was the first U.S. military aircraft designed specifically to carry radar and was adapted for research. For the Thunderstorm Project, they were equipped with instrumentation for measuring temperature, updrafts and downdrafts, turbulence, lightning (electric field mills), icing, precipitation, and provisions for recording both the measurements and the aircraft location and altitude (Byers and Braham 1949). Although aircraft had been used as observing platforms before (e.g., see section 3), in many ways this project was pioneering in its use of aircraft by carrying multiple instruments together with specialized recording and communication equipment (referred to as a “data acquisition system” in present-day research aircraft). As with most modern research aircraft, the aircraft characteristics (payload, range, power, altitude, speed, endurance, and handling) were matched to the specific mission requirements. The thunderstorm P61 aircraft were adapted for storm research, not only through the selection of a rugged airframe and provision of onboard capabilities (such as radar and instrumentation), but also through the training and selection of pilots and ground personnel to safely meet mission requirements. The project implemented an experimental program that relied on coordination between the different observing systems—a strategy that was widely adopted by others (e.g., Browning et al. 1973). Many of the strategies developed during the Thunderstorm Project continue today, especially for field deployments that utilize research aircraft for storm-related research. Further details on research aircraft as observing systems are provided in section 6. The influence of radar and the Byers and Braham (1949) study were evident some years later in the naming of certain thunderstorms as “supercells” by Browning (1964).
The Thunderstorm Project offers an early example of a field campaign designed to study a particular weather phenomenon using several coordinated techniques and types of observing systems (e.g., concurrent in situ and remote sensing observations). Similar techniques are still used today in field campaigns. One of the largest field campaigns of the past 100 years was the Global Atmospheric Research Program (GARP) Atlantic Tropical Experiment (GATE; Fig. 2-1), which was the first major experiment of GARP.5 The goal was to understand the predictability of the atmosphere and extend the time range of daily weather forecasts to over two weeks (e.g., Kuettner 1974).
GATE involved 40 research ships, 12 research aircraft, and numerous buoys; 20 countries were involved. Data from the project were made available without restrictions, setting an important precedent for many future campaigns. GATE focused on weather in the tropical Atlantic. Nineteen years later, the Tropical Ocean and Global Atmosphere Coupled Ocean–Atmosphere Response Experiment (TOGA COARE) project (Webster and Lukas 1992) brought 20 nations together to study the weather associated with the Pacific warm pool (Fig. 2-1). GATE and TOGA COARE are examples of the largest research campaigns of the past 100 years. They have made extensive contributions to many areas of tropical meteorology, such as in our understanding of tropical mesoscale convective systems (e.g., Houze 2019). One of the hallmarks of the second half of the past 100 years has been an increasing number of field campaigns, focusing on a variety of scientific and operational objectives. One of the most comprehensive summaries of recent field campaigns is provided by Kramer (2002, appendix Q), which, although focused primarily on campaigns after 1985 with a satellite component, includes information on many different field campaigns and their objectives.
Lidar, like weather radar, is an active scanning remote sensing observing system that offers access to other atmospheric variables such as aerosols and trace gases that are generally not accessible by radar. While radar provides an example of a mature technology that continues to be improved, lidar is a newer technology that is finding wide applications and great potential for future growth. Early development of laser-based systems occurred in the early 1960s (e.g., NCAR 1967), but widespread use of lidar observing systems occurred somewhat later than radar (Fig. 2-1). Further details on the development of lidar in meteorology are described in section 4.
Another group of observing systems that has improved our understanding of thunderstorms and our forecasting of severe weather has been the development of advanced lightning-detection systems, such as the National Lightning Detection Network (NLDN). The review by Orville (2008) provides a history of the development of the NLDN, its capabilities, and related lightning-detection networks. Lightning Mapping Arrays (LMA), which locate the sources of VHF radiation in space and time from lightning discharges (e.g., Rison et al. 1999), also make up an important observing system for studying storm electrification and the characteristics of the lighting discharge. The World Wide Lightning Location Network (WWLLN)6 provides lightning data from over 50 collaborating universities and institutions.
Theoretical milestones for numerical modeling, such as Kessler’s theory of autoconversion (Fig. 2-1)—the initial stage of the collision–coalescence process whereby cloud droplets collide and coalesce to form drizzle (Glickman 2000)—were influenced heavily by earlier research using radar (e.g., Kessler 1969, 1995). Later development of numerical modeling techniques for storms was motivated by observations of storm-echo splitting (e.g., Wilhelmson and Klemp 1981). Other advances in theory, such as explaining how pollution may affect the albedo of clouds (the Twomey effect; Twomey 1974, 1977), have relied on airborne measurements of cloud properties (e.g., cloud droplet sizes and concentration) and aerosol properties, such as the subset of aerosol types that act as nuclei for cloud droplet condensation [cloud condensation nuclei (CCN)]. Twomey’s work is considered a milestone in research on the radiative impact of clouds, an area of study that continues to the present day, typically using research aircraft and satellite measurements. Other theoretical work, such as the pioneering work in chaos theory by Lorenz (1963), fundamentally changed our understanding of the limits of observations for predicting a future state of the atmosphere.
Around the same time as the U.S. national network of weather surveillance radars began installation (Fig. 2-1), the launch of the first weather satellite, TIROS I, in April of 1960, ushered in a new era in observing systems covering a global scale, a trend that has only been increasing in subsequent years. This period can be viewed as a turning point in atmospheric observing systems, toward more emphasis on large geographic and temporal coverage, making use of the opportunity to integrate satellite observations with other Earth observing systems. As described by Kramer (2002), “Prior to the space age … humankind had never been able to take in the whole of a hemisphere in a single glance. In fact it had never had a global view of the world in which it lived.” Coming near the midpoint of the past 100 years, the availability of satellite platforms resulted in a revolution in the uses of remote sensing observing systems for both fundamental and applied meteorology. The monograph chapter on atmospheric satellite observations (Ackerman et al. 2019) covers many of the scientific advances that have resulted from these observing systems.
Even before the launch of TIROS I, it was recognized by a National Academy of Sciences report (NRC 1958) that “the global nature of the weather problem and the imminent availability of Earth satellites … are going to affect the complexity of meteorological research at least tenfold.” The prophetic nature of this remark is illustrated, for example, by developments in the late 1980s. The term “Earth system science” became popular, recognizing the interconnectedness of Earth’s biosphere, atmosphere, and hydrosphere, the study of which requires integrating data from many global Earth-observing systems. Earth system science is the study of the Earth and its components as an integrated set of systems (NRC 1986, 1988, 1990). The Intergovernmental Panel on Climate Change (IPCC), established in 1988 (Fig. 2-1), has produced five assessment reports since 1990,7 each relying on multiple Earth-observing systems and models to produce a summary of the state of the science behind climate change.
A point made by the NRC (1958) report was “another requirement for substantial progress toward a solution of the scientific problem of meteorology is that the scale on which research is conducted must be commensurate with the scale of the problem.” At the time of that report Keeling began his historic observations of CO2 at the Mauna Loa observatory (Fig. 2-3). No history of atmospheric observing systems would be complete without reference to the Keeling curve, which shows the rise in CO2 concentrations as well as the seasonal variations and is often cited as one of the most important geophysical observations of all time. Figure 2-1 shows the establishment of the observatory at Mauna Loa in 1958, although Keeling did prior work developing these measurements.8 He reported on the implications of these measurements in 1960 (Keeling 1960). One of the important lessons from the Keeling curve is a confirmation of the comments in the 1958 NRC report and for recommendations made by the advocates for establishing these types of sustained measurements (e.g., Callendar 1958; Revelle and Suess 1957): the need for continuous measurements consistent with the scale of the problem, which for CO2 involves both seasonal and long-term (i.e., climatological) scales. This strategy is evident in the current Global Greenhouse Gas Reference Network.9
b. Sources of information on atmospheric observing systems
Atmospheric observing systems are essential to nearly every aspect of meteorology. Although there is not enough space to describe the many systems in use today, there are a number of useful references, including other chapters of this monograph. Table 2-1 provides a brief summary of where information on some specific types of observing systems can be found in this monograph. Satellite observing systems have become so widely used that two chapters of this monograph are devoted to observing systems from these platforms (Fu et al. 2019; Ackerman et al. 2019). Additional information on some common satellite remote sensing systems is provided in sections 4 and 5 of this chapter. Systems for measuring precipitation are covered in the chapter on hydrology (Peters-Lidard et al. 2019). Examples from this monograph of specific applications to various areas of meteorology are given in Table 2-1.
One of the most comprehensive sources of contemporary atmospheric observing systems information is Kramer (2002), which emphasizes satellite remote sensors but contains an inventory of airborne platforms and their instrumentation and a summary of recent field campaigns.
A review of the status of observing systems and future needs at the end of the twentieth century is provided by NRC (1998), which discusses observing systems related to atmospheric physics, chemistry, dynamics and weather forecasting, upper-atmospheric and near-Earth space, and climate/climate change research.
Two twenty-first-century reviews of atmospheric observing systems are found in NRC (2003) and NRC (2009). Appendix C in NRC (2003) provides an overview of major U.S. observing systems from public and private sources, including surface, upper-air, profiler, and Meteorological Data Collection and Reporting System (MDCRS) commercial aircraft data; Doppler radar; ocean observations; lighting detection; and satellites. Descriptions of the technology involved, the number of sensors, data collection strategy, data products, quality control, and data dissemination are discussed. NRC (2003) also describes the substantial changes in declining instrument costs, increased computing power, increasing bandwidth, and related networking capabilities that are associated with networked atmospheric observing systems. For example, advances in computing and data assimilation/networking have allowed for numerical modeling capability at finer scales, which drives the need for higher-resolution observations of atmospheric and land-use variables. A related concern is processing and archiving the increasing amounts of data that are generated.
NRC (2009) provides a summary of atmospheric observing networks, which, within the United States, have enormous diversity and different support mechanisms. For example, appendix B.1 of NRC (2009) lists over 500 different surface-based networks devoted to meteorological data, and appendix B.2 lists the many surface-based networks devoted to air quality monitoring. These appendices are useful for finding measurement parameters, numbers of sites, operating agencies, and locations where the data can be obtained. They also provide links to databases that describe and map what is available (e.g., http://www.eol.ucar.edu/projects/hydrometnet covers hydrometeorological networks in the United States, including national, regional, state and local, precipitation and radar, upper-air, radiation and flux, soils, hydrology, and other networks). Chapter 4 of NRC (2009) provides an overview of current and emerging observing systems, observational challenges, and the global context for observing systems. NRC (2009) also discusses societal needs for observations and a strategy for a “network of networks” to better integrate the many existing networks. Both NRC (2009) and NRC (2003) provide details on the late twentieth/early twenty-first-century growth of networking and the impact this has had on observing systems and how they are used. They discuss the need to coordinate the increasing amounts of data that are generated, which is likely to continue to be a major feature of twenty-first-century meteorology.
The World Meteorological Organization (WMO) has a long history of fostering international cooperation in observing systems. Their Global Observing System (GOS; WMO 2017) is part of the World Weather Watch and includes requirements and reporting practices that have been widely adopted by WMO partners from a wide variety of international groups. WMO (2017) reports that the GOS includes some 11 000 surface observations, 1300 upper-air observations, 4000 ships (1000 reporting every day), reports from over 3000 cooperating aircraft via the Aircraft Meteorological Data Relay (AMDAR) system, data from a constellation of satellites, and radar data from national and regional networks, plus solar radiation observations, lightning-detection observations, tide gauge observations, and wind profiler data.
The WMO Guide to Meteorological Instruments and Methods of Observation (GIMO; WMO 2014) is a guide to measurements of meteorological variables, a description of observing systems, a guide to space-based observations, and a description of methods for quality assurance and management of observing systems. GIMO provides a comprehensive reference and standards for making many of the meteorological measurements used in networked observing systems, such as used in the GOS.
2. Boundary layer observing systems and systems for measuring turbulence from fixed and mobile locations
Meteorological conditions near the surface have been easily accessed during the past 100 years, resulting in major improvements in PBL observing systems. A critical requirement for PBL studies is the measurement of turbulence variables, since pervasive turbulence is a fundamental property of the PBL that distinguishes it from the overlying free troposphere (FT). This inherent turbulence property means that quantifying PBL structure requires a combination of vertical profiling techniques for measuring both mean scalars (e.g., temperature, humidity, and chemical species) and turbulence variables such as vertical fluxes of scalars and momentum, and variances of both scalar and dynamic quantities. Although in situ measurements are a mainstay in PBL measurements they have important limitations. One is the impact of the sensor and its associated structure on the properties being measured. Another is the limited number of observation points. The latter can be addressed to some extent by arrays of sensors (e.g., Patton et al. 2011), but still the number of sensors has practical limitations. One approach to obtaining concurrent observations along extended paths is to use remote sensing techniques, which includes both active remote sensing such as radars, lidars, sonic detection and ranging (sodars), acoustic tomography, and scintillometers and passive techniques such as microwave radiometry and spectroscopy. Boundary layer height is another important PBL property that lends itself to remote sensing (e.g., Luo et al. 2014). Many of these techniques are discussed in Wilczak et al. (1996) and radar and lidar techniques are discussed in section 4. Geerts et al. (2017) provide a summary of the next generation of technologies for observing convection and turbulence.
Profiling platforms, such as kites, balloons, manned and unpowered aircraft, unmanned aerial vehicles (UAVs), and blimps, each with their own advantages and limitations, have a long history. Balsley et al. (1998) describes recent developments using kites for measuring profiles of both mean and turbulence variables in the lower atmosphere for moderate wind conditions. Similarly, Siebert et al. (2003) summarize applications of tethered balloons in the PBL, with particular reference to the cloud-topped PBL, and describe a system that measures both mean and turbulent scalar and dynamic variables, including microphysical measurements. Further details on the use of kites for soundings are given in section 3.
The history of thermodynamic measurements in the PBL is extensive (e.g., LeMone et al. 2019 and references therein). Thermometers have been in use for hundreds of years mostly based on the volumetric changes with temperature that occur in gases, liquids, and solids. For the most part, their output is not easily recordable, an exception being bimetallic strips or thermocouples that transform temperature change into a displacement that can be recorded. Similarly, the measurement of humidity has a long history, going back to the eighteenth century when the change in length of human hair with relative humidity was used to obtain a displacement.
Early wind measurements in the PBL include cup and propeller-vane anemometers. Cup anemometers go back at least to the mid-nineteenth century (Robinson 1847) and have been in use (albeit with some improvements) ever since, as they are rugged, inexpensive, have a linear response, and can resolve a significant fraction of the turbulence spectrum. In 1991, the three-cup anemometer was modified by Weston to measure both wind speed and direction. Weston added a tag to one cup, which causes the rotation rate to increase and decrease as the tag moves alternately with and against the wind, which allows wind direction to also be determined. Similarly, propeller-vane anemometers, with the propeller axis in the horizontal plane and the vane in the vertical plane, have been used to measure both wind speed and direction in the PBL, and they are also able to resolve a significant fraction of the turbulence spectrum.
These early pioneering efforts at quantifying PBL structure have now been mostly superseded in the research community by newer technologies that are discussed in the following sections. A review of boundary layer measurement techniques as of the mid-1980s is presented in Lenschow (1984). A review of surface-based remote sensing of the PBL as of the mid-1990s is presented in Wilczak et al. (1996)
a. Turbulence measurements from fixed sites
One of the most important turbulence variables is the vertical momentum transport, or stress. Even before sensors were developed to measure stress, estimates were obtained from simple but elegant alternative approaches. Richardson (1920), for example, estimated stress near the surface by following the trajectories of thistledown (i.e., the soft feathery material surrounding a thistle seed) released from a surface point source, and later by estimating the force needed to bend wheat stalks in a field of wheat (Richardson 1922).
Early direct eddy-correlation measurements of turbulent heat and water vapor fluxes in the surface layer were pioneered by Australian investigators using the techniques of hot-wire anemometry combined with small-diameter dry and wet thermocouples (Swinbank 1951). The fluxes were obtained from the averaged products of the vertical wind fluctuations with temperature and humidity fluctuations. This means that the responses of the anemometer and thermocouples must be sufficient to resolve all the scales that contribute to the vertical fluxes. These early measurements led to considerable insight into the structure of turbulence in the surface layer—especially for the unstably stratified PBL (Priestley 1959). Although hot-wire anemometers have sufficiently fast response to resolve the flux contributions, they are inherently delicate and finicky and do not work well in light winds (e.g., Hicks 1988). However, they are still the standard technology for specialized very high-frequency (fine spatial resolution) turbulence measurements in the surface layer (e.g., Metzger et al. 2007). A more rugged approach for vertical flux measurement, the “Fluxatron,” was adapted in 1965 using a propeller anemometer to measure the vertical wind component (Hicks 1988). This approach continued to be improved and utilized until it was eventually replaced by the sonic anemometer.
The sonic anemometer–thermometer was a breakthrough development for measuring turbulent wind velocity and temperature fluctuations in the PBL. An early version was developed and deployed during The Great Plains Turbulence Field Program in summer 1953 (Lettau and Davidson 1957a,b) and in Project Prairie Grass (Haugen 1959) in summer 1956 near O’Neill, Nebraska, but with limited success, as discussed by Kaimal (2013) in his history of sonic anemometry. These early field programs demonstrated the usefulness of comprehensive round-the-clock measurements for obtaining an overall detailed quantification of PBL processes. These pioneering efforts, along with further development of sonic anemometry by Kaimal and others, led to the very successful Kansas Experiment in 1968 that quantified the surface-layer structure of the PBL by applying Monin–Obkuhov similarity theory for a horizontally homogeneous surface to the extensive datasets for both stable and unstable stratification.
Subsequent refinements in sonic anemometers, along with other instruments were incorporated in follow-on deployments. These included the Minnesota Experiment in 1973 that utilized tethered balloon-borne sensors to probe the entire PBL and a long-term deployment at the Boulder Atmospheric Observatory (Kaimal and Gaynor 1983), which featured a 300-m instrumented tower, starting in 1978 and continuing until it was shut down in 2016. A similar 213-m instrumented tower at Cabauw, the Netherlands, started operating in 1972 and continues today (Monna and Bosveld 2013). These towers have been used for a variety of studies and have played a major role in documenting the mean and turbulent structure of the PBL up to the heights of the towers for both stable and unstable stratification.
b. Turbulence measurements from mobile platforms
Aircraft measurements of mean thermodynamic variables in the PBL go back to the early twentieth century, not long after airplanes came into general use, but accurate measurement of all three wind components, including turbulent fluctuations, came later because of the inherent problem of measuring a vector quantity from a mobile platform, which requires measuring its location, velocity, and angular orientation and calculating the difference between the platform velocity and the velocity of the air relative to the platform in an Earth-based coordinate system. The mean horizontal wind along the longitudinal axis of the airplane can be approximately estimated from the difference between the aircraft speed with reference to Earth’s surface (i.e., the ground speed) and the speed with which it moves through the air (the true airspeed), and the mean horizontal wind normal to the longitudinal axis from the aircraft drift angle (the angle between the longitudinal axis of the aircraft and the flight direction). The true airspeed is measured by measuring the difference between the pressure in a pitot tube (where the air has been compressed as it is slowed in the tube) and the static (undisturbed) air pressure (e.g., Wendisch and Brenguier 2013). This difference is referred to as the pitot-static pressure difference. Fujita (1966) described how to implement this for measuring mesoscale wind fields using a navigational Doppler radar, which was first developed in the 1950s (e.g., Tull 1996).
An early effort in measuring vertical air velocity fluctuations was carried out by Bunker (1955) who used an accelerometer to measure vertical acceleration and a gyroscope to measure the departures of the hard-mounted accelerometer from vertical. He then estimated the vertical air velocity from the gust response characteristics of the aircraft and combined this with true airspeed fluctuations estimated from pitot-static pressure differences to also estimate vertical momentum flux along the longitudinal axis of the aircraft. Also in the mid-1950s, a turbulence measuring system was used on a McDonnell FH-1 (a first all-jet airplane) to measure vertical velocity spectra in the PBL. This system used either a rotating vane or differential pressure probe mounted on a nose boom to measure the airplane attack angle, an accelerometer, and a rate gyroscope (Lappe and Davidson 1963). A different approach to measure turbulence intensity was used by MacCready (1964) starting in the early 1960s, who disregarded the long-wavelength contributions to the longitudinal air velocity fluctuations by bandpass filtering the output of an airspeed sensor. This allowed him to estimate a standard measure of turbulence, the eddy dissipation rate (EDR; the rate at which turbulent kinetic energy is absorbed from eddy breakdown into smaller scales until it is converted into heat from viscosity) using the Kolmogorov inertial subrange hypothesis.
The next step in complexity and accuracy for vertical velocity measurement came in the early 1960s in Australia, with the development of a system that combined a nose-boom-mounted vane with a free gyroscope and a vertically stabilized (using signals from the free gyroscope) accelerometer (Telford and Warner 1962). This approach reduced errors present in previous systems because of the varying contribution of gravity to the measured acceleration resulting from attitude angle variations. They also incorporated fast-response dry- and wet-bulb thermometers to measure heat and water vapor fluxes.
In the late 1960s, inertial navigation systems (INS), with much improved accuracy and reduced drift rates, began to be used to measure the translational and rotational airplane motions, as well as the absolute location of the aircraft (e.g., Lenschow 1972). INS systems provide location and motion data in an Earth-referenced system. They became standard systems for aircraft navigation near the midpoint of the last 100 years. Today, global positioning systems (GPS) are also utilized in combination with inertial measuring units to provide a lighter and less expensive alternative to INSs. The combination of INS and GPS, and the more recent development of a laser-based air motion sensing system, have led to even more accurate technology for measuring both mean and turbulent fluctuations in the three wind components from aircraft (Cooper et al. 2016).
One disadvantage of manned aircraft is that their minimum safe flight altitude may not allow direct measurements in the stably stratified boundary layer and the surface layer of the convective boundary layer. One way to address this is the use of UAVs. The development and utilization of UAVs are increasing rapidly. Some can fly lower and slower and cost much less to deploy than manned aircraft, but with typically smaller payload and power capabilities and more limited deployment options. There are now miniaturized systems and instrumentation available using similar measurement techniques as for larger manned aircraft that can be used to measure air velocity from UAVs, but with less accuracy. UAVs have been deployed from land-based sites, aircraft, and ships. Over the ocean they have been used not only for measuring mean and turbulent atmospheric variables but also ocean surface wave structure (Reineman et al. 2016).
3. Upper-air observing systems
Until the end of the nineteenth century, atmospheric observations had been taken largely from the surface, except for few occasional observations from manned hot air or gas balloons. As discussed below, the past 100 years have seen a major increase in the capabilities for upper-air measurements with concurrent advances in our understanding of the atmosphere and our capabilities to provide upper-air data for a variety of meteorological applications.
a. From kites to radiosondes: The development of modern upper-air observations
In 1894, Abbott Lawrence Rotch, who 10 years earlier had founded the Blue Hill Observatory near Boston, Massachusetts, flew the first kite equipped with a recording instrument to measure a profile of the atmospheric conditions.10 This moment marks the beginning of systematic upper-air observations. Balsley et al. (1998) summarized kite meteorological applications that continued to be used by the U.S. Weather Bureau for routine meteorological measurements until 1933 when aircraft assumed that role. An example of the application of kites is given by Gregg (1922), who describes the thermodynamic and dynamic structure of the lower atmosphere from a network of kite sounding stations east of the Rocky Mountains.
In 1899 Richard Assmann founded the Aeronautical Observatory at Reinickendorf near Berlin, Germany, which in 1905 was relocated to Lindenberg, Germany, and which continues through today as a preeminent institute for atmospheric observations. Assmann used kites and tethered balloons but also free-flying “registering” instruments, for which he had invented rubber balloons (Fergusson 1909). The instruments used in these measurements were sophisticated barothermohydrographs, better known at the time as meteorographs. They typically measured pressure using aneroid cans, temperature using bimetal strips, and humidity using hair hygrometers. Measurements were recorded on mechanical strip charts, and in the case of free-flying registering balloons, instruments had to be recovered to retrieve the data. Registering balloons and the simpler pilot balloons, which did not carry any instruments, were tracked by optical theodolite to measure wind profiles aloft.
From about 1925 to 1943 the U.S. Weather Bureau and Army Air Corps operated a network of up to 30 aircraft sounding stations across the country that were used to profile the lower atmosphere. However, like kites, the aircraft could not be flown in poor weather and data were available only after the soundings were completed. The airplanes carried simple meteorographs, which recorded temperature, pressure, and humidity in much the same manner as the balloon-borne meteorographs of that period (Bemis 1951).
By the early decades of the twentieth century, a network of upper-air stations had been established in Europe and in the United States, performing regular soundings for weather forecasting. In Japan, Wasaburo Ooishi founded the first meteorological upper-air observatory at Tateno in 1920 (Lewis 2003). Ooishi had traveled to Lindenberg in 1911 to learn about upper-air observations and returned to Japan in 1913. Because of World War I, the creation of his observatory was significantly delayed; however, by the mid-1920s, Tateno was a well-established observatory. Between March of 1923 and February of 1925, Tateno measured 1288 profiles, which is nearly two soundings by kite or pilot balloon per day. Ooishi observed unusually strong winds in the upper troposphere, which he published in Esperanto (Ooishi 1926). This publication is effectively the first climatology of what 13 years later Heinrich Seilkopf called Strahlströmung (Seilkopf 1939) or “jet stream” in its English translation. In Europe, Vilhelm Bjerknes used the network of weather stations that he and his colleagues established throughout Norway (e.g., Fiolek 2004) with coordinated ascents at 18 stations in a set of four different case studies. The last of these (Bjerknes and Palmén 1937) also found narrow regions of high wind speeds and was able to put these into a much more theoretical context. Further approaches for a theoretical explanation of the jet stream followed in the 1940s (University of Chicago, Department of Meteorology 1947; Rossby 1947; Riehl 1948).
Profiling by kites was very labor-intensive work. Despite a number of improvements (e.g., Grund’s self-regulating kite), escaping kites were a common and dangerous occurrence. Furthermore, observations by kites were limited to the lowest few kilometers of the atmosphere. Typical operational profiling heights done at Lindenberg were generally up to 4 km. In 1919 a string of 8 kites raised at Lindenberg reached an altitude of 9740 m, a record that still stands today (Adam et al. 2005). To measure the atmosphere above that altitude required flying-registering devices on free-flying balloons, which were not always recovered. This motivated the development of instruments that could radio transmit their data rather than recording it on paper charts, which had to be recovered. Early developments of balloon borne transmitters for meteorological work started at Lindenberg in the early 1920s and were followed by work in the Soviet Union, France, and in the United States (DuBois et al. 2002). In 1924, William Blair at the U.S. Signal Corps laboratory at McCook Field, Ohio, built and flew what can be considered the first radiosonde, which, however, included only a temperature measurement. This work was abandoned when Blair was reassigned and not published until 1931. The first published radiosonde launch was likely conducted in 1929 by Robert Bureau in France, who also coined the term “radiosonde.” His first sonde also contained only a temperature sensor, but he soon added an aneroid pressure sensor. In 1930 Pazel Moltchanoff built a similar design in Russia, and Paul Duckert in Lindenberg followed with a design that was similar to Blair’s. Duckert soon added a sensor for humidity to complete the sensors still used in today’s radiosondes. In 1930, Vilho Vaisala, a Finnish engineer, also designed a radiosonde that used capacitive sensors for pressure, temperature, and humidity and combined these with fixed capacitive references to characterize the transmitter. His design was not only innovative but also less costly. This basic design principle lasted through all of Vaisala’s radiosondes for nearly 80 years.
Many radiosonde designs followed and allowed a significant expansion of the upper-air network. By the late 1940s almost all kite-based profiling had transitioned to observations by radiosondes (Adam et al. 2005). Measurements of winds, however, still required tracking of the balloon by optical theodolite, which limited wind measurements to fair weather or below cloud base. The invention of radio theodolites and radar tracking offered two alternative solutions to this problem, which would both be weather independent. Both systems for radio wind finding were implemented by different manufacturers and some are still in use today. Later wind-finding systems were based on radio navigation systems such as loran and Omega, which have meanwhile been replaced by Global Navigation Satellite Systems (GNSS), such as GPS or Galileo. These systems allow measurements of temperature and winds well into the stratosphere independent of weather and are one of the backbones of the global upper-air observing systems.
Jean Piccard was the first to develop lightweight plastic balloons (Winker 1986), which could carry heavy payloads into the stratosphere. However, World War II interrupted most of these developments. In 1945, the German engineer Otto C. Winzen partnered with Jean Picard to make plastic balloons using polyethylene for manned stratospheric flights. These balloons provided the capacity to lift heavy instruments into the stratosphere and to sample a part of the atmosphere that was previously out of reach. In 1961, the National Scientific Ballooning Facility (NSBF) was founded and initially operated by the National Center for Atmospheric Research (NCAR) to provide scientists access to altitudes above more than 99% of the atmosphere. These stratospheric balloons initially focused on astrophysics and astronomy; however, their value for stratospheric chemistry was almost immediately recognized (NRC 1976).
b. Solving the problem of ocean soundings: The dropsonde
In situ soundings over oceans, which cover two-thirds of our planet, have been notoriously difficult because of the logistical effort to launch soundings at sea, especially in storms. Nevertheless, these measurements are most needed when storms threaten populated regions on land. In 1943, Colonel Joe Duckworth flew into the eye of a hurricane near Houston, starting the era of hurricane reconnaissance. However, since flights through severe storms are inherently dangerous for aircraft, adding measurements by launching instruments from aircraft into storms became attractive. In the late 1960s, NCAR developed a dropsonde system, which could be deployed from aircraft into thunderstorms (Bushnell et al. 1973). Wind finding based on the Omega global navigation system was added in the early 1970s (Govind 1975). Loran wind finding and later GPS and more advanced sensors were added (Hock and Franklin 1999; Fig. 2-4). Dropsondes have become an essential instrument for hurricane surveillance and significantly improved the ability to forecast how hurricanes develop and which regions they may impact (Burpee et al. 1996). They fill an important gap for many research programs that rely on targeted observations in data sparse regions.
c. Upper-air observations and the stratosphere
The analysis of stratospheric winds from radiosonde launches at Nairobi, Kenya; Kanton Island, Republic of Kiribati; and Christmas Island between 1955 and 1960 (McCreary 1959; Reed et al. 1961) showed that the wind reversal descends with time throughout the lower stratosphere in what became known as the quasi-biennial oscillation (QBO; Ebdon and Veryard 1961). A first theoretical explanation of the QBO was provided by Lindzen and Holton (1968), who proposed a new idea of wave–mean flow interaction (Fig. 2-1).
The existence of the ozone layer in the stratosphere has been known since the early measurements of the solar spectrum at Earth’s surface by Fabry and Buisson (Fabry and Buisson 1913). Chapman (1930) proposed that ozone is formed in a photochemical cycle through the reaction of atomic oxygen and molecular oxygen. Without further observations, this was the accepted explanation for stratospheric ozone for over 40 years. In 1968, David Murcray used an NSBF stratospheric balloon to fly a solar infrared absorption spectrometer into the stratosphere, where he discovered nitric acid (Murcray et al. 1968). Paul Crutzen showed that even in the parts-per-billion range nitric acid is an indicator that stratospheric ozone may be catalytically destroyed by a photochemical cycle involving the nitrogen oxides NO and NO2. This important discovery rectified the deficiencies of the Chapman cycle and brought the theoretical calculations of stratospheric ozone into good agreement with stratospheric observations. Spurred by concerns about the impact of nitrogen oxide emissions by planned supersonic air traffic and later the impact of chlorofluorocarbons on stratospheric ozone, extensive observations of the stratospheric chemical composition took place using large scientific balloons as one of the essential platforms. With the discovery of the Antarctic ozone hole and the recognition that heterogeneous chemistry plays an important role in stratospheric chemistry, observations of aerosols, volcanic ash, and polar stratospheric clouds gained high importance. Instruments to measure condensed matter in the stratosphere, developed by Rosen and Hoffmann, were flown on large and small scientific balloons in all climate regions (Hoffmann et al. 1972).
In addition to remote sensing techniques, monitoring of stratospheric ozone requires frequent in situ observations. Few instruments for small balloons were available in the 1960s, when Walter Komhyr designed the Electrochemical Concentration Cell ozonesonde (Komhyr 1969). This instrument has been used extensively on small rubber and plastic balloons and is recognized as the in situ reference instrument for vertical profiles of ozone in the troposphere and lower to middle stratosphere. It is the only in situ instrument that provides accurate measurements of the vertical extent of the annual ozone hole.
In the early 1940s Alan West Brewer made the first observations of stratospheric water vapor on meteorological research flights of the Royal Air Force over England. These measurements were largely in support of WWII military efforts in understanding the formation of aircraft contrails. However, they revolutionized the understanding of stratospheric dynamics. He and Gordon M. B. Dobson deduced that air from the troposphere is injected into the stratosphere in the tropics (Dobson et al. 1946; Brewer 1949). Their model was the first description of the general stratospheric circulation. However, it took another 20 years before stratospheric water vapor measurements in the tropics could be taken. In the late 1950s work on stratospheric frost-point hygrometers had been taking place at a number of locations. John Mastenbrook at the Naval Research Laboratory developed a cryogenically cooled frost-point hygrometer, small enough to be launched on rubber balloons (Mastenbrook and Dinger 1961). Unlike measurements of temperature, water vapor measurements on ascent were usually contaminated because of the water vapor carried with the balloon and the payload itself. To avoid this issue, he had developed a method to release gas from small rubber balloons, which allowed water vapor measurements on descent. This instrument flew at several Pacific islands, in India, and at Trinidad (Mastenbrook 1965, 1966), where he achieved a 2-yr dataset of tropical stratospheric water vapor, which would not be repeated for another 40 years. These measurements confirmed the stratospheric dryness hypothesized by Brewer and Dobson. In 1980 this instrument was transferred to NOAA in Boulder, Colorado, where some derivatives of the original instrument are still flown and maintain the longest data series for stratospheric water vapor worldwide (Hurst et al. 2011) and is a sensitive indicator for climate change (Solomon et al. 2010).
With the miniaturization of instruments, it is now possible to build lightweight payloads for small sounding balloons, which measure a multitude of parameters. The combination of frost-point hygrometers and ozonesondes flown across the tropical tropopause has been used to study the mechanisms for troposphere stratosphere exchange and Antarctic dehydration (Vömel et al. 1995, 2002). Small versions of cloud backscatter sensors, icing probes, and aerosol counters have been built with sufficient quality to obtain scientific observations with high vertical resolution using small sounding balloons. This technology is highly flexible and can be transported to even the most remote locations.
In situ upper-air observations using small and large balloons continue to play an important role for atmospheric research, weather forecasting, and climate change studies. In particular for small sounding balloons, great emphasis is being placed on how changes in sensor technology, data processing, and operating procedures impact time series of temperature, humidity, and ozone. Standardization of procedures and a detailed quantification of measurement uncertainty are essential elements in providing in situ upper-air observations, which are most useful for advancing forecast systems and of high enough quality to extract small but highly relevant signals of how our climate is changing (Bodeker et al. 2016).
4. Remote sensing observing systems
In contrast to measurements and observations made in situ, many must be done remotely or are best done remotely. For example, remote sensing observing systems on satellites transformed the way we observe the atmosphere during the second half of the past 100 years. Today space-based remote sensing systems are networked with the surface-based networks of radars, meteorological observations, upper-air observations, and other data sources. Satellite remote sensing systems rely on both passive and active remote sensing techniques and exploit most of the microwave and optical electromagnetic spectrum. The monograph chapter on atmospheric satellite observations (Ackerman et al. 2019) discusses the scientific advances that have resulted from their implementation. Kramer (2002) provides a comprehensive review of these techniques and their associated satellite missions, plus an extensive summary of related field campaigns (on an accompanying CD-ROM). Additional references to satellite techniques and applications are discussed below.
Figure 2-5 provides some examples of common remote sensing techniques and the portion of the electromagnetic spectrum they employ. Many of these techniques have been developed relatively recently in the past 100 years and are expected to continue to advance in their capabilities and uses. Radar is undoubtedly the most widely used remote sensing application, and, although it might be considered a mature technology, it has continued to evolve and expand its applications as discussed in section 4a. Newer active remote sensing technologies, such as lidar, appear to be following a similar trajectory, using sophisticated techniques to exploit the information content from their area of the electromagnetic spectrum, which are described in section 4b. Finally, passive remote sensing technologies are widely used in space but also for many surface and airborne applications. Examples of some popular techniques are described in section 4c.
a. Meteorological radar systems
Radar observations have made significant contributions to more than half of the topics covered in other chapters of this volume. The history of radar and its contributions to meteorology have been well served in the published literature. Essays appear in Bigler (1981), Hitschfeld (1986), and Rogers and Smith (1996); more detailed reviews including the technology, applications, and history can be found in Atlas (1964, 1990) and Whiton et al. (1998a,b). Here we cite only a few key aspects of the evolution of meteorological radar and its contributions to the broader field of meteorology itself. The section on research aircraft observing systems (section 6d) discusses the specialized use of meteorological radars on research aircraft.
As discussed in the introduction, radar observing systems have had major impacts on both the science and the applications of meteorology. Radar provides critical information on the evolution of storms to distances of up to a few hundred kilometers, making nowcasting possible and more recently impacting numerical weather prediction. Storm avoidance radars on aircraft have made air travel much safer and more comfortable. Radars focused on observations of clouds (that may or may not contain precipitation) have contributed to research in cloud physics and climate studies. Radar observations led to the whole field of mesoscale meteorology (Ligda 1951). On a larger scale, composites compiled from observations by many radars (such as the “Ligda montages”; see also Carbone et al. 2002) show how precipitation echoes relate to the synoptic-scale structure of the atmosphere. Radar climatologies show spatial and temporal distributions of storm events and such things as frequency of occurrence of high-impact events or preferred locations of storm formation. Radar rainfall climatologies contribute to hydrology, climate science and even distant fields such as the engineering design of public works; and satellite-borne radar systems are producing estimates of global precipitation climatologies.
The initial recognition of microwave radar as a weather observing system took place during World War II. Details of who, when, and where are somewhat obscured in the mists of wartime secrecy, but useful reviews of the early Allied work appear in Atlas (1990) and Whiton et al. (1998a). Radar-like techniques had been used to measure the height of the ionosphere in the 1920s, and radiolocation of thunderstorms using sferics had already occurred prior to the war. The earliest applications of radar for meteorological observations involved tracking balloons carrying corner reflectors or similar strong targets for wind-finding purposes. Weather echoes were seen on radar (wavelength 10 cm) in England around the beginning of 1941 and in the United States only a little later. Concern about the effect of clouds and precipitation on microwave propagation had led Ryde (1946) to produce an analysis of the scattering properties of hydrometeors and the results explained how the echoes originated. Over the next few years radars installed for other purposes (or sometimes intended primarily for weather observations) provided weather support for varied military operations. The development of stratiform precipitation by a series of snow-generating cells aloft, and the “brightband” region of stronger echo where snow melts into rain, were discovered in this era. The first paper in an AMS journal detailing radar observations of weather appeared in the December 1945 issue of the Journal of Meteorology, just a few months after the end of hostilities (Maynard 1945). Already recognized were the characteristic features of thunderstorms, squall lines, and frontal and hurricane echoes, and the limitations caused by attenuation of microwaves of wavelengths shorter than 10 cm were appreciated.
The Weather Bureau (forerunner of the National Weather Service) had received airborne radars from the U.S. Navy in 1942 for modification and use as ground weather observing systems. A hybrid radar network in Panama was providing weather surveillance in 1944; in addition to operations support, research was carried out on such things as storm movement and storm life cycles. The same year, some weather reconnaissance squadron B-25s in India and elsewhere were equipped with AN/APQ-13 radar sets (wavelength 3.2 cm) for storm detection. Some APQ-13s were modified for ground use at military weather stations, and that use continued into the 1970s (Whiton et al. 1998a). Researchers seized on the capability of radar to provide insight into storm processes, and the importance of radar in the postwar Thunderstorm Project has been discussed in the chapter introduction. The early knowledge of storm characteristics, evolution and movement provided by these systems showed the potential value of radar for storm detection and tracking in the civilian world.
With the wartime experiences as background, the Weather Bureau acquired a number of surplus military 10-cm radars following the end of hostilities. They were modified for weather surveillance and formed the beginning of the Basic Weather Radar Network (BWRN). These were relatively low-powered radars with wide antenna beams, which limited their usefulness for operational purposes; however, they quickly proved their value. Several universities had also acquired surplus radars, and the first “hook echo” characterizing tornadic (and often other severe) storms was identified on a modified airborne 3.2-cm radar at the Illinois State Water Survey. The techniques of estimating echo intensity by varying the attenuation in the receiver or using “isoecho contouring” to reveal details of storm structure were developed to begin making radar into a quantitative observing instrument. Information about the distributions of raindrop sizes in the atmosphere was developed to connect the echo intensity to characteristics of the rainfall and then to determine relationships between the measured reflectivity and the associated rainfall rate. An important milestone that occurred in 1947 was the first Weather Radar Conference held at the Massachusetts Institute of Technology (MIT)—a pioneering stage in the series of specialty conferences held by AMS on radar and other topics since the 1950s. It is worth noting that over the first dozen or so of these conferences, documentation in the conference proceedings (later termed “preprints,” and then abandoned in the Internet era) was considered by many tantamount to publication. Consequently, most of the early advances in radar meteorology appeared in these proceedings a few years prior to (if at all in) any journal publication.
None of the early radars had been designed for weather observations; the first one so intended was the AN/CPS-9, already conceived before the end of the war and eventually coming into service in the mid-1950s. This 3.2-cm system with a 1° beamwidth providing high-resolution observations was installed at many military facilities as well as at some research laboratories, training facilities, and universities. In the mid-1950s, hurricanes struck the East Coast, and the limited radar observations available over the area stimulated the development of the 10-cm WSR-57 radar (the “57” signifying the year the design was completed) to expand and improve the BWRN. While the design focused on hurricane observation, units were also installed in the Midwest to provide tornado detection and warning. The “weak-echo region” indicating the presence of strong updrafts was recognized as a significant characteristic of many severe thunderstorms.
The advent of the transistor in the 1950s and the stimulus of the space program in the 1960s led to great advances in miniaturization and reliability of electronics. By the mid-1960s, components for 5.5-cm radars were widely available; this wavelength, longer than the 3.2-cm systems, suffers less attenuation by precipitation and can provide similar beamwidths at less cost than 10-cm systems. The U.S. Air Force acquired 5.5-cm AN/FPS-77 weather radars in the mid-1960s, and by the end of the decade television stations were beginning to acquire 5.5-cm radars to support their weathercasts. In the 1970s, the NWS also acquired 5.5-cm WSR-74C radars to supplant the aging World War II radars still in use as local warning radars. Many transportable (and more recently mobile) radars operating at 5.5 cm, and sometimes at 3.2-cm or shorter wavelengths, have become available to the research community for studies of cloud-, storm-, and mesoscale processes (often carried out in conjunction with research aircraft and other ground observing facilities).
In the same time frame, computer technology evolved beyond the mainframe category, and digitization and computer processing of radar data came into practice. The improving technology soon allowed digitization of actual signals within the radar system, and signal processing has become an important part of weather radar development. Battan (1973) gives a good summary of the early work and summaries of more recent developments appear in Doviak and Zrnić (1993) and Bringi and Chandrasekar (2001). Today it is possible to manipulate both the transmitted signal (e.g., by coding the phases of the transmitted pulses) and the received echoes (e.g., by adding random phases to the echoes to “whiten” the signals); such techniques help improve the scanning speeds and measurement accuracies. The measured variables can then be processed in a wide variety of algorithms designed to elucidate features of the echo patterns useful to the meteorologist. Such algorithms identify features like the mesocyclone or the tornado vortex signature and also indicate potentially hail-bearing storms. Data processing and display systems like the Warning Decision Support System–Integrated Information (WDSS-II) are able to blend data from multiple radars with data from other sources such as satellites and surface stations to provide a more comprehensive picture of the current weather situation.
Prior to these developments, the measurement focus of operational weather radars was the echo intensities and inference of rainfall rates and accumulations. But Doppler radar techniques to measure target motion had been developed in the war, and the potential for measuring winds was recognized. A “pseudo-Doppler” system observed vertical wind (and precipitation) motions in the United Kingdom in the early 1950s. The first attempts to observe horizontal winds used cumbersome dual-antenna continuous-wave (CW) systems with limited range and lacking range resolution; however, one measured tornado winds in excess of 200 mph (322 km h−1; Smith and Holmes 1961). “Pulsed-Doppler” systems providing both range and velocity data soon appeared (e.g., Lhermitte 1962), though they taxed the analog technologies of the day. The advent of digital signal-processing capabilities provided quantitative Doppler velocity data, but the radars observe only the radial wind component and attempts to display the wind field in a meaningful way were marginally successful at best. However, in conditions with horizontal gradients, not too strong velocity–azimuth display (VAD) scans did provide reasonable estimates of wind and divergence profiles above the radar site.
To obtain vector winds required dual-Doppler systems, with attendant challenges of data processing limitations and the constraint to storms that were not too far from the pair and not too close to the baseline. The requirement for two radars spaced a few tens of kilometers apart limited use of dual-Doppler systems to research projects, but many aspects of storm-scale wind fields—such as the mesocyclone and the rear-flank downdraft—were revealed by these studies (e.g., see the review by Markowski 2002). A major advance occurred when color displays arrived in the mid-1970s; now the full radial wind field from a plan position indicator (PPI) azimuth scan could be presented in a readily interpretable way. This quickly led to recognition of patterns such as the tornado vortex signature and the interpretation of patterns observed in both widespread and convective precipitation in meaningful ways (Wood and Brown 1983). Signal-processing technology was advancing rapidly and research groups and television stations were early adopters of the color displays and Doppler technology. Coming at just the time when the need for replacing the aging WSR-57 radars was being recognized and offering potential for improved weather surveillance, this capability was incorporated in the WSR-88D (NEXRAD) Doppler radar system procured for use by the NWS as well as the Federal Aviation Administration (FAA) and the U.S. Air Force (USAF). The system was deployed in the 1990s (Crum and Alberty 1993) and continues to provide the backbone of the BWRN. A major accomplishment has been the radar contribution to the improvement in tornado warnings (Simmons and Sutter 2005), which was a prime motivation for adoption of the system.
The year 1990, with the publication of Radar in Meteorology (Atlas 1990), provides a convenient demarcation point. With development of the NEXRAD system proceeding, preparation was also underway for launch of the first spaceborne weather radar system on the Tropical Rainfall Measuring Mission (TRMM) satellite (Kummerow et al. 1998). Previous satellite-derived estimates of precipitation rates had been based on indirect radiometric techniques (e.g., Arkin and Ardanuy 1989). The TRMM Precipitation Radar (PR), launched in 1997, was a 2.2-cm system with a phased-array antenna that electronically scanned a 215-km-wide swath across the satellite track. The PR provided direct observations of the vertical structure of precipitation with a footprint pixel size < 5 km. The TRMM orbit extended the available radar observations of precipitation rates over the globe between latitudes 35°N and 35°S, including ocean areas previously accessible only via occasional ship visits. The satellite also carried an imaging microwave radiometer operating at five wavelengths from 2.8 cm down to 3.5 mm (with four wavelengths using dual polarization), along with other sensors (Kummerow et al. 1998; Stephens and Kummerow 2007). Synthesis of data from the multiple observing systems enhanced the capabilities of the analysis techniques (e.g., Grecu and Olson 2006; see also Ackerman et al. 2019).
The TRMM satellite functioned for 17 years and was followed by the launch of the Global Precipitation Measurement (GPM) Core Observatory satellite in 2014. This satellite carries a dual-frequency precipitation radar operating at wavelengths 2.2 and 0.86 cm (Hou et al. 2014). The 2.2-cm radar scans a slightly wider swath and extends the time history of the TRMM PR observations; the 0.86-cm radar has the same footprint size (about 5 km), but the swath is about half as wide. The GPM radar system has better sensitivity to light rainfall rates than the TRMM PR, and the dual-frequency data add a capability to estimate raindrop size distributions (Williams et al. 2014) in the swath overlap region. Attenuation by the precipitation is a significant issue, and substantial effort is required to correct for this attenuation to achieve the desired accuracy in the rainfall rate estimates (e.g., Seto and Iguchi 2015)
Meanwhile, investigation of the polarization properties of precipitation echoes had begun in the 1950s—though for meteorologists the technology of the era was limited and interest subsided. The polarization of the transmitted microwaves is established in the antenna system, and information about nonspherical particles such as ice particles or large raindrops can be gleaned from examining the echoes at different polarization states. “Precipitation-clutter suppression” in such things as air traffic control radars, on the other hand, was making use of a property of circular polarization that can reduce the strength of echoes from spherical drops in relation to those from aircraft or similar irregular targets. Renewed study occurred with the advent of satellite communications, where it was hoped that channel capacities might be doubled by carrying separate signals on right- and left-hand circularly polarized waves. Unfortunately for that purpose, the presence of nonspherical hydrometeors along the propagation path engenders cross talk between the two signals. Further investigation of the polarization properties of propagation through, and echoes from, precipitation ensued, with much of the early work on circular polarization taking place in Canada (e.g., McCormick and Hendry 1975). From the weather radar standpoint, circular polarization presents two problems; one is that the component of the echo due to depolarization by the particles is typically more than 20 dB weaker than the main component. This means that the depolarized component can only be detected to a range of about one-tenth that of the main component. The second problem is that propagation through the nonspherical particles affects the phases of the horizontal and vertical components of the signal differently. This means that a circularly polarized signal gradually degrades as it travels through precipitation.
Seliga and Bringi (1976) made a substantial advance by reviving the alternative of dual linear polarization, which as explained in Atlas (1990) had been employed by Newell and others in early exploratory work. This approach gains two advantages, one being that measurements of the “differential reflectivity” (difference between the echo intensities measured at horizontal and vertical polarization) give useful information about raindrop sizes without requiring measurement of a weak depolarized signal component. The other is that the degradation of circular polarization now becomes an asset; measuring the “differential phase” between the two propagating linear components gives added information about rain intensity. To be sure, measurement of the depolarized component (when possible) does give information about particle shapes. Technical challenges impeded the early implementations of this concept, which required fast switching between the two transmit polarizations and in effect doubled the scan time. Adoption of the “simultaneous transmit and receive” approach (meaning that both components operate at the same time) eliminated the switching requirement. Intensive research has shown the value of polarimetric radar in elucidating details of the microstructure of precipitation systems, and “hydrometeor classification algorithms” are now available (e.g., Park et al. 2009) to describe storm structure. The polarimetric data help improve quantitative precipitation estimates as well as in improving overall radar data quality. This has led to the recent polarimetric upgrade of the NEXRAD radar systems.
There have been many other types of radar observing systems; two will be mentioned here. The first derives from observations of “clear air” echoes, which were initially referred to as “radar angels.” These were a hot topic of discussion in the early days of weather radar and were subsequently shown to be primarily due to insects and birds (e.g., see the review by Hardy and Katz 1969). (Radar ornithology is a legitimate field of science, as is radar monitoring of insect migrations.) It was also realized that detectable echoes could occur from the optically clear air at longer wavelengths. This led to the development of wind profilers operating at 30-cm wavelength or longer to give continuous profiles up to heights that can reach into the stratosphere (Balsley and Gage 1982; Strauch et al. 1984).
The other type of radar observing system operates at wavelengths as short as 3 mm to provide observations of clouds with reflectivities too low to be observed by operational centimetric radars. The first operational system of this type was the vertically pointing 0.86-cm AN/TPQ-11 used by the USAF to observe clouds above airfields. Developments in the communications industry led to the availability of components at even shorter wavelengths and research with these systems has been quite active (e.g., Lhermitte 1987; Kollias et al. 2007). They are commonly known as “cloud radars,” and compact systems with very high sensitivities can be constructed at reasonable cost. Atmospheric attenuation limits the useful range of these systems, and they are often operated in a vertically pointing mode; however, they can have very narrow beamwidths that provide high resolution. At wavelengths shorter than about 1 cm the scattering properties of raindrops and small ice particles exhibit wavelength variations that are taken advantage of in multiwavelength radar systems to infer properties of the hydrometeors. Cloud radars are also used to study the boundary layer (LeMone et al. 2019).
The short-wavelength systems are especially attractive for use on research aircraft (as described in section 6d) and satellites. The CloudSat mission launched in 2006 has a nadir-looking 3-mm radar system (Stephens et al. 2002; Posselt et al. 2008) using an offset-feed antenna configuration. The radar provides data on the vertical structure of clouds and precipitation, with the analysis incorporating data from other satellites in the “A-Train” (Stephens et al. 2002).
As AMS moves into its second century, exciting new developments in meteorological radar are on the horizon. One is the use of dense networks of linked small 3.2-cm radars to provide high-resolution low-level coverage over urban areas and in complex terrain (Junyent et al. 2010). Comprehensive low-level coverage is a problem for widely spaced surveillance radars because of the masking effects of Earth curvature. The 3.2-cm systems circumvent the precipitation attenuation problem by providing different viewing angles of the same event and can provide rapid updates in severe-storm situations. Another is the increasing capabilities of high-power solid-state microwave devices, which are beginning to replace the previously needed vacuum-tube transmitters in many radar systems with attendant improvements in reliability and life cycle costs. A third is the prospect of converting ground-based radars to phased-array antenna systems that can provide a rapid scan capability with adaptive control along with elimination of the often-troublesome moving parts of the antenna pedestal. These and other technological advances will contribute to more effective, as well as newer, applications of radar data and to further scientific advances.
b. Lidar remote sensing systems
Light detection and ranging (lidar) remote sensing occupies a unique niche in atmospheric observing systems. Although conceptually similar to radar, lidar utilizes shorter wavelengths in the ultraviolet, optical, and infrared regions of the electromagnetic spectrum, enabling investigation of atmospheric aerosols, molecules, chemical species, clouds, and air motion (i.e., winds and turbulence). Because it can sample many atmospheric constituents that are of interest (e.g., aerosols and trace gases), growth of lidar systems is likely to continue. Lidar has not been in use as long as radar, so it does not have as extensive a history (Fig. 2-1), but it is following a similar trajectory with improvements in capability, complexity, and impact occurring since the concept of measuring light scattered by the atmosphere using multiple searchlight sources intersecting at a common height was first suggested by Synge (1930).
Although searchlight-based probing yielded interesting profiles of atmospheric scatter, the invention of the laser in 1960 (e.g., Bromberg 1988), which offered a more powerful, monochromatic light source, catapulted lidar remote sensing into the mainstream as a tool for atmospheric monitoring. Within three years, application of ruby lasers in elastic-backscatter (i.e., scatter from aerosols and molecules where the wavelength does not change during the scattering process) lidar systems to observe aerosol layers was being reported (e.g., Collis and Ligda 1964; Fiocco and Grams 1964). From the 1960s to the present, demonstrations of new lidar remote sensing have proceeded hand in hand with technological advances in the optical components that make up a lidar system, especially including the laser transmitter and receiver. Over the past 50 years, new sources and detectors with improved performance and robustness have become available across the optical spectrum, stimulating new measurement capabilities.
From these early aerosol studies, elastic-backscatter lidar probing of the aerosol and cloud structure has continued to advance as new multiwavelength, depolarization, and filtering techniques have been applied. A fundamental parameter measured by lidar for aerosol studies is the backscatter ratio, defined as the ratio of aerosol to total (aerosol plus molecular) scatter, which is related to aerosol loading and affects transmission and reflection of light in the atmosphere. The capability of lidars to measure aerosol backscatter ratio into the stratosphere has enabled investigation of the injection and decay of aerosols injected into the atmosphere by volcanic eruptions. The 1982 eruption of El Chichón, to that date one of the strongest volcanic perturbations of the twentieth century, was extensively studied by ground-based lidars (e.g., Jager and Carnuth 1987), which chronicled evolution, transport, and dispersion of the aerosol plume injected by the volcano from several locations around the world. The ground-based observations were complemented by an airborne lidar study in the Southern Hemisphere (McCormick and Osborn 1986), which provided information on backscatter ratio versus altitude and location. Eleven years later, experience gained from the El Chichón studies was important in lidar characterization of the ash cloud formed by the Mount Pinatubo eruption in the Philippines, which injected more than twice the mass of sulfate as the El Chichón eruption. Lidar data, combined with satellite measurements and in situ observations using balloon-borne instruments, showed that stratospheric aerosols increased dramatically after the eruption and had a major impact on climate, with lower-stratospheric temperatures increasing and tropospheric temperatures decreasing (McCormick et al. 1995).
Elastic backscatter lidars have been extensively employed from surface, ship-based, and airborne platforms to study pollution-related phenomena such as arctic haze, Saharan dust transport, wildfire plumes, plume dispersion, and pollution sources and transport. Because of the mobility of the platform, airborne studies have been particularly useful for observing plumes of pollution over extended distances. A topic related to pollution studies, the impacts of aerosols and clouds from both anthropogenic and natural sources on Earth’s radiation budget and the resultant effect on climate change, has been a major focus of lidar research over the past several decades. Of primary interest for investigations of radiative effect of aerosols are the scattering and extinction properties; however, for elastic backscatter lidars the problem is ill constrained in that multiple mathematical solutions exist for a given measured backscattered signal profile. Fernald et al. (1972), Klett (1981, 1985), and Fernald (1984) addressed this limitation by investigating inversion algorithms based on the single-scattering lidar equation and the assumption of a power-law relationship between extinction and backscatter. Under appropriate conditions, where an extinction to backscatter ratio (commonly called the lidar ratio) can be assumed based on anticipated aerosol type and where a reference molecular layer is available in the profile, the Klett–Fernald algorithms produce a stable solution for extinction and backscatter profiles. Over the ensuing decades, the Klett–Fernald technique has been applied in numerous studies of anthropogenic and natural aerosol phenomena such as Saharan dust (Gobbi et al. 2003), volcanic ash (Marenco and Hogan 2011), and South American biomass burning (Marenco et al. 2016). As aerosol lidars evolved to include polarization channels (e.g., Sassen 1991) or multiple-wavelength sources (e.g., Sasano and Browell 1989), depolarization and/or wavelength dependence of the return of the backscattered signal was used to aid in characterization of specific aerosol or cloud type, which reduced the uncertainties in the inversion introduced by a lack of knowledge of the extinction to backscatter ratio.
Early on, researchers recognized that elastic backscatter lidars could be augmented by adding the capability to measure depolarization of radiation scattered by clouds (e.g., Schotland et al. 1971; Derr et al. 1976) and aerosol particles (Shimizu et al. 2004). Polarization sensitive lidars have been used to differentiate between ice and water clouds, investigate ice particle habit and orientation in cirrus clouds, and locate and map regions of supercooled cloud water droplets (e.g., Sassen 1991). For example, Sassen (2002) deployed a polarization lidar to investigate the effects of springtime Asian dust storms on the formation of unusually warm cirrus ice clouds, showing the effect of transported Asian dust aerosols on the radiative properties of clouds several thousand kilometers away. Gobbi et al. (1998) analyzed polarization lidar observations collected at McMurdo Station in Antarctica over a full winter to infer phase and nucleation patterns of polar stratospheric clouds (PSCs). The study indicated that mixed phase PSCs occurred during the full winter chiefly in the 12–20-km-height range where maximum ozone depletion takes place. Recently, a class of elastic backscatter lidars has incorporated the so-called micropulse technique, in which the transmit laser emits low-energy (on the order of tens of microjoules), high-pulse-repetition-frequency pulses, and the receiver incorporates photon counting and a narrowband filter to reduce solar background effects (Spinhirne 1993). Micropulse lidars are eye safe and highly reliable, enabling continuous unattended monitoring for weeks and months without the need for significant maintenance. Researchers from a number of countries have established a network of micropulse lidars to provide continuous, year-round observation of cloud and aerosol profiles (Berkoff et al. 2003) for climate forcing and air quality applications.
The need to improve understanding of the role of aerosols and clouds on the climate system led to development of the Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) space-based lidar mission (Winker et al. 2010), which commenced in April 2006 with the launch of the CALIOP dual-wavelength (1066 and 532 nm) polarization lidar. As one component of the A-Train constellation of Earth sensors (see Ackerman et al. 2019), the CALIOP lidar observations have provided a global three-dimensional view of the aerosol and cloud structure (Winker et al. 2010, 2013) and significantly improved knowledge on global and local aerosol and cloud properties (e.g., Young and Vaughan 2009; Delanoë and Hogan 2010), transport (e.g., Liu et al. 2008), and radiative effects (Kato et al. 2011). The combination of the CALIOP lidar with the CloudSat 94-GHz nadir-viewing radar has allowed for both the aerosol and cloud fields to be mapped from space, illustrating an example of synergistic use of multiple observing systems.
Given the success of CALIPSO, future space-based aerosol/cloud missions are already being planned (Illingworth et al. 2015) to provide improved information on aerosol type and properties by utilizing the High Spectral Resolution Lidar (HSRL) technique (Grund 1991; She et al. 1992; Hair et al. 2001), which can independently retrieve aerosol or cloud extinction and backscatter without a priori assumptions on lidar ratio or aerosol type. Development of HSRL techniques represented a significant step forward in using lidar to characterize aerosol and cloud optical properties and to remotely differentiate between different aerosol type and source. First demonstrated in the 1980s, the HSRL instrument individually measures the signal backscattered from molecules and aerosols based on its spectral characteristics—because the aerosol backscattered signal is more spectrally narrow, it can be separated from the broader molecular signal by means of a filtering element such as an optical etalon or absorption cell. Extinction is then computed based on the deviation of the measured molecular signal profile from that computed from a known atmospheric density profile. Although initially demonstrated as a ground-based instrument, the HSRL methodology has also been used for airborne instruments (e.g., Hair et al. 2008; Groß et al. 2013). Early HSRL deployments, often in combination with cloud radars and microwave radiometers, focused on characterizing cloud optical and microphysical properties (Grund and Eloranta 1990). The basic HSRL implementation has evolved over the past decade to include polarization differentiation and multiple wavelengths spanning the infrared, visible and UV spectral regions, which enables measurement of aerosol properties including lidar ratio, depolarization ratio, backscatter, extinction, backscatter color ratio, and particle effective radius. These advanced HSRL instruments have been used to investigate aerosol optical properties during outbreaks of pollution (e.g., Müller et al. 2014) and Saharan dust (Groß et al. 2013). By cataloging the HSRL-measured optical parameters for known aerosol types, methodologies have been developed to classify aerosols into as many as eight different types (Burton et al. 2012) and to apportion optical depth measurements among these different aerosol types.
Another type of lidar uses the Raman effect (a change in wavelength exhibited by scattering by specific molecules) to provide a remote sensing capability for measurement of both temperature and water vapor and also offer an alternative to HSRL instruments to estimate aerosol backscatter and extinction properties. Because a Raman lidar can separate the weak inelastic scattering of light (i.e., scattering that causes a change in the wavelength) by molecules in the atmosphere from the elastic backscattered radiation from aerosols, it can also provide independent observations of extinction and backscatter used to characterize aerosol properties (e.g., Ansmann et al. 1990). Examples of Raman lidar characterization of aerosol and cloud properties include measurements of Saharan dust (Groß et al. 2015), cirrus cloud particles (Ansmann et al. 1992), transport of dust and smoke from Africa to the Amazon rain forest (Ansmann et al. 2009), and the ash layer from the Eyjafjallajökull volcano (Sicard et al. 2012). Although the Raman technique requires relatively powerful laser transmitters, large optical receivers, a large telescope (with appropriate filters) to gather sufficient signals from the weak inelastic molecular scatterers, it has continued to be applied since its demonstration in the 1960s because of both the value of the measurement and its relative ease of implementation. A network of mostly Raman lidars currently comprises the multicountry European Aerosol Research Lidar Network (EARLINET) aimed at creating a long-term database of aerosol observations on a continental scale (Pappalardo et al. 2014). A major contribution of the work has been establishment of standards for calibrating each instrument in the network and analyzing the returns.
Raman lidar provides unique capabilities to measure profiles of the important thermodynamic variables water vapor and temperature. The water vapor estimates are obtained by comparing the inelastic scattered signal from atmospheric water vapor molecules to the signal scattered from nitrogen molecules, while temperature profiles are computed by observing the temperature-dependent changes in the rotational Raman radiation backscattered by atmospheric nitrogen and oxygen. Early water vapor measurements were first made at visible wavelengths in the late 1960s (Cooney 1970) followed by profiles of temperature (Cohen et al. 1976). Availability of more powerful lasers (including those that operated in the ultraviolet region where scattering is more efficient) and improved receivers stimulated a renewal of interest in Raman lidar in the late 1980s. The improved technology enabled profiling of atmospheric temperature into the stratosphere (Keckhut et al. 1990; Nedeljkovic et al. 1993) as well as investigation of lower-atmospheric moisture structure (Melfi and Whiteman 1985), mid- and upper-tropospheric moisture (Sherlock et al. 1999), and stratospheric intrusions (Di Girolamo et al. 2009). Raman lidar studies of upper-troposphere water led to identification of a dry bias in humidity measurements from a commonly used radiosonde situ sensor (Soden et al. 2004). Currently, many Raman lidars are configured to measure aerosol properties and water vapor simultaneously (e.g., Ansmann et al. 1992) and may include multiple wavelengths and depolarization as well (Althausen et al. 2000).
Initially, Raman lidars operated in a vertically pointing mode primarily at night (to minimize interference from solar background light) as research systems requiring highly skilled operators. However, as engineering and technology have improved, the feasibility of scanning Raman systems (e.g., Cooper et al. 1997; Whiteman et al. 2006) for scanning water vapor profiles has been demonstrated. A significant achievement was the development and deployment of a Raman lidar system, designed to operate with limited manual intervention, at the Department of Energy’s Atmospheric Radiation Measurement (ARM) program Southern Great Plains site to collect information on atmospheric aerosols, water vapor, temperature, and liquid water (Turner et al. 2016; Newsom et al. 2013; Goldsmith et al. 1998). The ARM Raman lidar demonstrated extended measurements of aerosol properties and moisture evolution during several intensive field programs beginning in the 1990s. Raman lidar technology currently allows for portable, unattended operations (Althausen et al. 2009). Such instruments continue to be deployed for long-term measurements at a variety of locations such as the Amazon basin (Baars et al. 2012) and central Asia (Hofer et al. 2017) to improve understanding of local and transported aerosol properties and moisture structure.
The differential absorption lidar (DIAL) takes advantage of the fine spectral absorption structure of atmospheric gases to estimate gas concentrations. By operating the laser transmitter to produce spectrally narrow radiation at two wavelengths that are differentially absorbed by the gas and comparing the backscattered signal power at each wavelength, a range-resolved estimate of the gas concentration is obtained. The first DIAL measurements were reported by Schotland (1964) soon after the invention of the laser in 1960. Schotland thermally tuned a ruby laser on and off the wavelength of an atmospheric water vapor absorption line at 694.38 nm to estimate dewpoint temperature at several heights extending from the ground up to 1500 m above ground level.
Although potentially applicable for measurements of a variety of gases, DIAL techniques have been historically applied primarily to atmospheric measurements of water vapor and ozone. Recently, with the increased scientific focus on climate forcing by greenhouse gases, DIAL measurements of CO2 and methane concentrations have been demonstrated and proposed for deployment in space to measure global distributions (e.g., Ehret et al. 2008). As with nearly all types of lidar measurements, DIAL progress has closely tracked advancements in laser and optical technology. A number of DIAL water vapor measurements utilized tunable dye lasers as sources (Bösenberg 1991; Ehret et al. 1993); however, by the late 1990s, sources incorporating alexandrite (Wulfmeyer and Bösenberg 1996; Bruneau et al. 2001), Ti:sapphire (Ismail et al. 2000; Wagner et al. 2013), or optical parametric oscillators (OPOs; Ehret et al. 1998) were replacing the difficult-to-use dye lasers as DIAL laser transmitters. DIAL water vapor instruments have been extensively deployed for atmospheric research in both ground-based and airborne configurations. Ground-based water vapor DIAL has been applied to study structure and turbulent fluctuations of boundary layer water vapor (Wulfmeyer 1999). Recently a scanning DIAL lidar mapped out the three-dimensional heterogeneity of water vapor and related observed humidity profiles to differences in land cover (Späth et al. 2016). DIAL systems have been deployed fairly extensively on research aircraft flown to study, for example, aerosol and moisture distribution associated with, haze layers off the U.S. East Coast (Ismail et al. 2000), airmass transition in flow across the intertropical convergence zone (Browell et al. 2001), Saharan dust impacts on tropical storm formation (Ismail et al. 2010), and water vapor distribution in the upper troposphere and lower stratosphere (Kiemle et al. 2008). During the International H2O Project (IHOP) in 2002, three airborne DIAL water vapor lidars were deployed, along with a variety of other sensors, to investigate how characterization of the three-dimensional structure and evolution of the water vapor field could improve understanding and prediction of convective processes (Weckwerth et al. 2004).
The DIAL technique is also well matched to measurements of atmospheric ozone, and since the late 1980s has played an important role in observing changes in tropospheric/stratospheric ozone levels and improving understanding anthropogenic air pollution. Ozone DIAL instruments operate in the ultraviolet region of spectrum, typically incorporating excimer lasers and/or Raman cells, which are used to shift the optical wavelength, to produce the online and offline wavelengths for ozone profiling. Advances in laser technology have enabled use of all solid-state ultraviolet lasers (e.g., Alvarez et al. 2011). An international network of surface-based ozone lidars has operated for more than 20 years as part of the Network for the Detection of Atmospheric Composition Change to observe and understand the physical and chemical state of the upper troposphere, stratosphere and mesosphere (e.g., Leblanc and McDermid 2000; Godin-Beekmann et al. 2003). Because ozone is also a pollutant affecting human health, ozone lidars have been applied in air quality studies over the years from both surface (Carnuth et al. 2002; Lin et al. 2015) and aircraft (Ancellet and Ravetta 2003; Browell et al. 2003; Senff et al. 2010) platforms.
Like Raman lidars, for many years DIAL instruments to measure water vapor have been operated as complex research instruments applied intermittently in focused research projects (e.g., Weckwerth et al. 2004; Wulfmeyer et al. 2018). However, NRC (2009) noted the need for a “network of networks” of observations at the mesoscale and specified high-resolution profiles of humidity as a critical measurement need. The needs articulated in NRC (2009) stimulated efforts to develop robust, affordable DIAL instruments that could be deployed in networks and operated continuously with minimal intervention (Machol et al. 2004; Repasky et al. 2004). As a result of steady progress in this area during the 2010s, an unattended water vapor DIAL instrument utilizing high-pulse-rate diode laser sources (Nehrir et al. 2009; Spuler et al. 2015; Weckwerth et al. 2016) has been developed. This system is capable of measurement of moisture and aerosol structure through the boundary layer and into the lower free troposphere; there are plans for multiple instruments to create a network.
Another lidar technique involves measuring the Doppler shift of laser radiation scattered from atmospheric aerosol particles that are small enough to move primarily with the air motions. Doppler lidars fill a unique observing system niche in their capability to observe atmospheric motions. They are complementary to and have often been deployed along with meteorological Doppler radars (e.g., Rothermel et al. 1985), which measure the Doppler shift of microwave radiation scattered from rain and cloud droplets. Early Doppler lidar instruments used CO2 CW laser sources in the thermal infrared at 10.6-μm wavelength and heterodyne detection to detect aircraft trailing vortices (Huffaker et al. 1970) and to measure fall velocities of hydrometeors (Abshire et al. 1974) and the velocity structures of dust devils (Schwiesow and Cupp 1976) at ranges of a few hundred meters. Doppler lidar transmitter technology rapidly progressed to incorporate pulsed laser sources; a pulsed system deployed on a NASA aircraft observed winds around severe storms (e.g., Bilbro and Vaughan 1978; McCaul et al. 1986) to ranges out to ~10 km. In the early 1980s the feasibility of applying Doppler lidar for satellite-based measurements of winds from space was studied by NOAA (Huffaker et al. 1984), which stimulated development of a high-pulse-energy transmitter to demonstrate lidar system performance (Post et al. 1982). The initial NOAA pulsed system, with a maximum range extending beyond 15 km, was employed in a number of atmospheric studies, including flows in complex terrain (Post and Neff 1986). An additional upgrade of the NOAA system (Post and Cupp 1990) further extended the system maximum range to beyond 20 km, enabling the first two-dimensional mapping of severe downslope windstorms (Neiman et al. 1988), flows in the Grand Canyon (Banta et al. 1999), and California sea-breeze structure (Banta et al. 1993).
Because the high-power CO2 Doppler lidars employed by NOAA and others in the 1980s and 1990s were very much research instruments requiring skilled operators, attention turned to development of more user-friendly approaches better suited to both unattended operation and deployment in space (Huffaker and Hardesty 1996). Solid-state laser transmitters operating near 2 μm first appeared in the 1990s (Henderson et al. 1993; Grund et al. 2001) and saw use for wind shear monitoring near airports (Chan and Lee 2012), ship-based measurements of marine boundary layer dynamics (Yamaguchi et al. 2013), and nocturnal low-level jet research (Banta et al. 2003). In a novel application, a solid-state Doppler lidar was codeployed with a water vapor lidar on a research aircraft during the IHOP experiment to measure the two-dimensional structure of water vapor (Tollerud et al. 2008); an airborne lidar has also been deployed to study katabatic flows off Greenland (Marksteiner et al. 2011). A significant step in the evolution of Doppler lidars in recent years has been application of off-the-shelf telecommunications technology in commercial stand-alone, unattended Doppler lidars operating around 1.5 μm. These instruments are being applied alone or in arrays for boundary layer studies of mixing and turbulence (e.g., Bonin et al. 2018) and in support of wind energy production (Choukulkar et al. 2017).
Forty years after it was first proposed, the goal of deploying a Doppler lidar measuring global winds from space remains a high priority for improving weather forecasting (NAS 2018). Because heterodyne Doppler lidars are not able to obtain measurements in low aerosol regions such as the middle troposphere, alternative methods that employ interferometers to measure the Doppler shift of radiation scattered from atmospheric molecules have been developed (Korb et al. 1992; McKay 1998; Bruneau 2001; Tucker et al. 2018). Aeolus, a Doppler lidar mission incorporating Fabry–Perot and Fizeau interferometers to measure global wind profiles from space, was launched in August 2018 by the European Space Agency (Lux et al. 2018). Aeolus will provide single line of sight wind profiles for assimilation into numerical forecast models.
Lidar techniques have been developed for sampling the structure and dynamics of the upper stratosphere, mesosphere, and lower thermosphere using resonance scatter from sodium, iron, and other metallic layers. Bowman et al. (1969) made the first resonance scattering observations of a metallic sodium layer at altitudes between 80 and 100 km. The technique improved over the ensuing years with advances in tunable, spectrally narrow dye laser sources. Gardner and Voelz (1987) analyzed high-resolution measurements of the sodium layer to observe gravity waves. Temperature measurements, important for modeling chemistry in the region near the mesopause, were first demonstrated by Gibson et al. (1979), who probed the thermal broadening of a sodium resonance line with a narrowband lidar. Because studies of the middle atmosphere are incomplete without knowledge of the wind structure, She and Yu (1994) developed a method for determining the radial wind speed based on the Doppler shift of resonance scatter from the sodium layer. Work continues to extend the capability of resonance scatter lidars; a study by Gardner and Liu (2014) showed that sodium and iron lidar resonance scatter lidars could be used to measure vertical transport by turbulent mixing, which plays a fundamental role in establishing the thermal and constituent structure in the upper mesosphere. Recently, Guo et al. (2017) extended this work to show the capability of sodium wind temperature lidar to measure turbulence perturbations in temperature and vertical wind, enabling derivation of eddy heat flux, turbulence thermal diffusivity, and energy dissipation rate. Resonance lidar probing of the middle atmosphere remains an active research area, with measurements ongoing at several sites worldwide including in Arctic and Antarctic regions (e.g., Hildebrand et al. 2017; Chen et al. 2016).
Much as radar has evolved to expand its capabilities and applications, a number of developments suggest that lidar is following a similar path, especially with the need to better measure trace gases (including water vapor) and aerosols for understanding human impacts on climate and air quality. For example, two areas that seem especially primed for growth and increased impact are the application of stand-alone, unattended instruments and the continued deployment of space-based instruments. Over the past decade, unattended instruments utilizing Raman, DIAL, and Doppler technology have shown the capability to provide temporally continuous observations of water vapor, winds, and aerosol properties over extended periods; as technology continues to advance, this class of instruments will likely continue to replace the large, complex systems that characterized lidar measurements in the first four decades following the invention of the laser.
c. Passive remote sensing systems
Passive remote sensing technology relies on naturally emitted and reflected electromagnetic radiation from a scene, in contrast to active remote sensors, which supply their own energy to illuminate a scene. An early example of a passive remote sensing device was the camera; aerial photography used for military reconnaissance in World War I is an application of remote sensing employed at the outset of the past 100 years. Since the space age began, sensors with increasing complexity have evolved on satellites, with corresponding improvements to sensors on airborne platforms and on the surface. For example, the GPM Core Observatory carries a microwave radiometer system with state-of-the-art capabilities, including two shorter-wavelength channels to provide enhanced sensitivity to light rain and ice particles (Draper et al. 2015). The latter observations are made from an orbit that provides coverage between 68°N and 68°S. The GPM Core Observatory also serves as a baseline system to support the calibration and analysis of data from a constellation of other satellites that carry microwave radiometers of varying characteristics (Hou et al. 2014).
Atmospheric properties are retrieved from remote sensing devices using visible, infrared, and microwave portions of the electromagnetic spectrum (Fig. 2-5). Measurements of cloud and aerosol properties are a common application for the visible and infrared portions of the spectrum. An early example is the airborne Multichannel Cloud Radiometer (MCR), which was used by King (1987) to derive cloud optical depth from measurements of visible reflectance, and later applied by Nakajima and King (1990) to obtain effective particle radius using visible and near-infrared measurements. Subsequent upgrades to this technology resulted in the MODIS/ASTER airborne simulator (MASTER), which supports the calibration, validation, and algorithm development for satellite-borne instruments (i.e., MODIS and ASTER) (King et al. 1996; Hook et al. 2001), a common goal of airborne campaigns (e.g., Kramer 2002).
The microwave portion of the spectrum is often used for passive remote sensing of the atmosphere (Fig. 2-5). Microwave radiometers have uses in sensing column water vapor and cloud liquid water. Microwave radiometers have been developed in both ground-based (e.g., Guiraud et al. 1979) and satellite-borne (e.g., Staelin et al. 1976) configurations. They rely on radiometric measurements of atmospheric thermal emissions at specific absorption lines for obtaining temperature profiles as well as profiles and integrated quantities of liquid water and water vapor (Janssen 1993). Askne and Westwater (1986) describe methods used in the NOAA Profiler Radiometric system for obtaining water vapor (1.455-cm channel), liquid water (0.947 cm), and temperature profiles (0.6–0.5 cm).
Retrieval methods that combine radiosonde data with the measured microwave brightness temperatures were developed to enhance the vertical resolution of the retrieved profiles. Airborne microwave radiometers using a similar array of channels are used to derive temperature and moisture information at higher altitudes. The airborne Microwave Temperature Profiler (Gary 1989) measures brightness temperature from a set of oxygen absorption lines around 0.545 08 cm to obtain temperature structure above and below flight level, providing meteorological context for coincident airborne measurements of chemistry and cloud properties. A more capable atmospheric sounder, the High-Altitude Monolithic Microwave Integrated Circuit (MMIC) Sounding Radiometer (HAMSR) developed at the Jet Propulsion Laboratory, employs additional channels for enhanced temperature sounding as well as a series of channels around 0.1638 cm for humidity profiling. Lambrigtsen et al. (2016) used HAMSR observations to describe the three-dimensional structure of hurricanes.
5. Solar and terrestrial radiation observing systems
The past century has been witness to remarkable progress in our ability to monitor the flow of radiative energy through the Earth system and apply it to problems in fundamental atmospheric radiative transfer, radiative energy budget assessment, remote sensing, and improving climate and weather models. The focus here is on some of the major achievements and the most important observing systems.
a. Achievements in understanding the sun’s input
Radiative energy from the sun establishes the basic climate of the Earth’s surface and atmosphere and defines the terrestrial environment that supports all life on the planet. The energy that the sun provides to Earth’s atmosphere is almost 4000 times larger than all other sources combined (Kren et al. 2017). Solar variability on a wide range of time scales affects the Earth system and combines with internal forcings—including anthropogenic changes in greenhouse gases and aerosols—and natural modes such as El Niño–Southern Oscillation (ENSO) and volcanic forcing to define past, present, and future climates. Understanding these effects requires continuous measurements of total and spectrally resolved solar irradiance that meet the stringent requirements of climate-quality accuracy and stability over time. Early surface-based measurements, such as the Smithsonian Astrophysical Observatory Solar Constant Program (Hoyt 1979), were instrumental in characterizing solar variability and modern satellite observations of total and spectral irradiance might be considered crowning achievements, leading to over a century of progress and reducing uncertainty in total solar irradiance from 3% during the twentieth century to approximately 0.03% today (Kopp and Lean 2011).
Since 1978, total solar irradiance (TSI; sometimes called the solar constant) has been measured continuously from space by several different systems (described in Kyle et al. 1993; Willson 1994; Lee et al. 1995; Willson 2001; Fröhlich and Lean 2002; Kopp et al. 2005). These systems include the Earth Radiation Budget (ERB) on Nimbus-7; Active Cavity Radiometer Irradiance Monitor-I (ACRIM-I) on the Solar Maximum Mission (SMM); ACRIM-II on the Upper Atmosphere Research Satellite (UARS) and ACRIM III on ACRIMSAT; Earth Radiation Budget Experiment (ERBE) on three Earth Radiation Budget Satellites (ERBS); the Solar Constant and Variability Instrument (SOVA) on the European Retrievable Carrier (EURICA); Variability of Solar Irradiance and Gravity Oscillations (VIRGO) on the Solar and Heliospheric Observatory (SOHO); and the Total Irradiance Monitor (TIM) on the Solar Radiation and Climate Experiment (SORCE). Evident in this combined record is an 11-yr cycle with peak-to-peak amplitude of approximately 0.1% and variations a factor of 2–3 times greater associated with the short-term transits of sunspots over the disk of the sun.
Prior to 2010, the combined record of TSI measurements made by individual radiometers exhibited a spread of nearly 1% that was of instrumental rather than solar origin, far exceeding the 11-yr or rotational solar variability. While instrument offsets were large, each instrument had high precision and was able to detect small changes in the TSI caused by variability in solar activity. These data were all recorded with ambient temperature sensors, each of which has its own stated instrumental uncertainty, typically on the order of 0.1% (1000 ppm), with the exception of the Total Irradiance Monitor on SORCE, which had a 350-ppm uncertainty (Kopp et al. 2005).
In 2005, a workshop conducted at the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland (Butler et al. 2008), led to investigations into the effects of diffraction and of aperture area measurements as a cause of the differences between instruments. Additional recommendations resulting from the workshop included power and irradiance calibrations, and comparisons with a standard cryogenic electrical substitution radiometer. In response to these issues, a new TSI Radiometer Facility (TRF) has been established to provide such calibrations (Kopp et al. 2007). The results of several years of measurements at the TRF revealed that scattered light was the primary source of the large offsets between other instruments and the SORCE TIM. A new TSI standard of 1360.8 W m−2 has been established by Kopp and Lean (2011).
Continuous measurements of solar ultraviolet radiation began in 1978 with the Nimbus-7 Solar Backscatter Ultraviolet Radiometer (SBUV; Schlesinger and Cebula 1992). These measurements were followed by those from the Solar Mesosphere Explorer (SME; Rottman 1988), NOAA-9 SBUV/2, NOAA-11 SBUV/2, the Solar Stellar Intercomparison Experiment (SOLSTICE) on UARS (Rottman et al. 2001), and the Solar Ultraviolet Spectral Irradiance Monitor (SUSIM; Floyd et al. 2002), also on UARS. SOLSTICE is one of four solar irradiance measurement experiments that was part of SORCE. The present-day SORCE SOLSTICE and SORCE Spectral Irradiance Monitor (SIM) extend this continuous (albeit with different spectral coverage, resolution, and instrument accuracies and stabilities) record of the solar ultraviolet and its variability. Measurement of the continuous, full solar irradiance spectrum, which is much shorter than the record of TSI, commenced with measurements by the Spectral Irradiance Monitor on the SORCE satellite in 2003 (Harder et al. 2009).
b. Advances in radiometry
Progress in radiometric observing systems of the atmosphere closely tracks the evolution of radiometry itself. Motivation for most of the radiometers of the nineteenth and early twentieth centuries largely came from a need to understand the amount of solar energy reaching Earth and its atmosphere and surface. The book Solar and Infrared Radiation Measurements, by Vignola et al. (2017), provides an extensive review of the development of solar and terrestrial radiometric instruments over this period. Chapter 3 in particular, on “historic milestones in solar and infrared radiation measurement,” is a detailed accounting of the history and evolution of radiometric instrumentation. By the end of the nineteenth century, Ångström had developed a method of balancing absorbed radiative power with electrical power, the so-called electrical compensation or substitution radiometer. Remarkably, this type of radiometer continues to serve as a reference standard today and electrical substitution detectors form the basis of the most accurate measurement of any component of the radiative energy budget, total solar irradiance.
Fröhlich (1991) provides a detailed history of solar radiometry that begins with Pouillet’s instrument developed in 1837 and covers the creation of the Ångström and Smithsonian radiometric scales. The twentieth-century introduction of modern absolute radiometers subsequently led the World Radiometric Reference (0.3% accuracy and guarantee of homogeneity of radiation measurements within 0.l% precision) that has been used by the meteorology community since 1981.
c. Surface observing networks: BSRN, Aeronet, and ARM
The role of Earth’s surface in transforming radiative energy is central to many climate processes and the regulation of climate in general. In the 1980s it was recognized that the existing radiometric networks were incapable of providing the required accuracy to provide a basic understanding of climate, let alone validate model simulation of climate changes. In 1988 the World Meteorological Organization International Council of Scientific Unions (WMO/ICSU) Joint Scientific Committee for the World Climate Research Programme (WCRP) proposed the international Baseline Surface Radiation Network (BSRN) to establish a surface radiation network to monitor the surface shortwave and longwave radiation components for validating satellite estimates of the radiative energy budget, validating radiation codes in climate models, and to monitor trends in the surface radiation budget. The U.S. Department of Energy (DOE) initiated the ARM Program in 1989 to establish long-term measurements to better define the problem of cloud radiative feedbacks in climate.11
The BSRN was implemented in 1992 to support the research projects of the WCRP and other scientific programs. It was intended not only to carry out the measurements but also to improve fundamental measurement capabilities.
The BSRN opened with 9 stations, expanding to 22 over the following decade with coverage from 80°N to 90°S. In early 2000 it was designated as the Global Baseline Surface Radiation Network of the Global Climate Observing System (GCOS). By 2015, 58 stations had provided data to the BSRN data archive. (For a current listing of BSRN stations, see http://bsrn.awi.de/stations/listings/.) The fundamental station measurements are downwelling global diffuse and direct shortwave radiation and downwelling longwave radiation, plus air temperature, relative humidity, and pressure. Many stations also provide upwelling radiation components and spectrally resolved shortwave radiation in the ultraviolet, visible, and near-infrared spectral regions.
The Aerosol Robotic Network (AERONET) project is a federation of ground-based aerosol networks established by NASA and Photométrie pour le Traitement Opérationnel de Normalization Satellitaire [PHOTONS; University of Lille 1, CNES, and CNRS-National Institute for Earth Sciences and Astronomy (INSU)] and is greatly expanded by networks [e.g., the Red Ibérica de Medida Fotométrica de Aerosoles (RIMA), AeroSpan, Aerosol Canada (AEROCAN), and the China Aerosol Remote Sensing Network (CARSNET)] and collaborators from national agencies, institutes, universities, individual scientists, and partners. For more than 25 years, the project has provided a long-term, continuous, and readily accessible public domain database of aerosol optical, microphysical, and radiative properties for aerosol research and characterization, validation of satellite retrievals, and synergism with other databases. The network imposes standardization of instruments, calibration, processing, and distribution.
6. Airborne observing systems
The growth in aviation in the twentieth century brought with it a new capability to easily access the lower atmosphere by carrying instrumentation on aircraft. As is evident in section 7 and in Haupt et al. (2019a), the growth in aviation created a huge demand for atmospheric observation systems of many types, including on commercial aircraft themselves. There are several repositories of information about existing airborne observing capabilities. The European Facility for Airborne Research (EUFAR)12 coordinates and facilitates access to 17 European research aircraft through 12 European-based operators. They also sponsor expert working groups on airborne measurement techniques as well as education and training on these methods. NASA research aircraft and instrumentation are described through the NASA Airborne Science Program,13 while NOAA research aircraft are described through their Office of Marine and Aviation Operations program.14 Recent updates on research aircraft are also included in Geerts et al. (2017). While these descriptions do not cover all the research aircraft capabilities presently deployed across the globe, they provide a comprehensive cross section of the types and capabilities of the modern research aircraft fleet.
As might be expected, there is a wide variety of capabilities in research aircraft, ranging from platforms that carry a single instrument to platforms designed to be multipurpose mobile research laboratories. Many of the larger aircraft in the inventories described above fall into the latter category, with specialized instrument racks, external pylons, power and signal distribution wiring, and inlets to provide air samples. Special considerations for the placements of inlets and hydrometeor sampling to provide representative samples are needed. For example, air inlets are often placed on the underside of the fuselage because most flight configurations are at a positive angle of attack (angle between the chord line of an airfoil and the direction of the surrounding undisturbed flow), which makes it easier for inlets to reach into the undisturbed air (e.g., Fig. 2-6).
Although scientific milestones such as the Thunderstorm Project relied heavily on airborne observing systems, aircraft were widely used before then for various atmospheric observing and measurement tasks. Wendisch and Brenguier (2013) provide a recent summary of airborne measurement techniques and numerous examples of early uses of aircraft in research. An earlier look at airborne sampling techniques is provided by NCAR (1966), which covers airborne instrumentation for measurement of air motion, cloud physics and kinematics, and data handling. Comparing these two works, both focused on airborne methods, illustrates the extensive expansion of airborne measurement capabilities over the past five decades, with the growth of disciplines such as atmospheric chemistry and remote sensing reflected in the types of instrumentation carried and a general increase in better and more varied measurements of individual atmospheric constituents.
Kramer (2002, section P) provides a comprehensive summary of modern sensors flown on research aircraft and a summary of many recent field campaigns (section Q). These summaries illustrate the extensive variety of payloads and uses of research aircraft as both a primary source of Earth observations and as a provider of data that supplements or validates space-borne observations. They also illustrate the growth in sensor capabilities on research aircraft, particularly in remote sensing, that have evolved during the satellite era. Most of these capabilities are to support basic and targeted research objectives; however, routine airborne missions for operational hurricane forecasting utilize dropsondes and airborne radar.
There has been steady progress in aircraft observing systems over the past 100 years. Some of the major milestones related to the capabilities of airborne observing systems are the following:
The increase in the number, types, and overall capabilities of aviation platforms has grown enormously with the development of jet aircraft and the growth of the aviation industry overall. This has allowed for a high degree of specialization in the types of missions and a wide range of platform capabilities. Examples include the specialized modification for convective storm penetrations described in section 6c and the specialized installation of airborne weather radar described in section 6d.
The improvements in navigation and aircraft attitude measurements, such as the introduction of INS in the 1950s and, later, by the incorporation of GPS on aircraft, has created a major improvement in the measurements that can be carried out from aircraft. More details on the breakthroughs in wind and turbulence measurement from aircraft are provided in section 2b.
The introduction of digital computing and data handling (collection and recording) on aircraft has greatly expanded the capabilities of research aircraft overall by largely replacing photographic film and pen and ink recording methods (such as used in the Thunderstorm Project) with digital recording and a subsequent increase in the number and types of data that can be created, usually with much higher spatial or temporal resolution. This is illustrated by the marriage of digital and optical techniques for hydrometeor sampling pioneered by Robert Knollenberg in the 1970s (Knollenberg 1970), which has essentially replaced older methods with a greatly improved ability to collect representative samples. A brief overview of progress in these methods is described in section 6b.
The development of satellite communication systems has brought near global ground-to-air networking coverage to airborne research. As the cost and capabilities of these systems have improved, more deployments are taking advantage of these technologies, thus transforming how research aircraft are used and largely replacing older communication methods such as radio. This real-time digital communication capability has enabled tools15 to be developed for coordination between aircraft and the ground to improve sampling strategies and to provide information to the aircraft to guide sampling around significant weather.
Meteorological sampling of the atmosphere predates human flight, for example, with the use of kites as sampling platforms (section 3). Today, unmanned aircraft systems (UAS), also referred to as drones, UAV, or remotely piloted vehicles (RPV), are an evolving technology that has many applications for airborne research. As regulations allow for access to more airspace, these airborne systems are likely to play an increasing role in future scientific applications. Recent reviews of the capabilities of the various types of UAS and their scientific uses (e.g., NASA 2006; Vömel et al. 2018) indicate that a robust set of platforms currently exist and are likely to take over some missions from traditional research platforms. More importantly, they will likely perform some missions that cannot be accomplished with traditional research aircraft.
a. Airborne observing methods for solar and terrestrial radiation
Aircraft are often used as platforms for radiometric observations because of their ability to make radiometric measurements at different levels of the atmosphere and to simultaneously collect information on radiatively important atmospheric constituents, such as trace gases, clouds and aerosols, and black carbon. A summary of airborne observing systems for solar and terrestrial radiation is provided in chapter 7 of Wendisch and Brenguier (2013). In addition to fundamentals of atmospheric radiation, this chapter addresses broadband and spectral shortwave (solar) and longwave (thermal infrared) irradiance radiometers, sunphotometers and shadow-band radiometers for measuring directly transmitted and diffuse solar irradiance, longwave interferometers, actinic flux radiometers, and microwave radiometers. Principles of operation, calibration, and characterization for each class of instrument are provided along with examples of historical applications. One highlight of note is the role that spectrally resolved airborne measurements played in the resolution of what became known as the cloud absorption anomaly (Cess et al. 1995). The chapter also covers an important advance in airborne solar radiometry, the development of the active leveling platform that greatly reduced geometry-induced error in shortwave irradiance measurements (Wendisch and Brenguier 2013).
b. Progress in airborne cloud probes
As reported by Pruppacher and Klett (2010) the earliest airborne measurements of cloud particles were made at the beginning of the twentieth century by Albert Wigand, who presumably captured them on a flat surface to observe them. As shown in Table 2-2, beginning some 40 years later, ice crystals were impacted on oil-coated slides extended out from the aircraft cockpit (Weickmann 1947, 1949). Over the following 30 years, a number of impaction techniques were perfected, capturing impressions of cloud droplet and crystals on surfaces exposed to the passing airstream. These were simple frames holding a single slide or more sophisticated, multislide cloud “guns” (Golitzine 1950; Clague 1965; Spyers-Duran and Braham 1967; Hindman 1987), or continuous foil impactors. Glass slides were prepared with a coating of carbon black (soot), MgO powder, viscous oil, or formvar. The slide was then exposed to the airstream.
The introduction of high-speed photography (Cannon 1960), the optical array probe (Knollenberg 1970), and holography (Thompson 1974; Trolinger 1975; Lawson and Cormack 1995; Brown 1989; Fugal et al. 2004) to image cloud particles in real time, thus acquiring much larger samples, led to the phasing out of the impaction devices. One exception is the video impactor (Murakami and Matsuo 1990; Miloshevitch and Heymsfield 1997) that captures cloud particles on a moving transparent ribbon that is then photographed with a video camera. Droplet measurements using light scattering were developed first in 1972 (Knollenberg 1976, 1981), followed by many similar spectrometers that differed only in the size, frequency response, and collection angles (Baumgardner et al. 2001; Hirst et al. 2001).
In parallel with sensors to distinguish the size and shape of individual cloud particles, instruments were developed to measure the total liquid water and/or the total condensed water. Supercooled liquid water is especially important as a cause of aircraft icing problems. There are at least four techniques that have been implemented: 1) measurement of the amount of power required to heat a wire or cylinder when impacted by water droplets (Owens 1957; Neel 1973; King et al. 1978; Nevzorov 1980; Lilie et al. 2005), 2) measurement of the water vapor concentration produced by water evaporation (Kyle 1975; Nicholls et al. 1990; Morgan et al. 2000; Weinstock et al. 2006; Davis et al. 2007; Schiller et al. 2008), 3) optical diffraction patterns (Gerber 1991), and 4) changes in the natural frequency of a vibrating cylinder due to accretion of ice on its surface from impaction of supercooled water (e.g., section 184.108.40.206 in Wendisch and Brenguier 2013).
c. Specialized airborne observing: The T-28 storm penetrating aircraft
Aircraft used for atmospheric observations are often specially modified to meet specific mission demands and to carry specialized payloads. An unusual example of this strategy is the modification of an aircraft to penetrate thunderstorms, which are avoided by normal aircraft operations (see section 7). The idea that an aircraft could be modified to safely penetrate hail-bearing thunderstorms was advanced by Paul MacCready during “Project Hailswath” (Goyer et al. 1966). With support from NSF, a post–World War II T-28 pilot trainer plane was acquired and armored on leading edges and other critical surfaces to withstand impacts of hailstones as large as 7.5 cm at flight speeds. It was equipped with instrumentation to measure temperature, vertical wind, and hydrometeor characteristics from cloud droplet through hailstone sizes as well as a data acquisition and recording system; all these were upgraded as the technologies advanced. The T-28 began storm penetration work in 1970 and over its 30+ years of service (the aircraft was retired in 2004) contributed observations to many storm research projects across the United States as well as in Switzerland and Canada; examples include the National Hail Research Experiment (Knight and Squires 1982a,b), the Cooperative Convective Precipitation Experiment (CCOPE; Knight 1982), Grossversuch IV (Waldvogel et al. 1987), the Severe Thunderstorm Electrification and Precipitation Study (STEPS; Lang et al. 2004), and the Joint Polarization Experiment (JPOLE; Scharfenberg et al. 2005). A summary of the many scientific contributions of T-28 observations appears in Detwiler et al. (2004).
d. Specialized research aircraft: Airborne radar
Many weather systems occur over the oceans and observations of these often require either airborne or ship-borne radars. Near the end of WWII, the U.S. military established several Weather Reconnaissance Squadrons with 3.2-cm bombing radars adapted for weather surveillance and installed on B-24s and B-25s (Fletcher 1990). A variety of airborne weather radars ranging from 3.2- to 10-cm wavelengths were deployed in Project Stormfury (Gentry et al. 1970) to gather precipitation data before, during, and after seeding multiple clouds near the hurricane eyewall region (Black et al. 1972). The design considerations and tradeoffs for putting a Doppler radar on an aircraft and the sampling strategy to probe hydrometeor vertical velocity (pointing vertically) and horizontal wind vectors (pointing horizontally) were established in the early 1970s (Lhermitte 1971). Since then, a variety of airborne Doppler radars (wavelengths from 3 mm to 5 cm) have been developed for the tail, side of the fuselage, bomb/cargo bay, or in a wing pod.
The modern airborne tail Doppler radar (TDR) era began with the installation of scanning tail radars (rotating along the longitudinal axis of the fuselage) in 1976 with a steerable parabolic antenna sweeping out a conical surface (similar to an RHI scan) at a tilt angle ±25° away from the plane normal to the fuselage at a rotating speed up to 60° s−1. These radars were given Doppler capability in 1981 (Jorgensen et al. 1983). By flying L-shaped flight patterns (near two orthogonal tracks) and scanning perpendicularly to the flight track, these TDRs revealed three-dimensional wind fields of tropical cyclones, convective storms and frontal rainbands at a spatial resolution of 1 km (with a stationarity assumption from 30 min up to 2 h; e.g., Marks and Houze 1984; Hildebrand and Mueller 1985; Jorgensen et al. 1983; Lee et al. 1994a; Roux and Marks 1996). By alternating the antenna scanning fore then aft of the flight track, the fore–aft scanning technique (FAST; Jorgensen et al. 1996), pseudo-dual-Doppler radar data can be collected with one straight-line flight leg at half the time. Sophisticated procedures have been developed to perform data quality control (Bell et al. 2013) and remove aircraft motion from the measured Doppler velocities (Lee et al. 1994b; Testud et al. 1995; Georgis et al. 2000; Bosart et al. 2002). The two NOAA P-3 TDRs and the NOAA G-IV TDR have been essential components in the annual NOAA hurricane reconnaissance field program where the TDR data have formed the basis in understanding hurricane kinematics and dynamics (e.g., Marks et al. 1992; Marks 2003; Lorsolo et al. 2010). They have also participated in many field campaigns on winter storms, tropical convection, supercell, squall line, and orographic precipitation (e.g., Wakimoto et al. 1995; Wakimoto and Atkins 1996; Jorgensen et al. 1997; Jorgensen and Smull 1993; Yu et al. 2007). In the past 10 years, these real-time dual-Doppler wind fields in TCs have been transmitted to NCEP via satellite link and assimilated into hurricane models to improve hurricane intensity forecast (Zhang and Weng 2015).
The development of the NSF/NCAR Electra Doppler Radar (ELDORA), jointly developed with France [known as Analyese Steroscopic par Impulsions Aeroporte (ASTRAIA)], in the mid-1980s marked the next technology advancement for the TDR (Hildebrand et al. 1994, 1996). The system was improved by adopting slotted waveguides, a high-power transmitter, improved transmission frequency, higher antenna rotation, and dual-pulse repetition frequency to improve the sampling statistics. At a typical airspeed of ~130 m s−1, the along-track resolution was ~300 m. From 1993 to its retirement in 2012, ELDORA participated in nine U.S. and international field campaigns, collected data with unprecedented high spatial resolution, and produced scientific discoveries in tropical convection, supercells, clear-air boundary layers, squall lines, cold and warm fronts, orographic precipitation, and tropical cyclones (e.g., Hildebrand 1998; Wakimoto et al. 1998; Atkins et al. 1998; Wakimoto et al. 2006; Wakimoto and Bosart 2000, 2001; Bousquet and Smull 2003; Houze et al. 2007). The two NOAA P-3 TDRs were upgraded in 2016 to include two solid-state transmitters with a faster antenna rotating speed up to 120° s−1, making the capability close to the NSF/NCAR ELDORA system.
NASA developed several airborne Doppler radars to emulate satellite radar systems (e.g., TRMM and later the GPM radars), which are used to deduce storm internal structures. The radars included 1) Airborne Rain-Mapping Radar (ARMAR), which is a cross-track scanning 2.2-cm radar mounted on the cargo bay of the NASA DC-8 (Durden et al. 1994); 2) the ER-2 Doppler radar (EDOP), which is a fixed-nadir and forward-looking 3.2-cm radar flown on the NASA ER-2 aircraft (Heymsfield et al. 1996); 3) the High-Altitude Imaging Wind and Rain Airborne Profiler (HIWRAP), which is a dual-wavelength (2.2 and 0.86 cm) dual-beam (incidence angles of 30° and 40°) radar that flies on the NASA Global Hawk (Guimond et al. 2014) (HIWRAP can also be flown on NASA ER-2 with only nadir-pointing capability; polarization capability is available in ARMAR and EDOP); and 4) the Airborne Second Generation Precipitation Radar (APR-2), a dual-frequency (13 and 35 GHz), Doppler, dual-polarization radar.16
At a shorter wavelength (3 mm), there are two airborne Doppler and polarimetric cloud radar systems capable of sensing preconvective boundary layer and clouds before heavy precipitation developed. The Wyoming Cloud Radar (WCR) was developed in the mid-1990s and mounted in the cabin of the University of Wyoming King Air aircraft (Vali et al. 1998). The WCR can deduce a curtain (either horizontal or vertical) of pseudo-dual-Doppler wind vectors. The WCR can also be deployed on the NSF/NCAR C-130 but with limited pointing options. The newly developed NSF/NCAR High-Performance Instrumented Airborne Platform for Environmental Research (HIAPER) Cloud Radar (HCR; Vivekanandan et al. 2015) is mounted on a wing pod on the NSF/NCAR G-V aircraft and is capable of sensing cloud properties from about 14-km altitude and below either by a vertical staring mode or cross-track scanning mode.
These are intended only as examples of some current airborne radar systems, which illustrate the utility of aircraft as remote sensing platforms. Many other remote sensors, such as lidars and passive remote sensors are routinely deployed from aircraft (e.g., see section 4).
e. Airborne data systems
Aircraft often carry multiple types of instrumentation, requiring different recording techniques and data formats. Early (starting in the 1970s) digital airborne data systems replaced strip chart recorders and were based on small-footprint computers, such as minicomputers for data acquisition and processing, with tape drive systems for recording. These data systems served as the heart of the data handling capabilities on the aircraft, with many instruments wired directly to the data system through interface boards (e.g., analog-to-digital cards) in the data system. These data systems have gradually become more compact and lighter but with larger storage capacity allowing for much higher sampling frequency, with a corresponding increase in the quantity of data that could be handled. This technology has evolved so that many instruments have their own data acquisition and processing capabilities, which are often connected through high-rate data transfer via local area networks to a central data system. The present generation of data systems on multipurpose research aircraft requires a capability to manage satellite communications with the ground, plus communications with many onboard instrument data systems, often including remote control of instrumentation, a wide variety of data formats, and a multitude of ground and airborne display capabilities. These features have allowed modern research aircraft deployments to include more participants on the ground than in the air. This trend is likely to continue as UAS technology becomes more widely used with onboard mission specialists likely reserved for nonstandard applications.
7. Societal needs and observing systems
As discussed in the introduction, atmospheric observing systems are created in response to the need to study fundamental atmospheric processes and the needs of operational meteorology (e.g., for preparing forecasts, scheduling aircraft, and snow removal). Often, these needs have resulted in highly specialized observing capabilities to support particular industries and sectors (e.g., NRC 2009, 2003). Moreover, weather data often have to be combined with industry-specific information to yield guidance for effectively dealing with weather impacts. Societal interest is also a factor. For example, interest in climate-related observing systems has certainly increased during the latter part of the past 100 years as new evidence for climate change has emerged. This section examines examples of relationships between observing systems, societal interests/needs, and science.
a. Societal interest
Observations of the environment have arisen from basic societal needs, such as protection from harm and locating food and other resources (e.g., water). Interest in exploration and basic research has also played an important role. Societal interests in the past century have grown to include observations fostering efficiency and sustainability.
As conceptually conveyed in Fig. 2-7, observations advance scientific understanding (often based on insights gained from detailed field experiments) and lead to models, which then require further validation through additional observations (the gray curve in Fig. 2-7). Societal benefits arise from predictions (green curve in Fig. 2-7) that enable effective planning to mitigate weather and climate impacts as discussed by Ziolkowska et al. (2017). This results in a feedback loop as society invests in the scientific enterprise to increase benefits through improving observation and prediction capabilities [e.g., see Benjamin et al.’s (2019) review of weather forecasting]. Smith (2010) provides a gripping account of how advances in meteorology virtually eliminated airline crashes due to wind shear and how thousands of lives are saved by hurricane warnings. Yet quantifying the societal return on investment into science and technology may not be so easy (depending on the field of research) and is not immediately apparent, as contemplated by Bornmann (2012).
Environmental observations that capture aspects of both the atmosphere and Earth’s surface (including land, ocean, and cryosphere) have become essential to sustainably satisfy basic human needs, like water (too much, not enough, and quality), food (availability and quality), health (temperature, air quality, radiation, diseases), transportation (safety and efficiency of travel on road, water, rail, and air), and energy (sources, distribution). In addition, public safety and defense are other key areas that greatly benefit from observations of the environment. Often, these societal needs have resulted in emerging, highly specialized observing capabilities to support particular industries and sectors (NRC 2003, 2009). Moreover, weather and climate data alone satisfy only part of the societal needs—these data have to be combined with sector- or industry-specific information in smart decision support capabilities to yield actionable guidance for effectively dealing with high-impact weather and climate events. What follows is meant to provide a flavor of the range of societal benefits resulting from meteorological observations rather than offer a comprehensive overview. Reference will be made to other monograph chapters for further details as warranted.
b. Water, food, and human health
Freshwater and nutrition are pillars of public health and thus among the most basic concerns of humanity. Too much water may cause flooding, while not enough water may result in drought and famine—both put lives at risk. Furthermore, a lack of quality freshwater or nutrition makes people less resilient and therefore more prone to suffering from diseases. Because of their importance, issues related to water, food, and human health are considered national security concerns (e.g., Haupt et al. 2019a, section 4).
Peters-Lidard et al. (2019) elaborate on water-related aspects across the full spectrum from catchment to global scales, including observations and modeling capabilities. Their discussion of the hydrometeorologic observing infrastructure (section 3) is comprehensive and provides examples where advanced scientific understanding has enabled many decision support capabilities, like providing alerts for imminent flash flooding or guidance related to operating dam and water reservoirs.
Another water-related activity where research and societal interest has played a large role is in cloud seeding to increase precipitation or to suppress hail. The history of weather modification offers important lessons about the role of societal interest in new technologies and how this interest originates. For example, the discoveries of the glaciogenic properties of silver iodide and dry ice occurred just after the scientific successes of WWII (Fig. 2-1), during a time of optimism about what science can accomplish. Prominent scientists promoted cloud seeding as a promising new technology for controlling the weather (e.g., Strand 2015). This resulted in a long-standing interest in weather modification in various sectors of society. As a result, numerous research programs were developed to test the various seeding hypotheses. These field projects have had major positive impacts on the field of cloud physics and on the development of instrumentation and research aircraft for measuring cloud particles (section 6), even though many of the seeding hypotheses did not work out as planned. For example, the U.S. National Hail Research Experiment, which was created in response to Soviet claims of hail suppression seeding technology, resulted in major advances in our understanding of midwestern hailstorms, even though the project failed to confirm the claims of the Soviet hail suppression methods. A thorough review of weather modification is provided in Haupt et al. (2019a, section 2).
Haupt et al. (2019c, section 2) provides a comprehensive review of the applied meteorology relevant to agriculture and food security. Meteorological information is essential for optimizing food production through understanding seasonal outlooks to guide decisions on what to plant and when to plant, and on shorter time scales how much irrigation is needed. Today’s satellite observing capabilities have become a key enabler for large-scale monitoring of crop growth and health (e.g., soil moisture, temperature, leaf area index), and they are heavily leveraged by agricultural modeling and decision support applications. Detailed in situ observations relevant for agriculture are provided by several mesoscale networks (e.g., MesoWest,17 Oklahoma Mesonet18), the Soil Climate Analysis Network (SCAN; Schaefer et al. 2007) and AmeriFlux19 (Baldocchi et al. 2001), among others (NRC 2009).
Today more than half the global population lives in cities. Yet cities, with their many large buildings of varying heights, heavy traffic, and paved streets and parking areas, can create their own distinct local weather (e.g., urban heat island, changes in local precipitation patterns, elevated concentrations of gaseous pollutants and aerosols, and the channeling of wind between buildings) and respond to weather hazards (street flooding as a result of heavy rainfall), as discussed in NRC (2012). Sustained poor air quality can significantly affect human health. The physical and chemical processes that take place in the atmosphere leading to urban air quality issues are addressed by Wallington et al. (2019) and also Haupt et al. (2019b, sections 2 and 4). The dispersion of pollution and airborne toxic agents has been extensively studied based on data collected during several specialized field experiments20 (see also LeMone et al. 2019, section 10). Many urban areas include specialized observing networks for air quality monitoring, as detailed in NRC (2009, Table B2). For example, the U.S. Environmental Protection Agency (EPA), in collaboration with many partners, operates a national air quality notification and forecasting system (AIRNow21) that provides the public with easy access to air pollution data and maps, air quality forecasts, information about the effects of air pollution on public health and the environment, and actions people can take to protect their own health and reduce pollution-forming emissions (adding to the value of the observations).
In addition to improvements in geophysical observations, improving societal welfare requires interdisciplinary research on social, economic, and health activities related to climate and weather at local, regional, and global scales. For example, disease outbreaks can be significantly modulated by weather and climatic conditions, as in the case of the breeding and survival of virus-transmitting mosquitos, which depends on the location of warm and wet weather. By combining meteorological information to capture the mosquito seasonality with socioeconomic and travel factors, Monaghan et al. (2016) were able to advance the understanding of when and where the Zika virus will spread across the United States (Fig. 2-8). Interdisciplinary research like this, that effectively blends meteorological observations with other data, will provide health organizations with specialized forecasts that predict the weather conditions associated with the beginning and end of outbreaks of often-deadly diseases like Zika, dengue, meningitis, and plague among others.
The electric grid requires a delicate balancing of the distributed load from various power sources like coal, nuclear, and renewable energy (water, wind, solar, and geothermal) production plants. In 2017, 18% of all electricity in the United States was produced by renewable sources, including solar, wind, and hydroelectric dams, and this percentage continuous to increase.22 Harnessing above-ground renewable energy sources depends upon understanding the potential yield from wind turbines, solar panels, and water flow, and how that may be changing over time, which requires detailed meteorological observations and location-specific prediction capabilities related to wind, solar radiation, clouds, and precipitation. In addition, the demand for energy from the power grid needs to be properly anticipated as well for the grid to remain balanced, which depends on the weather. For example, urban heat waves or cold snaps significantly affect the energy consumption through extensive use of air conditioning or heating, respectively. Last, extreme weather, like thunderstorms (lightning), hurricanes (wind and heavy rainfall), and ice storms, can cause failures in the power grid. Meteorological observations greatly assist in anticipating where such failures may occur and effectively deploying resources for fixing problems. A review of renewable energy applications, especially related to wind and solar energy, is provided by Haupt et al. (2019b, section 3). Societal interest in renewable sources such as wind and solar energy is likely to increase because of the air quality, public health, and greenhouse gas emission benefits they offer (e.g., Millstein et al. 2017).
Surface transportation, such as travel along highways, by rail, and on water can be heavily affected by weather, as reviewed by Haupt et al. (2019b, section 5). Examples of extreme weather impacts on the transportation system are discussed in a recent National Cooperative Highway Research Program (NCHRP) synthesis report (NCHRP 2014). It is clear from these reviews that weather has a major impact on both the safety and productivity of a wide variety of industries that rely on surface transportation. Particular challenges are related to limited visibility, strong wind (e.g., blowing over trucks, high waves as a threat to boating or wind-induced currents as a hazard for precision navigation of large vessels through narrow channels), heavy precipitation (local flooding or washouts of roads or tracks), wintry conditions (slippery roads, risk of avalanches), and extreme temperatures (highway and railway track buckling) that can significantly impair safe and expedient travel. The Road Weather Information System (RWIS)23 is an example of a specialized reporting system that provides access to Environmental Sensor Stations (ESS) that collect atmospheric and road condition data at several thousand sites across the United States. These data, often combined with additional information (e.g., webcams, traffic counts and speed), serve the specific purpose of enhancing safety of travel on highways by enabling timely alerts of hazardous driving conditions and decision guidance for winter weather road treatments. Similarly, railway operations benefit from sensors installed along the tracks to monitor the weather conditions and alert of potential hazards.
Marine transportation typically requires wind speed and direction, wave heights and direction, and tropical weather updates (NRC 2009). Except for ship reports and buoys, comparatively few surface observations are available for operational decision-making at sea. Automated reports from marine buoys (available from the National Buoy Data Center24) and satellite observations of sea state and surface winds (e.g., inferred from scatterometer data) provide situational awareness and input to forecasts. Port operations require good tidal information, water depth, wind velocity, wave heights and direction, and other information to navigate large vessels precisely and avert collisions with structures or other ships.
Nowhere is the synergy between transportation and meteorology stronger than in aviation. The rapidly growing aviation industry over the past century has significantly shaped meteorological observing systems, as chronicled in Haupt et al. (2019a, section 3). The aviation history serves as a key example of how societal needs drive advances in observing systems and, conversely, how the aviation industry and the flying public benefit from meteorological research using these observing systems. Since the dawn of human flight, observations have been essential to support a range of aviation operations, both on the ground and in the air. Specific details about the necessary observations and reporting in support of the international aviation industry can be found in Annex 3 of the Meteorological Services for International Air Navigation from the International Civil Aviation Organization (ICAO 2016). Furthermore, the textbook by Brock and Richardson (2001) provides valuable insights about meteorological measurement systems in general—their chapters 7 (wind measurements), 11 (visibility and cloud height), and 12 (upper-air measurements) are particularly relevant to aviation.
Today’s global aviation industry has come to rely on specialized forecasts of a range of weather conditions at the origin and destination of a flight several hours in advance, as well as weather updates en route, especially for long-distance, transoceanic flights between continents. Moreover, the management of air traffic flows increasingly relies on tailored weather information as well to achieve safe and efficient flight operations. This increasing demand on weather guidance for the aviation industry has resulted in a significant increase in weather observations aloft (including data collected by aircraft) to capture the atmospheric conditions at levels where airplanes are flying. At the same time, these upper-air observations have enabled great insights about the free atmosphere and general circulations, which in turn advanced the ability to predict weather into the future both near the surface and aloft (e.g., Benjamin et al. 2019).
The need and desire to fly in all weather conditions posed a great challenge to early commercial flight, but even today weather can still cause significant havoc in the global aviation system. Major aircraft accidents and incidents had a profound impact on aviation weather safety by stimulating research programs and specialized atmospheric observations (e.g., McCarthy et al. 1982; Rasmussen et al. 1992; Kulesa et al. 2003; Mecikalski et al. 2007; Haggerty and Black 2014) to understand the relevant weather phenomena and develop appropriate weather guidance toward mitigating avoidable impacts (see Haupt et al. 2019a, section 3 for more details). These research and development efforts continue to this day, as many aviation weather safety problems remain to be solved. Unexpected turbulence (whether generated naturally by various mechanisms in the atmosphere or in the wake of a nearby aircraft) injures hundreds of passengers and flight attendants every year. Engine icing (a particular problem of modern fuel-efficient engines attributed to encountering high concentrations of tiny ice crystals) may result in flameouts that endanger the safety of flights. Wintry weather causes dangerous conditions and costly flight delays. Severe thunderstorms produce hazardous winds, hail, and lightning, resulting in frequent delays and damage to aircraft countless times every year. And new observational challenges arise with the emergence of UAS and their integration into the national airspace system.
Thunderstorms provide a range of challenges both on the ground and in the air. Radar is heavily used today to track storms; assess their structure, intensity, and evolution; identify wind-related hazards to aviation; and much more. Similarly, satellite-based imagery (and profiling) of the atmosphere at increasingly higher resolution in space and time provides great utility to identify cloud-related hazards to aviation for all phases of flight (e.g., Mecikalski et al. 2007; Schmetz and Menzel 2015). Moreover, satellites provide the backbone of weather information available to pilots and dispatchers during the execution of flights, especially for long-distance transoceanic flights (Kessinger et al. 2017), significantly enhancing the short-range weather depiction enabled by the onboard radar.
Commercial aircraft have become an essential part of the atmospheric observing network, providing readings of pressure, air temperature, wind speed and direction, and increasingly also turbulence (EDR) and water vapor along the flight path. Fahey et al. (2016) provide a history of these measurements. In addition, these aircraft-based upper atmospheric observations significantly improve numerical weather prediction, as demonstrated by Benjamin et al. (2010, 2019), Moninger et al. (2010), De Haan and Stoffelen (2012), Petersen (2016), Petersen et al. (2016), Hoover et al. (2017), and many others.
In the future, more specialized observations targeting aviation weather hazards such as turbulence and icing are needed. More commercial aircraft will be equipped to routinely generate in situ water vapor and EDR readings and make these available in real time. With regard to icing, there are two particular problems that require observations, the airframe icing caused by large supercooled droplets (e.g., Thompson et al. 2017) and the engine flameout attributed to ingestion of large amounts of tiny ice crystals (e.g., Beswick et al. 2015)—both topics are active areas of research. More observations are also needed to address the significant risks posed by space weather impacts on the GNSS, the degradation of high-frequency (HF) radio communication, and increased radiation exposure of humans during high-altitude flights (e.g., Wiltberger 2016).
New airspace users are emerging, such as commercial operators of UAS for a variety of applications (mostly focused on the lower parts of the atmosphere, but some target the upper troposphere and lower stratosphere as well). New societal interests will have an impact on observing system requirements, such as the concept of urban air mobility (UAM),25 the resurgence of interest in supersonic flight, and the growing demand for space launch and travel into outer space. There will be opportunities for collecting weather observations from parts of the atmosphere that are currently undersampled—for example, UAS vehicles can provide detailed in situ observations of the atmospheric boundary layer and in areas that are dangerous for manned aircraft to fly.
Aviation is a global industry that critically depends on weather observations. The call to action (NRC 1994, 1995) for a concerted effort to improve aviation weather services through cooperation among government agencies, private weather services, research organization, and user groups remains valid today. Moreover, connectivity today enables access to information in real time from almost anywhere. In the near future, aircraft and other transportation systems will function as connected nodes sensing and sharing information for improved safety and efficiency. The wealth of data in the data “cloud” environment is attractive for harvesting by smart data analytics and artificial intelligence to detect unusual situations, guide maintenance schedules, and many other beneficial applications.
e. Public safety and defense
Across many nations, efforts are geared toward preparing for and responding to impacts of weather, water, and climate-related hazards that affect the public safety or human health, the environment, economy, and security. Key high-impact weather events of concern include major winter (blizzards, ice storms) and summer storms (thunderstorms), floods, tropical cyclone landfall, heat waves, and wildfires. This requires a core observing infrastructure such as a wide array of surface stations, instrumented buoys, radiosondes, radar, satellites, and aircraft (e.g., the GOS; WMO 2017) and other observational assets (NRC 2003, 2009). Ackerman et al. (2019, section 4) provide a history of the positive impact satellite imagery exerts on our understanding of Earth’s atmosphere and global weather. Ackerman et al. (2019, section 6) discusses some social benefits from satellite observing systems. Similarly, radar networks are key for monitoring precipitating weather systems and associated hazards, as reviewed in section 4 of this chapter and Brooks et al. (2019).
Wildfires are a serious threat to life and property. Smoke plumes produced by fires cause air quality issues, and burned areas exacerbate flash flooding and erosion. Weather information is important for determining not only when the fire danger will be high (e.g., Erickson et al. 2016; Page et al. 2018) but also how quickly a fire will spread and how dangerous conditions will be for the firefighting crews. The environmental drivers for the drying of combustible materials are related to atmospheric conditions, including temperature, humidity, precipitation, and wind, yet the near-surface conditions alone are inadequate for determining how fast a fire will spread, as large fires may evolve into complex, coupled nonlinear dynamic systems (e.g., Coen et al. 2013; Johnson et al. 2014; Peace et al. 2016). To accurately forecast wildland fires, computer models have to simulate highly localized winds (e.g., in complex terrain) that drive the flames. Adding to the complexity, a major blaze can alter its local weather, creating winds within the fire that may be significantly stronger than those outside. These internal winds can contribute to potentially deadly accelerations, increases in intensity, unexpected shifts in direction, or splits in which the flames go in multiple directions. Haupt et al. (2019c, section 4) and a recent NRC report (NRC 2017) provide a historical perspective on wildland fire research and management. Increasingly, modern observing tools enable assessment of different vegetation and fuel attributes (e.g., by satellite remote sensing), monitoring of potential fire ignition hazards (lightning information), and examining the fire weather and its characteristics by using rapidly deployable mobile platforms (e.g., Kiefer et al. 2012; Clements et al. 2018).
Since World War I, advancements in applied meteorology and climatology have been transformational for national security and defense applications, as discussed by Haupt et al. (2019a, section 4). Intelligence about weather has long been recognized as a key element of national security, which explains why in the United States each branch of the defense services has their own weather support. There are well-known examples showing how weather has changed the course of history, including the defeat of the Spanish Armada by the English and dreadful storms (1588), the unsuccessful invasion of Russia by Napoleon in 1812 and later Hitler during World War II (on both occasions caused by severe winter conditions), and the now famous D-day invasion of German-occupied France based on an outstanding weather forecast (that included German weather observations) compiled by the Allied meteorologists (Ross 2014).
f. The role of field experimentation and test beds
Atmospheric observing systems typically undergo a maturing process from an exploratory tool (for fundamental research) to an operational tool (for a specific forecasting application) that meets a societal need. Work by Fujita and colleagues (e.g., Fujita 1985, 1986; McCarthy et al. 1982; Wilson et al. 1984) illustrate this process. They utilized radar and mesonet observing stations extensively in field campaigns such as the Northern Illinois Meteorological Research on Downbursts (NIMROD) in 1978 and the Joint Airport Weather Studies (JAWS) in 1982 to understand the cause of dangerous wind shear near the ground and especially near airports. Their work was possible because of the availability of sophisticated research-grade observing systems, which could be used to study a specific phenomenon that was of societal concern—that is, wind shear. Fujita’s ability to develop a realistic conceptual model of the phenomenon was essential in interpreting the data from radar and ground-based observing systems and interpreting data from flight recorders on aircraft. He brought the term microburst into common usage as a cause of the most dangerous wind shear phenomenon. Once the physical characteristics of downbursts/microbursts were understood, a radar observing system specifically tailored to detect hazardous wind shear near airports became possible. This system is the Terminal Doppler Weather Radar (TDWR), which is now deployed at 45 airports across the United States. Together with specialized ground-based observing systems, such as the Low Level Wind shear Alert System (LLWAS),26 these systems provide the backbone of observing systems to prevent wind shear aviation accidents. The education of pilots about wind shear and microbursts is based on scientific research using observing systems (NRC 1983; Wilson et al. 1984). This training has been an essential component to substantially reduce the accident rate (Sand and Biter 1997).
The approach of testing and evaluating new observing system capabilities in a test bed (Fig. 2-9) is discussed in an NRC report (NRC 2009), although the idea has been around much longer. Such a prototyping approach is applicable also to introducing new support capabilities to decision-makers, as they can learn to use the new capability in a controlled environment and at the same time provide valuable user feedback. Urban test beds are now emerging, for example, in Helsinki, Finland (Koskinen et al. 2011); Shanghai, China (Tan et al. 2015); and Dallas–Fort Worth, Texas (Pulkkinen et al. 2018). Muller et al. (2013) assess the status of urban meteorological networks and examine the fundamental scientific and logistical issues related to such networks.
It is instructive to compare predictions made by the NRC (1958) with the NRC (1998) report. In NRC (1958), they state, “There is a serious danger that our desire for observational material and our ability to use such data may soon outstrip our technical capabilities and, quite likely, our economic capacity to provide them.” After 40 years of progress in creating new satellite, radar, and many other important observing systems, NRC (1998) states, “In surveying the state of basic research in weather dynamics, time after time we came to the conclusion that further progress was limited by the lack of appropriate measurement capabilities.” They further suggest a strategy of predicting how improvements in observing systems would result in better forecasts. This strategy of relating the investments in observing systems to economic benefits of better forecasts is certainly a contemporary theme in modern observing systems.
However, the needs of forecasting the weather are different from the needs of forecasting climate because weather forecasting depends heavily on initial conditions, while climate forecasting depends on long-term data. A lesson from the Keeling curve is the need for a strong and consistent method of quality control of the data that can be sustained over long periods, a point that was made in earlier studies of atmospheric CO2 by Callendar (1958) but that also applies to many long-term observing strategies, such as those conducted for the GOS of the WMO. Our understanding of the role of CO2 and other greenhouse gases relied not only on earlier work in sampling them directly, but also on careful and accurate early observing systems for infrared radiation (e.g., Callendar 1938). The strategy of long-term, sustainable ecological measurements that are the hallmark of Keeling’s Mauna Loa measurements, have been more recently applied to a number of networked ecological observing systems, such as the Global Greenhouse Gas Reference Network;27 the Long Term Ecological Research Network (LTER), which was organized in 1980;28 the AmeriFlux network, which was established in 1996 (Boden et al. 2013); and the National Ecological Observatory Network (NEON), which is expected to enter full operations in 2018.29 The DOE ARM program utilizes a similar long-term approach to documenting the role of clouds in Earth’s radiation budget.
a. Trends of the past 100 years
In looking at the past 100 years of atmospheric observing systems and their future several trends are evident:
Unlike the first half of the last 100 years, a driver for atmospheric observations in the second half of the century has been the increasing resolution in numerical models and their need for observational data. These models are essential for transforming atmospheric observations into forecasts and assessments of the societal impacts of many Earth processes. Thus, the current trends in numerical modeling are likely to increase the demand for more and better observing systems.
Improvements in atmospheric observing systems are often tied to the needs of specific industries, such as transportation, which provide direct benefits to society. However, they often produce broader societal benefits. There is a need to better quantify the benefits of new atmospheric observing systems so that our economic capacity to provide the observations is applied to the highest priorities.
There is an increasing need to better coordinate and integrate the many observing systems, through a network of networks (NRC 2009) and similar strategies to improve access and utility of atmospheric observing system. This situation reflects the NRC (1958) prediction that there are limits to our ability to use existing data effectively. Thus, future progress in observing systems will depend on improving data access and data utility, in addition to improving the types and numbers of observing system. This will also require continued focus on data management to include strategies for how data are collected, how they are archived, how they are accessed, and how they eventually are used.
b. Looking ahead
The WMO recognizes that rapid urbanization necessitates new types of services that make the best use of science and technology and considers the challenge of delivering these as one of the main priorities for the meteorological community (Baklanov et al. 2018). Such integrated urban weather, environment, and climate services will assist cities in facing hazards such as storm surges, flooding, heat waves, and air pollution episodes, especially in changing climates. The aim is to build smart cities with urban services that meet the special needs of cities through a combination of dense observation networks, high-resolution forecasts, multihazard early warning systems, and climate services for reducing emissions that will enable the building of resilient and thriving sustainable cities.
Miniaturization of meteorological sensors now provides low-cost consumer weather monitoring devices that can be placed almost anywhere, including ground-based and aerial vehicles. Connected vehicles (e.g., Mahoney and O’Sullivan 2013) and crowd sourcing (Muller et al. 2015) greatly enhance access to weather-related information that has not been previously available in real time. The Internet of Things (IoT) enables access to and manipulation of data through low-power wireless communication means and cloud-based storage. Chapman and Bell (2018) discuss examples that showcase the transformative potential of the IoT on observations and forecasting.
Mountains, coastlines, and cities exhibit a range of small-scale, localized weather phenomena that are largely undersampled (i.e., poorly resolved) with the present observing infrastructure, yet strong gradients in atmospheric (and chemical) variables across short distances may be of vital importance to life and property (NRC 2009). There remains a need for more (density of stations) and better (higher reporting frequency) meso- and microscale observing networks, increased coastal and offshore observing capabilities, and better coverage of the vertical dimension (i.e., above surface observations). The new technologies for transportation create opportunities for additional sampling of the lower parts of the atmosphere, as is happening in aviation, and the IoT may provide a means to access such data in real time.
Efforts to enhance coordination among networks of data providers and to facilitate wider access to those data in real time should continue as recommended by NRC (2009). Central data repositories30 can facilitate effective collection of data, quality control, and distribution for a wide range of applications.
The combination of large amounts of meteorological data with targeted information enables big data mining using artificial intelligence and deep learning to create new insights and develop smart decision support capabilities, as discussed by Haupt et al. (2019c, section 5). There may also be surprising benefits from the synergy between atmospheric observing system data and data collected for other purposes. For example, Thornton et al. (2017) compared lightning frequency from the WWLLN database with ocean shipping emissions and found a remarkable correlation between lightning frequency and emissions from ships in the eastern Indian Ocean and the South China Sea. The ocean shipping emissions inventory [from the Emissions Database for Global Atmospheric Research (EDGAR)] was made possible because of the shipboard Automatic Identification System (AIS), which was implemented for collision avoidance. Thus, a system designed for avoiding collisions at sea, when coupled with an atmospheric observing system for lightning, has provided important insights into maritime lightning.
Finally, the history of progress in atmospheric observing systems demonstrates that many scientific milestones over the past 100 years, such as our understanding of thunderstorms as only one example, would not have been possible without observing systems (e.g., radar) that were developed over decades and often for other purposes (e.g., detecting enemy aircraft). This scientific understanding led to new observing systems (e.g., TDWR) designed to satisfy specific societal needs, such as preventing aircraft wind shear accidents. Similar examples can be found in operational meteorology (e.g., hurricane forecasting and dropsondes), basic research (e.g., into precipitation formation with improved airborne cloud probes), and climate (long-term sampling networks). It is noteworthy that radar was not generally available to the meteorological community until after World War II. At the time radar technology became available, there was a healthy scientific meteorological community able to apply this new technology to the study of storms, and the meteorological uses of radar rapidly followed, producing remarkable societal benefits. This example should be kept in mind as the technologies of the next 100 years unfold.
The authors are grateful to the American Meteorological Society for its support of this work. The National Center for Atmospheric Research is supported by the National Science Foundation. Additional support from NOAA (Hardesty), the University of Colorado (Pilewskie), South Dakota School of Mines and Technology (Smith), and Droplet Measurement Technology (Baumgardner) is acknowledged. The Aviation Weather Program at NCAR (Steiner) is supported by the FAA and NASA. The contributions of the coauthors in leading the writing of individual section are recognized: J. Stith (sections 1, 6, 8, and parts of other sections), D. Lenschow (section 2), H. Vömel (section 3), P. Smith (sections 4 and 6), R. M. Hardesty (section 4), W.-C. Lee (section 6), J. Haggerty (sections 4, 5, and 6), P. Pilewskie (sections 5 and 6), D. Baumgardner (section 6), and M. Steiner (sections 7 and 8). Thanks are due to helpful comments by M. LeMone, J. A. Moore, A. Blyth, and two anonymous reviewers.