• Ackerman, T. P., and G. M. Stokes, 2003: The Atmospheric Radiation Measurement Program. Phys. Today, 56, 3844, doi:10.1063/1.1554135.

    • Search Google Scholar
    • Export Citation
  • Ahlgrimm, M., R. Forbes, J.-L. Morcrette, and R. Neggers, 2016: ARM’s impact on numerical weather prediction at ECMWF. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0032.1.

  • Andrews, E., P. J. Sheridan, J. A. Ogren, and R. Ferrare, 2004: In situ aerosol profiles over the Southern Great Plains cloud and radiation test bed site: 1. Aerosol optical properties. J. Geophys. Res., 109, D06208, doi:10.1029/2003JD004025.

    • Search Google Scholar
    • Export Citation
  • Bony, S., and J.-L. Dufresne, 2005: Marine boundary layer clouds at the heart of tropical cloud feedback uncertainties in climate models. Geophys. Res. Lett., 32, L20806, doi:10.1029/2005GL023851.

    • Search Google Scholar
    • Export Citation
  • Clothiaux, E. E., M. A. Miller, B. A. Albrecht, T. P. Ackerman, J. Verlinde, D. M. Babb, R. M. Peters, and W. J. Syrett, 1995: An evaluation of a 94-GHz radar for remote sensing of cloud properties. J. Atmos. Oceanic Technol., 12, 201229, doi:10.1175/1520-0426(1995)012<0201:AEOAGR>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Clothiaux, E. E., G. G. Mace, T. P. Ackerman, T. J. Kane, J. D. Spinhirne, and V. S. Scott, 1998: An automated algorithm for detection of hydrometeor returns in micro pulse lidar data. J. Atmos. Oceanic Technol., 15, 10351042, doi:10.1175/1520-0426(1998)015<1035:AAAFDO>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Clothiaux, E. E., and Coauthors, 1999: The Atmospheric Radiation Measurement program cloud radars: Operational modes. J. Atmos. Oceanic Technol., 16, 819827, doi:10.1175/1520-0426(1999)016<0819:TARMPC>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Clothiaux, E. E., T. P. Ackerman, G. G. Mace, K. P. Moran, R. T. Marchand, M. A. Miller, and B. E. Martner, 2000: Objective determination of cloud heights and radar reflectivities using a combination of active remote sensors at the ARM CART sites. J. Appl. Meteor., 39, 645665, doi:10.1175/1520-0450(2000)039<0645:ODOCHA>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Cress, T. S., and D. L. Sisterson, 2016: Deploying the ARM sites and supporting infrastructure. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0049.1.

  • Ellingson, R. G., R. D. Cess, and G. L. Potter, 2016: The Atmospheric Radiation Measurement Program: Prelude. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0029.1.

  • Feingold, G., and A. McComiskey, 2016: ARM’s aerosol–cloud–precipitation research (aerosol indirect effects). The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0022.1.

  • Feltz, W. F., W. L. Smith, H. B. Howell, R. O. Knuteson, H. Woolf, and H. E. Revercomb, 2003: Near-continuous profiling of temperature, moisture, and atmospheric stability using the Atmospheric Emitted Radiance Interferometer (AERI). J. Appl. Meteor., 42, 584595, doi:10.1175/1520-0450(2003)042<0584:NPOTMA>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Ferrare, R. A., and Coauthors, 2004: Characterization of upper tropospheric water vapor measurements during AFWEX using LASE. J. Atmos. Oceanic Technol., 21, 17901808, doi:10.1175/JTECH-1652.1.

    • Search Google Scholar
    • Export Citation
  • Ghan, S., and J. Penner, 2016: ARM-led improvements in aerosols in climate and climate models. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0033.1.

  • Haeffelin, M., and Coauthors, 2016: Parallel developments and formal collaboration between European atmospheric profiling observatories and the U.S. ARM research program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0045.1.

  • Kollias, P., and Coauthors, 2016: Development and applications of ARM millimeter-wavelength cloud radars. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0037.1.

  • Krueger, S. K., H. Morrison, and A. M. Fridlind, 2016: Cloud-resolving modeling: ARM and the GCSS story. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0047.1.

  • Long, C. N., and Y. Shi, 2008: An automated quality assessment and control algorithm for surface radiation measurements. Open Atmos. Sci. J., 2, 2337, doi:10.2174/1874282300802010023.

    • Search Google Scholar
    • Export Citation
  • Long, C. N., J. H. Mather, and T. P. Ackerman, 2016: The ARM Tropical Western Pacific (TWP) sites. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0024.1.

  • Marchand, R., 2016: ARM and satellite cloud validation. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0038.1.

  • Mather, J. H., and J. W. Voyles, 2013: The ARM climate research facility: A review of structure and capabilities. Bull. Amer. Meteor. Soc., 94, 377392, doi:10.1175/BAMS-D-11-00218.1.

    • Search Google Scholar
    • Export Citation
  • Mather, J. H., D. D. Turner, and T. P. Ackerman, 2016: Scientific maturation of the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0053.1.

  • McComiskey, A., and R. A. Ferrare, 2016: Aerosol physical and optical properties and processes in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0028.1.

  • McCord, R., and J. W. Voyles, 2016: The ARM data system and archive. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0043.1.

  • McFarlane, S. A., J. H. Mather, and E. J. Mlawer, 2016: ARM’s progress on improving atmospheric broadband radiative fluxes and heating rates. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0046.1.

  • Michalsky, J., F. Denn, C. Flynn, G. Hodges, P. Kiedron, A. Koontz, J. Schemmer, and S. E. Schwartz, 2010: Climatology of aerosol optical depth in north-central Oklahoma: 1992–2008. J. Geophys. Res., 115, D07203, doi:10.1029/2009JD012197.

    • Search Google Scholar
    • Export Citation
  • Miller, M. A., and A. Slingo, 2007: The ARM Mobile Facility and its first international deployment. Bull. Amer. Meteor. Soc., 88, 12291244, doi:10.1175/BAMS-88-8-1229.

    • Search Google Scholar
    • Export Citation
  • Miller, M. A., K. Nitschke, T. P. Ackerman, W. R. Ferrell, N. Hickmon, and M. Ivey, 2016: The ARM mobile facilities. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0051.1.

  • Mlawer, E. J., and D. D. Turner, 2016: Spectral radiation measurements and analysis in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0027.1.

  • Peppler, R., K. Kehoe, J. Monroe, A. Theisen, and S. Moore, 2016: The ARM data quality program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0039.1.

  • Randall, D. A., A. D. Del Genio, L. J. Donner, W. D. Collins, and S. A. Klein, 2016: The impact of ARM on climate modeling. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0050.1.

  • Schmid, B., R. G. Ellingson, and G. M. McFarquhar, 2016: ARM aircraft measurements. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0042.1.

  • Shupe, M. D., J. M. Comstock, D. D. Turner, and G. G. Mace, 2016: Cloud property retrievals in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0030.1.

  • Sisterson, D., R. Peppler, T. S. Cress, P. Lamb, and D. D. Turner, 2016: The ARM Southern Great Plains (SGP) site. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-16-0004.1.

  • Stamnes, K., R. G. Ellingson, J. A. Curry, J. E. Walsh, and B. D. Zak, 1999: Review of science issues, deployment strategy, and status for the ARM North Slope of Alaska–Adjacent Arctic Ocean climate research site. J. Climate, 12, 4663, doi:10.1175/1520-0442-12.1.46.

    • Search Google Scholar
    • Export Citation
  • Stephens, G., and Coauthors, 2000: The Department of Energy’s Unmanned Aerospace Vehicle (UAV) Program. Bull. Amer. Meteor. Soc., 81, 29152938, doi:10.1175/1520-0477(2000)081<2915:TDOESA>2.3.CO;2.

    • Search Google Scholar
    • Export Citation
  • Stokes, G. M., 2016: Original ARM concept and launch. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0021.1.

  • Turner, D. D., and Coauthors, 2004: The QME AERI LBLRTM: A closure experiment for downwelling high spectral resolution infrared radiance. J. Atmos. Sci., 61, 26572675, doi:10.1175/JAS3300.1.

    • Search Google Scholar
    • Export Citation
  • Turner, D. D., and Coauthors, 2007a: Thin liquid water clouds: Their importance and our challenge. Bull. Amer. Meteor. Soc., 88, 177190, doi:10.1175/BAMS-88-2-177.

    • Search Google Scholar
    • Export Citation
  • Turner, D. D., S. A. Clough, J. C. Liljegren, E. E. Clothiaux, K. E. Cady-Pereira, and K. L. Gaustad, 2007b: Retrieving liquid water path and precipitable water vapor from the Atmospheric Radiation Measurement (ARM) microwave radiometers. IEEE Trans. Geosci. Remote Sens., 45, 36803689, doi:10.1109/TGRS.2007.903703.

    • Search Google Scholar
    • Export Citation
  • Turner, D. D., E. J. Mlawer, and H. E. Revercomb, 2016: Water vapor observations in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0025.1.

  • U.S. Department of Energy, 1990: Atmospheric Radiation Measurement Program Plan. DOE/ER-0441, 121 pp.

  • U.S. Department of Energy, 1996: Science Plan for the Atmospheric Radiation Measurement (ARM) Program. DOE/ER-0670T, 74 pp.

  • U.S. Department of Energy, 2004: Atmospheric Radiation Measurement Program Science Plan. DOE/ER-0402, 62 pp.

  • U.S. Department of Energy, 2007: Report on the ARM Climate Research Facility Expansion Workshop. DOE/SC-ARM-0707, 50 pp.

  • U.S. Department of Energy, 2008: ARM Climate Research Facility Workshop Report. DOE/SC-ARM-0804, 23 pp.

  • U.S. Department of Energy, 2010: Atmospheric System Research (ASR) Science and Program Plan. DOE/SC-ASR-10-001, 77 pp. [Available online at http://science.energy.gov/~/media/ber/pdf/Atmospheric_system_research_science_plan.pdf.]

  • U.S. Department of Energy, 2014: Atmospheric Radiation Measurement Climate Research Facility Decadal Vision. DOE/SC-ARM-14-029, 21 pp.

  • Uttal, T., and Coauthors, 2002: Surface heat budget of the Arctic Ocean. Bull. Amer. Meteor. Soc., 83, 255275, doi:10.1175/1520-0477(2002)083<0255:SHBOTA>2.3.CO;2.

    • Search Google Scholar
    • Export Citation
  • Verlinde, J., and Coauthors, 2007: The Mixed-Phase Arctic Cloud Experiment (M-PACE). Bull. Amer. Meteor. Soc., 88, 205221, doi:10.1175/BAMS-88-2-205.

    • Search Google Scholar
    • Export Citation
  • Verlinde, J., B. Zak, M. D. Shupe, M. Ivey, and K. Stamnes, 2016: The ARM North Slope of Alaska (NSA) sites. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0023.1.

  • Vogelmann, A. M., and Coauthors, 2012: RACORO extended-term aircraft observations of boundary layer clouds. Bull. Amer. Meteor. Soc., 93, 861878, doi:10.1175/BAMS-D-11-00189.1.

    • Search Google Scholar
    • Export Citation
  • Voyles, J. W., and L. A. Chapman, 2012: Field campaign guidelines. DOE/SC-ARM-11-003, 20 pp.

  • Westwater, E. R., B. B. Stankov, D. Cimini, Y. Han, J. A. Shaw, B. M. Lesht, and C. N. Long, 2003: Radiosonde humidity soundings and microwave radiometers during Nauru99. J. Atmos. Oceanic Technol., 20, 953971, doi:10.1175/1520-0426(2003)20<953:RHSAMR>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Wulfmeyer, V., and Coauthors, 2011: The Convective and Orographically Induced Precipitation Study (COPS): The scientific strategy, the field phase, and first highlights. Quart. J. Roy. Meteor. Soc., 137, 330, doi:10.1002/qj.752.

    • Search Google Scholar
    • Export Citation
  • Xie, S., and Coauthors, 2010: Clouds and more: ARM climate modeling best estimate data. Bull. Amer. Meteor. Soc., 91, 1320, doi:10.1175/2009BAMS2891.1.

    • Search Google Scholar
    • Export Citation
  • Zhang, M., and J. L. Lin, 1997: Constrained variational analysis of sounding data bases on column-integrated budgets of mass, heat, moisture, and momentum: Approach and application to ARM measurements. J. Atmos. Sci., 54, 15031524, doi:10.1175/1520-0469(1997)054<1503:CVAOSD>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Zhang, M., R. C. J. Somerville, and S. Xie, 2016: The SCM concept and creation of ARM forcing datasets. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0040.1.

  • View in gallery

    Location of the permanent ARM sites, ARM Mobile Facility deployments, and field campaigns as of 2012.

  • View in gallery

    Cumulative growth in files and megabytes requested from the ARM Data Archive.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 70 67 4
PDF Downloads 51 48 3

The Programmatic Maturation of the ARM Program

View More View Less
  • 1 Joint Institute for the Study of the Atmosphere and Ocean, University of Washington, Seattle, Washington
  • | 2 Pacific Northwest National Laboratory, Richland, Washington
  • | 3 U.S. Department of Energy, Germantown, Maryland
  • | 4 Pacific Northwest National Laboratory, Richland, Washington
  • | 5 NOAA/National Severe Storms Laboratory, Norman, Oklahoma
© Get Permissions
Full access

Corresponding author address: Thomas P. Ackerman, JISAO, University of Washington, 3737 Brooklyn Ave. NE, Seattle, WA 98105. E-mail: tpa2@uw.edu

Retired.

Corresponding author address: Thomas P. Ackerman, JISAO, University of Washington, 3737 Brooklyn Ave. NE, Seattle, WA 98105. E-mail: tpa2@uw.edu

Retired.

1. Introduction

The early years of the Atmospheric Radiation Measurement (ARM) Program (see Stokes 2016, chapter 2) were devoted to the establishment of ground-based remote sensing sites at the Southern Great Plains (SGP), Tropical Western Pacific (TWP), and North Slope of Alaska (NSA; see Sisterson et al. 2016, chapter 6; Long et al. 2016, chapter 7; Verlinde et al. 2016, chapter 8). The ARM Program focused a great deal of its activity on site selection, instrument choices, site development, data management, and basic software development and implementation, which is described in Cress and Sisterson (2016, chapter 5). The period around 1998 represented a transition period for ARM. The SGP site was largely completed (as initially conceived), the TWP Manus site was up and running and the Nauru site was about to be installed, and the NSA Barrow site was up and running. Data were being collected and archived routinely and data rates were increasing exponentially by year.1

During this period, the Science Team was actively engaged in learning how to use effectively the available and expected data. Initially, the ARM Science Team was organized around two broad research themes. One was the instantaneous radiative flux (IRF) concept and the other was the single-column model (SCM) approach. The IRF approach was derived from the SPECTRE experience (Ellingson et al. 2016, chapter 1) and focused on measuring all the radiatively active components in an atmospheric vertical column, computing downwelling radiative fluxes from these components, and then asking if the computations matched observations. The SCM approach attempted to represent experimentally a single-grid column in a climate model, including the advective forcing, and the clouds that were generated in such a column. It became apparent that both these concepts required expansion and redefinition. The IRF concept worked well for noncloudy skies but had significant problems in cloudy skies because the cloud radiative properties could not be adequately summarized from the measurements alone. Similarly, ground-based measurements of aerosol optical depth were insufficient to resolve the impact of aerosol on solar fluxes. Boundary conditions for the SCM were a significant challenge, as was determining the area-averaged cloud properties for comparison with model output. In addition, there was a perceived need for higher-resolution (2D and 3D) cloud modeling.

Technological progress also had a significant impact on the ARM Program and science during this early period. Electronic data transmission was in its infancy with low transmission rates and high costs, especially for large data files from relatively remote sites. Thus, data were transported on physical media, often with relatively long delay times. Computers and storage media were expensive and limited by today’s standards. During the 1990s, computers and the Internet were in a rapid expansion mode, which required plans to be continually revised and updated by program management. This created both the opportunity to reduce costs, as computer prices fell, but also the demand for purchase of new systems with greater capability.

By the late 1990s, ARM had grown from an initial vision into a vigorous and dynamic program. It had expanded, however, to the limits of its financial resources, in part because the proposed budget was never actually allocated. The completion of the initial site development, the increased availability of ARM data, and the expansion of the Science Team in new directions began to shift management needs and interests. In the early stages, ARM project management was carried out largely by the original Chief Scientist, Gerry Stokes, and the technical management at the Pacific Northwest National Laboratory (PNNL) along with advice and guidance from the Science Team Executive Committee (STEC), which comprised senior members of the ARM Science Team (Mather et al. 2016, chapter 4).

In 1999, Gerry Stokes left the program for a different position, and Thomas Ackerman was hired as the ARM Chief Scientist.2 At the same time, the Department of Energy (DOE) commissioned the Chief Scientist and the STEC to produce an assessment of ARM accomplishments, to carry out an evaluation of the science direction of ARM and to identify areas that required improvement with the preparation of a revised science plan for ARM. The goal of these activities was to create a stable model for the ARM Program as it transitioned from a developing to a mature program.

The planning for a mature ARM Program occurred under the constraints of a fixed budget and a time horizon of roughly another decade of data collection. The ARM Program received its first funding in 1990, but its funding profile plateaued after a few years and remained relatively flat for the remainder of the decade. The budget was sufficient to establish only three fixed locales (SGP, TWP, and NSA), and expansion plans for both the TWP and NSA were curtailed. The initial program plan had called for a decade of data collection; substantive data collection, however, started in 1996 with the deployment of the millimeter cloud radar (MMCR; Kollias et al. 2016, chapter 17) at the SGP and the deployment of the TWP Manus site and with the deployment of the NSA Barrow site by 1998. Science and budget realities forced the ARM management to find operational efficiencies, increase synergy across the program, and shift research emphases to focus on critical science issues.

2. The ARM questions

The fundamental goal of the ARM Program was to improve cloud parameterizations in global climate models (GCMs) through improved understanding of cloud and radiation processes obtained from a combination of modeling and data analysis (Mather et al. 2016, chapter 4). In 1999, the DOE program management asked the Chief Scientist to assess how well the ARM Program was doing in answering the ARM questions. To a large degree, this question was concerned with process. The question was not whether the program had obtained the answers but whether the program scientists had the tools and structure required to seek the answers. The obvious follow-on was then to decide what could be done to improve the tools, structure, and process.

The ARM Chief Scientist summarized the initial results of this analysis at the 1999 ARM Science Team Meeting. Assessments included a qualitative scoring as to whether a dataset or process understanding was 1) sufficient for ARM purposes, 2) good but requiring some further work, 3) reasonably well understood but requiring substantial work, or 4) important but lacking a well-conceived idea of how to proceed. It is worth a brief look at this assessment to understand what drove ARM development over roughly the next decade and where progress has been made.

Clear-sky3 radiative flux understanding was deemed to be largely sufficient in the infrared but requiring some additional work on the solar side. Cloudy-sky radiative flux understanding was likewise deemed to be largely sufficient in the infrared but requiring research on aspects of broken cloudiness. Interestingly, solar radiative transfer on the grid box or SCM domain was placed in the last category: important but lacking good ideas on how to proceed. ARM continued to carry out research on the infrared problem (see Mlawer and Turner 2016, chapter 14) and one of its huge successes has been the resolution of many of the questions associated with this issue.

Water vapor column and liquid water path measurements were deemed to be sufficient as well. This was largely the result of considerable improvement in microwave radiometry (see Turner et al. 2016, chapter 13) brought about by ARM research and partnership with manufacturers of microwave radiometers. Water vapor profiling in the lower atmosphere was deemed sufficient, but much less so in the upper troposphere. Ice water path and its spatial distribution were both placed in the fourth category. The recently deployed scanning millimeter radars (Mather and Voyles 2013) may finally bring some much needed clarity to this latter question.

Cloud retrievals from ground-based sensors were in their infancy at this point and were generally considered to be in the bottom two categories (categories 3 and 4). The only exception was stratus clouds, which were placed in category 2, largely because of an increased ability to measure liquid water path and retrieve optical depth values. ARM research on retrievals is another area of huge success (Shupe et al. 2016, chapter 19), including extensions to satellite instruments (Marchand 2016, chapter 30).

SCM investigations required a set of both surface fluxes and lateral boundary fluxes (Zhang et al. 2016, chapter 24). Radiative fluxes were deemed largely sufficient. Temperature and water fluxes were seen as good, but requiring further work at the SGP site. For the TWP and NSA sites, however, these fluxes were placed in category 4, a reflection of the fact that these sites were located in areas where there were limited options for carrying out extended site measurements. The final category of parameterization testing and development was deemed to be largely in category 4. This was and is an ongoing issue for ARM.

This assessment and the discussions that followed led to the identification of several issues/questions that shaped the direction of the ARM Program over the next decade: 1) data quality and continuity, 2) data fusion and value-added products, 3) intensive observing period (IOP) process and management, 4) ARM Mobile Facilities, 5) Science Team refocusing, and 6)improved parameterization development process.

Intertwined throughout these issues was the need for instrument development. The long-term focus of the ARM Program allowed scientists time to analyze and critique data streams, which in turn led to identification of issues and proposed improvements to existing instruments or new instruments. Specific examples of this process are provided elsewhere in this monograph.4 The constrained resources of the program during this period, however, required management to weigh instrument priorities and find innovative ways of partnering with other agencies and private companies.

The bulk of this article is devoted to a discussion of how these issues molded ARM Program development over the subsequent decade (1999–2009). For reasons of clarity, we have chosen to discuss them largely as separate themes but they were, and are, clearly interconnected. This connectivity required ARM management to engage in a continuous trade-off between recognized and defined program needs and programmatic and financial feasibility. Setting priorities and making decisions was difficult and sometimes contentious (e.g., deciding the relatively priority of repairing and/or purchasing new instruments that were requested by the different working groups). One of the real hallmarks of the ARM Program during this period, however, was a genuine and thoughtful partnership among program management, infrastructure personnel, and Science Team members. The success of the ARM Program was built largely upon this partnership and acceptance of shared sacrifices to accomplish a desired end, namely the advancement of ARM science.

a. Program organization and evaluations

Before turning to the issues identified above, we think it useful to review briefly the organizational structure of the ARM Program and program evaluations that were carried out during its formative years. In February 1996, the ARM Program released a Science Plan, written primarily by the STEC (U.S. Department of Energy 1996; see appendix B therein). The 1996 program plan provides a view of the ARM Program as it prepared to launch into a fully operational phase. The science strategy is focused largely on two problems: the IRF problem (question 1 above) and the SCM approach (question 2), and resulted in two working groups within the ARM Science Team to focus on these issues (Mather et al. 2016, chapter 4). This Science Plan, along with several program evaluations, laid the foundation for the next phase of the program.

Early on, as mentioned by Stokes (2016, chapter 2), the ARM Program management turned to a group of prominent scientists, known as the JASONS, for a periodic review of the Program. One of their earliest reports on the ARM Program appeared in 1995 and is fairly brief. In 1997, the Washington Advisory Group (WAG, chaired by Robert M. White) provided a program review that was quite detailed in its recommendations. Both of these reports are generally laudatory reviews of ARM Program progress and stature within the research community, and both note the pressures inherent in maintaining sustained funding and suggest the need to address this problem. Both reports also comment on the important role of intensive field campaigns and the need to manage these activities, a subject we address below. It is interesting that this tension regarding support for continuous ground-based measurements, routine in situ observations, and/or intensive campaigns has existed from the very beginning of the ARM Program and continues through to the present. The WAG report also identified the need for maintaining data integrity including calibration and data management, another issue of continuing importance to ARM.

The reports, the ongoing concerns of the Science Team about science management, and the appointment of a new ARM Chief Scientist in 1999 led to a reorganization of the Science Team and the Science Team Executive Committee around 2000–01. By now, the STEC representation had been built around four large working group themes, two large groups focused on IRF and cloud parameterization and modeling, and two small groups focused on cloud properties and aerosols (see Fig. 4-1 in Mather et al. 2016, chapter 4). The growth of ARM science and the associated scientific community made these groups too large for easy communication and sometimes unresponsive to new needs. The STEC appointed formal working group steering committees for each of these four groups with the intent of distributing work and increasing communication. In addition, it provided a mechanism for the creation of smaller, temporary working groups for specific projects, particularly those that cut across the four major groups. This structure worked well and remained in place for much of the next decade.

From a data and data flow point of view, the computational component of the ARM Program was also undergoing significant change. As discussed by Stokes (2016, chapter 2), the Experiment Center functioned as the data acquisition focal point for the program, acquiring data from the ARM sites as well as external sources. Early on, the Experiment Center was tasked with virtually all things data (with the exception of the archival of the data), including ascertaining data continuity and quality. As data streams from sophisticated instruments began to be received and technically involved data processing and quality control became required, this had to change and evolve. This became a key element of the evolution of the program and is discussed in detail below and in McCord and Voyles (2016, chapter 11).

DOE program management also experienced some significant changes during this period. The growth of the program required the addition of management staff. In the late 1990s, management responsibilities were divided principally between the Science Team management under the direction of Pat Crowley and the ARM infrastructure under the direction of Wanda Ferrell. With the sudden retirement of Crowley in 2000, Ferrell inherited responsibility for the management of the entire ARM Program, a very challenging endeavor that she then managed for much of the next decade.

The Chief Scientist and the reorganized STEC began the process of developing a revised Science Plan in about 2001 and produced a final version of that new plan in 2004 (U.S. Department of Energy 2004; see appendix C therein). It provided the blueprint for ARM development from the early 2000s to the formation of the Atmospheric Science Research (ASR) program (described later in this chapter). The 2004 plan identified a set of key science goals:

  • Maintain the data record at the remote sites at least through the next 5-yr period.

  • Improve significantly our understanding of and ability to parameterize the 3D cloud-radiation problem at scales from the local atmospheric column to the GCM grid square.

  • Develop new techniques to retrieve the properties of ice clouds and mixed-phase clouds and thereby improve our understanding of the life cycle processes in these clouds and their interaction with atmospheric radiation (see Shupe et al. 2016, chapter 19).

  • Develop a focused research effort on the indirect aerosol problem that spans observations, physical models, and climate model parameterizations (see Feingold and McComiskey 2016, chapter 22; Ghan and Penner 2016, chapter 27).

  • Implement and evaluate an operational methodology to calculate broadband heating rates in atmospheric columns at the ARM sites (see McFarlane et al. 2016, chapter 20).

  • Develop and implement methodologies to use ARM data more effectively to confront atmospheric models, both at the CRM and the GCM scale (see Zhang et al. 2016, chapter 24; Krueger et al. 2016, chapter 25; Randall et al. 2016, chapter 26).

These goals have guided the program for a decade. This chapter focuses on the program developments that were intended to enable the research community to address these important questions.

b. Data quality and continuity

From an observational perspective, the conceptual vision of ARM was to define a suite of necessary ground-based measurements to address the ARM science questions, deploy instruments that could make these measurements, and then continuously collect data of known quality. Not surprisingly, translating this vision into reality was far more complicated than most of the ARM community realized at the beginning of the program. In some cases, definition of needed measurements led to a call for new instruments; however, for both existing and new instruments, the difficulties centered on the twin issues of data continuity and quality.

Data continuity was fairly easy to obtain for measurements of standard meteorological quantities such as surface meteorology and broadband radiation. Achieving data continuity for more complex instruments was hampered by the fact that many of these instruments were not commercially available and had never been deployed for continuous measurements or, in some cases, instruments for some required observations did not exist (or existed only in some relatively primitive state). The ARM-related histories of specific instruments (e.g., microwave radiometers, infrared interferometers, Raman lidar, and cloud radars) are discussed elsewhere in this monograph; here we comment only on the broad ARM approach.

Early on, the ARM management recognized the need to have instrument experts available to the program and devised the idea of instrument mentors (Stokes 2016, chapter 2). Each instrument (or instrument class) was intended to have a mentor who would be supported for some fraction of his/her time out of the ARM infrastructure budget to monitor instrument performance and be available to consult with on-site staff when instrument problems arose. This approach seemed particularly well suited to the ARM concept of deploying similar instruments at multiple sites. The instrument mentor approach worked well, except for cost. As the ARM Program moved toward maturity, both the number of different instruments and the number of each type of instrument grew, but the pot of money available for mentors did not. The obvious consequence was that instrument mentoring suffered, producing corresponding problems with data continuity and quality.

The data continuity problem gradually sorted itself out in various ways. Some instruments were simply dropped from the ARM suite because they were deemed too unreliable and/or expensive to maintain. This instrument triage was often painful because it required an explicit statement of ARM research priorities and an implicit rejection of certain types of research through instrument decommissioning (e.g., the decommissioning of the whole sky imager because it never lived up to its goal to provide automated cloud cover routinely without substantial human interaction). Other instruments moved from prototypes to more reliable production models supported in large part by DOE investments directly through the ARM Program or by Small Business Innovative Research (SBIR) initiatives (e.g., the GVR and GVRP , which are microwave radiometers that observe downwelling radiance around the 183.3-GHz water vapor line). Other instruments (both hardware and software components) went through an evolutionary development based on science and engineering efforts supported within the ARM Program and through external contracts.

The instrument evolutionary development efforts, formalized early in the program as the Instrument Development Program, are among the great successes of the ARM Program. Scientists funded by ARM used portions of their research efforts to diagnose problems and investigate possible solutions. Engineering staff, funded by the ARM infrastructure, consulted freely with the scientists and invested precious time and effort in carrying out specific instrument tests and operational procedures as part of the development. Program management supported both groups in these activities because of the need to solve the instrument problems. We think it fair to say that failure to solve some of these really difficult instrument problems would have produced a far smaller and less useful ARM Program.

ARM management and scientists recognized from the very beginning that data quality and assessment was a critical element of the program but it took some time to find the right approach. In the early days, ARM tried a two-pronged approach that combined examination of data streams by instrument mentors and scientific users (primarily the site scientist teams). This approach had the virtue of relatively low cost but ultimately failed because the process was too uneven. The analogy used at the time (probably based on looking at agricultural areas around the SGP site) was that ARM had a giant field of data that required plowing and some areas were plowed deeply while in other areas the surface was barely scratched. The solution was the establishment of a data quality office at the University of Oklahoma in 2000. The office was charged with implementing data assessment procedures that could identify data quality issues in a timely fashion and then communicating that information to the appropriate operations and engineering personnel. In addition, the office served and serves as a point of contact for data quality issues raised by data users in the scientific community. [See Peppler et al. (2016, chapter 12) for more details about the Data Quality Office.]

The establishment of the Data Quality Office was extremely important for the long-term health of the ARM Program. The Data Quality Office now serves as a crucial link between site operations, the ARM Data Archive, and the user community. Office personnel devise quantitative tests of data quality and incorporate information on data quality in the ARM Data Archive through data flags and metadata. They respond to inquiries from users about data quality and track these questions to resolution whenever possible. By formalizing this process, ARM has created a “uniformly plowed” field. This is not to say that there are no data problems; it is almost impossible to envisage a program the size of ARM that has no data issues. (Today ARM operates over 350 instruments that produce over 1500 unique data streams across six ground-based facilities.) The Data Quality Office, however, has provided ARM with a mechanism to reduce data problems through regular, consistent, and careful examination of the data.

c. Data fusion and value-added products

The initial focus of the ARM Program was to obtain simultaneous datasets at a single site from multiple instruments. Within a very short time, the focus broadened to include the idea of data fusion in two distinct ways. The first is that multiple instruments may be measuring the same or very closely related atmospheric quantities, so these measurements should be merged into a single estimate of that atmospheric quantity. For example, water vapor is a critical component of the atmosphere and many investigations require water vapor profiles (e.g., water vapor concentration as a function of height). The ARM data suite includes water vapor measurements made by conventional radiosondes, microwave radiometers, infrared radiometers, Raman lidar, and instruments at the surface and on meteorological towers. Each of these data streams has strengths and weaknesses and a highly sophisticated user may be aware of them. For most users, however, this requires a level of understanding that requires too much time to acquire and is not a productive use of precious research time. The question then is how to provide a merged (or fused) dataset for users that represents an optimal data product.

The second type of data fusion combines measurements that provide complementary information about some quantity of interest. For example, the ARM community embraced the use of millimeter wavelength cloud radar and lidars to study cloud properties. Lidar is a very sensitive probe that responds to very small concentrations of hydrometeors but saturates quickly at large concentrations. Radar lacks the sensitivity to small concentrations but is able to provide information through deep layers and multiple layers of clouds. Thus, one can combine the information from these two quite different sensors to produce continuous profiles of cloud occurrence. A related feature is that multiple measurements from a single or multiple instruments can be combined into a mathematical retrieval that provides information about some atmospheric quantity that is not readily observed in a direct way. The retrieval process was well understood in atmospheric sciences and had been applied to satellite data beginning in the late 1970s to retrieve profiles of temperature and humidity. There had been very little application, however, to ground-based systems prior to ARM and little of the satellite experience translated directly to the ARM instruments.5

The need for creating value-added products or “VAPs” (the generic name that ARM applied to these data fusion activities) was well appreciated from the onset of ARM, and is called out specifically in the 1996 ARM Science Plan (U.S. Department of Energy 1996). The only method in place to create them, however, was the activity of members of the Science Team. One of the earliest examples of this effort was the creation of the Active Remote Sensing of Clouds (ARSCL) VAP to provide, initially, profiles of cloud occurrence as a function of time. The science behind this VAP grew out of research at the Pennsylvania State University, funded in large part by DOE, to understand cloud radar data using the Penn State radar (Clothiaux et al. 1995). When ARM put a micropulse lidar and MMCR in place at the SGP site, the Penn State group and colleagues took the logical path of extending their research and creating a product that anyone could use (Clothiaux et al. 1998, 1999, 2000; Kollias et al. 2016, chapter 17).

The initial VAPs were built around science ideas and codes developed by science investigators. The codes were then adapted at one of the participating DOE laboratories for operational conditions and run by the ARM infrastructure. A large number of VAPs are in operation today; an example of a subset of these is given in Table 3-1. While this process was highly efficient in development cost, it was inefficient in operational cost and organization. Scientists tend to write programs that serve their own particular scientific purposes and are relatively unconcerned with issues of generic reliability and code versioning and traceability.6 As the ARM Program matured, it became apparent that these latter problems were proving to be a significant barrier to VAP production and overall data usage and therefore needed to be addressed.

Table 3-1.

Examples of ARM value-added products.

Table 3-1.

The STEC and ARM management decided to institutionalize a VAP process within the program structure that relied on a combination of science input and testing and engineering development and implementation. The key concept was a “translator,” usually a scientist, who served as the liaison between scientists and software engineers. The basic features of the process were and are the following:

  • Identification of VAPs. The scientific working groups and subgroups are the focal point of the science and therefore are the originators of VAPs. They are in the best position to know which VAPs are possible and most useful to the community, where “possible” implies the requirement of writing down an algorithm that can be implemented.

  • Prioritization. VAP development requires money and, clearly, the number of VAPs desired by the science community can outstrip the available resources. In the initial implementation of the process, the STEC [later the Science and Infrastructure Steering Committee (SISC)] and program management were and are charged with establishing priorities.

  • Development and implementation. Once a VAP is approved, implementation is assigned to a translator and a software engineer. They, along with the originating scientists, are responsible for establishing VAP criteria, estimating resources, defining a schedule, and implementing the algorithm in a framework that allows it to be run in an autonomous manner.

  • Evaluation. After the VAP is implemented, a test dataset spanning a relatively long time period is generated and made available to sponsoring scientists and the working group that promoted the VAP. This allows a subgroup of the community to analyze the dataset to ensure that the algorithm is working as desired. Frequently, the application of a new algorithm to a longer-term dataset reveals problems that the originating scientist did not anticipate, which requires that the algorithm be moved back into a development/implementation state until the problems are resolved.

  • Operations. Once the VAP is implemented using the computing resources available at the ARM Data Archive and the sponsoring working group gives its approval of the evaluation dataset, the VAP moves to an operational status. It is then the responsibility of the originating scientists and the translator to ensure that it is performing correctly and evaluate the product.

This VAP process began in 2001 with the appointment of four translators, one assigned to each of the four main working groups. As with any new initiative, a bit of time was required to put in place the right combination of translators and software engineers and make the process run smoothly. Within a year or two, the VAP process became integral to the ARM Program and a distinct success. The process created robust codes that provided reliable value-added products, especially for the merged measurement products or “best estimates” as they become known. Relying on expert scientists to provide the scientific backbone for these products gave them credibility. Implementing engineered codes gave them reliability. Of course, the codes are always evolving as the science and instrumentation evolves, leading to changes, but ones that are managed and recorded.

Ideas for VAPs rapidly outpaced available program resources, a problem that was exacerbated by the fact that development often took longer than expected due to the complexity of the process, and that most VAPs were evolutionary in principle. In addition, the translators found that some VAPs were simply not ready for implementation because the science was insufficiently developed or the algorithms were too complex to be implemented as robust operational codes. This was particularly a problem for retrieval algorithms for cloud properties. These algorithms were mathematically sensitive, causing program crashes and breaks in the automated processing. However, many of these retrievals were of interest to many members of the Science Team. After much discussion, the ARM Program took the unusual step of creating space in the ARM Data Archive for what became known as Principal Investigator (PI) datasets. These datasets are generated by PIs using documented algorithms but are produced by the PI’s research group rather than by ARM infrastructure members. While the ARM Program dictates the data format and metadata for these PI datasets, it makes no statement about quality or robustness. This process has provided a helpful middle ground for the program by providing community access to interesting datasets while reducing the time and cost for the program itself to produce them. The popularity and number of PI-produced datasets started slowly, but has continually grown with time; as a consequence, the program has improved methods to collect metadata about these datasets and make them more visible in the ARM Data Archive.

d. IOP process and management

A central element of the ARM vision is to provide continuous ground-based remote sensing at multiple sites. In the early days of the program, ARM did not yet have sites in full operation so the program supported participation by ARM scientists in planned field programs such as the NASA FIRE [First International Satellite Cloud Climatology Project (ISCCP) Regional Experiment] campaigns and TOGA COARE (Tropical Ocean and Global Atmosphere Coupled Ocean–Atmosphere Response Experiment). When the ARM Program began to develop its observing sites, not all instruments could be put in place simultaneously, so ARM sponsored intensive observing periods during which investigators brought their own instruments to the ARM sites to supplement existing ARM instrumentation, or ARM supported other agencies by taking ARM instruments to their campaigns [e.g., the Surface Heat Budget of the Arctic Ocean (SHEBA) experiment; see Verlinde et al. (2016, chapter 8)]. Furthermore, as instruments from the Instrument Development Program (IDP; Stokes 2016, chapter 2) were placed at the sites, IOPs were typically used to confirm that the IDP instruments were indeed operating as desired. The conceptual idea during the early ARM days was that IOPs for various purposes would be required, but as the sites became populated with the permanent ARM instrument set, the need for these IOPs would be reduced (Stokes 2016, chapter 2). This proved not to be true. For example, the 1996 Science Plan identifies IOPs as a way to provide special measurements such as aircraft sampling.

IOPs (or campaigns, as they are sometimes called) presented an additional challenge for the ARM Program in terms of management and allocation of resources. On the one hand, the ARM paradigm was based on continuous observations; on the other hand, IOPs were needed to obtain science-defined measurements that were too costly to be obtained routinely or could not yet be done in a quasi-unattended mode. Because cloud and radiation measurements had previously been done almost exclusively in field programs, some part of the science community felt that this was the only useful way in which they could be done. Thus the ARM science and program management was forced to chart a difficult course between conflicting demands that allowed the continuous measurement program to grow and mature while still enabling enhanced measurements during IOP periods.

By the end of the 1990s, IOPs had become an integral part of ARM science and operations and their annual number, complexity, and cost was increasing.7 Some IOPs were simple instrument comparisons lasting a few weeks to a month;8 some IOPs lasted for a year or more; some involved the use of expensive assets such as airplanes and ships; some were ARM-centric and some were joint national and/or international operations. As with other ARM activities, it was clear that some process needed to be put in place to manage IOPs so that allocation of resources fit within program science goals and available funding.

The IOP process established in the mid-1990s and early-2000s was fairly simple. A group of scientists (typically from one of the original scientific working groups) proposed an IOP to ARM through the creation of a short science plan that stated objectives, an activity plan, and resource requirements. The plan was vetted by the ARM Technical Director regarding resources and cost and then passed to the STEC for evaluation. The STEC then evaluated the various proposals in terms of program science objectives and budget impact and approved those most highly ranked. This process worked well in large part because it provided for an iterative discussion among the proposers, the ARM Program management, and the scientific leadership. In many cases, these discussions sharpened objectives and honed resources, including bringing in funding from other participants.

One of the interesting events that occurred during this period was the creation of an ongoing IOP to measure in situ aerosol profiles (McComiskey and Ferrare 2016, chapter 21). In some sense this was an effort to merge the conflicting demands of routine measurements and an IOP. A rugged set of instruments to measure aerosol properties was designed to fit into the passenger seat of a small Cessna aircraft. The Oklahoma company that owned the Cessna was instructed to fly a couple of times a week over the SGP site when permitted by weather and airplane availability. Data were downloaded after each flight and sent to the science group that proposed the IOP [e.g., see Andrews et al. (2004) for early results]. This approach has now been applied to other measurements [e.g., liquid water cloud properties in Vogelmann et al. (2012)], blurring the distinction between continuous measurements and IOPs.

In 2004, the ARM sites were designated as a National Scientific User Facility called the ARM Climate Research Facility, which posed new considerations for the ARM science community. As a User Facility, the ARM sites were now seen as serving the larger international science community, not just the ARM science community. While this created the opportunity for additional funding and resources, it also changed many past processes. The collective ARM site managers (the Infrastructure Management Board) and DOE program managers now assumed responsibility for reviewing and approving the smaller campaigns. For large campaigns, an ARM Science Board, under the direction of DOE program managers, was created and charged with determining the scientific merit of the proposed campaigns. The ARM Science Board is composed of both ARM and non-ARM scientists to reflect the user base for the facility, and would meet once yearly to evaluate the proposals for the larger campaigns. This Science Board reviewed and discussed the scientific merits of the proposals and their likely impacts. The final decision on which of the large campaigns would be supported by the program were (and are) made by DOE program managers.

e. ARM Mobile Facility

As described by Cress and Sisterson (2016, chapter 5), the original ARM vision proposed the establishment of five permanent sites and one movable site. By the late 1990s, ARM had developed three fixed locales but lacked resources to develop any more fixed locales. It had become increasingly apparent, however, that there was a scientific need and programmatic desire to sample cloud and radiation properties at other locations. The primary science driver was to study cloud properties and radiative effects in climate regimes not sampled at the three fixed sites (e.g., marine stratus and orographic clouds). In addition, there was a realization that ARM could benefit by partnering with field campaigns organized by national and international groups. In many cases, these field campaigns were built around aircraft sampling but had limited ground-based capability. By providing the latter, ARM could benefit from the former.

The ARM Chief Scientist resurrected the idea of a deployable facility in 2000, which in turn led to a feasibility study and a white paper and, eventually, a commitment to build an ARM Mobile Facility [AMF; see Miller et al. (2016, chapter 9) for a complete discussion]. The AMF concept and design followed closely from experiences and lessons learned from deploying instruments at the fixed sites of the TWP and NSA. The AMF performed well from its very first deployment in 2005, which can be attributed to the maturity of operations at the fixed sites, including instrument robustness and data integrity, and the ability and skill of the AMF team itself. The first AMF proved to be so popular that a second was built and both are now in essentially continuous operation at sites around the world.

One of the thorniest issues associated with the AMFs (and to some extent, IOPs) was the need for end-to-end support that included funding for the PIs and subsequent scientific analysis. In the early days of the ARM Program, IOPs were proposed by ARM Science Team members who had science research grants. Some of these IOPs included deployments of PI instruments (such as millimeter radars and different types of lidars) that were early prototypes of an AMF. In these cases, the IOP data augmented the continuous ARM data and were analyzed and used by the proposers using their existing funding. AMF deployments (and IOPs) fell under the auspices of the ARM Climate Research User Facility (because the AMF was established shortly after ARM became a User Facility) and were approved largely independent of Science Team funding. The result was that there is no explicit linkage between IOP and science funding. DOE management realized that this situation was undesirable and addressed it by providing some direct funding for the lead investigators of the IOP, after the IOP was reviewed by the ARM Science Board and selected for funding by DOE.

f. Science team refocusing

The rapid growth of the ARM Program throughout the decade of the 1990s also impacted the activities of the Science Team. During the latter half of the 1990s, ARM science efforts really blossomed, attracting more attention leading to an increase in the number and scientific diversity of proposals. Given that Science Team funding had largely plateaued by the end of this period, the greater diversity of topics implied less depth in some areas. This diversity also made it more difficult to link the broad ARM research community with the ongoing infrastructure activities because the links between science investigations and infrastructure activity were more tenuous in some cases. Priority was given to improve this linkage.

Around 1999, ARM management addressed the Science Team issues in two ways, one focusing on its general culture and a second that involved pruning of certain activities. In the first case, the ARM Program management, including the Chief Scientist, mandated a tighter coupling between Science Team research projects and ARM Program goals. While this connection had been implicitly considered, it now became an explicit consideration in proposal reviews. Proposals were expected to have defined scientific goals that were congruent with ARM goals. In addition, principal investigators and coinvestigators were strongly encouraged to see themselves as a resource for the ARM Program in activities such as defining ARM science, consulting with ARM infrastructure personnel on specific science needs and data usage, and participating in data quality and VAP activities. These were not new ideas—they were part of the original vision of ARM—but they had become diffused as the program grew. Restating and emphasizing them was really an attempt to refocus the Science Team.

The pruning effort was considerably more painful. The program evaluation discussed above identified several scientific areas in which the program had been very successful. Water vapor profiling and clear-sky radiative transfer (particularly in the thermal infrared) were two areas that were readily identified as huge ARM successes. ARM had expended significant resources in these two areas (see Turner et al. 2016, chapter 13; Mlawer and Turner 2016, chapter 14) and science papers had demonstrated that it was possible to compute infrared radiative transfer in the clear sky to an accuracy of a percent or better [Turner et al. (2004) report an accuracy of better than 2 W m−2 in downward, spectrally resolved radiance compared to Atmospheric Emitted Radiance Interferometer (AERI) measurements]. Doing so required knowledge of the water vapor profile to a similar level of accuracy, which was also possible. Thus, as management looked at the broad program needs, a decision was made to deemphasize research in these two areas that had been key efforts from the beginning of the program. This decision was not popular with all, but was necessary in order to find resources to attack other problems.

The results of the Science Team refocusing were largely positive. ARM research scientists responded well to the call for greater involvement in ARM programmatic issues. The restructuring of the working group structure to allow for smaller, ad hoc projects created a new avenue for participation and brought together small teams of scientists, engineers, and site personnel to work on specific issues. The Clouds of Low Water Optical Depth (CLOWD) project, which started around 2003, provides an excellent example of such a project (Turner et al. 2007a). Decisions to shift scientific resources resulted in losing some science team members but opened the door for participation by other scientists working in areas such as cloud retrievals, 3D radiative transfer, aerosol physics, and high-resolution cloud modeling. The flourishing of ARM science that occurred in the 2000s is due in no small part to these efforts.

g. Improved parameterization development process

Any scientist who has been engaged in cloud parameterization can attest to the extreme difficulty of the parameterization problem. ARM was put in place specifically to attack this problem through measurements, data analysis, physical process modeling, and parameterization development and testing. As ARM progressed in the early 1990s, science efforts were predominantly in the areas of measurement and data analysis and process modeling, particularly related to cloud processes. This is not to say that there were no efforts in climate model parameterization development, but those efforts were limited in scope. Toward the end of the 1990s, research efforts using global models increased. These included the development of a forecasting system for a global climate model and the use of a multiscale climate model. Details of these efforts are included in chapters by Krueger et al. (2016, chapter 25) and Randall et al. (2016, chapter 26).

However, there was still a strong feeling that more needed to be done to promote the use of ARM data in the parameterization problem and that ARM science funding could not be stretched much further. One largely unexpected development that occurred in the late 1990s was the use of ARM data for evaluation of weather forecasting models, led particularly by the European Center for Medium-Range Weather Forecasts (ECMWF). After consultation with leadership at the National Centers for Environmental Prediction (NCEP) and ECMWF, ARM decided to fund a postdoctoral research position called the ARM Fellow at each of these institutions. The ARM Fellow was recruited by the forecasting center, and approved and directly funded by DOE management. The only constraint placed on research was that it should involve the application of ARM data to forecast model improvement and validation. This program was particularly successful at ECMWF where several outstanding young scientists were recruited over time and worked on a variety of parameterization development and model evaluation problems (see Ahlgrimm et al. 2016, chapter 28).

h. Links with other programs

In its very early days, the ARM Program was focused largely on its own internal development, but that perspective changed rapidly. As the SGP site developed into a robust remote sensing site, other programs requested use of the data and the site. Joint field campaigns were held with NCAR, satellite validation campaigns with NASA, and international collaborations under the aegis of the Global Energy and Water Cycle Experiment (GEWEX). The logistical constraints and costs of working at the more remote NSA and TWP sites encouraged collaborative research projects such as SHEBA at the NSA (Uttal et al. 2002) and Nauru99 at the TWP (Westwater et al. 2003). These projects involved national partners such as the NSF and NOAA as well as international partners (e.g., Canada in SHEBA and Japan in Nauru99).

We could generate a long list of these interactions, but it is more useful here to consider the overall impact of these collaborations on the program. ARM profited in many ways from these linkages. It gained credibility in the broader scientific arena, enhanced science productivity by leveraging resources, and helped drive the agenda of the U.S. Climate Research Program. These interactions in the decade of the 2000s helped lay the foundations for the use of the AMF in international programs like the African Monsoon Multidisciplinary Analysis (AMMA) and COPS (see Miller et al. 2016, chapter 9). Early discussions between ARM scientists and their counterparts in Europe have now led to a formal collaboration between the ARM Climate User Facility and European scientists to develop shared data portals and algorithms9 (Haeffelin et al. 2016, chapter 29).

3. ARM as a DOE Science User Facility

From the beginning of the ARM Program, data collected from the ARM sites were considered to be a community resource and the research done with these data was highly collaborative across government agencies, universities, private companies, and international institutions. This open character of ARM was due in part to the distributed nature of the observation facility and the global nature of climate research. At the core of this open architecture was the ARM Data Archive that has always made the ARM data freely available to anyone (McCord and Voyles 2016, chapter 11). Formally recognizing this open character of ARM, the infrastructure component of ARM was designated as a National Scientific User Facility in 2004 called the ARM Climate Research Facility (Mather and Voyles 2013). As a User Facility, ARM was to serve as a resource for the broad climate research community. While ARM had already been acting in this regard to a significant extent, the designation as a User Facility formalized this role. Prior to designation as a User Facility, the ARM infrastructure was coupled to, but distinct from, the ARM Science Team. Following the User Facility designation, the management split between the infrastructure and science became more distinct; however, the structure of the ARM Science Team remained unchanged. The STEC did evolve to the SISC and served as the link between the Science Team and User Facility.

The designation as a User Facility required a change in how DOE selected and approved field campaigns at the ARM sites. Previously, that process had been internal to ARM and DOE, and all of the larger field campaigns had their roots in the ARM Science Team. After becoming a User Facility, DOE appointed a Science Board whose primary responsibility was to review proposals for field campaigns and make recommendations to DOE management. The proposal process begins each year with a call for proposals in the first half of the year followed by review by the Science Board over the summer, with the final selections by DOE management in the fall (Voyles and Chapman 2012). This process is followed for so-called facility proposals involving large field campaigns at any of the permanent sites, aircraft operations, or deployment of an ARM Mobile Facility. As a result, scientists outside of the ARM Program have led many of the more recent field campaigns.

Mobile facilities—facilities that are relocatable for short-term deployments—were part of the original ARM plan (U.S. Department of Energy 1990, appendix A therein) but took many years to become a reality (Miller et al. 2016, chapter 9). The first ARM Mobile Facility was completed in 2005. It was designed to be portable and to include the same basic suite of instruments found at any of the fixed remote-location ARM sites (e.g., Nauru). The first AMF deployment was to the Point Reyes National Seashore near San Francisco in 2005 and was immediately followed by a deployment to Niamey, Niger (Miller and Slingo 2007), the first in a series of international deployments (Fig. 3-1). Since then, AMF proposals have represented the largest portion of facility requests that are reviewed each year by the Science Board, although the augmentation of measurements at the fixed sites continues to be important. Miller et al. (2016, chapter 9) provide a further discussion on the history, development, and accomplishments of the AMFs, including an overview of each deployment.

Fig. 3-1.
Fig. 3-1.

Location of the permanent ARM sites, ARM Mobile Facility deployments, and field campaigns as of 2012.

Citation: Meteorological Monographs 57, 1; 10.1175/AMSMONOGRAPHS-D-15-0054.1

Aircraft operations have always been important for supplementing the routine ground-based observations at ARM sites. For many years, aircraft support was obtained through a combination of contracted and collaborative activities. The latter included both interagency collaborations (e.g., with NASA for the ARM-FIRE Water Vapor Experiment in 2000; Ferrare et al. 2004) and support from the DOE Unmanned Aerial Vehicles (UAV) program [see further discussion in Schmid et al. (2016, chapter 10)]. In 2006, DOE ended the UAV program and initiated a new formalized aerial component of ARM. This was initially referred to as the ARM Aerial Vehicle Program, but soon was renamed and is now known as the ARM Aerial Facility (AAF). For several years, the AAF primarily served a coordinating and contracting role; however, beginning in 2010, the AAF began to make use of the Battelle Gulfstream G1 research aircraft.

With the deployment of the first AMF in 2005, the field campaign landscape began to change. In particular, there was a shift in scientific emphasis from field campaigns at fixed-location sites to AMF deployments, since the AMF enabled the program to collect data in other climatically important regions of the world. Given the time scale of several years to plan and execute a major field campaign, this transition occurred over a period of several years. This does not mean that field campaigns at the fixed ARM sites were no longer considered important nor supported; however, there was a shift in emphasis and resources to the new ARM Mobile Facilities as well as the Aerial Facility.

This change had both positive and negative effects on the science community. For the ARM Science Team, the change meant less direct involvement in the planning of field campaigns. This change was inevitable due to ARM’s new status as a User Facility and the associated requirement to subject facility proposals to a peer-review process via the ARM Science Board. The other side effect of this change is that field campaign proposals were suddenly equally accessible to anyone regardless of their affiliation. Through this open process, the first AMF was deployed for five campaigns in a row outside of the United States to address key science issues. The PI for two of those five campaigns had no direct ties to DOE; those campaigns were the Radiative Atmospheric Divergence Using the AMF, Geostationary Earth Radiation Budget (GERB) Data, and AMMA Stations (RADAGAST) campaign in Niamey, Niger, in 2006 (Miller and Slingo 2007) and the COPS deployment to the Black Forest in Germany in 2007 (Wulfmeyer et al. 2011). All of these AMF deployments included a significant contribution from ARM’s international colleagues. Of course, scientists connected with the ARM Program continued to be very involved in these deployments, often as PIs, but now the doors were open to the international climate science community. A result of this was to significantly broaden the interest in ARM both in the United States and internationally.

4. Going beyond the soda straw, and other new measurement needs

Part of the challenge in evaluating climate models directly with ARM measurements (vs. through process study analysis and associated improvement of representation of those processes in models) is that ARM measurements are inherently local while most global models have horizontal resolutions of about 100 km (although this resolution continues to improve). This issue is mitigated to a certain extent at the SGP because that facility includes a network of extended facilities (Sisterson et al. 2016, chapter 6). The measurements at those extended sites, which covered 142 000 km2 circa 2008, were somewhat site dependent but included at least broadband radiative fluxes and surface meteorology. Therefore, information about the spatial variability of some geophysical fields is available at the SGP site, although this information is not as detailed or complete as is available at the central facility.

During this period in the mid-2000s following the designation of ARM as a User Facility, certain topics related to bridging this spatial scale gap were being discussed with increasing frequency. Perhaps foremost among these was the idea of using scanning radars (at frequencies ranging from the microwave at 35–94 GHz to frequencies in the 1- to 10-GHz range appropriate for studying precipitation and cloud dynamics) to study the three-dimensional distribution of clouds and precipitation around the ARM sites. In 2010, one of the ARM 94-GHz cloud radars was modified to permit scanning (Kollias et al. 2016, chapter 17); the radar was first deployed to the ARM Mobile Facility in the Azores.

There were many discussions during this period of using small Unmanned Aerial Systems (UASs) to obtain better spatial representation on a routine basis. UASs were not new to ARM (Stephens et al. 2000; Schmid et al. 2016, chapter 10) but the idea of using small systems on a routine basis had not yet been implemented because of federal aviation restrictions. A big step toward the goal of flying UASs routinely came in 2004 when the NSA site management team obtained permission for a small (4 nautical mile diameter) warning area centered on Oliktok, Alaska.

While the goal of all these measurement efforts was to move from a one-dimensional (1D) soda-straw view of the atmosphere to a 3D spatial view, providing a better spatial or volumetric match with models was not the only intended application of this information. Many atmospheric processes, such as convective initiation, are hypothesized to have dependencies on small-scale inhomogeneities in the atmospheric state field, and evaluating and improving 3D radiative transfer models requires 3D cloud observations. In addition, this spatial information will provide improved better information about cloud life cycle processes, which are essential to capturing these processes correctly within models.

Meanwhile, there had also been ongoing discussions that additional information about a variety of geophysical parameters and processes that could not be adequately probed with the instrumentation then available. These included better measurements of ice crystal habit, the partitioning of liquid and ice in mixed-phase clouds, precipitation and cloud properties in the presence of precipitation, boundary layer dynamics, and aerosol properties and processes such as absorption, formation from precursor gases, and the ability to serve as ice nuclei. Together these discussions precipitated a series of meetings designed to extract from the science community a clear sense of unfulfilled measurement needs.

5. Facility workshops and the 2009 Recovery Act

ARM Program management tries to be responsive to the needs of the scientific community, especially when these needs are well articulated in reports authored by both ARM scientists and those outside of the ARM family. In 2007 and 2008, DOE convened two workshops that were designed to assess the state of the ARM User Facility and needs to address critical gaps in climate science. The first workshop, held in the fall of 2007, discussed the priorities for potential future ARM sites and facility needs. Participants at the first workshop were split between representatives of the ARM science community and the broader climate research community. One of the main results from the report (U.S. Department of Energy 2007) was the identification of key areas for future measurements; this list included the Azores, Greenland, South Asia, the Amazon, and the Southern Ocean. Additionally, if DOE were to develop a second Mobile Facility, this report recommended that it should be sufficiently compact and flexible that it could be deployed on a ship or difficult-to-reach locations.

A second workshop was held a year later that focused on identifying key science issues and missing measurements (U.S. Department of Energy 2008). Recommendations from the second workshop included expanding radar capabilities to include features such as multiple frequencies, dual polarization, scanning to improve measurements of microphysics, and longer wavelengths to better characterize precipitation processes. Other recommendations included better measurements of surface and boundary layer properties, aerosol properties, and upper tropospheric water vapor. DOE management led by Wanda Ferrell had a remarkable amount of foresight in holding these workshops because both proved to be very timely.

In 2008, the baseline funding for the ARM User Facility was increased specifically to implement a second Mobile Facility to partially address some of the recommendations from the 2007 workshop and to specifically support marine deployments. While the first several deployments of this second AMF were land-based to help the developing facility mature, in 2012/13 the AMF2 was deployed on a ship that repeatedly transited between Los Angeles and Honolulu to sample boundary layer clouds in this region.

While the AMF2 was under development, the ARM User Facility was awarded funds through the 2009 American Reinvestment and Recovery Act to significantly enhance the program’s measurement capabilities. The Recovery Act was an economic stimulus package implemented widely across the Federal Government to projects that had tasks that were ready to be worked on right away. The recommendations from the workshop held less than six months earlier, combined with ongoing interactions between the ARM User Facility and the ARM Science Team, put ARM management in the position to be able to react very quickly (as was required) when the Recovery Act opportunity came along. Through the Recovery Act, a broad variety of instruments were added to the facility for improving the measurement of cloud, aerosol, and precipitation properties and for measuring surface radiative and heat fluxes. Significant additions included scanning radars at millimeter and centimeter wavelengths, several types of advanced lidars for profiling aerosol extinction, water vapor concentration, and clear air motion, and aerosol instruments to provide improved measurements of physical and chemical properties (Mather and Voyles 2013). A listing of core ARM instruments including those added through the Recovery Act is provided in Table 3-2.

Table 3-2.

Recovery Act instruments. The sites where each instrument is deployed are identified with the following key: S = SGP, N = NSA/Barrow, T1 = TWP/Manus, T3 = TWP/Darwin, A1 = AMF1, A2 = AMF2, AF = ARM Aerial Facility, MA = Mobile Aerosol Observing System.

Table 3-2.

The significant improvement in the instrumentation at the ARM sites via the Recovery Act coincided with a desire by DOE management to more closely link different climate research programs within DOE’s Climate and Environmental Sciences Division. This led to the merging of the ARM science program with another program that focused on aerosol processes and properties to develop a new scientific program called Atmospheric System Research (ASR; U.S. Department of Energy 2010; Mather et al. 2016, chapter 4). This new program focuses on the better characterization of the myriad of processes associated with clouds and aerosols, their interaction with radiation, and their impact on climate.

To support ARM and ASR research efforts, it is very important to develop the expertise, datasets, and software tools to make the best use of the new facility measurement capabilities. These measurements have the potential of providing remarkable insights into processes associated with cloud and aerosol life cycles but will require persistent effort to fully realize these benefits. In particular, the scanning cloud radars have not been used previously on a continuous basis and there are few examples where they have been used at all. Optimizing the use of these instruments including their scanning strategy and derived data products will require close collaboration between ARM and the science user community.

Another resource for the ASR scientific community will be the data from the recent deployment of two new sites. In 2013, ARM began operating a new fixed-location site in the Azores where the AMF was deployed in 2009–10. The purpose for this site is to continue to explore the life cycle of marine stratus clouds, which is very uncertain yet is critical in regulating Earth’s energy balance (Bony and Dufresne 2005). To aid in this work, the site will be equipped with additional instrumentation relative to the earlier AMF deployment in the Azores, including an X-band centimeter wavelength radar for studying drizzle, a Doppler lidar for characterizing below-cloud vertical air motion, and an Aerosol Observing System (AOS). ARM has also developed a third AMF that will first be deployed for an extended term at Oliktok, Alaska, one of the ancillary sites used during the Mixed-Phase Arctic Cloud Experiment (M-PACE; Verlinde et al. 2007). As noted earlier, a key attribute of Oliktok is the restricted airspace managed by DOE centered on that site. That restricted airspace opens the possibility of operating Tethered Balloon Systems and Unmanned Aerial Systems in conjunction with the AMF. Such a combination would allow links to be made between the ground-based observations along the Arctic Ocean coast and the adjacent ocean–sea ice, which was one of the original goals for the NSA site (Stamnes et al. 1999).

As ARM undertakes measurements in new locales, it has ended operations in the TWP in 2014 after 18 yr. The cessation of operations in the tropics is enabling a redistribution of instruments and resources, which is intended to accelerate the application of ARM observations and data processing for the understanding of key atmospheric processes and the representation of these processes in global climate models. This reconfiguration of the ARM User Facility is focused on enhancements of the SGP and NSA sites and has three main facets (U.S. Department of Energy 2014):

  1. enhancing ARM observations and measurement strategies to enable the routine operation of high-resolution models and to optimize the use of ARM data for the evaluation of these models.

  2. undertaking the routine operation of high-resolution models at ARM sites; and

  3. developing data products and analysis tools that enable the evaluation of models using ARM data.

ARM is undergoing some significant changes; however, its mission to provide comprehensive observation datasets for climate research and the improvement of climate models remains unchanged. As the ASR community makes progress in understanding aerosol and cloud life cycles and the interaction between clouds, aerosols, and precipitation, the measurement needs will evolve and thus areas of emphasis within ARM will also evolve. But the core emphasis on clouds and aerosols, and their effect on radiation and now precipitation, will continue for some time.

6. Continued evolution and growth of ARM

The ARM Program began with a vision largely unconstrained by prior experience. It grew prolifically in the 1990s in many different directions simultaneously and experimentally. When questions arose about what to do, the oft-repeated mantra was to do what made sense. Those of us who were part of the program in that decade remember the enthusiasm, the brainstorming, the successes, and the failures of those early days. By the later years of the decade of the 1990s, it was clear that ARM was moving into a new phase. As a maturing program, ARM needed to transition to a longer time horizon for program planning, develop a better sense of process in order to make sure that all the science voices were heard, set priorities in the face of constrained resources, and increase the interaction between Science Team and infrastructure personnel. The mantra now became how to ensure that we focused on what made sense and weed out what did not.

The ARM Program made a series of decisions around 2000–01 that put processes in place to further the goals of ARM. These included the data quality office, the VAP process, and an IOP planning process. These changes were implemented to create a situation in which priorities could be set and resources allocated fairly. One of the important and recognized consequences was the pruning of the program to eliminate certain instruments, reduce the resources applied to some science questions, and reduce the footprint of some operational components. Given the fixed resources available to the program in the period of the 2000s, these changes were inevitable in order to make room for an expanded instrument set and new science questions. This process would be repeated again as the ARM Program’s observational capability grew substantially with the Recovery Act, and when the scientific priorities changed as ARM science transitioned to the ASR.

The mature ARM Program of the 2000s was in some sense less exciting than the growing program of the 1990s, but was considerably more productive in terms of the breadth of the research and publications. Changes in the program infrastructure were put in place to enhance the ability of the Science Team to conduct research and they largely met that goal. In the case of VAPs, the program cost was greater than anticipated, mainly because it proved to be more difficult to create “bullet-proof” code than had been expected. Overall, the cost of infrastructure activities increased, but much of that increase was the result of increased scope rather than a failure to contain costs. In fact, ARM Program management fought doggedly and successfully throughout this period to find ways to economize.

During this period, the ARM Program also gained international visibility because of its success. It served as an exemplar that promoted the growth of several similar sites in Europe, such as the Chilbolton Observatory in the United Kingdom, Cabauw in the Netherlands, Lindenberg Observatory in Germany, and Paliseau in France (see Haeffelin et al. 2016, chapter 29). It forged links with other science programs in the United States, particular those sponsored by NASA, by serving as an instrument test bed, a key satellite validation site (or sites), and target point for aircraft observations. These connections can most easily be identified in the list of IOPs mentioned previously but are also noted elsewhere in this volume (Marchand 2016, chapter 30). Modeling groups began to make increased use of ARM data, especially as ARM undertook the creation of datasets particularly targeted toward their needs. (See Fig. 3-2 for overall growth in data usage.) Attendance at ARM Science Team Meetings grew into the hundreds, attracting not only those funded directly by ARM but also those using the ARM data or sites for these purposes. The ARM Program has continued to grow, pushing the observational boundaries with new state-of-the-art instruments, conducting IOPs at both its fixed and AMF sites, and garnering new users.

Fig. 3-2.
Fig. 3-2.

Cumulative growth in files and megabytes requested from the ARM Data Archive.

Citation: Meteorological Monographs 57, 1; 10.1175/AMSMONOGRAPHS-D-15-0054.1

When the ARM Program was first proposed toward the latter part of 1989, there was a considerable amount of skepticism raised about whether the program could be carried out as envisaged and whether the results would be worth the cost. The first decade of ARM was devoted to proving that the program could indeed be carried out. The vision had to be reduced somewhat, largely because the proposed budget never materialized. The second decade of ARM was devoted to demonstrating that the results were worth the cost. The mature ARM Program produced a record of solid scientific achievement, including significant breakthroughs in our understanding of cloud and aerosol processes and radiative forcing. Perhaps the ARM Program’s most important legacy is the recognition that continuous, ground-based remote sensing is a critically important tool for understanding the complex interactions between clouds, aerosols, atmospheric radiation, weather, and climate.

REFERENCES

  • Ackerman, T. P., and G. M. Stokes, 2003: The Atmospheric Radiation Measurement Program. Phys. Today, 56, 3844, doi:10.1063/1.1554135.

    • Search Google Scholar
    • Export Citation
  • Ahlgrimm, M., R. Forbes, J.-L. Morcrette, and R. Neggers, 2016: ARM’s impact on numerical weather prediction at ECMWF. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0032.1.

  • Andrews, E., P. J. Sheridan, J. A. Ogren, and R. Ferrare, 2004: In situ aerosol profiles over the Southern Great Plains cloud and radiation test bed site: 1. Aerosol optical properties. J. Geophys. Res., 109, D06208, doi:10.1029/2003JD004025.

    • Search Google Scholar
    • Export Citation
  • Bony, S., and J.-L. Dufresne, 2005: Marine boundary layer clouds at the heart of tropical cloud feedback uncertainties in climate models. Geophys. Res. Lett., 32, L20806, doi:10.1029/2005GL023851.

    • Search Google Scholar
    • Export Citation
  • Clothiaux, E. E., M. A. Miller, B. A. Albrecht, T. P. Ackerman, J. Verlinde, D. M. Babb, R. M. Peters, and W. J. Syrett, 1995: An evaluation of a 94-GHz radar for remote sensing of cloud properties. J. Atmos. Oceanic Technol., 12, 201229, doi:10.1175/1520-0426(1995)012<0201:AEOAGR>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Clothiaux, E. E., G. G. Mace, T. P. Ackerman, T. J. Kane, J. D. Spinhirne, and V. S. Scott, 1998: An automated algorithm for detection of hydrometeor returns in micro pulse lidar data. J. Atmos. Oceanic Technol., 15, 10351042, doi:10.1175/1520-0426(1998)015<1035:AAAFDO>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Clothiaux, E. E., and Coauthors, 1999: The Atmospheric Radiation Measurement program cloud radars: Operational modes. J. Atmos. Oceanic Technol., 16, 819827, doi:10.1175/1520-0426(1999)016<0819:TARMPC>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Clothiaux, E. E., T. P. Ackerman, G. G. Mace, K. P. Moran, R. T. Marchand, M. A. Miller, and B. E. Martner, 2000: Objective determination of cloud heights and radar reflectivities using a combination of active remote sensors at the ARM CART sites. J. Appl. Meteor., 39, 645665, doi:10.1175/1520-0450(2000)039<0645:ODOCHA>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Cress, T. S., and D. L. Sisterson, 2016: Deploying the ARM sites and supporting infrastructure. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0049.1.

  • Ellingson, R. G., R. D. Cess, and G. L. Potter, 2016: The Atmospheric Radiation Measurement Program: Prelude. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0029.1.

  • Feingold, G., and A. McComiskey, 2016: ARM’s aerosol–cloud–precipitation research (aerosol indirect effects). The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0022.1.

  • Feltz, W. F., W. L. Smith, H. B. Howell, R. O. Knuteson, H. Woolf, and H. E. Revercomb, 2003: Near-continuous profiling of temperature, moisture, and atmospheric stability using the Atmospheric Emitted Radiance Interferometer (AERI). J. Appl. Meteor., 42, 584595, doi:10.1175/1520-0450(2003)042<0584:NPOTMA>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Ferrare, R. A., and Coauthors, 2004: Characterization of upper tropospheric water vapor measurements during AFWEX using LASE. J. Atmos. Oceanic Technol., 21, 17901808, doi:10.1175/JTECH-1652.1.

    • Search Google Scholar
    • Export Citation
  • Ghan, S., and J. Penner, 2016: ARM-led improvements in aerosols in climate and climate models. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0033.1.

  • Haeffelin, M., and Coauthors, 2016: Parallel developments and formal collaboration between European atmospheric profiling observatories and the U.S. ARM research program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0045.1.

  • Kollias, P., and Coauthors, 2016: Development and applications of ARM millimeter-wavelength cloud radars. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0037.1.

  • Krueger, S. K., H. Morrison, and A. M. Fridlind, 2016: Cloud-resolving modeling: ARM and the GCSS story. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0047.1.

  • Long, C. N., and Y. Shi, 2008: An automated quality assessment and control algorithm for surface radiation measurements. Open Atmos. Sci. J., 2, 2337, doi:10.2174/1874282300802010023.

    • Search Google Scholar
    • Export Citation
  • Long, C. N., J. H. Mather, and T. P. Ackerman, 2016: The ARM Tropical Western Pacific (TWP) sites. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0024.1.

  • Marchand, R., 2016: ARM and satellite cloud validation. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0038.1.

  • Mather, J. H., and J. W. Voyles, 2013: The ARM climate research facility: A review of structure and capabilities. Bull. Amer. Meteor. Soc., 94, 377392, doi:10.1175/BAMS-D-11-00218.1.

    • Search Google Scholar
    • Export Citation
  • Mather, J. H., D. D. Turner, and T. P. Ackerman, 2016: Scientific maturation of the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0053.1.

  • McComiskey, A., and R. A. Ferrare, 2016: Aerosol physical and optical properties and processes in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0028.1.

  • McCord, R., and J. W. Voyles, 2016: The ARM data system and archive. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0043.1.

  • McFarlane, S. A., J. H. Mather, and E. J. Mlawer, 2016: ARM’s progress on improving atmospheric broadband radiative fluxes and heating rates. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0046.1.

  • Michalsky, J., F. Denn, C. Flynn, G. Hodges, P. Kiedron, A. Koontz, J. Schemmer, and S. E. Schwartz, 2010: Climatology of aerosol optical depth in north-central Oklahoma: 1992–2008. J. Geophys. Res., 115, D07203, doi:10.1029/2009JD012197.

    • Search Google Scholar
    • Export Citation
  • Miller, M. A., and A. Slingo, 2007: The ARM Mobile Facility and its first international deployment. Bull. Amer. Meteor. Soc., 88, 12291244, doi:10.1175/BAMS-88-8-1229.

    • Search Google Scholar
    • Export Citation
  • Miller, M. A., K. Nitschke, T. P. Ackerman, W. R. Ferrell, N. Hickmon, and M. Ivey, 2016: The ARM mobile facilities. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0051.1.

  • Mlawer, E. J., and D. D. Turner, 2016: Spectral radiation measurements and analysis in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0027.1.

  • Peppler, R., K. Kehoe, J. Monroe, A. Theisen, and S. Moore, 2016: The ARM data quality program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0039.1.

  • Randall, D. A., A. D. Del Genio, L. J. Donner, W. D. Collins, and S. A. Klein, 2016: The impact of ARM on climate modeling. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0050.1.

  • Schmid, B., R. G. Ellingson, and G. M. McFarquhar, 2016: ARM aircraft measurements. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0042.1.

  • Shupe, M. D., J. M. Comstock, D. D. Turner, and G. G. Mace, 2016: Cloud property retrievals in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0030.1.

  • Sisterson, D., R. Peppler, T. S. Cress, P. Lamb, and D. D. Turner, 2016: The ARM Southern Great Plains (SGP) site. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-16-0004.1.

  • Stamnes, K., R. G. Ellingson, J. A. Curry, J. E. Walsh, and B. D. Zak, 1999: Review of science issues, deployment strategy, and status for the ARM North Slope of Alaska–Adjacent Arctic Ocean climate research site. J. Climate, 12, 4663, doi:10.1175/1520-0442-12.1.46.

    • Search Google Scholar
    • Export Citation
  • Stephens, G., and Coauthors, 2000: The Department of Energy’s Unmanned Aerospace Vehicle (UAV) Program. Bull. Amer. Meteor. Soc., 81, 29152938, doi:10.1175/1520-0477(2000)081<2915:TDOESA>2.3.CO;2.

    • Search Google Scholar
    • Export Citation
  • Stokes, G. M., 2016: Original ARM concept and launch. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0021.1.

  • Turner, D. D., and Coauthors, 2004: The QME AERI LBLRTM: A closure experiment for downwelling high spectral resolution infrared radiance. J. Atmos. Sci., 61, 26572675, doi:10.1175/JAS3300.1.

    • Search Google Scholar
    • Export Citation
  • Turner, D. D., and Coauthors, 2007a: Thin liquid water clouds: Their importance and our challenge. Bull. Amer. Meteor. Soc., 88, 177190, doi:10.1175/BAMS-88-2-177.

    • Search Google Scholar
    • Export Citation
  • Turner, D. D., S. A. Clough, J. C. Liljegren, E. E. Clothiaux, K. E. Cady-Pereira, and K. L. Gaustad, 2007b: Retrieving liquid water path and precipitable water vapor from the Atmospheric Radiation Measurement (ARM) microwave radiometers. IEEE Trans. Geosci. Remote Sens., 45, 36803689, doi:10.1109/TGRS.2007.903703.

    • Search Google Scholar
    • Export Citation
  • Turner, D. D., E. J. Mlawer, and H. E. Revercomb, 2016: Water vapor observations in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0025.1.

  • U.S. Department of Energy, 1990: Atmospheric Radiation Measurement Program Plan. DOE/ER-0441, 121 pp.

  • U.S. Department of Energy, 1996: Science Plan for the Atmospheric Radiation Measurement (ARM) Program. DOE/ER-0670T, 74 pp.

  • U.S. Department of Energy, 2004: Atmospheric Radiation Measurement Program Science Plan. DOE/ER-0402, 62 pp.

  • U.S. Department of Energy, 2007: Report on the ARM Climate Research Facility Expansion Workshop. DOE/SC-ARM-0707, 50 pp.

  • U.S. Department of Energy, 2008: ARM Climate Research Facility Workshop Report. DOE/SC-ARM-0804, 23 pp.

  • U.S. Department of Energy, 2010: Atmospheric System Research (ASR) Science and Program Plan. DOE/SC-ASR-10-001, 77 pp. [Available online at http://science.energy.gov/~/media/ber/pdf/Atmospheric_system_research_science_plan.pdf.]

  • U.S. Department of Energy, 2014: Atmospheric Radiation Measurement Climate Research Facility Decadal Vision. DOE/SC-ARM-14-029, 21 pp.

  • Uttal, T., and Coauthors, 2002: Surface heat budget of the Arctic Ocean. Bull. Amer. Meteor. Soc., 83, 255275, doi:10.1175/1520-0477(2002)083<0255:SHBOTA>2.3.CO;2.

    • Search Google Scholar
    • Export Citation
  • Verlinde, J., and Coauthors, 2007: The Mixed-Phase Arctic Cloud Experiment (M-PACE). Bull. Amer. Meteor. Soc., 88, 205221, doi:10.1175/BAMS-88-2-205.

    • Search Google Scholar
    • Export Citation
  • Verlinde, J., B. Zak, M. D. Shupe, M. Ivey, and K. Stamnes, 2016: The ARM North Slope of Alaska (NSA) sites. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0023.1.

  • Vogelmann, A. M., and Coauthors, 2012: RACORO extended-term aircraft observations of boundary layer clouds. Bull. Amer. Meteor. Soc., 93, 861878, doi:10.1175/BAMS-D-11-00189.1.

    • Search Google Scholar
    • Export Citation
  • Voyles, J. W., and L. A. Chapman, 2012: Field campaign guidelines. DOE/SC-ARM-11-003, 20 pp.

  • Westwater, E. R., B. B. Stankov, D. Cimini, Y. Han, J. A. Shaw, B. M. Lesht, and C. N. Long, 2003: Radiosonde humidity soundings and microwave radiometers during Nauru99. J. Atmos. Oceanic Technol., 20, 953971, doi:10.1175/1520-0426(2003)20<953:RHSAMR>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Wulfmeyer, V., and Coauthors, 2011: The Convective and Orographically Induced Precipitation Study (COPS): The scientific strategy, the field phase, and first highlights. Quart. J. Roy. Meteor. Soc., 137, 330, doi:10.1002/qj.752.

    • Search Google Scholar
    • Export Citation
  • Xie, S., and Coauthors, 2010: Clouds and more: ARM climate modeling best estimate data. Bull. Amer. Meteor. Soc., 91, 1320, doi:10.1175/2009BAMS2891.1.

    • Search Google Scholar
    • Export Citation
  • Zhang, M., and J. L. Lin, 1997: Constrained variational analysis of sounding data bases on column-integrated budgets of mass, heat, moisture, and momentum: Approach and application to ARM measurements. J. Atmos. Sci., 54, 15031524, doi:10.1175/1520-0469(1997)054<1503:CVAOSD>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Zhang, M., R. C. J. Somerville, and S. Xie, 2016: The SCM concept and creation of ARM forcing datasets. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi: 10.1175/AMSMONOGRAPHS-D-15-0040.1.

1

An excellent summary of the state of the ARM Program in 1998 was produced by W. Ferrell, P. Crowley, T. S. Cress, and G. M. Stokes, “History and Status of the Atmospheric Radiation Measurement Program March 1998” (available at http://www.arm.gov/publications/proceedings/conf08/extended_abs/history.pdf?id=59; accessed July 2013). Additional background information was published by Ackerman and Stokes (2003).

2

The ARM Program had three Chief Scientists: Gerry Stokes from 1990 to 1998, Tom Ackerman from 1999 to 2004, and Warren Wiscombe from 2005 to 2009.

3

“Clear-sky” here refers actually to non-cloudy sky (i.e., aerosol effects are included in the clear sky). We use the term as a simple differentiation from “cloudy” sky.

4

Chapters 13 to 18 in this monograph all are connected with ARM-related instrument development. In addition, the topic shows up in a number of other chapters such as those dealing with the sites, the mobile facility, and the aircraft program. Stokes (2016, chapter 2) describes the Instrument Development Program.

5

One of the successful outcomes of the ARM Program has been much closer contact between satellite and ground-based retrieval groups leading to shared retrieval algorithms. This has been particularly apparent for active sensors (radar and lidar). For example, the NASA CloudSat team and ARM millimeter-wavelength radar scientists have worked closely together on the development of radar products and similar retrieval algorithms have been applied to the NASA A-Train constellation of instruments and the ARM ground-based instruments (Marchand 2016, chapter 30).

6

One of the recurring themes in the development of ARM has been the need to merge science and engineering. Building continuously operating facilities required ARM to develop engineering standards for instrument and site operations. Similarly, VAPs required engineering standards for code and documentation. The science demands, however, were continuously evolving so the engineering standards had to be flexible enough to permit change and evolutionary growth. These paired requirements forged a unique partnership between engineers and scientists on a program-wide basis that was simultaneously challenging and rewarding. One of ARM’s outstanding achievements was the creation of a relatively seamless team of scientists, software engineers, and hardware engineers. Many of the articles in this volume are coauthored by combinations of these groups, along with the management that endorsed and encouraged that collaboration.

7

A timeline summary of ARM IOPs can be found on the ARM website (http://www.arm.gov/campaigns/; scroll to bottom; accessed 4 Sep 2013), as well as a table of IOPs by year (http://www.arm.gov/about/stats/campaigns; accessed 4 Sep 2013).

8

The ARM sites have become perhaps the most heavily instrumented atmospheric research sites in the world, and many IOPs are where investigators bring their new instrument to run side by side with operational ARM instruments to help evaluate their new technology.

9

U.S./European Workshop on Climate Change Challenges and Observations, 6–8 Nov 2012; workshop report available at http://science.energy.gov/~/media/ber/pdf/CESD_EUworkshop_report.pdf; accessed 23 Oct 2013.

Save