• ARM, 2016a: Appendix B: Executive summary: Science Plan for the Atmospheric Radiation Measurement Program (ARM). The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0035.1.

  • ARM, 2016b: Appendix C: Executive summary: Atmospheric Radiation Measurement Program Science Plan: Current status and future directions of the ARM Science Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0034.1.

  • Ackerman, T. P., T. S. Cress, W. R. Ferrell, J. H. Mather, and D. D. Turner, 2016: The programmatic maturation of the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0054.1.

  • DOE, 1990: Atmospheric Radiation Measurement Program Plan. DOE Tech. Rep. DOE/ER-04411990, 121 pp. [Available at http://www.arm.gov/publications/doe-er-0441.pdf.]

  • DOE, 1991: Identification, recommendation, and justification of potential locales for ARM sites. U.S. DOE Tech. Rep. DOE/ER 0495T, 160 pp.

  • DOE, 1996: Science Plan for the Atmospheric Radiation Measurement Program. U.S. DOE Tech. Rep. DOE/ER-0670T, 174 pp.

  • Ellingson, R. G., and Y. Fouquart, 1991: The intercomparison of radiation codes in climate models: An overview. J. Geophys. Res., 96, 89258927, doi:10.1029/90JD01618.

    • Search Google Scholar
    • Export Citation
  • Ellingson, R. G., and W. J. Wiscombe, 1996: The Spectral Radiance Experiment (SPECTRE): Project description and sample results. Bull. Amer. Meteor. Soc., 77, 19671985, doi:10.1175/1520-0477(1996)077<1967:TSREPD>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Ellingson, R. G., R. D. Cess, and G. L. Potter, 2016: The Atmospheric Radiation Measurement Program: Prelude. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0029.1.

  • Kollias, P., and Coauthors, 2016: Development and applications of ARM millimeter-wavelength cloud radars. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0037.1.

  • Long, C. N., J. H. Mather, and T. P. Ackerman, 2016: The ARM Tropical Western Pacific (TWP) sites. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0024.1.

  • Lunn, P., T. S. Cress, and G. M. Stokes, 1995: History and Status of the Atmospheric Radiation Measurement Program—March 1995. Proc. Fifth Atmospheric Radiation Measurement (ARM) Science Team Meeting, San Diego, CA, U.S. DOE, iii–vii. [Available online at https://www.arm.gov/publications/proceedings/conf05/extended_abs/history.pdf.]

  • Luther, F., Ed., 1984: The Intercomparison of Radiation Codes in Climate Models. World Climate Program Rep. WCP-93, 37 pp.

  • McCord, R., and J. W. Voyles, 2016: The ARM data system and archive. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0043.1.

  • Michalsky, J. J., and C. N. Long, 2016: ARM solar and infrared broadband and filter radiometry. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0031.1.

  • Ohmura, A., and Coauthors, 1998: Baseline Surface Radiation Network (BSRN/WCRP): New precision radiometry for climate research. Bull. Amer. Meteor. Soc., 79, 21152136, doi:10.1175/1520-0477(1998)079<2115:BSRNBW>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Peppler, R., K. Kehoe, J. Monroe, A. Theisen, and S. Moore, 2016: The ARM data quality program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0039.1.

  • Ramirez, A., Ed., 1990: The federal plan for meteorological services and supporting research: Fiscal year 1990. U.S. Office of the Federal Coordinator for Meteorological Services and Supporting Research Rep. FCM-P1-2015, 290 pp. [Available online at http://www.ofcm.gov/fedplan/FY2016/pdf/FCM-P1-2015.pdf.]

  • Sisterson, D. L., R. Peppler, T. S. Cress, P. Lamb, and D. D. Turner, 2016: The ARM Southern Great Plains (SGP) site. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-16-0004.1.

  • Stokes, G. M., 2016: Original ARM concept and launch. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0021.1.

  • Stokes, G. M., and S. E. Schwartz, 1994: The Atmospheric Radiation Measurement (ARM) Program: Programmatic background and design of the cloud and radiation test bed. Bull. Amer. Meteor. Soc., 75, 12011221, doi:10.1175/1520-0477(1994)075<1201:TARMPP>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Turner, D. D., J. E. M. Goldsmith, and R. A. Ferrare, 2016a: Development and applications of the ARM Raman lidar. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0026.1.

  • Turner, D. D., E. J. Mlawer, and H. E. Revercomb, 2016b: Water vapor observations in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0025.1.

  • Verlinde, H., B. Zak, M. D. Shupe, M. Ivey, and K. Stamnes, 2016: The ARM North Slope of Alaska (NSA) sites. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0023.1.

  • Weber, B. L., and Coauthors, 1990: Preliminary evaluation of the first NOAA demonstration network wind profiler. J. Atmos. Oceanic Technol., 7, 909918, doi:10.1175/1520-0426(1990)007<0909:PEOTFN>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Zhang, M., R. C. J. Sommerville, and S. Xie, 2016: The SCM concept and creation of ARM forcing datasets. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0040.1.

  • View in gallery

    Conceptual design of ARM experimental network [ARM Program Plan (DOE 1990, Fig. 12)].

  • View in gallery

    Geographical distribution of recommended locales circa 1991 (Stokes and Schwartz 1994).

  • View in gallery

    CART Data Environment circa 1992 showing data flow. The Site Data System is a component of the CDE established at each site’s central facility for the purpose of acquiring data in real time and completing initial processing, which included conversion to standard units, application of calibrations, passing data through quality control checks, and running measurement-related algorithms for evaluating instrument performance. Processed data were forwarded to the Experiment Center, and both raw and processed data were sent to the archive. The Experiment Center was a two-component data center, with a program data center at PNNL and an external data center at BNL. The PNNL facility was responsible for receiving data from the sites, creating higher-order data products, creating data packages tailored to the needs of Science Team members, and “pushing” those packages to them. The BNL facility was responsible for acquiring data from external sources, creating data packages for the Science Team, and pushing the packages to them. The ARM data archive was established at Oak Ridge National Laboratory and was responsible for receiving and archiving all raw and processed data from all field sites and the Experiment Center. It also had reprocessing responsibility and acted as the user interface for the general scientific community (Stokes and Schwartz 1994).

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 58 57 4
PDF Downloads 34 33 3

Deploying the ARM Sites and Supporting Infrastructure

View More View Less
  • 1 Pacific Northwest National Laboratory, Richland, Washington
  • | 2 Argonne National Laboratory, Lemont, Illinois
© Get Permissions
Full access

Retired.

Corresponding author address: Ted Cress, 279 Hillwood St., Richland, WA 99352. E-mail: cressts@earthlink.net

Retired.

Corresponding author address: Ted Cress, 279 Hillwood St., Richland, WA 99352. E-mail: cressts@earthlink.net

1. Background

Ellingson et al. (2016, chapter 1) discusses the scientific stimuli that led to the ARM Program. Stokes (2016, chapter 2) discusses the DOE efforts to develop the program’s concept for a proposal to the Committee on Earth Sciences (CES) of the U.S. Global Change Research Program (USGCRP) in 1990. Two of the most influential scientists driving the proposed program forward were Fred Luther of Lawrence Livermore National Laboratory and Bob Ellingson of Florida State University. As a result of the Intercomparison of Radiation Codes in Climate Models (ICRCCM) activity (Ellingson and Fouquart 1991), Luther (1984) was among those who concluded that “a dedicated field measurement program is recommended for the purpose of obtaining accurate spectral radiances rather than integrated fluxes as a basis for evaluating model performance.”

Ellingson and Wiscombe took it a step further, proposing a field program (Ellingson and Wiscombe 1996; Ellingson et al. 2016, chapter 1) of “real observations” to evaluate radiative modeling results. At the same time, it was realized that observations were needed to evaluate the accuracy of cloud properties that were being predicted by models and that there was much work needed to improve GCMs in this regard. As discussed by Stokes (2016, chapter 2), these influences led the DOE to direct the preparation of a proposal for the incipient USGCRP.

Stokes (2016, chapter 2) discusses the preparation of the initial DOE proposal submitted to the CES and its rejection; it was considered too big and too costly—nearly equaling the then total investment in atmospheric basic research by all departments of the Federal government.1 The CES requested a revised, less expensive proposal (Stokes 2016, chapter 2) that was accepted, which became the ARM Program Plan (DOE 1990; ARM 2016a, appendix A) and the primary guidance for the implementation of the ARM Program. For a field program, that approval was not without its doubters. The proposed program of long-term observations called for the use, in many cases, of instruments that were still experimental or laboratory grade. The proposal also called for data to be acquired from instruments in real time and to be shared in near–real time. The proposed approach was viewed as extremely aggressive, requiring a substantially new approach and potentially new technical capabilities as compared to the historical approach of time-limited campaigns. For the proposed DOE program, the data communications challenges alone appeared nearly insurmountable. As Stokes (2016, chapter 2) observed, the resultant features of the program were “highly controversial.” To the doubters, it appeared to promise a “step too far” with an attendant high potential to fall short.

Indeed, the approved DOE program was aggressive. The Program Plan (ARM 2016a, appendix A) clearly presented objectives and goals for the program but, for deployment purposes, provided no blueprint or roadmap to successfully get there. The proposed duration and complexity was unprecedented. It was clear at the time that the challenges would be many and that a successful deployment would be dependent on effectively recognizing and dealing with issues as they arose. It became a hallmark of the deployment effort that, as new issues or requirements were identified (and there were many), they were, for the most part, effectively addressed, producing robust solutions that typically stood the test of time. It was also clear that the proposed program would not succeed on DOE efforts alone. DOE would need to foster an extraordinary level of interagency collaboration and cooperation, extending from simple acquisition of routine data [e.g., National Weather Service (NWS) weather and radar data] to the acquisition of less accessible data (e.g., raw radar data from NWS or research Doppler radars or satellite and aircraft data from other field programs). These requirements had to be factored into deployment planning for the implementation of a robust data system.

2. Approved Program Plan guidance

In chapter 2, Stokes (2016) described the “intellectual threads” leading to ARM and the evolution of the Program Plan. This section summarizes the key requirements that were identified and thus drove and guided the deployment effort.

Stokes and Schwartz (1994) present the primary ARM Program Plan objectives as the following:

  1. To relate observed radiative fluxes in the atmosphere, spectrally resolved and as a function of position and time, to the atmospheric temperature, composition (specifically including water vapor and clouds), and surface radiative properties.

  2. To develop and test parameterizations that describe atmospheric water vapor, cloud and the surface properties governing atmospheric radiation in terms of relevant prognostic variables, with the objective of incorporating these parameterizations into general circulation and related models.

These objectives were to be pursued with a particular focus on atmospheric radiative properties as a consequence of the presence of clouds and aerosols. Since acquired measurements would be used as input to drive GCM radiation parameterizations and as the validation data for GCM calculations, Ackerman et al. (2016, chapter 3) viewed these objectives in a more direct way:

  1. If we can specify a cloud field, can we compute the radiative fluxes associated with that field?

  2. If we can specify the large-scale atmospheric state variables, can we predict the cloud field properties associated with that state?

This restatement of ARM’s aim to provide the data critical to improve radiation and cloud parameterizations gives an immediate insight into what was required to translate the approved ARM Program Plan into a roadmap leading to a siting strategy, the design and deployment of the measurement sites, and the processing of the data. For GCM applicability, these questions had to be asked for every climate regime.

a. Concept development (fleshing out the skeleton of the Program Plan)

After the approval of the DOE proposal by the CES, DOE requested Gerry Stokes, Pacific Northwest National Laboratory (PNNL), to assume leadership of the effort as ARM’s first chief scientist to translate the ARM Program Plan into an implementable strategy. In chapter 2, Stokes (2016) discusses the facilitated planning methodology (commonly referred to as WISDM) used to develop the necessary plans and guidance. Facilitated planning workshops (i.e., WISDM sessions) were convened repeatedly over the next few years to prescribe virtually all aspects of the field experiment structure, including site operations, instrument operations, and data processing and management. The workshops identified the programmatic elements and their functions, fleshed out the details of what was needed, and determined how information and data would be captured, distributed, and archived within the program and made available to the ARM Science Team and the general research community. The workshops were not constrained by budget considerations and current technology. The intent was to allow participants to look far enough into the future to deploy a capability that would not become obsolete within a decade. The sentiment was, “If you can imagine it, you can build it.” The results of these workshops are discussed in the following sections.

At the same time, Gerry Stokes was addressing the need to marshal the technical and management manpower with the skills necessary to map out the ARM deployment and to put ARM into the field with the full operational capability envisioned in the ARM Program Plan. To this end, eight DOE laboratories (Argonne National Laboratory, Brookhaven National Laboratory, Oak Ridge National Laboratory, Los Alamos National Laboratory, Sandia National Laboratories, Lawrence Livermore National Laboratory, Pacific Northwest National Laboratory, and the National Renewable Energy Laboratory) were requested to participate and assume responsibilities for various aspects of the program. Management and technical personnel from this family of DOE laboratories constituted the nucleus of the facilitated planning workshops and then effectively collaborated to put workshop results into motion.

It is important to appreciate that this collaboration among the DOE laboratories was a culture shift for the laboratories and for DOE. Although the laboratories certainly had cooperated in field programs before ARM [e.g., the Atmospheric Studies in Complex Terrain (ASCOT) project], they typically had to compete against one another for DOE funding. For the ARM Program, DOE decided to fund the Program through a single management office, which managed and funded the functional efforts of the laboratories. The culture shift was challenging, but the laboratories readily recognized that together they had the required expertise across the range of tasks involved in implementation of the Program, whereas individually they did not. In this respect, the ARM Program was directly a DOE “corporate program,” with the DOE family of laboratories working as a distributed, yet integrated, corporate resource. Programmatic funding did not include funding for the research. Proposed research efforts at those same laboratories were funded directly by DOE via the Science Team solicitation and proposal review process used for all Science Team efforts. But projects like ASCOT did create some spirit of cooperation among many of the scientists at the different laboratories.

b. Requirements (what had to happen?)

The ARM Program Plan provided details, circa 1990, about the types of measurements that could be foreseen as required to support the program’s stated objectives. To be applicable to GCM parameterizations, two measurement approaches were recognized as required:2

  1. A concentration of instruments and support facilities to provide measurements of the vertical column of the atmosphere (commonly referred to as a “soda straw” view of the atmosphere over a point)

  2. An extended network to measure the three-dimensional structure of the atmospheric column on the scale of a GCM grid cell

Combined with the requirement to gather data to address these two questions neatly summarized by Ackerman et al. (2016, chapter 3) above, plus the requirement for real-time data acquisition and quality control, a set of measurement requirements important to deployment planning could be specified:

  • Continuous (24/7) measurements

  • Measurements of solar and infrared radiation, both spectrally resolved and broadband, for a range of climatically different meteorological conditions to constrain detailed, line-by-line radiative calculations under clear, cloudy, and overcast conditions for global application

  • Measurements of surface and overlying meteorological variables, including cloud type and distribution, wind, and temperature

  • Measurements of clouds, radiative properties, and atmospheric properties over a wide range of scales

  • Measurements of the microphysical properties of clouds

  • Measurements of atmospheric water vapor, aerosols, and atmospheric trace gases

  • State-of-the-art calibration capabilities

Considering the measurements required, the facility requirements identified were just as important, and highly significant for planning and deployment, including:

  • Sites to be installed in climatologically representative regimes across the globe

  • Sites typically operating for years to acquire statistically significant data for seasonal and annual cycles in the climate system with shorter campaigns in additional areas as deemed necessary

  • Field-hardened instruments

  • Extensibility—the ability to extend and/or adjust measurement facilities to accommodate permanent or temporal adjustments of the operational measurement scheme or the fielding of new instruments, some with potentially high data volume rates

  • A robust data environment

c. The Cloud and Radiation Test Bed

To meet the data and facility requirements summarized above, and consonant with the proposed budget, the Program Plan introduced the concept of the Cloud and Radiation Test Bed (CART) and specified it to have the following features:

  • Four to six permanent field sites, with each site comprising a central facility and an extended network

  • An extended network of 16–25 surface observing sites distributed over a representative GCM grid cell area (additional sites for characterizing the three-dimensional structure of the atmosphere over the central facility important to radiative fluxes being measured at the central facility were added later)

  • An in situ sampling capability

  • A mobile site

  • Capability to host specialized observational campaigns

  • A data environment meeting the requirements of the Program Plan

  • State-of-the-art calibration capability at each site

The term “test bed” was selected deliberately, because the overall goal of the ARM Program was to “develop and test parameterizations of important atmospheric processes, particularly cloud and radiative processes” (Stokes and Schwartz 1994). The goals would be addressed by comparing radiation measurements against calculations and vice versa, in accordance with questions posed by Ackerman et al. (2016, chapter 3) quoted above.

d. Conceptual CART measurement site

The basic conceptual design for a CART measurement site is shown in Fig. 5-1. The idealized CART site, reflective of the listed requirements, included a central facility to acquire data to support the instantaneous radiative flux (IRF) measurement strategy (i.e., the soda straw view overhead) and a surrounding network of sites to meet the requirements for the single-column model (SCM) measurement strategy. The extended surface observing facilities would be distributed over an area comparable to a GCM grid cell. Finally (not shown in the illustration), four to six boundary facilities established at the edges of the CART site boundaries would obtain sufficient vertical measurements of the overlying atmosphere to characterize the advective tendencies across the boundaries of the grid cell (Zhang et al. 2016, chapter 24). Also not shown, three to four auxiliary sites would be established around the central facility (approximately 20 km away) equipped to map the three-dimensional structure of the atmosphere over the central facility. The boundary and auxiliary sites were improvements to the original Program Plan scheme and were suggested by the ARM Science Team. The entire site was designed to enable acquisition of sufficient data to represent the full range of processes acting within the grid cell and to permit the desired calculations to evaluate radiative and cloud parameterizations being used in GCMs.

Fig. 5-1.
Fig. 5-1.

Conceptual design of ARM experimental network [ARM Program Plan (DOE 1990, Fig. 12)].

Citation: Meteorological Monographs 57, 1; 10.1175/AMSMONOGRAPHS-D-15-0049.1

e. CART site instrumentation

With the configuration conceptualized, what instrumentation was required to provide the observations needed to address the two questions summarized by Ackerman et al. (2016, chapter 3)? The Program Plan presented a rather comprehensive look at the types of instruments that would likely be required. The selection of instruments actually deployed was influenced heavily by recommendations from the Science Team (Stokes 2016, chapter 2) and through the efforts of the instrument team (see below). The following is a brief list of instrument types envisioned for each site facility.

1) Central facility

  • Spectral radiation: For longwave (infrared) radiation: spectrometers and interferometers for measurements at wavelengths between 3 and 25 µm at high spectral resolution (1 cm−1 or better) using a field-proven, rugged design; For visible radiation (sunlight): spectrophotometers for spectrally resolved measurements.

  • Broadband radiometers: A duplication, to the extent possible, of the pyranometers, pyrgeometers, and pyrheliometers selected by the World Climate Research Program (WCRP) for the Global Baseline Surface Radiation Network (GBSRN) to provide measurements of direct normal shortwave (solar beam), diffuse horizontal shortwave (sky), global horizontal shortwave (total hemispheric), upwelling shortwave (reflected), downwelling longwave (atmospheric infrared), and upwelling longwave (surface infrared) radiation.

  • Meteorological instrumentation: Surface-based or tower (i.e., near surface) measurements of meteorological variables [temperature, humidity (i.e. water vapor), pressure, wind speed and direction, precipitation amount, surface fluxes, and cloud cover]. Measurements of aerosols, trace gases, aerosol optical depth, and soil moisture. Vertical profile measurements of meteorological variables associated with radiative transfer. These include in situ measurements from balloonborne sensors or aircraft. The program also considered remote sensors, such as Raman lidar, differential absorption lidar, and microwave radiometry for water vapor and radio acoustic soundings for winds and temperature critical, with the idea that these remote sensors would ultimately replace the need to launch radiosondes (Turner et al. 2016a, chapter 18).

  • Calibration instrumentation: A state-of-the-art calibration facility to ensure radiometric calibration with standards referenced to World Radiation Center instruments. A field calibration capability for balloonborne sensors.

2) Extended field sites

  • Broadband radiometric instrumentation identical to the central facility reflective of the WCRP GBSRN network (pyranometers, pyrgeometers, and pyrheliometers).

  • Surface meteorological, surface flux, and soil moisture instrumentation as at the central facility.

  • Surface reflectivity measurements.

  • Cloud cover, perhaps with whole-sky imagery.

3) Boundary sites

  • Perhaps collocated with an extended field site.

  • Additional instruments include a balloonborne sounding system for measurements of advective fluxes germane to the SCM experiment.

4) Auxiliary sites

  • Perhaps collocated with an extended field site.

  • Instrumentation suited to map the three-dimensional structure of the atmospheric properties in the vicinity of the soda straw experiment at the central facility.

  • Sites would be 10–20 km from the central facility and instrumented with, perhaps, radio acoustic sounders, scanning radars, and whole-sky cloud imagers.

f. Planning for laboratory-grade or developmental instruments in CART—The IDP

As Stokes (2016, chapter 2) discussed, many effective instruments had been developed and used in short- and midterm observational campaigns, with operation and data quality being dependent on principal investigator interaction and data reduction. Some of these kinds of instruments were identified specifically in the ARM Program Plan as being desirable for CART measurements (DOE 1990). The ARM Instrument Development Program (IDP) (Stokes and Schwartz 1994; DOE 1996; Stokes 2016, chapter 2; Ackerman et al. 2016, chapter 3) was implemented to support continuing engineering development and technical evaluation of selected instruments to develop a capability for unattended, routine 24/7 operations within the CART environment. Several IDP-supported efforts were highly successful, providing instruments for CART that would not have been available. Some of the successes were the atmospheric emitted radiance interferometer (AERI; Turner et al. 2016b, chapter 13), the multifilter rotating shadowband radiometer (MFRSR; Michalsky and Long 2016, chapter 16), the micropulse lidar, the millimeter cloud radar (MMCR; Kollias et al. 2016, chapter 17), and the Raman lidar (Turner et al. 2016a, chapter 18). Not all instrument development efforts under the IDP were successful in producing instruments for routine, unattended 24/7 operation. Only limited success was achieved for a solar radiance transmission interferometer, an absolute solar transmittance interferometer, and a rotating shadowband spectrometer, for example.

g. Planning for the CART data system

The ARM Program Plan provided a view of what was required for data acquisition, processing, and archival, but the real meat of the CART Data Environment (as the data system was termed in those years) was put on the skeleton during a series of facilitated planning workshops. The skeleton provided by the ARM Program Plan guidance called for the CART Data Environment to have the following characteristics and features (DOE 1990; Stokes and Schwartz 1994; McCord and Voyles 2016, chapter 11):

  • Development to be kept to a minimum, with the system depending on existing data centers for data distribution and archival functions

  • Real-time processing of instrument data streams while executing primary quality control with feedback to site operations for instrument maintenance and operations

  • Data transmission to a central location for conversion to a standard format for distribution to science members and the archive, where it would be available to the general scientific community

  • Data fully documented and archived in raw and processed form to facilitate reprocessing, if required, at later dates

  • Routine acquisition of data from external sources, such as weather and sounding data from NWS, satellite data from NASA and from ongoing field programs, such as the WCRP GBSRN

  • An extensible data system with the capability to grow to accommodate additional instruments, increases in data volume, and to support short- or midterm intensive field programs conducted by ARM

The effective implementation and reliable operation of the data system was recognized to constitute a critical element of the program, as clearly stated in the Program Plan: “The design of the ARM project presupposes a well-designed, smoothly functioning, research data management system.”

3. Deployment and implementation of CART

As discussed previously, the ARM Program Plan provided a rather detailed high-level view of what the deployed CART would look like, with requirements for operations, instruments, and data processing and archival. However, it did not provide the roadmap to get it done. Facilitated planning workshops were utilized, essentially, to convert the Program Plan into a hierarchical structure of functional requirements for the deployment and operation of CART. Personnel with the required technical backgrounds from the DOE laboratories were tasked to participate as part of the teams addressing each of the functional elements identified earlier. These teams began their efforts in parallel in 1990.

Stokes (2016, chapter 2) points out that the relatively complete description of the relationships between elements of the program provided in the Program Plan enabled the teams’ efforts. The teams were able to take those understood relationships and develop detailed approaches (and therefore plans) for what had to be done to put CART in the field, make it operational, and provide the path to meeting the data needs of the Science Team.

The following sections discuss the deployment activity of those teams focused on site selection, operations, instrumentation, and the data. With respect to the modeling team, the program made the decision not to run its own model and subsequently relied, for the most part, upon the efforts of the Science Team’s SCM working group for modeling activity. The SCM working group led the development of the scheme for estimating advective tendencies using the data from a CART site’s measurement network (Zhang et al. 2016, chapter 24).

a. Site/locale selection

At the outset, one of the highest priority efforts was to identify where CART field sites should be established. The site selection process devolved into a two-step process, distinguishing between climatologically significant locales and a specific site selection within the locales. The task of identifying and evaluating climatically significant locales was led by Steve Schwartz from Brookhaven National Laboratory. DOE Technical Report ER 0495T (DOE 1991) presents the findings and recommendations and rationale for the locales identified. Specific siting considerations and decisions for the three sites that were ultimately established in the initial deployment are detailed in Sisterson et al. (2016, chapter 6), Long et al. (2016, chapter 7), and Verlinde et al. (2016, chapter 8).

To start the process of locale selection, the team first established a set of principles derived from the purposes of the ARM Program (DOE 1991; Stokes and Schwartz 1994). Briefly stated, these were:

  • The set of locales should stress the radiation models, spanning the domain of radiation-influencing attributes (latitude, altitude, clouds, humidity, aerosols, etc.).

  • Climatological and surface-property attributes should be as homogeneous as possible across the locale (with deliberate exceptions).

  • Establishing a site within the locale must be logistically feasible.

  • The opportunity for collaboration with other programs in a given locale gives additional weight to the significance of that locale.

Based on these principles, nominally 20 locales were identified for further evaluation. Five locales (Fig. 5-2) were recommended and ranked based on operational feasibility and the anticipated budget for establishing the sites as well as the locale’s scientific value (for GCM modeling). The locales and the rationale for each were as follows:

  1. U.S. Southern Great Plains (SGP): logistics; synergism; a wide variety of cloud types; a wide range of temperature and specific humidity; large annual, synoptic, and diurnal variations

  2. Tropical Western Pacific Ocean (TWP): deep tropical convection; cirrus clouds; interannual variability in sea surface temperature; high sea surface temperature; high specific humidity

  3. Eastern North Pacific Ocean (ENP) or eastern North Atlantic Ocean: marine stratus; transition between marine stratus and broken cloud fields; high specific humidity

  4. North Slope of Alaska (NSA): large seasonal variations in surface properties; distinct surface properties from other locales

  5. Gulf Stream off eastern North America, extending eastward: extreme values and variation in surface heat fluxes; marine stratus clouds; altostratus clouds; mature cyclonic storms; genesis region for cumulonimbus and widespread layered clouds associated with large synoptic storms

In addition to the five primary locales for permanent installations, four supplementary locales intended for episodic occupation by the ARM Mobile Facility were identified with the caveat that other sites might be added or substituted based on specific Science Team needs. The four potential ARM mobile facility deployment sites were:
  1. Central Australia or Sonoran Desert: high temperature; low specific humidity; frequent totally clear skies

  2. U.S. Northwest–southwest Canada coast: coastal and orographic inhomogeneity; marine stratus and nimbostratus clouds

  3. Amazon basin or Congo basin: deep convection; large latent heat fluxes; high specific humidity; large seasonal variation in rainfall

  4. Beaufort Sea, Bering Sea, or Greenland Sea: sea ice; sea ice edge; fog and marine stratus clouds

Fig. 5-2.
Fig. 5-2.

Geographical distribution of recommended locales circa 1991 (Stokes and Schwartz 1994).

Citation: Meteorological Monographs 57, 1; 10.1175/AMSMONOGRAPHS-D-15-0049.1

On the basis of the locale recommendations, the ARM management team identified authors to provide locale-specific reports for each of the five primary locales, evaluating the operational issues to be faced. Each of the reports examined the research purposes that each site could address, as well as the specific instrumentation, modeling, implementation, and operational issues that emerged. The authors had the freedom to explore the full set of issues encountered in addressing the following operational questions, establishing the content of these reports:

  • Why conduct operations within this locale?

  • What measurements must be taken?

  • What are the logistic and operational problems for CART operations at this locale?

  • How can these problems be resolved?

  • What are the logical linkages to other candidate locales and the most appropriate extensions of the primary mission?

  • How would the measurement strategies outlined in the ARM Program Plan (DOE 1990) be implemented at this CART locale?

Significantly, the narrowing of a locale to a site depended on ease of site access, transportation, supplies, and services. Additionally, since the SGP and NSA locales were in the United States, National Environmental Policy Act (NEPA) requirements dictated screening to ensure that the proposed CART site and related activities would not adversely affect any environmentally sensitive areas, such as historic or cultural resources, protected areas such as parks, or other ecologically significant areas. Furthermore, the CART site and activities must not threaten animal species that were considered endangered or threatened or have adverse effects on the designated critical habitats for these species.

Other significant considerations for site placement included restrictions on the use of airspace for aircraft and balloons, operation of lidars, and access to the appropriate radio transmission frequencies and telephone/Internet for operations and communications. Another important discriminant was the opportunity for synergism with other field programs where instruments and data could be shared.

Relatedly, the ARM Program scope was evaluated against budget expectations on an annual basis. Budget limitations, recognized circa 1994–95, resulted in the reduction of planned site deployments from five to three, deferred any development or planning for a mobile facility (possibly permanently), and slowed deployment activity at the primary sites and in the capability growth of the data system (Lunn et al. 1995). The sites to be established were the SGP, TWP, and NSA. The eastern Pacific/Atlantic Ocean and Gulf Stream sites were changed from primary sites to supplementary status. The NSA was moved ahead of the ocean margins (i.e., eastern North Pacific/Atlantic) site because of recent substantial field programs in the eastern North Atlantic and the scarcity of radiation- and/or cloud-related research programs in the Arctic.

b. Site deployment/operations planning

Planning for the physical deployment and operation of the sites was the function of the site operations team led by Sumner Barr of Los Alamos National Laboratory. Like the other teams, the site operations team participated in facilitated planning workshops to flesh out the site deployment and operations management functionality. Included in the defined functionalities for each site were roles for a site manager and a site scientist. Planning the functionality of the sites created the roadmap or blueprint for them, which was something that was missing from the ARM Program Plan.

Based on recommendations from the ARM management team and to kick start deployment of the first site, DOE named Doug Sisterson of Argonne National Laboratory as the site manager for the SGP. April 1992 was targeted as the start-up date for data to be generated at that site. Setting a target date accomplished two things: first, it removed the fear of delaying site deployment for the development of a perfect plan (Stokes 2016, chapter 2); second, it provided a firm timeline for the infrastructure to map actual deployment milestones. Proposals were requested from the laboratories for nominees to manage the other sites and to propose how these other sites might be configured considering the conceptual CART site design, the functionality being developed in the planning workshops, and the exigencies of the specific locale. Proposals were reviewed by ARM management and recommendations provided to DOE. Consequently, Bill Clements from Los Alamos National Laboratory (LANL) was named site manager for the TWP and Bernie Zak from the Sandia National Laboratory (SNL) for the NSA. In addition, Paul Michael from Brookhaven National Laboratory (BNL) was given responsibility for the Gulf Stream site, and Mike Reynolds from BNL for the eastern oceans margins site. Shortly thereafter these two locales were removed from the deployment schedule.

While the first site was to be operational in 1992, the conceptual plan was to activate one site per year, with the five sites operational by 1997. As discussed, budget realities forced the scaling back of the intended project to three sites, deferred/cancelled the Mobile Facility, and dictated a phased deployment of the sites in accordance with what the annual funding allowed. A mantra remained to get all of the sites established and operational as soon as it could be accomplished. It was the site managers who bore the brunt of the effort to meet the timeline, and it was they who saw the task to a successful end despite the plethora of hurdles to overcome. The result was that first measurements at the SGP were started in 1992 (Stokes 2016, chapter 2), but extended facilities were still being deployed two years later. The TWP site achieved initial operational status in 1996 (Long et al. 2016, chapter 7) and the NSA in 1997 (Verlinde et al. 2016, chapter 8).

Site deployment began when the site managers converged on a general blueprint for their sites. Identifying which IRF and SCM requirements could be addressed at each site was the long pole in the planning tent. It was clear that, except for the SGP, neither of the other sites could implement the ideal CART site depicted in Fig. 5-1 with a central facility surrounded by a network of extended sites because of remoteness and geography. Each site had to conceive of a facility and instrument configuration that made the most sense. This challenge alone was major and involved working with ARM management and the Science Team to achieve an optimum and achievable site plan. As usual, however, the devil was in the details, and these details were, at times, major hurdles. A few examples of the types of hurdles that were recognized included NEPA and similar requirements at state and local levels; contracting with foreign governments; and developing local support in the country (i.e., around the U.S. sites) and in foreign lands (not as easy as it sounds). As discussed by Long et al. (2016, chapter 7), the limited availability of scientific and technical talent at the TWP island sites posed a significant challenge and required substantial efforts to reach out to local governments and communities.

Timetables were constructed, and deliverables were determined. Detailed planning went into all aspects of site design, instrumentation, operations, and data quality. The efforts have proven robust, with remarkably few changes being required over years of operation.

c. Concrete poured

No plan survives first contact with the enemy.

—Field Marshall Helmuth Carl Bernard Graf von Moltke [courtesy of Stokes (2016, chapter 2)]

Moltke’s comment was never so apt as when applied to actually putting sites on the ground in each of the locales. Each of the site locale analysis reports started with the idealized CART site (Fig. 5-1) but then married the ideal with reality. On the basis of synergism with existing measurements from other programs and logistic constraints (e.g., terrain, power, communications, and politics), actual locations for CART sites were identified and the instrumentation and facilities rescoped to fit. The SGP site was the least complicated and closest to the idealized site—about the size of a GCM grid cell (~300 km × 300 km), more-or-less homogeneous across its surface, relatively easily accessible, and there was a road to it. It had the additional benefit of being in the midst of a high density of measurements being made by other programs that would be beneficial to the ARM Science Team.

The TWP and NSA sites, however, were unique. The warm pool in the TWP locale would be best characterized by establishing a central facility–type site (i.e., primarily supporting the IRF, or soda straw, measurement strategy) in the middle of the warm pool with similar facilities at additional sites located closer to the east and west peripheries of the warm pool. These would necessarily be island sites, limiting SCM characterization capability. As an example of the type of out of the box thinking that was required for these remote sites, it was decided that the best deployment model for the TWP would be to build the central facilities using sea containers configured prior to deployment and delivered as units to the selected sites (Long et al. 2016, chapter 7). The NSA site was somewhere in between the SGP and TWP in complexity, being on land but with extremely limited siting options. A central facility focused on the IRF measurement strategy was feasible, but the gridcell characterization would be limited to a few sites at best supplemented by temporal field programs, probably in collaboration with other programs (Verlinde et al. 2016, chapter 8).

d. Local site operations

The facilitated planning workshops mapped site operations down to an implementable level of detail. This mapping determined the functions of on-site personnel, their qualifications, what information and reports were needed (corrective maintenance, preventative maintenance, calibrations, an operations log book, etc.), and who needed to receive them. The actual staffing of personnel and establishment of operational protocols (e.g., facility and instrument maintenance) was left to the individual site managers. Some debate did arise about operating a site with a permanent scientific staff as opposed to hiring local people and training them. A qualified scientific staff was perceived as desirable because of the complexity and diversity of instrumentation, and ARM was a research program. Hiring locally would require training and routine visits from technical experts but would be more cost effective and, perhaps, make the site more valued by the surrounding community. The latter scenario was implemented and proved to be very effective.

One of the most critical functionalities identified in the planning process was data acquisition, processing, and communications. While data acquisition and processing will be discussed in a later section, data communication capability was a critical element for site operations, requiring either the provision of sufficient bandwidth for the anticipated data flows [nominally 7 GB day−1 per site (DOE 1990)] or a plan to physically transport storage media (sneakernet)3 if required. The advent of widespread Internet access during this period greatly facilitated data transfer capabilities (McCord and Voyles 2016, chapter 11), with site-specific attributes as discussed in the chapters that follow. For example, in the beginning of the TWP’s history, the sneakernet was critical to getting most data back from these tropical sites to the data system computers on the U.S. mainland.

e. Site scientists

One of the functional roles identified in the facilitated planning workshops was the need for a site scientist. It was envisioned that a site scientist would be named for each fixed site, ideally a researcher from a university near the site or who was deeply involved in research in the geographic area of the ARM site. The overall responsibility of the site scientists would be to ensure that local site operations did not jeopardize data quality (e.g., by regularly driving diesel trucks past the aerosol intake stack), assist with field campaigns at the site, review on-site changes to physical structures, and oversee the data quality practices for the site. The site scientists would also ensure that ARM efforts did not become insular from the interests of the general atmospheric research community. As the program evolved, the role of the SGP site scientist’s office, in particular, took on a greater role in data quality, as discussed by Ackerman et al. (2016, chapter 3) and Peppler et al. (2016, chapter 12).

A natural tension existed between the site manager and the site scientist, because the ideal support for research objectives was often in conflict with logistical or budgetary reality. The tension ensured that all feasible options were always taken into account in site operations planning.

The selection of the site scientists was through a request for proposals issued by DOE. DOE and ARM management reviewed all proposals, selecting Dr. Peter Lamb from University of Oklahoma as the site scientist for the SGP, Dr. Thomas Ackerman from Pennsylvania State University for the TWP, and Dr. Knut Stamnes from the University of Alaska for the NSA as the original site scientists for the three primary sites.

4. Instrumentation

Instrumentation planning for the CART sites followed a path similar to site selection and site operations functional areas. Facilitated planning workshops identified the key functional activities that would be required to acquire, deploy, and operate instruments in the CART site environment. Commercially available and routinely used instruments posed one set of questions for acquisition and implementation, but other desired instruments were not as mature and not as readily incorporated into the 24/7 operation of the CART facility. Additional instruments were anticipated to evolve from the IDP, as discussed earlier, and required a more fundamental approach. The instrument team, under the leadership of Marv Wesley at Argonne National Laboratory, developed a methodology to deploy each class of instrument with primary attention being paid to reliable operation and data quality. Interfacing the instruments to the data system for continuous operation, data quality control, and data distribution will be discussed later. To deal with the spectrum of issues for putting an instrument into the CART environment, instrument mentors were assigned to each instrument. Site-specific instrument operational issues that required instrument mentor support are discussed in the site-specific chapters.

a. Instrument acquisition

Instrument acquisition within the DOE environment both is and is not a straightforward task. The straightforward task is the contracting with a vendor for an instrument for delivery meeting a specified set of requirements (e.g., environmental hardening, performance limits, calibration requirements, and delivery schedule). Critical deliverables for each procurement needed to address finer points of data processing, such as the formats of the data output and available quality control checks, just to name two. The not-so-straightforward issue concerned instrument acquisition funding. The funding to support the ARM Program infrastructure in deploying and maintaining the site is received from the DOE operations and maintenance (O&M) budget line, where DOE managers have the discretion to determine the funding for individual programs under their oversight. Instruments, however, are considered capital property (if over a modest cost threshold) and must be acquired with capital funds. Capital funding is a different process that requires identification of capital needs, review against other DOE capital needs, and finally a designation of the capital funds that will be allocated to the individual programs. Requestors (programs) will likely not get all the funds requested and, most likely, not in the time frame that they would like to receive them. In this context, capital funding for ARM instruments was not received in time to permit acquisition and delivery in time to meet the April 1992 start-up date. To meet that date, the first instruments to arrive at the SGP site were borrowed from NCAR and replaced with ARM property later in the year. It was largely the foreseen capital funding in future years that proved to be a key factor in the decision to reduce the number of sites and phase the instrument deployment schedule, as has been discussed previously.

b. Instrument mentors

As mentioned, ARM recognized that technical experts were needed for each instrument. These experts not only had to know the instrument hardware, but they also needed to know instrument software and, most importantly, had to have used the data from that instrument previously in their own research. ARM instrument mentors were critically important and impactful. Their specific tasks evolved as deployment phased into operation, but new instruments were always on the horizon (or closer), and as such the list of responsibilities remained relatively static during the deployment years:

  • Develop the technical specifications for instruments and spare components to be procured.

  • Develop procedures for instrument operations (e.g., daily rounds, maintenance, and calibration).

  • Assess instrument (and measurement) uncertainty, status, and quality.

  • Manage instrument repairs.

  • Work with site ops to upgrade instrument performance or upgrade an instrument as appropriate.

  • Work with ARM data system personnel on data product requirements, the specification of appropriate operating ranges, the determination of appropriate data flags when data fall outside the range, and the development of any additional data quality control procedures that are feasible in the course of real-time data acquisition and processing.

  • Alert the Data Quality Office, site operations, and science data users to data quality problems.

  • For the balloonborne sounding system (BBSS) and other in situ sampling system, provide for a continuing in situ sampling program.

  • Participate in intensive operational periods (IOPs)4 as appropriate.

5. The CART Data Environment

The ARM data system (McCord and Voyles 2016, chapter 11), as initially established, was known as the CART Data Environment (CDE). The elemental structure of the CDE and how the data flowed in the system are shown in Fig. 5-3. Recognizing that data was to be made available to the Science Team as soon as instruments were in the field, planning for the data system and development of some elements of the CDE were begun immediately upon approval and funding of the program. One aim of the CDE developmental effort was, to the degree feasible, to use existing facilities and data centers for processing, dissemination, and archiving of the data. Underscoring what proved to be an inadequate ability to foresee the pace of technological advance, the plan also specified that “existing technology in software and hardware” would be used (to control cost). Of course, one could argue over the definition of the word “existing” (i.e., at the time or at a time in the future).

Fig. 5-3.
Fig. 5-3.

CART Data Environment circa 1992 showing data flow. The Site Data System is a component of the CDE established at each site’s central facility for the purpose of acquiring data in real time and completing initial processing, which included conversion to standard units, application of calibrations, passing data through quality control checks, and running measurement-related algorithms for evaluating instrument performance. Processed data were forwarded to the Experiment Center, and both raw and processed data were sent to the archive. The Experiment Center was a two-component data center, with a program data center at PNNL and an external data center at BNL. The PNNL facility was responsible for receiving data from the sites, creating higher-order data products, creating data packages tailored to the needs of Science Team members, and “pushing” those packages to them. The BNL facility was responsible for acquiring data from external sources, creating data packages for the Science Team, and pushing the packages to them. The ARM data archive was established at Oak Ridge National Laboratory and was responsible for receiving and archiving all raw and processed data from all field sites and the Experiment Center. It also had reprocessing responsibility and acted as the user interface for the general scientific community (Stokes and Schwartz 1994).

Citation: Meteorological Monographs 57, 1; 10.1175/AMSMONOGRAPHS-D-15-0049.1

The mantra for the program was that the data would have “known and reasonable quality” (Stokes 2016, chapter 2). Accordingly, the Program Plan discussed various operational requirements for the data system. The following requirements distilled from the Program Plan were central to the design and implementation of the CDE:

  • Real-time quality control (for every instrument data stream)

  • Data availability on site for instrument monitoring

  • “Ready availability” of data to data centers

  • Data to be “well documented”

  • Data (raw and processed) and data documentation (metadata) to be archived for future reference and the possible necessity of reprocessing in the future

  • Data converted to a “standard format” for ease of use by the research community

The CDE schematic implies data moving smoothly through the system, but as has been discussed, adequate bandwidth needed to be established at the SGP site, and data from the remote sites had to be physically transported on storage media. The data flow was not necessarily smooth or without delay, at least at first.

The deployment era contained great challenges, a lot of time and travel, and experts who came together to learn the software and hardware to actually design and implement the CDE. They used, for the most part, strict software system design protocols to design the various aspects of the system and develop the coding required. To that group, the data management gurus, the data system probably appeared to be simply a very large but tractable data system design challenge. To the nongurus, it had the appearance of an overwhelming task with innumerable opportunities to fail. The truth, fortunately, was in between, and a functional and reliable system was developed with the first elements of the system in place to transmit data successfully from the SGP site in April 1992.

a. Data and Science Integration Team

The Data and Science Integration Team (DSIT)5 was the evolutionary group of data system developers and other members of the ARM Program with responsibility to develop, field, and manage the various elements that made up the CDE. For the purposes of this discussion, references to DSIT and its responsibilities embody the spectrum of an instrument’s activity from development to deployment to operation regardless of whether or not DSIT was the formal name at the time or not. Like the name, leadership of this activity changed with time, beginning with Ron Melton at PNNL and then Jimmy Voyles at PNNL. Paul Kanciruk, Paul Singly, and Raymond McCord at Oak Ridge National Laboratory (ORNL), with primary responsibility for the ARM data archive, were, in essence, partners with Melton and Voyles in managing this very large and challenging aspect of the ARM Program. Gerry Stokes, the first ARM chief scientist, felt that the DSIT (and its predecessor the Experiment Support Team) was important because an observatory had to support ongoing experiments, which required coordination.

The DSIT had many faces. On one hand, it developed the software and computer systems for site operations and the data centers. On the other hand, it had a responsibility to interact with members of the Science Team, leading the way to translating science needs into data needs, with data quality as a fundamental objective. For data acquisition, DSIT involved the instrument mentors to ensure that the instruments were producing data of expected precision and accuracy and to ensure that a complete record of instrument calibrations and operational history was maintained. In another context, the DSIT was that element of ARM that had responsibility for responding to what might be called special data needs. One example was the development of showcase datasets. Early in the program, there was concern about how to make ARM data more accessible and useable by both the Science Team and by the general scientific community. One option was to create more-or-less complete datasets for a given question. The DSIT worked with Science Team members to create and organize the dataset, making it available as a singular entity.

In essence, in the data world of ARM, DSIT “carried the water.” They were at the heart of ARM and served a critical function for the Science Team. As deployment was completed, various aspects of DSIT’s role split off to be part of other functions, detailed, in part, by Ackerman et al. (2016, chapter 3), although they do not call out DSIT by name.

b. Value-added products

Value-added products (VAPs) were not, conceptually, an identified element of the initial deployment in the early 1990s. VAPs are discussed in substantial detail by Ackerman et al. (2016, chapter 3). Antecedents to VAPs, however, were developed almost immediately upon instrument deployment. To ensure the highest data quality, measurements from different sources were merged to create an integrated data product specifically intended to evaluate and improve measurement capabilities of deployed instruments. As the project matured, these merged products became more complex and involved but still served their purpose to “make it easier for Science Team members to use ARM data, or to reprocess the original dataset to improve the quality of the data” (DOE 1996; ARM 2016b, appendix B). If the results of the VAPs could point to an instrument or operational improvement, the results were fed back to the DSIT and instrument mentors for evaluation and the implementation of corrective action, if feasible.

c. External and IOP data

It was recognized, even during the preparation of the Program Plan, that ARM could not unilaterally obtain all of the data that was going to be required by the Science Team. Data from programs like the NOAA Wind Profiler Demonstration Network (e.g., Weber et al. 1990) and the WCRP BSRN sites (e.g., Ohmura et al. 1998), as well as routine surface observations and vertical soundings from the NWS would all be necessary and would involve acquisition and distribution to meet Science Team data requirements. In addition, data from field programs conducted in collaboration with ARM [e.g., FIRE and the Spectral Radiance Experiment (SPECTRE)] or conducted in the vicinity of ARM, such as the TWP island sites (e.g., TOGA COARE), would be needed. The planning and development of the functionality of the ARM data centers were undertaken in full recognition of these realities, which then became part of the routine operation of the program. While ARM was basically designed to provide routine measurements continuously, those measurements were not always going to be sufficient. The capability was planned into ARM and its data system to ramp up for high-intensity efforts for limited periods of time to acquire data during IOPs that might be “too expensive or personnel intensive to be conducted continuously” (Stokes and Schwartz 1994).

6. Conclusions

The ARM Program was conceived in the wake of research efforts as part of ICRCCM that concluded that one of the largest sources of error in the GCMs being used for climate modeling was in the radiation parameterization components of those models. The recommended path to improving the parameterizations was a long-term measurement program involving permanent (more than a decade) ground sites measuring a full spectrum of radiation-influencing parameters of the atmosphere. DOE opted to adopt this issue as the primary focus of the department’s contribution to the USGCRP. To organize and implement the necessary field observational program, DOE tapped the technical strength of its family of laboratories, using technical expertise in those laboratories as a corporate resource. During planning discussions, it was recognized that, on the surface, while the ICRCCM recommendation suggested a soda straw experiment (measurements in a column over a point), in reality, clouds and their representation in the models were part of the larger picture involving radiation processes in the atmosphere and needed to be addressed as well. The laboratory planning sessions produced a concept involving a highly instrumented permanent facility surrounded by smaller groups of instruments to document the three-dimensional structure of the atmosphere over a soda straw measurement facility at the center. A network of five measurement sites was proposed with additional sites for short-term data acquisition efforts. Budget considerations limited the project to three permanent sites: the SGP, TWP, and NSA. The first and most comprehensive site was the SGP, which produced its first data in 1992, with deployment continuing into 1996. The TWP initial operational capability was 1996, and the NSA began operating continuously in 1997. The suite of sites comprised what was termed the Cloud and Radiation Test Bed. The data system for the project was designed to acquire data in real time, ensure data quality control by several methods, and transfer the data to the Science Team and to the ARM data archive. The ARM data archive then functioned as the entry point (user facility) for the general scientific community who desired to access and use ARM data. These capabilities evolved with time away from the initial deployment, but remain very much in operation to the current time.

REFERENCES

  • ARM, 2016a: Appendix B: Executive summary: Science Plan for the Atmospheric Radiation Measurement Program (ARM). The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0035.1.

  • ARM, 2016b: Appendix C: Executive summary: Atmospheric Radiation Measurement Program Science Plan: Current status and future directions of the ARM Science Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0034.1.

  • Ackerman, T. P., T. S. Cress, W. R. Ferrell, J. H. Mather, and D. D. Turner, 2016: The programmatic maturation of the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0054.1.

  • DOE, 1990: Atmospheric Radiation Measurement Program Plan. DOE Tech. Rep. DOE/ER-04411990, 121 pp. [Available at http://www.arm.gov/publications/doe-er-0441.pdf.]

  • DOE, 1991: Identification, recommendation, and justification of potential locales for ARM sites. U.S. DOE Tech. Rep. DOE/ER 0495T, 160 pp.

  • DOE, 1996: Science Plan for the Atmospheric Radiation Measurement Program. U.S. DOE Tech. Rep. DOE/ER-0670T, 174 pp.

  • Ellingson, R. G., and Y. Fouquart, 1991: The intercomparison of radiation codes in climate models: An overview. J. Geophys. Res., 96, 89258927, doi:10.1029/90JD01618.

    • Search Google Scholar
    • Export Citation
  • Ellingson, R. G., and W. J. Wiscombe, 1996: The Spectral Radiance Experiment (SPECTRE): Project description and sample results. Bull. Amer. Meteor. Soc., 77, 19671985, doi:10.1175/1520-0477(1996)077<1967:TSREPD>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Ellingson, R. G., R. D. Cess, and G. L. Potter, 2016: The Atmospheric Radiation Measurement Program: Prelude. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0029.1.

  • Kollias, P., and Coauthors, 2016: Development and applications of ARM millimeter-wavelength cloud radars. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0037.1.

  • Long, C. N., J. H. Mather, and T. P. Ackerman, 2016: The ARM Tropical Western Pacific (TWP) sites. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0024.1.

  • Lunn, P., T. S. Cress, and G. M. Stokes, 1995: History and Status of the Atmospheric Radiation Measurement Program—March 1995. Proc. Fifth Atmospheric Radiation Measurement (ARM) Science Team Meeting, San Diego, CA, U.S. DOE, iii–vii. [Available online at https://www.arm.gov/publications/proceedings/conf05/extended_abs/history.pdf.]

  • Luther, F., Ed., 1984: The Intercomparison of Radiation Codes in Climate Models. World Climate Program Rep. WCP-93, 37 pp.

  • McCord, R., and J. W. Voyles, 2016: The ARM data system and archive. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0043.1.

  • Michalsky, J. J., and C. N. Long, 2016: ARM solar and infrared broadband and filter radiometry. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0031.1.

  • Ohmura, A., and Coauthors, 1998: Baseline Surface Radiation Network (BSRN/WCRP): New precision radiometry for climate research. Bull. Amer. Meteor. Soc., 79, 21152136, doi:10.1175/1520-0477(1998)079<2115:BSRNBW>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Peppler, R., K. Kehoe, J. Monroe, A. Theisen, and S. Moore, 2016: The ARM data quality program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0039.1.

  • Ramirez, A., Ed., 1990: The federal plan for meteorological services and supporting research: Fiscal year 1990. U.S. Office of the Federal Coordinator for Meteorological Services and Supporting Research Rep. FCM-P1-2015, 290 pp. [Available online at http://www.ofcm.gov/fedplan/FY2016/pdf/FCM-P1-2015.pdf.]

  • Sisterson, D. L., R. Peppler, T. S. Cress, P. Lamb, and D. D. Turner, 2016: The ARM Southern Great Plains (SGP) site. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-16-0004.1.

  • Stokes, G. M., 2016: Original ARM concept and launch. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0021.1.

  • Stokes, G. M., and S. E. Schwartz, 1994: The Atmospheric Radiation Measurement (ARM) Program: Programmatic background and design of the cloud and radiation test bed. Bull. Amer. Meteor. Soc., 75, 12011221, doi:10.1175/1520-0477(1994)075<1201:TARMPP>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Turner, D. D., J. E. M. Goldsmith, and R. A. Ferrare, 2016a: Development and applications of the ARM Raman lidar. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0026.1.

  • Turner, D. D., E. J. Mlawer, and H. E. Revercomb, 2016b: Water vapor observations in the ARM Program. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0025.1.

  • Verlinde, H., B. Zak, M. D. Shupe, M. Ivey, and K. Stamnes, 2016: The ARM North Slope of Alaska (NSA) sites. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0023.1.

  • Weber, B. L., and Coauthors, 1990: Preliminary evaluation of the first NOAA demonstration network wind profiler. J. Atmos. Oceanic Technol., 7, 909918, doi:10.1175/1520-0426(1990)007<0909:PEOTFN>2.0.CO;2.

    • Search Google Scholar
    • Export Citation
  • Zhang, M., R. C. J. Sommerville, and S. Xie, 2016: The SCM concept and creation of ARM forcing datasets. The Atmospheric Radiation Measurement (ARM) Program: The First 20 Years, Meteor. Monogr., No. 57, Amer. Meteor. Soc., doi:10.1175/AMSMONOGRAPHS-D-15-0040.1.

1

The initial DOE proposal was about $200 million per year. The total federal budget for “supporting research” in meteorology for fiscal year 1990 was $445 million, of which $210 million was for NASA, mostly for satellite research and development. The remaining $235 million included engineering development efforts as well as basic research (Ramirez 1990).

2

The surrogate science team meetings described by Stokes (2016, chapter 2) termed these efforts the instantaneous radiative flux measurement strategy and the single-column model experiment measurement strategy, respectively.

3

Sneakernet—a term coined to denote the physical recording of data and the transport of that recorded data to a processing center by personnel returning from the remote site.

4

“IOP” has been an ambiguous reference in ARM since the beginning. Some use it to refer to an “intensive operational period,” and some prefer to use it to mean an “intensive observational period.” Both uses are found regularly on the ARM website.

5

Before the DSIT, there was an Experiment Support Team and a Data Management Team. Marv Dickerson and Ric Cederwall were leaders of the former, and Ron Melton and Jimmy Voyles led the latter. These two teams were merged to create the DSIT.

Save