2012 Unidata Users Workshop Navigating Earth System Science Data

Steven M. Lazarus Florida Institute of Technology, Melbourne, Florida

Search for other papers by Steven M. Lazarus in
Current site
Google Scholar
PubMed
Close
,
Jennifer M. Collins University of South Florida, Tampa, Florida

Search for other papers by Jennifer M. Collins in
Current site
Google Scholar
PubMed
Close
,
Martin A. Baxter Central Michigan University, Mount Pleasant, Michigan

Search for other papers by Martin A. Baxter in
Current site
Google Scholar
PubMed
Close
,
Anne Case Hanks University of Louisiana at Monroe, Monroe, Louisiana

Search for other papers by Anne Case Hanks in
Current site
Google Scholar
PubMed
Close
,
Thomas M. Whittaker Space Science and Engineering Center/Cooperative Institute for Meteorological Satellite Studies, Space Science and Engineering Center, University of Wisconsin—Madison, Madison, Wisconsin

Search for other papers by Thomas M. Whittaker in
Current site
Google Scholar
PubMed
Close
,
Kevin R. Tyle University at Albany, State University of New York, Albany, New York

Search for other papers by Kevin R. Tyle in
Current site
Google Scholar
PubMed
Close
,
Stefan F. Cecelski University of Maryland, College Park, College Park, Maryland

Search for other papers by Stefan F. Cecelski in
Current site
Google Scholar
PubMed
Close
,
Bart Geerts University of Wyoming, Laramie, Wyoming

Search for other papers by Bart Geerts in
Current site
Google Scholar
PubMed
Close
, and
Mohan K. Ramamurthy Unidata, University Corporation for Atmospheric Research, Boulder, Colorado

Search for other papers by Mohan K. Ramamurthy in
Current site
Google Scholar
PubMed
Close
Full access

CORRESPONDING AUTHOR ADDRESS: Steven M. Lazarus, W. University Blvd., Melbourne, FL 32901, E-mail: slazarus@fit.edu

CORRESPONDING AUTHOR ADDRESS: Steven M. Lazarus, W. University Blvd., Melbourne, FL 32901, E-mail: slazarus@fit.edu

As part of its mission,1 the Unidata Program Center (UPC) works with the Unidata Users Committee to organize triennial summer workshops2 on topics of interest to the Unidata community. The 2012 workshop theme, “navigating Earth system science data,” was designed in part to address a two-pronged challenge: how can Unidata best serve the data needs of the education and the research communities? One key goal of the workshop was to raise the level of data awareness within the academic geoscience community: this was accomplished through a diverse set of presentations on software, data access/applications, visions of the future, and a student-led poster session. Developed jointly by the UPC and Unidata Users Committee, the workshop goals reflect Unidata's prime directives to provide and support the flow of real-time geoscience data and to facilitate the use of these data in geoscience education. For the 2012 workshop, emphasis was placed on a number of different areas, including Unidata's network Common Data Form (NetCDF) and its associated standards; the Unidata model as it relates to the National Science Foundation's EarthCube initiative; features of data servers developed or supported by Unidata, including the Thematic Realtime Environmental Distributed Data Services (THREDDS) data server (TDS) and the Repository for Archiving, Managing, and Accessing Diverse Data (RAMADDA) server; promotion of Unidata's Integrated Data Viewer (IDV); practical take home knowledge and skills; and the promotion and dissemination of new data management tools. Participants were encouraged to bring their own laptops and to download specific workshop-related software in advance (e.g., the IDV). At the end of two of the workshop days, there were “meet the developer” sessions designed to provide the participants with an opportunity for one-on-one interaction with the speakers and developers. Finally, all participants were given a USB flash drive that contained a prototype “Unidata in a box” suite of UPC-developed software tools, which run on a virtual machine.

2012 UNIDATA USERS WORKSHOP

What: Unidata staff and community participants from academia, federal agencies, research institutes, and consortia met to raise awareness of data science in the geoscience academic community and share hands-on activities, course materials, and ideas for improving research and education.

When: 9–13 July 2012

Where: Boulder, Colorado

The workshop demographics reflected broad community interest, with approximately half of the 102 attendees coming from academic institutions around the world, representing both small (e.g., Bemidji State) and large universities (e.g., Ohio State). Personnel from federal agencies, research institutes, and consortia were also in attendance—including the Mexican Institute of Water Technology, the University of North Carolina Institute for the Environment, Goddard Space Flight Center, the U. S. Geological Survey (USGS), the Desert Research Institute, the National Space Research and Development Agency, the Consortium of Universities for the Advancement of Hydrologic Science, and the Centers for Disease Control and Prevention. Attendees' individual backgrounds were quite varied; modelers were especially diverse, with interests in ecological, agricultural, environmental, and climate systems. In addition, there were attendees with interests in ocean biogeochemistry, hydrology, marine geology, computer science, and system administration. Student participation was encouraged by providing travel grants and waiving the workshop registration fees, with the expectation that students would present a poster featuring material related to the workshop's theme.

MEETING SUMMARY.

The presentation formats were a mix of conference-style plenary presentations, demonstrations, and hands-on activities falling broadly into three areas: software demonstrations, data access and applications, and a look to the future termed Blue Sky. While a portion of the workshop was devoted to Unidata and other software tools used within the geosciences, many of the presentations were geared toward data awareness—focusing on existing data archives and portals. Unlike previous Unidata triennial workshops, which had daily themes, the 2012 workshop was more open ended, although each day began with a keynote presentation. For the most part, the keynotes gave a big-picture look at the data future. In many cases, individual presentations contained elements of all three areas; the groupings below are not meant to be mutually exclusive.

All workshop presentations are housed on Unidata's RAMADDA server. The web address for this server and websites related to each presenter are provided in appendix B in the order in which they are discussed in the text.

Software demonstrations.

Workshop participants were treated to a number of data display demonstrations throughout the week. Bob Hart of Florida State University extolled the benefits of the Grid Analysis and Display System (GrADS) software package, as well as its shortcomings. As an extensive user of GrADS (he generates over 10,000 images per day), Hart showed various tropical cyclone images (closest approach to landfall and cyclone phase space diagrams), ensemble model time series, and a dynamic animation of the locations of daily record maximum temperatures. Using model output from FSU and observations, Justin Hartnett from the University of South Florida showed how the IDV can be used as a learning tool to examine the vertical structure of warm core tropical cyclones. Specifically, sea level pressure was used to geolocate Hurricane Irene, and then model soundings (both near the storm center and along the perimeter) were extracted and compared. Stefan Cecelski of the University of Maryland dazzled the audience with his IDV scripting language (ISL) prowess, illustrating the power of the IDV to generate high-quality graphics and animations. Showing the end product first [an image featuring a combination of absolute vorticity, streamlines, and mean sea level pressure (MSLP)], Cecelski presented a three-step approach that included 1) the creation of the image with an embedded colorbar; 2) the generation of an IDV bundle; and 3) information on how to create and run an ISL script that references the bundle and adds a finishing touch to the image.

With the expected deployment sometime in 2013 of the National Centers for Environmental Prediction (NCEP) and National Weather Service (NWS)'s Advanced Weather Interactive Processing System, version 2 (AWIPS II) software package, Michelle Mainelli of NCEP, in tandem with Unidata's Michael James, gave a demo of the Common AWIPS Visualization Environment (CAVE). The CAVE retains many of the positive attributes of the existing National Centers' Advanced Weather Interactive Processing System (N-AWIPS) NMAP graphical user interface (GUI) it will replace, such as the product generation tool, while adding upgrades such as an Extensible Markup Language (XML) editor that will allow the user to customize the user interface.

The workshop took on more of a programming flavor as Daryl Herzmann from Iowa State University gave a demo on the capabilities of the iPython toolkit and dashboard, providing concrete examples of using Python to route data from Unidata's local data manager (LDM) to Twitter and creating stable URLs. In particular, Herzmann suggested that first creating a data archive and then using that URL on a website would improve the stability. The archive (and subsequent HTML link) might be organized by date, data type, and so on. In a related talk, Johnny Lin of North Park College led the participants through a Python-related application executed through the Ultrascale Visualization—Climate Data Analysis Tools (UV-CDAT) GUI. Lin also engaged the workshop attendees with a simple Python data-analysis application that uses Python dictionaries to link names with a variable or function on the fly.

Data access and applications.

The data floodgates have been opened, presenting a number of challenges to the Unidata community. In addition to not knowing what is out there, data volume can be problematic in a number of ways including bandwidth, processing, storage, metadata, and so on. Using a RAMADDA server to stage and share data, Kevin Tyle [(University atof Albany–State University of New York (SUNY)] presented an example of mining and processing a subset of the NCEP Climate Forecast System Reanalysis (CFSR). Originally available as individual daily files (4 analyses per day) in General Regularly Distributed Information in Binary format, edition 2 (GRIB2), the data were converted to NetCDF and composited into large yearly files.

There were plenty of presentations for data junkies during the course of the week, with Don Murray from the University of Colorado Cooperative Institute for Research in Environmental Sciences (CIRES) serving up a plate full of climate graphics via the National Oceanic and Atmospheric Administration (NOAA) Earth System Research Laboratory/Physical Sciences Division's (ESRL/PSD) map room, which sports a potpourri of graphical products. In addition, Murray discussed his work to improve the climate-related functionality of the IDV through the development of customized plug-ins and gave an overview of PSD's interpreting climate conditions website. In one of a number of talks from the perspective of a data provider, Jerry Robaidek of the University of Wisconsin's Space Science and Engineering Center (SSEC) discussed the geosynchronous/polar-orbiting satellite data archive. Featuring data from 10 satellites, the SSEC archive has an online repository with data extending back to 1978 and totaling 685 terabytes (TB). The SSEC is a top-level provider of Geostationary Operational Environmental Satellite (GOES) data to the Unidata Internet Data Distribution network (IDD), both through an LDM feed and through the Abstract Data Distribution Environment (ADDE) data transfer protocol in both real-time and archival modes. Roland Viger from the USGS broached the question, “How do we get more eyes on the data?” Referring to what he called “GIS chauvinism,” Viger discussed the emergence of software standards within the USGS, with a focus on the Environmental Systems Research Institute (ESRI) proprietary visualization tools [ArcGIS, spatial database engine (SDE), and so on]. The ESRI software uses NetCDF version 4 within the iPython notebook interface via the Environmental Data Connector multidimensional toolbox, allowing users to connect to an Open-Source Project for a Network Data Access Protocol (OPeNDAP) server or TDS to download data without leaving ArcGIS. Participants were also introduced to the USGS Geo Data Portal, which contains a variety of data resources including downscaled climate model forecasts.

Climate data were also at the forefront of an interdisciplinary presentation by Olga Wilhelmi from the National Center for Atmospheric Research (NCAR) Geographic Information Systems (GIS) initiative, who discussed the usability of climate data in the context of integrating the Earth system and social sciences. To be usable, the data must meet the needs of decision makers ranging from natural resource managers to emergency preparedness personnel. Data (such as anomaly fields and other model output) can be accessed either through the NCAR GIS climate change portal or via a TDS.

In a change of pace, participants were treated to a space weather tutorial by Brent Gordon of the Space Weather Prediction Center (SWPC). Discussing two major solar events (the geomagnetic storms of 1859 and 1921), Gordon indicated that their impact on the modern power grid might result in prolonged power outages (on the order of years) and have a huge financial impact. The SWPC, which provides global space weather alerts and warnings, has over 25,000 subscribers of its product services, featuring real-time data feeds from GOES, Polar Operational Environmental Satellites (POES), and ground-based instruments as well as data access to the NASA research satellites Solar and Heliospheric Observatory (SOHO), Advanced Composition Explorer (ACE), and Solar Terrestrial Relations Observatory (STEREO). The SWPC is currently working to port their data to the AWIPS II environment, thereby extending access beyond operations to the research and education communities.

Taking a more industrial approach to data storage, Steve Worley of NCAR's Computational Information Systems Laboratory (CISL) delivered an overview of NCAR's Research Data Archive (RDA)—a large [more than 200 TB online and 1.4 petabytes (PB) offline] and diverse database populated with observations (meteorological, oceanic, and satellite), analyses, reanalyses, and model output. The RDA supports ASCII-to-NetCDF conversion as well as spatiotemporal subsetting. Discussion of “Big Data” storage issues continued as Glen Rutledge of the National Climatic Data Center (NCDC) provided an overview of the NOAA Operational Model Archive and Distribution System (NOMADS). The NCDC holdings exceed 6,200 TB and are growing at a rate of nearly 800 TB yr-1, with an annual download rate of 1,650 TB. Rutledge described NCDC's data goals for NOMADS, which include providing access to NOAA's next generation of climate analysis products (e.g., CFSR and the Twentieth-Century Reanalysis). (Both NOAA and NCDC have developed the www.climate.gov website, which features a data and service portal along with a variety of climate literacy products.) In a huge undertaking, NCDC has rescued (i.e., digitized) approximately half of their paper holdings, totaling 15 TB. In addition to their data stewardship, NCDC maintains the National and Regional Climate Reference Networks and a paleoclimate (tree-ring data) network, and it is one of the agencies responsible for the U.S. Drought Monitor. Rutledge concluded with a series of “next steps,” in which he addressed NCDC's short-term efforts related to data processing tools including downscaling, format conversion to NetCDF files that comply with the conventions for climate and forecast metadata (CF-compliant NetCDF), and online diagnostic engines such as CDAT.

Antonia Rosati and Seth McGinnis of NCAR described the downscaling and archival efforts of the North American Regional Climate Change Assessment Program (NARCCAP). With a focus on downscaling and high-impact analysis, the NARCCAP is composed of high-resolution regional climate models (RCMs) embedded within different global climate models. The model output (40–60 TB of RCM data in CF-compliant NetCDF) is available through NCAR's Earth System Grid (ESG) portal. The NARCCAP smart software ecosystem includes high-level climate analysis tools such as the Climate Data Operators (CDO), which allow a user to manipulate a NetCDF file to create composite files, such as a seasonal climatology from daily output, and a set of Python-based CDAT tools that facilitate access to and management of large gridded datasets.

As a user of large gridded datasets, Brian Etherton from NOAA–ESRL presented an overview of the Weather Research and Forecasting model (WRF), Advanced Research WRF (ARW), and the Local Analysis and Prediction System (LAPS), including installation, initialization, simulation, and verification. Using asynoptic observations from the 22 May 2008 Windsor, Colorado, tornado, Etherton demonstrated the impact of introducing local nonstandard data into the initialization, which included a local data bundle that can be pulled from Unidata's RAMADDA server. Using the IDV to illustrate the impact of local data, Etherton then showed a difference field between the first-guess pressure vertical velocity from a 1-h Rapid Update Cycle (RUC) model forecast for a recent day (11 July 2012) and an analysis with supplemental mesonet data, reflecting the influence of the deep convection over the Texas–Louisiana coastal region. In a related talk, Russ Schumacher of Colorado State University (CSU) discussed the CSU real-time WRF ensemble that runs on an iMac. The combination of inexpensive computing resources and ubiquitous easy-to-access data has fueled university-based NWP. The CSU WRF ensemble, which is running at a resolution that can resolve large mesoscale systems, consists of five members with varying physics and initial/lateral boundary conditions from the Global Forecast System (GFS) model, North American Model (NAM), and the WRF variational data assimilation system (WRF–Var). Schumacher presented a case study of the 19–20 June 2012 Duluth, Minnesota, flash flood, comparing the various members and ensemble mean with NCEP's stage IV precipitation analyses. Using the IDV, the ensemble runs are regularly discussed in CSU weather discussions.

Blue Sky—The future.

A number of presentations addressed big-picture and wish-list related items. On the first day of the workshop, Cliff Jacobs of the National Science Foundation (NSF) discussed the NSF EarthCube initiative. Referring to Unidata as an exemplar, Jacobs championed a more collective or community approach to scientific and data infrastructure. Phrases such as “sea of data” and “transforming Earth science” as well as questions relating to cross-discipline data management formed the backdrop of his presentation. Speaking indirectly on the so-called data friction issue (Edwards 2010), which refers to “ease of use” in that researchers can focus more on the science and less on data-related issues, Jacobs discussed overcoming the “business-as-usual” ES139 model for data integration and use—posing questions related to NSF's role as a facilitator in the process of redirecting the future of integrative science and data. Citing problems with current cyberinfrastructure and its failure to keep pace with modern science, Jacobs painted a broad vision of integrating data across the geosciences that would, in effect, change the way research is conducted and lead to greater productivity. In a later session, UPC director Mohan Ramamurthy presented Unidata's perspective on data management. Seeking to democratize data access, Unidata's primary philosophy is to “build it, test it, give it away (and support it).” Citing an EarthCube survey that indicated more than 50% of the respondents required data outside of their discipline, Ramamurthy emphasized the interconnected nature of geosciences data and the challenges this poses for Unidata. Referring to Unidata's vision of “geoscience at the speed of thought,” Ramamurthy identified five principal data challenges including 1) volume (data explosion); 2) variety (different types); 3) velocity (speed of discovery, access, and analysis); 4) views (data use); and 5) virtual communities (global network community). Within this vision, Ramamurthy discussed GIS integration, the “long tail” (i.e., skewed) data sharing problem, cloud computing, and data/resource citation.

Inspired by the work of Lewis Fry Richardson, one of the pioneers of NWP, NSF Atmospheric and Geospace Sciences Division director Michael Morgan presented his vision of a university-based national NWP ensemble. After talking briefly about data assimilation, Morgan launched his vision of a vast network of WRF ensembles that might tackle currently intractable problems such as tropical cyclone genesis. Given both its ubiquity and modularity, WRF is a natural candidate for desktop ensembles.

In Alexander MacDonald's presentation, the ESRL director led off with a rhetorical question: might there be a few climate surprises in the pipeline? Using this as a springboard, MacDonald gave an overview of ESRL's modeling and visualization efforts. In terms of model development, ESRL has been moving in the direction of finite volume NWP, which lends itself to flux-form equations, mass conservation, and graphics processing unit (GPU) computing. The calculation of dot products are especially amenable to GPU-based computation and represent a possible future computing paradigm, according to MacDonald. He also described an up-and-coming visualization package, TerraViz. As a component of the NOAA environmental information services framework, the software is designed for fast access to Earth system data. Emphasizing the importance of dealing with big data, MacDonald pushed for the continued development of new technology.

Greg Mandt, director of the GOES-R series satellite program, reported on the progress of this next generation of geosynchronous satellites. The GOES-R, scheduled to begin operation in 2016, sports 16 channels, a scan rate of five minutes over the conterminous United States (CONUS), a mesoscale floater at a 30-s rate, and enhanced spatial resolution of 0.5 km for the visible and 1.0 km for the infrared portions of the electromagnetic spectrum. Unlike the current GOES, there will be no hyperspectral sounder, which will limit the vertical resolution of the temperature and moisture profiles. However, the new satellite will have a lightning detector, magnetometer, and space weather instruments including a solar ultraviolet (UV) imager, a solar debris detector, and extreme UV and X-ray irradiance sensors. The new platform features a host of baseline and future imager products, such as cloud-drift winds, aerosol optical depth, fog and volcanic ash detection, and much more. Data access will be available, in real time, on AWIPS II workstations and in near–real time via a data repository (with a 7-day archive) from the product distribution and access (PDA) of the Environmental Satellite Processing and Distribution Service (ESPDS) at NOAA's National Environmental Satellite, Data, and Information Service (NESDIS). Long-term storage will be hosted by the NOAA Comprehensive Large Array-Data Stewardship System (CLASS) server at NCDC.

Matt Mayernik, research data services specialist at the NCAR–University Corporation for Atmospheric Research (UCAR) library, motivated his presentation with a rhetorical question, asking about proper citation of data, software, and services available via the web. Given their unreliable nature, URLs have become passé as citations have evolved to include digital object identifiers (DOIs), which provide a more persistent locator for internet-based resources. In addition to DOIs, the lesser known but similar archival resource keys (ARKs) also resolve to a dataset no matter where it resides on the web. While the importance of recognizing the contributions of data providers, software developers, and support services in the scientific process is recognized, it has been problematic to establish a best practices template for our community, and citing data sources is still not a common practice. Mayernik points out that the assignment of identifiers is not trivial, especially given the diversity and volume of resources. Albeit unresolved, recommendations for the best citing practices are being proposed by a variety of organizations. In a related talk, Ben Domenico of UPC discussed his idea of interactive scientific publishing, which describes a process that enables readers to access, analyze, display, and interpret the data used in a publication. The benefits are many—especially the promotion of open source data, which are not only documented but readily accessible to the entire community. Based in part on software that Unidata has been developing, such as IDV bundles and web-based Java-oriented tools, the publications and modules would be fully dynamic in terms of their data content, including access to the sites where the data are staged. A sea surface temperature example was given where a reader can dynamically change the coverage area, examine different times, and so on. Domenico advocated on the behalf of a new architecture with a brokering layer between client and server that can be used to harvest and serve metadata for catalog and discovery systems as well as data access and data processing services.

Poster session.

A student poster session was first introduced to the Unidata Users workshop in 2009. This year's session featured 11 posters on a wide variety of subjects. There were a number of climate-related posters including a learning-tool approach using IDV scripting, southwestern U.S. drought, a fire impact case study, a geographical look at aerosol and optical depth, and two on soil properties: one dealing with the impact of climate change on the desert tortoise and a second with an agricultural theme concerning the grain sorghum. There were several modeling posters that detailed mesoscale modeling, ensembles, nearcasts, and data assimilation. Rounding out the posters was a methodology for estimating surface roughness via land use data. The posters served to underscore the workshop theme—especially the application of Earth system science data to both research and education.

BENEFITS AND OUTCOMES.

For the first time at its triennial workshop, Unidata set up a real-time online evaluation system—allowing participants to provide immediate post-session feedback. The “on the fly” survey was intended to provide a different, more spontaneous perspective compared to a post-mortem survey. Participants were encouraged to provide feedback on each individual workshop session, with provisions for comments on more administrative matters such as the facility (location, comfort, and amenities), technology (e.g., audio/visual and networking), and ideas for future workshops. In addition to receiving feedback relevant to the next triennial workshop, this created an opportunity to respond, in situ, to participant comments—a useful modus operandi for an interactive workshop! For example, one commenter suggested that Unidata add a half day to the front end of the workshop so that participants could work with the Unidata staff to assist with workshop-related software installation. Anonymous survey results can be found online (at www.unidata.ucar.edu/community/surveys/workshop2012/survey_answers.html).

ACKNOWLEDGMENTS

We wish to acknowledge NSF Award 1227949 for providing the support for this workshop as well as the funding for the stipends so graduate students would be able to participate in the workshop. We also would like to recognize the tremendous effort of the Unidata Program Center staff leading up to and including the workshop, both of which were essential in making the entire event a success. In particular, we thank Douglas Dirks, Linda Miller, Tina Campbell, Ginger Emery, and Sean Arms. On behalf of Unidata, we offer our sincerest appreciation to the expert presenters who contributed their time, ideas, and tools to the workshop. Their presentations and tools are available to the larger community through the Unidata RAMADDA server.

APPENDIX A: PREVIOUS UNIDATA USERS WORKSHOPS.

Since 1988, eight previous Unidata Users Workshops have been held on topics specific to classroom instruction:

  1. Synoptic meteorology instruction (Huffman et al. 1989);

  2. Synoptic/mesoscale instruction (Wash et al. 1992);

  3. Mesoscale meteorology instruction in the age of the modernized National Weather Service (Ramamurthy et al. 1995);

  4. Faculty workshop on using instructional technologies and satellite data for college-level education in the atmospheric and Earth sciences (Wetzel et al. 1998);

  5. Shaping the future: Unidata users as leaders (Fulker et al. 2002);

  6. Expanding horizons: Using environmental data and model output for education, prediction, and decision making (Kruger et al. 2005);

  7. Expanding the use of models in the atmospheric and related sciences (Orf et al. 2007); and

  8. Using operational and experimental observations in geoscience education (Etherton et al. 2011).

In addition to providing a forum to enhance teaching in the atmospheric and related sciences, the triennial workshops have been an important venue for the community to share ideas and course materials and engage in-depth discussion on ways to improve student learning.

APPENDIX B: WORKSHOP AND UNIDATA HOMEPAGES

SOFTWARE DEMONSTRATIONS

DATA ACCESS AND APPLICATIONS

BLUE SKY—THE FUTURE

REFERENCES

  • Edwards, P. N., 2010: A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press, 518 pp.

  • Etherton, B. J., S. C. Arms, L. D. Oolman, G. M. Lackmann, and M. K. Ramamurthy, 2011: Using operational and experimental observations in geoscience education. Bull. Amer. Meteor. Soc., 92, 477480.

    • Search Google Scholar
    • Export Citation
  • Fulker, D. W., C. A. Jacobs, A. A. Rockwood, and D. N. Yarger, 2002: Infrastructure for ideas: Unidata as a catalyst for change in geoscience education and research. Bull. Amer. Meteor. Soc., 83, 2527.

    • Search Google Scholar
    • Export Citation
  • Huffman, G. J., R. Gall, and C. H. Wash, 1989: Results of the synoptic meteorology instruction workshop. Preprints, Fifth Int. Conf. on Interactive Information and Processing Systems for Meteorology, Oceanography, and Hydrology, Anaheim, CA, Amer. Meteor. Soc.

    • Search Google Scholar
    • Export Citation
  • Kruger, A., M. Laufersweiler, and M. C. Morgan, 2005: Expanding horizons. Bull. Amer. Meteor. Soc., 86, 167168.

  • Orf, L., G. Lackmann, C. Herbster, A. Krueger, E. Cutrim, T. Whitaker, J. Steenburgh, and M. Voss, 2007: Models as educational tools. Bull. Amer. Meteor. Soc., 88, 11011104.

    • Search Google Scholar
    • Export Citation
  • Ramamurthy, M. K., and Coauthors, 1995: Teaching mesoscale meteorology in the age of the modernized National Weather Service: A report on the Unidata/COMET workshop. Bull. Amer. Meteor. Soc., 76, 24632473.

    • Search Google Scholar
    • Export Citation
  • Wash, C. H., R. L. DeSouza, M. Ramamurthy, A. Anderson, G. Byrd, J. Justus, H. Edmon, and P. Samson, 1992: Teaching interactive computer systems: A report on the Unidata/COMET/STORM workshop on synoptic/mesoscale instruction. Bull. Amer. Meteor. Soc., 73, 14401447.

    • Search Google Scholar
    • Export Citation
  • Wetzel, M., and Coauthors, 1998: Faculty workshop on using instructional technologies and satellite data for college-level education in the atmospheric and Earth sciences. Bull. Amer. Meteor. Soc., 79, 21532160.

    • Search Google Scholar
    • Export Citation

1 Unidata's mission is “to transform the geosciences community, research, and education by providing innovative data services and tools.”

2 A list of previous summer workshops is contained in appendix A.

Save
  • Edwards, P. N., 2010: A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press, 518 pp.

  • Etherton, B. J., S. C. Arms, L. D. Oolman, G. M. Lackmann, and M. K. Ramamurthy, 2011: Using operational and experimental observations in geoscience education. Bull. Amer. Meteor. Soc., 92, 477480.

    • Search Google Scholar
    • Export Citation
  • Fulker, D. W., C. A. Jacobs, A. A. Rockwood, and D. N. Yarger, 2002: Infrastructure for ideas: Unidata as a catalyst for change in geoscience education and research. Bull. Amer. Meteor. Soc., 83, 2527.

    • Search Google Scholar
    • Export Citation
  • Huffman, G. J., R. Gall, and C. H. Wash, 1989: Results of the synoptic meteorology instruction workshop. Preprints, Fifth Int. Conf. on Interactive Information and Processing Systems for Meteorology, Oceanography, and Hydrology, Anaheim, CA, Amer. Meteor. Soc.

    • Search Google Scholar
    • Export Citation
  • Kruger, A., M. Laufersweiler, and M. C. Morgan, 2005: Expanding horizons. Bull. Amer. Meteor. Soc., 86, 167168.

  • Orf, L., G. Lackmann, C. Herbster, A. Krueger, E. Cutrim, T. Whitaker, J. Steenburgh, and M. Voss, 2007: Models as educational tools. Bull. Amer. Meteor. Soc., 88, 11011104.

    • Search Google Scholar
    • Export Citation
  • Ramamurthy, M. K., and Coauthors, 1995: Teaching mesoscale meteorology in the age of the modernized National Weather Service: A report on the Unidata/COMET workshop. Bull. Amer. Meteor. Soc., 76, 24632473.

    • Search Google Scholar
    • Export Citation
  • Wash, C. H., R. L. DeSouza, M. Ramamurthy, A. Anderson, G. Byrd, J. Justus, H. Edmon, and P. Samson, 1992: Teaching interactive computer systems: A report on the Unidata/COMET/STORM workshop on synoptic/mesoscale instruction. Bull. Amer. Meteor. Soc., 73, 14401447.

    • Search Google Scholar
    • Export Citation
  • Wetzel, M., and Coauthors, 1998: Faculty workshop on using instructional technologies and satellite data for college-level education in the atmospheric and Earth sciences. Bull. Amer. Meteor. Soc., 79, 21532160.

    • Search Google Scholar
    • Export Citation
All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 384 112 59
PDF Downloads 122 34 4