A Call for the Evaluation of Web-Based Climate Data and Analysis Tools

Kristin VanderMolen Division of Atmospheric Sciences, Desert Research Institute, Reno, Nevada

Search for other papers by Kristin VanderMolen in
Current site
Google Scholar
PubMed
Close
,
Tamara U. Wall Division of Atmospheric Sciences, Desert Research Institute, Reno, Nevada

Search for other papers by Tamara U. Wall in
Current site
Google Scholar
PubMed
Close
, and
Britta Daudert Division of Hydrologic Sciences, Desert Research Institute, Reno, Nevada

Search for other papers by Britta Daudert in
Current site
Google Scholar
PubMed
Close
Restricted access

Abstract

Researchers are producing an ever greater number of web-based climate data and analysis tools in support of natural resource research and management. Yet the apparent absence or underreporting of evaluation in the development of these applications has raised questions as to whether, by whom, and for what they are utilized, and, relatedly, whether they meet the rationale for their development. This paper joins recent efforts to address these questions by introducing one approach to evaluation—developmental evaluation—and reporting on its use in the evaluation of the Southwest Climate and Environmental Information Collaborative (SCENIC). A web interface under development at the Western Regional Climate Center, SCENIC provides access to climate data and analysis tools to environmental scientists in support of natural resource research and management in the southwestern United States. Evaluation findings highlight subtlety in the improvements necessary for ensuring a useful and usable application that could not have been ascertained in the absence of end-user feedback. We therefore urge researchers to systematically evaluate web-based climate data and analysis tools in the interest of ensuring their usefulness, usability, and fulfillment of the proposed rationale. In so doing, we recommend that researchers test and apply established evaluation frameworks, thereby engaging end users directly in the process of application development.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

CORRESPONDING AUTHOR: Kristin VanderMolen,kristin.vandermolen@dri.edu

Abstract

Researchers are producing an ever greater number of web-based climate data and analysis tools in support of natural resource research and management. Yet the apparent absence or underreporting of evaluation in the development of these applications has raised questions as to whether, by whom, and for what they are utilized, and, relatedly, whether they meet the rationale for their development. This paper joins recent efforts to address these questions by introducing one approach to evaluation—developmental evaluation—and reporting on its use in the evaluation of the Southwest Climate and Environmental Information Collaborative (SCENIC). A web interface under development at the Western Regional Climate Center, SCENIC provides access to climate data and analysis tools to environmental scientists in support of natural resource research and management in the southwestern United States. Evaluation findings highlight subtlety in the improvements necessary for ensuring a useful and usable application that could not have been ascertained in the absence of end-user feedback. We therefore urge researchers to systematically evaluate web-based climate data and analysis tools in the interest of ensuring their usefulness, usability, and fulfillment of the proposed rationale. In so doing, we recommend that researchers test and apply established evaluation frameworks, thereby engaging end users directly in the process of application development.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

CORRESPONDING AUTHOR: Kristin VanderMolen,kristin.vandermolen@dri.edu
Save
  • Bangor, A., P. Kortum, and J. Miller, 2009: Determining what individual SUS scores mean: Adding an adjective rating scale. J. Usability Stud., 4 (3), 114123.

    • Search Google Scholar
    • Export Citation
  • Barnard, G., 2011: Seeking a cure for portal proliferation syndrome. Climate and Development Knowledge Network, https://cdkn.org/2011/06/portal-proliferation-syndrome/?loclang=en_gb.

    • Search Google Scholar
    • Export Citation
  • Barsugli, J. J., and Coauthors: 2013: The practitioner’s dilemma: How to assess the credibility of downscaled climate projections. Eos, Trans. Amer. Geophys. Union, 94, 424425, https://doi.org/10.1002/2013EO460005.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bernard, R. H., 2006: Research Methods in Anthropology: Qualitative and Quantitative Approaches. AltaMira Press, 824 pp.

  • Bozeman, B., and D. Sarewitz, 2011: Public value mapping and science policy evaluation. Minerva, 49, 123, https://doi.org/10.1007/s11024-011-9161-7.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Briley, L., D. Brown, and S. E. Kalafatis, 2015: Overcoming barriers during the co-production of climate information for decision-making. Climate Risk Manage ., 9, 4149, https://doi.org/10.1016/j.crm.2015.04.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brooke, J., 1996: SUS—A quick and dirty usability scale. Usability Evaluation in Industry, P. W. Jordan et al., Eds., CRC Press, 189194.

    • Search Google Scholar
    • Export Citation
  • Brown, M., and D. Bachelet, 2017: BLM sagebrush managers give feedback on eight climate web applications. Wea. Climate Soc ., 9, 3952, https://doi.org/10.1175/WCAS-D-16-0034.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Carden, F., and M. C. Alkin, 2012: Evaluation roots: An international perspective. J. Multidiscip. Eval., 8, 102118.

  • Cash, D. W., and J. Buizer, 2005: Knowledge-Action Systems for Seasonal to Interannual Climate Forecasting: Summary to a Workshop. National Academies Press, 44 pp., https://doi.org/10.17226/11204.

    • Crossref
    • Export Citation
  • Cash, D. W., W. C. Clark, F. Alcock, N. M. Dickson, N. Eckley, D. H. Guston, J. Jäger, and R. B. Mitchell, 2003: Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. USA, 100, 80868091, https://doi.org/10.1073/pnas.1231332100.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cash, D. W., J. C. Borck, and A. G. Patt, 2006: Countering the loading-dock approach to linking science and decision making: Comparative analysis of El Niño/Southern Oscillation (ENSO) forecasting systems. Sci. Technol. Hum. Values, 31, 465494, https://doi.org/10.1177/0162243906287547.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Chen, H. T., 2014: Practical Program Evaluation: Theory-Driven Evaluation and the Integrated Evaluation Perspective. 2nd ed. Sage, 464 pp.

  • Dickson, R., and M. Saunders, 2014: Developmental evaluation: Lessons for evaluative practice from the SEARCH Program. Evaluation, 20, 176194, https://doi.org/10.1177/1356389014527530.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Dilling, L., and M. C. Lemos, 2011: Creating usable science: Opportunities and constraints for climate knowledge use and their implications for science policy. Global Environ. Change, 21, 680689, https://doi.org/10.1016/j.gloenvcha.2010.11.006.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fagen, M. C., S. D. Redman, J. Stacks, V. Barrett, B. Thullen, S. Altenor, and B. L. Neiger, 2011: Developmental evaluation: Building innovations in complex environments. Health Promot. Pract ., 12, 645650, https://doi.org/10.1177/1524839911412596.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Faulkner, L., 2003: Beyond the five-user assumption: Benefits of increased sample sizes in usability testing. Behav. Res. Methods Instrum. Comput., 35, 379383, https://doi.org/10.3758/BF03195514.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Funnell, S. C., and P. J. Rogers, 2011: Purposeful Program Theory: Effective Use of Theories of Change and Logic Models. John Wiley and Sons, 576 pp.

  • Gamble, J. A. A., 2008: A Developmental Evaluation Primer. J. W. McConnell Family Foundation, 69 pp.

  • GAO, 2015: Climate information: A national system could help federal, state, local, and private decision makers use climate information. U.S. Government Accountability Office Rep. 16-37, 53 pp., www.gao.gov/assets/680/673823.pdf.

    • Search Google Scholar
    • Export Citation
  • Ghere, G., J. A. King, L. Stevahn, and J. Minnema, 2006: A professional development unit for reflecting on program evaluator competencies. Amer. J. Eval., 27, 108123, https://doi.org/10.1177/1098214005284974.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Glouberman, S., and B. Zimmerman, 2002: Complicated and complex systems: What would successful reform of Medicare look like? Commission on the Future of Health Care in Canada Discussion Paper 8, 30 pp., www.alnap.org/system/files/content/resource/files/main/complicatedandcomplexsystems-zimmermanreport-medicare-reform.pdf.

    • Search Google Scholar
    • Export Citation
  • Hammill, A., B. Harvey, and D. Echeverria, 2013: Understanding needs, meeting demands: User-oriented analysis of online knowledge broker platforms for climate change and development. International Institute for Sustainable Development Paper, 32 pp., www.iisd.org/library/understanding-needs-meeting-demands-user-oriented-analysis-online-knowledge-broker-platforms.

  • Hogan, R. L., 2007: The historical development of program evaluation: Exploring past and present. Online J. Workforce Educ. Dev., 2, 5, https://opensiuc.lib.siu.edu/ojwed/vol2/iss4/5/.

    • Search Google Scholar
    • Export Citation
  • Honadle, B. W., M. A. Zapata, C. Auffrey, R. vom Hofe, and J. Looye, 2014: Developmental evaluation and the ‘Stronger Economies Together’ initiative in the United States. Eval. Program Plann., 43, 6472, https://doi.org/10.1016/j.evalprogplan.2013.11.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krug, S., 2014: Don’t Make Me Think, Revisited: A Common Sense Approach to Web Usability. 3rd ed. New Riders, 214 pp.

  • Lachner, F., P. Naegelein, R. Kowalski, M. Spann, and A. Butz, 2016: Quantified UX: Towards a common organizational understanding of user experience. Proc. Ninth Nordic Conf. on Human-Computer Interaction, Gothenburg, Sweden, Association for Computing Machinery, 56.

    • Search Google Scholar
    • Export Citation
  • Lam, C. Y., and L. M. Shulha, 2015: Insights on using developmental evaluation for innovating: A case study on the cocreation of an innovative program. Amer. J. Eval., 36, 358374, https://doi.org/10.1177/1098214014542100.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Langlois, M., N. Blanchet-Cohen, and T. Beer, 2013: The art of the nudge: Five practices for developmental evaluators. Can. J. Program Eval., 27, 3959.

    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., and B. J. Morehouse, 2005: The co-production of science and policy in integrated climate assessments. Global Environ. Change, 15, 5768, https://doi.org/10.1016/j.gloenvcha.2004.09.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., C. J. Kirchhoff, and V. Ramprasad, 2012: Narrowing the climate information usability gap. Nat. Climate Change, 2, 789794, https://doi.org/10.1038/nclimate1614.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lourenço, T. C., R. Swart, H. Goosen, and R. Street, 2016: The rise of demand-driven climate services. Nat. Climate Change, 6, 1314, https://doi.org/10.1038/nclimate2836.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Madaus, G. F., D. Stufflebeam, and M. S. Scriven, 1983: Program evaluation. Evaluation Models: Viewpoints on Educational and Human Services Evaluation, G. F. Madaus, M. Scriven, and D. Stufflebeam, Eds., Springer, 3–22.

    • Crossref
    • Export Citation
  • McNie, E. C., 2007: Reconciling the supply of scientific information with user demands: An analysis of the problem and review of the literature. Environ. Sci. Policy, 10, 1738, https://doi.org/10.1016/j.envsci.2006.10.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McNie, E. C., 2013: Delivering climate services: Organizational strategies and approaches for producing useful climate-science information. Wea. Climate Soc., 5, 1426, https://doi.org/10.1175/WCAS-D-11-00034.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Meadow, A. M., D. B. Ferguson, Z. Guido, A. Horangic, G. Owen, and T. Wall, 2015: Moving toward the deliberate coproduction of climate science knowledge. Wea. Climate Soc., 7, 179191, https://doi.org/10.1175/WCAS-D-14-00050.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Meyer, R., 2011: The public values failures of climate science in the US. Minerva, 49, 4770, https://doi.org/10.1007/s11024-011-9164-4.

  • Morell, J. A., 2010: Evaluation in the Face of Uncertainty: Anticipating Surprise and Responding to the Inevitable. Guilford Press, 303 pp.

  • Narayanaswamy, L., 2016: Gender, Power and Knowledge for Development. Routledge, 270 pp.

    • Crossref
    • Export Citation
  • National Research Council, 2007: Evaluating Progress of the U.S. Climate Change Science Program: Methods and Preliminary Results. National Academies Press, 178 pp., https://doi.org/10.17226/11934.

    • Crossref
    • Export Citation
  • Narayanaswamy, L., 2010: Informing an Effective Response to Climate Change. National Academies Press, 346 pp., https://doi.org/10.17226/12784.

    • Crossref
    • Export Citation
  • Nielsen, J., 2000: Designing Web Usability. New Riders, 432 pp.

  • Oakley, N. S., and B. Daudert, 2016: Establishing best practices to improve usefulness and usability of web interfaces providing atmospheric data. Bull. Amer. Meteor. Soc., 97, 263274, https://doi.org/10.1175/BAMS-D-14-00121.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Overpeck, J. T., G. A. Meehl, S. Bony, and D. R. Easterling, 2011: Climate data challenges in the 21st century. Science, 331, 700702, https://doi.org/10.1126/science.1197869.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Patton, M. Q., 1994: Developmental evaluation. Eval. Pract., 15, 311319, https://doi.org/10.1016/0886-1633(94)90026-4.

  • Patton, M., 1996: A world larger than formative and summative. Eval. Pract., 17, 131144, https://doi.org/10.1016/S0886-1633(96)90018-5.

  • Patton, M., 2008: Utilization-Focused Evaluation. 4th ed. Sage, 688 pp.

  • Patton, M., 2011: Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. Guilford Press, 375 pp.

  • Rich, R. F., 1997: Measuring knowledge utilization: Processes and outcomes. Knowl. Policy, 10, 1124, https://doi.org/10.1007/BF02912504.

  • Rogers, P. J., 2008: Using programme theory to evaluate complicated and complex aspects of interventions. Evaluation, 14, 2948, https://doi.org/10.1177/1356389007084674.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Rossing, T., A. Otzelberger, and P. Girot, 2014: Scaling up the use of tools for community-based adaptation. Community-Based Adaptation to Climate Change: Scaling It Up, et al., Eds., Routledge, 103–121.

  • Sauro, J., 2011: A Practical Guide to the System Usability Scale: Background, Benchmarks & Best Practices. Measuring Usability LCC, 162 pp.

  • Scriven, M., 1967: The Methodology of Evaluation. Rand McNally, 140 pp.

  • Scriven, M., 1993: Hard-won lessons in program evaluation. New Dir. Program Eval., 58, 1107.

  • Stevahn, L., J. A. King, G. Ghere, and J. Minnema, 2005: Establishing essential competencies for program evaluators. Amer. J. Eval., 26, 4359, https://doi.org/10.1177/1098214004273180.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Stufflebeam, D., 2001: Evaluation models. New Dir. Eval., 2001 (89), 798, https://doi.org/10.1002/ev.3.

  • Swart, R., K. de Bruin, S. Dhenain, G. Dubois, A. Groot, and E. von der Forst, 2017: Developing climate information portals with users: Promises and pitfalls. Climate Serv ., 6, 1222, https://doi.org/10.1016/j.cliser.2017.06.008.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Vogel, J., E. McNie, and D. Behar, 2016: Co-producing actionable science for water utilities. Climate Serv ., 2–3, 3040, https://doi.org/10.1016/j.cliser.2016.06.003.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wall, T. U., A. M. Meadow, and A. Horganic, 2017: Developing evaluation indicators to improve the process of coproducing usable climate science. Wea. Climate Soc., 9, 95107, https://doi.org/10.1175/WCAS-D-16-0008.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 1825 839 261
PDF Downloads 569 105 7