CONFIDENCE IN CLIMATE SERVICES—PRESENTING UNCERTAINTY WITH CONFIDENCE
What: Twenty-five participants from 10 European Union FP7 and H2020 projects (CLIPC, EUCLEIA, EUPORIAS, FIDUCEO, GAIA-CLIM, IMPACT2C, IMPRESSIONS, QA4ECV, SPECS), the European Space Agency SST CCI project, and two European institutions (C3S, EEA) met to share information about uncertainty in climate science and to discuss how to contribute to establishing confidence in the role of uncertainty in climate services.
When: 15–17 February 2016
Where: Climate Service Center Germany (GERICS), Hamburg, Germany
Enhancing trust in climate services is a fundamental challenge being faced by providers. Complicating this challenge is how best to communicate uncer-tainty to different sectors that handle information in different ways depending on their decision-making frameworks. To address this problem, for the first time a workshop was held to engage with and understand the different perspectives of European research projects, institutions, and climate service providers.
The workshop targeted European-funded projects (FP7 and H2020; see the appendix for a list of key acronyms and abbreviations used in this summary) that specifically related to the delivery and/or support of climate data, in particular providers of observational and modeled climate data (e.g., FIDUCEO, SPECS), of climate impact data (e.g., IMPACT2C), and service delivery (e.g., EUPORIAS). The assessment and communication of uncertainty is critical in developing confidence in climate services. The delegates presented their strategies in their projects or institutions, followed by in-depth discussions in six breakout groups.
ASSESSING UNCERTAINTY.
One step toward building confidence in the role of uncertainty is to reflect on how uncertainty can be assessed. Previous workshops have focused separately on observational (Matthews et al. 2013) and modeling (Qian et al. 2016) approaches; however, it was felt that by considering them together it might be possible to identify common challenges and opportunities. To facilitate this, three breakout groups discussed the following topics: methods for the quantification of uncertainty, if temporal and spatial scales matter for quantifying uncertainty, and how to categorize uncertainty.
Methods group.
An overview of methods for assessing uncertainty revealed three frameworks applied by the observation and modeling communities: verification and validation through comparison with a trusted standard reference, evaluation by testing the usefulness of a product to the user, and expert judgment. Within these frameworks there are several methods for quantifying, describing, and propagating uncertainty. As each method has its advantages and limitations, it is suggested that there is a need to apply a variety of methods to engender confidence. The discussion revealed though that this is not yet common practice. Climate services could benefit from more mutual cooperation between the observational and modeling communities. It was suggested that something analogous to the metrological traceability chain documenting the processing steps taken to produce remote sensing datasets (i.e., by QA4ECV) could be attractive for climate service products, ensuring that no uncertainty information gets lost in the chain while being tailored to the subsequent user needs.
Scale group.
Depending on the temporal and spatial scale of the study, different sources of uncertainty dominate; for example, random effects might be averaged out at longer temporal and larger spatial scales but systematic effects, such as imperfect instrument calibration, will persist. Appropriate methods for propagating the observational uncertainty estimates when averaging or accumulating a variable are urgently required. Some projects are attempting to address this problem (e.g., GAIA-CLIM, FIDUCEO, and SST CCI) but more work is needed. As with the methods group above, any solutions intended to build more trust require a better interaction between modelers and observational teams. This will include identifying clear specifications of user requirements on different temporal and spatial scales (Fig. 1) in terms of observational uncertainty estimates and the application of existing practices to as many observational datasets as possible.
Simulations are not exempt from problems of uncertainty estimation either as there is no general agreement on what constitutes an adequate uncertainty estimate. Earth system models are becoming more sophisticated and extensive through the addition of new components and processes. While these advances reflect an increase in knowledge and therefore reduced levels of uncertainty, they do not directly lead to a quantifiable estimate of uncertainty.
Category group.
To disentangle the multiple layers of uncertainty, a mapping exercise was conducted to identify examples across four categories of uncertainty made popular by former U.S. Secretary of Defense Donald Rumsfeld: known knowns, known unknowns, unknown knowns, and unknown unknowns. It is a known known that a certain fraction of the spread of climate projections is irreducible owing to internal variability in the climate system. Whereas the emissions scenarios used in climate projections are dependent on future policy implementations and therefore can be considered known unknowns. Unknown knowns, however, are areas of uncertainty we can explain or model but we do not recognize the importance of them to users. This category could be seen as service providers not fully understanding users’ needs but having the potential to be resolved through dialogue between the different parties. Unknown unknowns reflect areas of uncertainty that may be important to climate change but that have not yet been identified and can therefore only be speculated about. Hindsight has revealed examples, though, such as the depletion of stratospheric ozone as a result of anthropogenic pollutants. From these discussions it was possible to see that there are known components of uncertainty that can be used to outline the knowledge gaps.
COMMUNICATING UNCERTAINTY.
The workshop determined that the communication of uncertainty is critical in developing confidence in climate services and this was explored by addressing three questions: How best to engage with users? What are users’ communication preferences? What role does vocabulary play in understanding uncertainty?
User engagement group.
The importance of user engagement is widely acknowledged in building trust, but rather than a need for more engagement per se, there is an identifiable need for more targeted and efficient forms of engagement. The group discussed a range of successful strategies they experienced such as developing ongoing user engagement that creates close working relationships and allows for the efficient management of users’ input. Creating dedicated user engagement programs independent of any one project could support this last point, along with ensuring consistency of relationships and availability of responses (e.g., in shared databases). Responsive forms of user-led engagement (e.g., online FAQs) have also proved successful in improving usage and allowing the codevelopment of novel approaches. The incidental availability of broad statistics describing the kinds of user (e.g., geographical location, professional affiliation) engaging with available products can also be helpful.
User preferences group.
Using a mapping exercise inspired by Dowell et al. (2013), the chain of providers and users lying between climate data and climate service provision was explored (Fig. 1). While not exhaustive, this exercise highlighted i) multiple points at which uncertainty must be summarized and communicated, ii) that communication between the various “links” need not be unidirectional, and iii) that in the chain of providers and users, end-user preferences are not the only ones that must be considered. Communication challenges across the chain predominantly fell into two interlinked categories of “traceability” and “tailoring.” Traceability was seen as the need to maintain clarity about sources of uncertainty from observation to end user. The chain should not become an avalanche, cascading an unmanageable and unusable compendium of uncertainty details onto an overwhelmed user, but it should provide the links back to all the information for those who elect to follow them. While information about uncertainty may need to be condensed, a traceable chain of documentation is needed to provide full transparency. Tailoring encourages the climate service provider to recognize the differing information requirements of users at different points along “the chain,” as well as end users’ diverse needs. The importance of appropriately tailoring uncertainty information was stressed, as was a need for greater bidirectional communication between providers and subsequent users.
Language group.
It is of particular importance to convey the uncertainty information to different levels of decision-makers in understandable “language.” Two examples of well-proven practices in communicating confidence were identified: i) for a scientific audience, the definition of confidence through an amalgamation of level of evidence and agreement by the Intergovernmental Panel on Climate Change (Mastrandrea et al. 2011), and ii) for a broader nonscientific audience, using serious gaming to help local policy-makers understand climate hazards and risks (Suarez and Bachofen 2013). As seen by the user preference group, it is essential to maintain and improve interactive communication between service providers and decision-makers (Fig. 1). And, where appropriate, providing training on the presentation of climate and impact information with the necessary uncertainty information is seen as being decisive. This can be strengthened by climate services with a focus on traceability and the development of targeted guidance. The distribution of information through the translation of, for example, policy briefs into different languages needs to be done carefully, as any lack of clarity in the initial description of uncertainty is liable to be amplified in translation.
BARRIERS AND POTENTIAL SOLUTIONS.
A large part of the discussions centered on the barriers in building confidence in climate services and, where possible, their potential solutions. The ones noted here are far from being exhaustive but represent the key barriers and solutions highlighted at the workshop.
Barrier: Uncertainty is often seen as a barrier to action. Solution: The framing and integration of user needs at early stages of data product design is essential. On the one hand, this avoids unrealistic expectations by the users, but it also adds knowledge about which sources of uncertainty are most relevant.
Barrier: Each community has its own methods for treating uncertainty. Solution: Continued collaboration between communities in their roles as users and providers (Fig. 1) sharing information and learning from each other was recognized as a key for developing best practices.
Barrier: Presenting uncertainty in a clear, user-focused manner is a challenge. Solution: Lessons can be learned from other sectors as how to communicate uncertainty to users (e.g., finance or insurance), though care needs to be taken when applying other strategies within a new context.
LESSONS LEARNED FOR BEST PRACTICES.
During the workshop, three core lessons emerged from the group discussions that could be considered for best practice:
Transparency: The need to maintain traceability about sources of uncertainty was emphasized across all groups. While information about uncertainty may need to be condensed when it is communicated from provider to subsequent users, a traceable chain of documentation is necessary for full transparency. This assumes documentation of all processing steps (Fig. 1).
Layering: A layered approach allows tailoring the amount of information on uncertainty under different decision frameworks. This can only be achieved by bidirectional communication between providers and users, to ensure that the user’s needs are understood and that appropriate and accurate information is provided and appropriately interpreted (Fig. 1).
Disclosure: A tailored approached is not meant to hide uncertainty but rather aims to detect and document all known components of uncertainty, including knowledge gaps and issues relating to the methodology and processing of data. When communicating uncertainty, it is important to emphasize what we understand and to recognize that as research improves knowledge, some uncertainty sources may be reduced.
FUTURE CHALLENGES.
During the workshop two main challenges in the role of uncertainty for climate services were identified:
Validation of communication: The discussion of how to communicate uncertainty is often centered on how to transport information from providers to users. However, there is a great need for climate services to develop methods for testing the efficacy of communication strategies to ensure that appropriate and accurate uncertainty information is provided and that this is interpreted correctly.
Guidance: There is a clear need for guidance and standards on the methods of uncertainty assessment and communication. These do not yet exist for climate services. Noting that this was the first of its kind, similar workshops, preferably together with users, can serve as a good basis to share information between communities and to collect lessons learned that could be turned into best practices, which could then be developed into climate service standards.
ACKNOWLEDGMENTS
The three-day workshop was funded and initiated by the FP7 project CLIPC (Grant 607418) and organized jointly with the FP7 projects EUPORIAS, EUCLEIA, and QA4ECV and hosted by GERICS in Hamburg, Germany.
APPENDIX: KEY ACRONYMS AND ABBREVIATIONS USED IN THIS PAPER
C3S | Copernicus Climate Change Service |
CCI | Climate Change Initiative |
CLIPC | Climate Information Platform for Copernicus |
EEA | European Environment Agency |
EUCLEIA | European Climate and Weather Events: Interpretation and Attribution |
EUPORIAS | European Provision of Regional Impact Assessment on a Seasonal-to-Decadal Timescale |
FIDUCEO | Fidelity and Uncertainty in Climate Data Records from Earth Observations |
FP7 | European Union Seventh Framework Programme for Research |
GAIA-CLIM | Gap Analysis for Integrated Atmospheric Essential Climate Variable (ECV) Climate Monitoring |
H2020 | Horizon 2020 Research and Innovation Programme |
IMPACT2C | Quantifying Projected Impacts under 2°C Warming |
IMPRESSIONS | Impacts and Risks from High-End Scenarios: Strategies for Innovative Solutions |
QA4ECV | Quality Assurance for Essential Climate Variables |
SPECS | Seasonal-to-Decadal Climate Prediction for the Improvement of European Climate Services |
SST | Sea surface temperature |
REFERENCES
Dowell, M., and Coauthors, 2013: Strategy towards an architecture for climate monitoring from space. Committee on Earth Observation Satellites–Coordination Group for Meteorological Satellites–World Meteorological Organization, 39 pp. [Available online at www.wmo.int/pages/prog/sat/documents/ARCH_strategy-climate-architecture-space.pdf.]
Mastrandrea, M. D., K. J. Mach, G.-K. Plattner, O. Edenhofer, T. F. Stocker, C. B. Field, K. L. Ebi, and P. R. Matschoss, 2011: The IPCC AR5 guidance note on consistent treatment of uncertainties: A common approach across the working groups. Climatic Change, 108, 675–691, doi:10.1007/s10584-011-0178-6.
Matthews, J. L., E. Mannshardt, and P. Gremaud, 2013: Uncertainty quantification for climate observations. Bull. Amer. Meteor. Soc., 94(3), ES21–ES25, doi:10.1175/BAMS-D-12-00042.1.
Qian, Y., and Coauthors, 2016: Uncertainty quantification in climate modeling and projection. Bull. Amer. Meteor. Soc., 97, 821–824, doi:10.1175/BAMS-D-15-00297.1.
Suarez, P., and C. Bachofen, 2013: Using games to experience climate risk: Empowering Africa’s decision makers. CDKN Action Lab Innovation Grant Final Rep., Climate and Development Knowledge Network, 27 pp. [Available online at www.climatecentre.org/downloads/File/Games/CDKNGamesReport.pdf.]