Climate Assessments for Local Action

Christine J. Kirchhoff University of Connecticut, Storrs, Connecticut

Search for other papers by Christine J. Kirchhoff in
Current site
Google Scholar
PubMed
Close
,
Joseph J. Barsugli Cooperative Institute for Research in Environmental Sciences, University of Colorado Boulder, and NOAA/ESRL/PSD, Boulder, Colorado

Search for other papers by Joseph J. Barsugli in
Current site
Google Scholar
PubMed
Close
,
Gillian L. Galford University of Vermont, Burlington, Vermont

Search for other papers by Gillian L. Galford in
Current site
Google Scholar
PubMed
Close
,
Ambarish V. Karmalkar Northeast Climate Adaptation Science Center, and University of Massachusetts Amherst, Amherst, Massachusetts

Search for other papers by Ambarish V. Karmalkar in
Current site
Google Scholar
PubMed
Close
,
Kelly Lombardo University of Connecticut, Groton, Connecticut

Search for other papers by Kelly Lombardo in
Current site
Google Scholar
PubMed
Close
,
Scott R. Stephenson University of Connecticut, Storrs, Connecticut

Search for other papers by Scott R. Stephenson in
Current site
Google Scholar
PubMed
Close
,
Mathew Barlow University of Massachusetts Lowell, Lowell, Massachusetts

Search for other papers by Mathew Barlow in
Current site
Google Scholar
PubMed
Close
,
Anji Seth University of Connecticut, Storrs, Connecticut

Search for other papers by Anji Seth in
Current site
Google Scholar
PubMed
Close
,
Guiling Wang University of Connecticut, Storrs, Connecticut

Search for other papers by Guiling Wang in
Current site
Google Scholar
PubMed
Close
, and
Austin Frank University of Connecticut, Storrs, Connecticut

Search for other papers by Austin Frank in
Current site
Google Scholar
PubMed
Close
Full access

We are aware of a technical issue preventing figures and tables from showing in some newly published articles in the full-text HTML view.
While we are resolving the problem, please use the online PDF version of these articles to view figures and tables.

Abstract

Global and national climate assessments are comprehensive, authoritative sources of information about observed and projected climate changes and their impacts on society. These assessments follow well-known, accepted procedures to create credible, legitimate, salient sources of information for policy- and decision-making, build capacity for action, and educate the public. While there is a great deal of research on assessments at global and national scales, there is little research or guidance for assessment at the U.S. state scale. To address the need for guidance for state climate assessments (SCAs), the authors combined insights from the literature, firsthand experience with four SCAs, and interviews with individuals involved in 10 other SCAs to identify challenges, draw lessons, and point out future research needs to guide SCAs. SCAs are challenged by sparseness of literature and data, insufficient support for ongoing assessment, short time lines, limited funding, and surprisingly, little deliberate effort to address legitimacy as a concern. Lessons learned suggest SCAs should consider credibility, legitimacy, and salience as core criteria; happen at regular intervals; identify assessment scope, resource allocation, and trade-offs between generation of new knowledge, engagement, and communication up front; and leverage boundary organizations. Future research should build on ongoing efforts to advance assessments, examine the effectiveness of different SCA approaches, and seek to inform both broad and specific guidance for SCAs.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

CORRESPONDING AUTHOR: Christine J. Kirchhoff, christine.kirchhoff@uconn.edu

A supplement to this article is available online (10.1175/BAMS-D-18-0138.2)

Abstract

Global and national climate assessments are comprehensive, authoritative sources of information about observed and projected climate changes and their impacts on society. These assessments follow well-known, accepted procedures to create credible, legitimate, salient sources of information for policy- and decision-making, build capacity for action, and educate the public. While there is a great deal of research on assessments at global and national scales, there is little research or guidance for assessment at the U.S. state scale. To address the need for guidance for state climate assessments (SCAs), the authors combined insights from the literature, firsthand experience with four SCAs, and interviews with individuals involved in 10 other SCAs to identify challenges, draw lessons, and point out future research needs to guide SCAs. SCAs are challenged by sparseness of literature and data, insufficient support for ongoing assessment, short time lines, limited funding, and surprisingly, little deliberate effort to address legitimacy as a concern. Lessons learned suggest SCAs should consider credibility, legitimacy, and salience as core criteria; happen at regular intervals; identify assessment scope, resource allocation, and trade-offs between generation of new knowledge, engagement, and communication up front; and leverage boundary organizations. Future research should build on ongoing efforts to advance assessments, examine the effectiveness of different SCA approaches, and seek to inform both broad and specific guidance for SCAs.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

CORRESPONDING AUTHOR: Christine J. Kirchhoff, christine.kirchhoff@uconn.edu

A supplement to this article is available online (10.1175/BAMS-D-18-0138.2)

While climate change is a global phenomenon, the associated impacts such as heat waves, droughts, wildfires, and storms are devastating to local communities [U.S. Global Change Research Program (USGCRP) 2017]. The frequency and severity of these events have renewed interest in better understanding local impacts and adaptation options (Bierbaum et al. 2013). As states and communities’ interest in climate change impacts and adaptation grows, so does their need for usable climate information.

Global and national climate assessments (CAs) are comprehensive, authoritative sources of information about observed and projected climate changes and their impacts on society, like those undertaken under the auspices of the Intergovernmental Panel on Climate Change (IPCC) and USGCRP involving synthesis and evaluation of scientific studies by hundreds (or thousands) of recognized experts (IPCC 2014; USGCRP 2017). Assessments aim to produce credible, legitimate, policy-relevant, scientific information to inform policy and decisions (Farrell and Jäger 2006; Jacobs and Buizer 2016; Mach and Field 2017; Mitchell et al. 2006). Unfortunately, there remains a disconnect between the information in these global and national CAs and what states and communities need to inform local decision-making (see Fig. 1).

Fig. 1.
Fig. 1.

Scale, purpose, and audience of global, national, and state CAs.

Citation: Bulletin of the American Meteorological Society 100, 11; 10.1175/BAMS-D-18-0138.1

To address this disconnect, more and more states are undertaking CAs (see Table 1, and for more information, see supplemental Table ES1) but with little guidance for how to organize and conduct comprehensive, authoritative, and usable assessments at this scale. While 40 years of research on global and national assessments provides a starting point to guide state climate assessment (SCA) efforts to produce credible, legitimate, and policy-relevant information, there are unique aspects of state assessments that fall outside the range of techniques used to develop assessments at larger scales. To address the need for comprehensive SCA guidance, the authors convened a workshop (November 2017) at which participants discussed lessons learned beyond review of the literature on global and national CAs, as well as from their experiences conducting four SCAs—Connecticut, Vermont, Massachusetts, and Colorado. In 2018, the authors conducted interviews to learn from the experiences of 10 additional U.S. SCAs. Drawing on the literature and firsthand experience of the authors and interview participants, we sought to understand 1) how global and national CAs produce credible, legitimate, and salient scientific information to help guide SCAs; 2) similarities and differences between assessments at state and other scales; 3) unique challenges faced by SCAs; and 4) lessons learned from existing SCAs that may help guide future assessments. In this In Box article, we share highlights from this emerging area of research on SCAs.

Table 1.

States with SCAs with most recent SCA year. Interviewees included representatives from all SCAs listed except Maryland and Delaware.

Table 1.

LEARNING FROM GLOBAL AND NATIONAL ASSESSMENTS.

Years of research on global and national CAs offers a number of important lessons. First, research on global assessments (Farrell and Jäger 2006) and the U.S. National CA (NCA) (Mitchell et al. 2006) suggests that involving recognized experts enhances assessment credibility as does using accepted data, methods/tools, numerical models, and scientific peer review. Credibility of CAs can be undermined when certain types of expertise are excluded, when it is perceived that alarming and controversial findings are excluded, and when political forces try to delegitimize or discredit the assessment (Morgan et al. 2005; Vardy et al. 2017). Second, being transparent, ensuring fair participation, and engaging with policy- and decision-makers and the general public enhances legitimacy resulting in buy-in and support for CAs (Farrell and Jäger 2006; Mitchell et al. 2006; Mach and Field 2017; Vardy et al. 2017). For example, consistent engagement with national government representatives during the IPCC AR4 enhanced legitimacy and created support for the assessment (Vardy et al. 2017), while opportunities for both public and stakeholder input enhanced legitimacy of the third U.S. NCA (Cloyd et al. 2016). Finally, salience is about the relevance of assessment information for decision-making or the public (Cash et al. 2003). Scholars of global and national assessments suggest that achieving salience requires engaging with policy- and decision-makers in the production of assessment information and using effective and varied communication strategies (Buizer et al. 2016; Farrell and Jäger 2006; Mach and Field 2017; Mitchell et al. 2006; Pearce et al. 2018).

LEARNING FROM STATE CLIMATE ASSESSMENTS: WORKSHOP AND INTERVIEWS.

The authors convened “Methodologies and Engagement for State Level Climate Assessment” on 6 November 2017 at the University of Connecticut, Storrs campus. Approximately 25 individuals including the authors and other faculty, staff, and students from the University of Connecticut participated in the event. The open portion of the event included five presentations and a panel discussion designed to share first-person lessons learned from conducting three SCAs Vermont, Massachusetts, and Colorado, one municipal climate assessment (Boston), and a synthesis of lessons learned from 40 years of conducting national and international CAs. The closed afternoon session focused on synthesizing lessons learned from SCAs. Presentation slides and notes from the workshop helped inform development of the interview protocol.

We sought and obtained Institutional Review Board approval for our qualitative research (protocol X18-093) that involved interviews with 1–2 individuals from 14 SCAs totaling 17 interviewees (Table 1). Interviewees were identified using 1) the participant list from the National Academies Making Climate Assessments Work Workshop held in August 2018 [National Academy of Sciences (NAS); NAS 2019], 2) coauthor networks, and 3) snowball (from interviewees) and internet searches for SCAs. Representatives from all 14 SCAs we contacted agreed to an interview and all interviewees led or played major roles in their SCA. Interviews were conducted from October to December 2018, by phone, and each interview lasted between 43 and 109 min. Interviewees were asked questions about 1) the organization and conduct of the SCA (i.e., motivation, scope, funding, climate data and analysis, review process, stakeholder engagement, outreach, and use of SCA products); 2) stakeholder perceptions of SCA salience, credibility, and legitimacy; and 3) challenges and lessons learned from doing the SCA building on themes from the literature (Cash et al. 2003; Farrell and Jäger 2006; Mitchell et al. 2006; Vardy et al. 2017). Interviews were transcribed and coded in NVivo 11 (QSR International) using both inductive and deductive qualitative methodologies (Creswell 2007; Galletta 2013; Saldaña 2016). Because anonymity was guaranteed, interviewees are referred to by code only. See supplemental material for the interview protocol and for additional details on research methods.

SIMILARITIES AND DIFFERENCES IN ENSURING CREDIBLE, LEGITIMATE, AND SALIENT ASSESSMENTS.

Our own experience and the experience of our interviewees suggest that SCAs build credibility in similar ways to national and global assessments (see Fig. 2). Credible SCAs rely both on recognized experts to perform the assessment using accepted data and methods, and on scientific peer review of the assessment products (13 of 14 state assessments employed peer review). Where SCAs differ, they often require the production of new knowledge to compensate for the scarcity of available scientific evidence at the state scale. While SCAs typically produce state-scale knowledge using existing data such as long-term, quality-controlled, weather station data to analyze historical climate trends, or downscaled statistical [e.g., BCSD, Localized Constructed Analogs (LOCA)] and dynamical (e.g., NARCCAP; Mearns et al. 2013) models for making climate projections, there is little guidance for how to appropriately apply this information at the state scale. For example, in Connecticut, assessment authors sought to use the LOCA (Pierce et al. 2014) database for consistency with the NCA, but verification with local climate observations showed that the Multivariate Adaptive Constructed Analogs (MACA; Abatzoglou and Brown 2012) database better matched historical climate observations, especially for extreme precipitation statistics (Seth et al. 2019).

Fig. 2.
Fig. 2.

Comparison of both common and unique approaches to enhancing credibility, legitimacy, and salience among international, national and state CAs.

Citation: Bulletin of the American Meteorological Society 100, 11; 10.1175/BAMS-D-18-0138.1

Building support and buy-in through engagement with policy-/decision-makers and other stakeholders was common among the 14 SCAs we reviewed. However, only two assessments considered legitimacy a core concern, focusing on the perceived fairness of who participates and on perceptions about the intended audience. For SCAs, beyond engagement, legitimacy crucially depended on the involvement of local experts. One interviewee explained, “people were really very supportive of what we did because it…wasn’t somebody from another university or an academic somewhere else that was saying this kind of thing to the state. It was local experts with local expertise” (SCA interviewee 4).

Salience depends on working with policy-/decision-makers in the production of assessment information and, similar to assessments at other scales, many of the 14 SCAs we reviewed engaged with state and local policy-/decision-makers and other stakeholders. However, there was considerable variation in who were engaged, how they were engaged, and how often engagement happened. For example, ongoing collaborations with local and state decision-makers in the production of assessment information was a hallmark of the Vermont assessment, whereas Massachusetts used a more consultative approach through periodic workshops and meetings with stakeholders and decision-makers. Colorado fell in between seeking repeated input from policy-/decision-makers and other stakeholders to direct the assessment scope, refocus efforts along the way, and review the final assessment.

Research shows that interactions between scientists and policy-/decision-makers through boundary organizations—organizations that straddle the science–policy divide, provide trusted information, and establish and maintain relationships with decision-makers—improves salience and usability of climate information (Agrawala et al. 2001; Bales et al. 2004; Kirchhoff et al. 2013; McNie 2008). For example, Colorado’s assessment relied on a boundary organization, the Western Water Assessment (https://wwa.colorado.edu/about/index.html), to lead the assessment and to facilitate communication and engagement with stakeholders. In addition to Colorado, four other SCAs relied on boundary organizations. Interviews suggest that differences in the quality and level of engagement may affect the salience and usability of assessment information, but it was difficult to separate the influence of engagement from other influential factors (e.g., motivation for assessment).

UNIQUE CHALLENGES OF STATE ASSESSMENTS.

National and global CAs are relatively well resourced with centralized staffing to assist in organization and outreach (Jacobs and Buizer 2016; Jabbour and Flachsland 2017), whereas SCAs often operate with limited human and financial resources and under short time lines. These factors necessitate compromises between efforts to develop local scientific products, engage with the public, policy-makers, and other stakeholders, and implement effective communication strategies. Our review suggests that SCAs often invest the most time and resources in generating new knowledge; only 2 of the 14 assessments invested heavily in engagement and communication in addition to knowledge creation. Yet, rather than resource limitations, philosophical differences in how to create SCA (e.g., valuing engagement and communication) drove differences in this investment. Our review suggests that using boundary organizations can help offset necessary trade-offs. Boundary organizations can leverage both internal resources and staffing to help organize and conduct the assessment and external relationships with stakeholders to facilitate engagement. Their familiarity with locally credible and salient datasets and models, along with a track record of coproduction can streamline the production of new knowledge.

SCAs offer the promise of usable climate information at the spatial and temporal scales decision-makers need, yet SCAs are challenged to deliver on this promise because of inadequacies in the availability of finescale, long-term historical climate information, shortcomings in the capability of climate models to project climate at fine spatial and temporal resolutions, and the lack of existing scholarship on climate impacts at the scale of interest. Many interviewees mentioned a mismatch between the availability of long-term historical climate data on which to base trends analysis and validation procedures, and the fine spatial and temporal scales required by stakeholders. Another mismatch exists with global climate model simulations whose native spatial resolution (e.g., larger than hundreds of kilometers; Masson and Knutti 2011) is generally much coarser than what stakeholders want for making decisions about local climate impacts. Furthermore, uncertainties in finescale projections are greater (Hawkins and Sutton 2009) contributing to spatially homogeneous projections despite geographical differences (e.g., inland vs coast, mountain vs valley). Finally, several interviewees expressed challenges with the lack of existing scholarship on areas of interest to state stakeholders. For example, the lack of a robust literature on climate impacts to public health, ecosystems, and agriculture prevented assessment authors in two states from including climate impacts information on critical sectors of interest to stakeholders due to insufficient evidence.

Finally, the IPCC and NCA are mandated to recur at regular intervals, which fosters advancements in CAs and the incorporation of new information. The latest generation of IPCC and NCA assessments are more participatory and cover a broader range of topics and, over time, have generated deeper engagement and support for climate solutions (Jabbour and Flachsland 2017; Jacobs and Buizer 2016; Mach and Field 2017). SCAs may or may not have an official mandate and few have support for a recurring assessment process. Among the 14 assessments reviewed, only three recur with predictable regularity. In California, the longest-running example, changes include a broader range of topics, and an expansion in expertise type and stakeholders involved over time. For SCAs that lack support for a recurring assessment process, there are fewer opportunities for learning and change including shifts in engagement quality, capacity building, and support for solutions.

LEARNING FROM EXPERIENCE AND NEXT STEPS FOR STATE ASSESSMENTS.

There are important differences and challenges specific to SCAs that go beyond those of larger-scale assessments including sparseness of existing literature for traditional assessment necessitating the production of new knowledge, little deliberate focus on fairness in and representativeness of the assessment (a traditional focus of legitimacy), a lack of support for an ongoing assessment process, short time lines and limited funding, and mismatches between the usable information stakeholders want and what state assessments can actually provide. Identifying and learning from these differences and challenges is important for guiding the next generation of SCAs. Lessons learned from our experience and the experience of others in creating SCAs include the following:

  • SCAs should make prioritize efforts to address legitimacy and consider credibility, legitimacy, and salience as core criteria.

  • Ongoing support for SCAs is needed. Lack of support for a recurring assessment process creates fewer opportunities for learning and change and the associated shifts in the engagement quality, capacity building, and support for solutions.

  • SCAs should clearly identify the assessment bounds, where resources will be used, and how trade-offs in the generation of new knowledge, engagement, and communication will be managed.

  • Boundary organizations can help mitigate trade-offs between the production of new knowledge, engagement, and communication with policy-/decision-makers, other stakeholders and the public.

  • The lack of high-resolution observed and projected data are key constraints for SCAs.

While these lessons are a useful starting point for future SCAs, more work is needed to fully understand what makes SCAs effective and to inform both broad and specific guidance for SCAs, such as technical guidance on how to apply existing data appropriately at the state scale. In addition, building on existing networks and ongoing efforts to learn from SCAs makes sense (Galford et al. 2016; NAS 2019) as does more carefully examining SCAs to extract benefits and drawbacks of different engagement approaches to improve credibility, legitimacy, salience, and ultimately the usability of information for local policy and decision-making.

ACKNOWLEDGMENTS

Seth, Lombardo, Stephenson, and Wang were supported by the Connecticut Institute for Resilience and Climate Adaptation. Galford was supported by the Norman Foundation and Rubenstein School of Environment and Natural Resources at the University of Vermont. Karmalkar was supported by State of Massachusetts and Northeast Climate Adaptation Science Center at UMass Amherst. Barlow was supported by the National Science Foundation (Award AGS-1623912). The authors would also like to thank interviewees who graciously shared their time and experience.

FOR FURTHER READING

  • Abatzoglou, J. T., and T. J. Brown, 2012: A comparison of statistical downscaling methods suited for wildfire applications. Int. J. Climatol. ,32, 772780, https://doi.org/10.1002/joc.2312.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Agrawala, S., K. Broad, and D. H. Guston, 2001: Integrating climate forecasts and societal decision making: Challenges to an emergent boundary organization. Sci. Technol. Hum. Values, 26, 454477, https://doi.org/10.1177/016224390102600404.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bales, R. C., D. M. Liverman, and B. Morehouse, 2004: Integrated assessment as a step toward reducing climate vulnerability in the southwestern United States. Bull. Amer. Meteor. Soc., 85, 17271734, https://doi.org/10.1175/BAMS-85-11-1727.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bierbaum, R., and Coauthors, 2013: A comprehensive review of climate adaptation in the United States: More than before, but less than needed. Mitigation Adapt. Strategies Global Change, 18, 361406, https://doi.org/10.1007/s11027-012-9423-1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Buizer, J. L., and Coauthors, 2016: Building a sustained climate assessment process. Climatic Change, 135, 2337, https://doi.org/10.1007/s10584-015-1501-4.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cash, D., W. C. Clark, F. Alcock, N. M. Dickson, N. Eckley, and J. Jäger, 2003: Salience, credibility, legitimacy and boundaries: Linking research, assessment and decision making. Harvard University John F. Kennedy School of Government Faculty Working Paper RWP02-046, 25 pp.

    • Crossref
    • Export Citation
  • Cloyd, E., S. C. Moser, E. Maibach, J. Maldonado, and C. Tinqiao, 2016: Engagement in the Third U.S. National Climate Assessment: Commitment, capacity, and communication for impact. Climatic Change, 135, 3954, https://doi.org/10.1007/s10584-015-1568-y.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Creswell, J. W., 2007: Research Design: Qualitative, Quantitative and Mixed Method Approaches. SAGE Publications, 296 pp.

  • Farrell, A. E., and J. Jäger, 2006: Assessments of Regional and Global Environmental Risks: Designing Processes for the Effective Use of Science in Decisionmaking. Resources for the Future Press, 320 pp.

  • Galford, G. L., and Coauthors, 2016: Bridging the climate information gap: A framework for engaging knowledge brokers and decision makers in state climate assessments. Climatic Change, 138, 383395, https://doi.org/10.1007/s10584-016-1756-4.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Galletta, A., 2013. Mastering the Semi-Structured Interview and Beyond: From Research Design to Analysis and Publication. New York University Press, 258 pp.

    • Crossref
    • Export Citation
  • Hawkins, E., and R. Sutton, 2009: The potential to narrow uncertainty in regional climate predictions. Bull. Amer. Meteor. Soc., 90, 10951108, https://doi.org/10.1175/2009BAMS2607.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • IPCC, 2014: Climate Change 2014: Synthesis Report. R. K. Pachauri and L. A. Meyer, Eds., IPCC, 151 pp.

  • Jabbour, J., and C. Flachsland, 2017: 40 years of global environmental assessments: A retrospective Analysis. Environ. Sci. Policy, 77, 193202, https://doi.org/10.1016/j.envsci.2017.05.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Jacobs, K. L., and J. L. Buizer, 2016: Building community, credibility and knowledge: The third US National Climate Assessment. Climatic Change, 135, 922, https://doi.org/10.1007/s10584-015-1445-8.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., M. C. Lemos, and S. Dessai, 2013: Actionable knowledge for environmental decision making. Annu. Rev. Environ. Resour. ,38, 393414, https://doi.org/10.1146/annurev-environ-022112-112828.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mach, K. J., and C. B. Field, 2017: Toward the next generation of assessment. Annu. Rev. Environ. Resour. ,42, 569597, https://doi.org/10.1146/annurev-environ-102016-061007.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Masson, D., and R. Knutti, 2011: Spatial-scale dependence of climate model performance in the CMIP3 ensemble. J. Climate, 24, 26802692, https://doi.org/10.1175/2011JCLI3513.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McNie, E. C., 2008: Co-producing useful climate science for policy: Lessons from the RISA program. Ph.D. dissertation, University of Colorado Boulder, 293 pp.

  • Mearns, L. O., and Coauthors, 2013: Climate change projections of the North American Regional Climate Change Assessment Program (NARCCAP). Climatic Change, 120, 965975, https://doi.org/10.1007/s10584-013-0831-3.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mitchell, R. B., W. C. Clark, D. W. Cash, and N. M. Dickson, 2006: Global Environmental Assessments: Information and Influence. MIT Press, 352 pp.

    • Crossref
    • Export Citation
  • Morgan, M. G., and Coauthors, 2005: Learning from the U.S. National Assessment of Climate Change Impacts. Environ. Sci. Technol., 39, 90239032, https://doi.org/10.1021/es050865i.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • NAS, 2019: Making Climate Assessments Work: Learning from California and Other Subnational Climate Assessments. National Academies Press, 86 pp., https://doi.org/10.17226/25324.

    • Crossref
    • Export Citation
  • Pearce, W., M. Mahony, and S. Raman, 2018: Science advice for global challenges: Learning from trade-offs in the IPCC. Environ. Sci. Policy, 80, 125131, https://doi.org/10.1016/j.envsci.2017.11.017.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Pierce, D. W., D. R. Cayan, and B. L. Thrasher, 2014: Statistical downscaling using localized constructed analogs (LOCA). J. Hydrometeor., 15, 25582585, https://doi.org/10.1175/JHM-D-14-0082.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Saldaña, J., 2016: The Coding Manual for Qualitative Researchers. SAGE Publishing, 328 pp.

  • Seth, A., G. Wang, C. Kirchhoff, K. Lombardo, S. Stephenson, R. Anyah, and J. Wu, 2019: Connecticut Physical Climate Science Assessment Report (PCSAR): Observed trends and projections of temperature and precipitation. Connecticut Institute for Resilience and Climate Adaptation, 68 pp., https://circa.uconn.edu/wp-content/uploads/sites/1618/2019/08/CTPCSAR-Aug2019.pdf.

  • USGCRP, 2017: Climate Science Special Report: Fourth National Climate Assessment. Vol. I. D. J. Wuebbles et al., Eds., U.S. Global Change Research Program, 470 pp., https://doi.org/10.7930/J0J964J6.

    • Crossref
    • Export Citation
  • Vardy, M., M. Oppenheimer, N. K. Dubash, J. O’Reilly, and D. Jamieson, 2017: The Intergovernmental Panel on Climate Change: Challenges and opportunities. Annu. Rev. Environ. Resour. ,42, 5575, https://doi.org/10.1146/annurev-environ-102016-061053.

    • Crossref
    • Search Google Scholar
    • Export Citation
1

The Rhode Island state climate summary is one of 50 state summaries produced by NCICS.

Supplementary Materials

Save
  • Abatzoglou, J. T., and T. J. Brown, 2012: A comparison of statistical downscaling methods suited for wildfire applications. Int. J. Climatol. ,32, 772780, https://doi.org/10.1002/joc.2312.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Agrawala, S., K. Broad, and D. H. Guston, 2001: Integrating climate forecasts and societal decision making: Challenges to an emergent boundary organization. Sci. Technol. Hum. Values, 26, 454477, https://doi.org/10.1177/016224390102600404.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bales, R. C., D. M. Liverman, and B. Morehouse, 2004: Integrated assessment as a step toward reducing climate vulnerability in the southwestern United States. Bull. Amer. Meteor. Soc., 85, 17271734, https://doi.org/10.1175/BAMS-85-11-1727.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bierbaum, R., and Coauthors, 2013: A comprehensive review of climate adaptation in the United States: More than before, but less than needed. Mitigation Adapt. Strategies Global Change, 18, 361406, https://doi.org/10.1007/s11027-012-9423-1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Buizer, J. L., and Coauthors, 2016: Building a sustained climate assessment process. Climatic Change, 135, 2337, https://doi.org/10.1007/s10584-015-1501-4.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cash, D., W. C. Clark, F. Alcock, N. M. Dickson, N. Eckley, and J. Jäger, 2003: Salience, credibility, legitimacy and boundaries: Linking research, assessment and decision making. Harvard University John F. Kennedy School of Government Faculty Working Paper RWP02-046, 25 pp.

    • Crossref
    • Export Citation
  • Cloyd, E., S. C. Moser, E. Maibach, J. Maldonado, and C. Tinqiao, 2016: Engagement in the Third U.S. National Climate Assessment: Commitment, capacity, and communication for impact. Climatic Change, 135, 3954, https://doi.org/10.1007/s10584-015-1568-y.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Creswell, J. W., 2007: Research Design: Qualitative, Quantitative and Mixed Method Approaches. SAGE Publications, 296 pp.

  • Farrell, A. E., and J. Jäger, 2006: Assessments of Regional and Global Environmental Risks: Designing Processes for the Effective Use of Science in Decisionmaking. Resources for the Future Press, 320 pp.

  • Galford, G. L., and Coauthors, 2016: Bridging the climate information gap: A framework for engaging knowledge brokers and decision makers in state climate assessments. Climatic Change, 138, 383395, https://doi.org/10.1007/s10584-016-1756-4.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Galletta, A., 2013. Mastering the Semi-Structured Interview and Beyond: From Research Design to Analysis and Publication. New York University Press, 258 pp.

    • Crossref
    • Export Citation
  • Hawkins, E., and R. Sutton, 2009: The potential to narrow uncertainty in regional climate predictions. Bull. Amer. Meteor. Soc., 90, 10951108, https://doi.org/10.1175/2009BAMS2607.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • IPCC, 2014: Climate Change 2014: Synthesis Report. R. K. Pachauri and L. A. Meyer, Eds., IPCC, 151 pp.

  • Jabbour, J., and C. Flachsland, 2017: 40 years of global environmental assessments: A retrospective Analysis. Environ. Sci. Policy, 77, 193202, https://doi.org/10.1016/j.envsci.2017.05.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Jacobs, K. L., and J. L. Buizer, 2016: Building community, credibility and knowledge: The third US National Climate Assessment. Climatic Change, 135, 922, https://doi.org/10.1007/s10584-015-1445-8.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., M. C. Lemos, and S. Dessai, 2013: Actionable knowledge for environmental decision making. Annu. Rev. Environ. Resour. ,38, 393414, https://doi.org/10.1146/annurev-environ-022112-112828.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mach, K. J., and C. B. Field, 2017: Toward the next generation of assessment. Annu. Rev. Environ. Resour. ,42, 569597, https://doi.org/10.1146/annurev-environ-102016-061007.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Masson, D., and R. Knutti, 2011: Spatial-scale dependence of climate model performance in the CMIP3 ensemble. J. Climate, 24, 26802692, https://doi.org/10.1175/2011JCLI3513.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McNie, E. C., 2008: Co-producing useful climate science for policy: Lessons from the RISA program. Ph.D. dissertation, University of Colorado Boulder, 293 pp.

  • Mearns, L. O., and Coauthors, 2013: Climate change projections of the North American Regional Climate Change Assessment Program (NARCCAP). Climatic Change, 120, 965975, https://doi.org/10.1007/s10584-013-0831-3.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mitchell, R. B., W. C. Clark, D. W. Cash, and N. M. Dickson, 2006: Global Environmental Assessments: Information and Influence. MIT Press, 352 pp.

    • Crossref
    • Export Citation
  • Morgan, M. G., and Coauthors, 2005: Learning from the U.S. National Assessment of Climate Change Impacts. Environ. Sci. Technol., 39, 90239032, https://doi.org/10.1021/es050865i.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • NAS, 2019: Making Climate Assessments Work: Learning from California and Other Subnational Climate Assessments. National Academies Press, 86 pp., https://doi.org/10.17226/25324.

    • Crossref
    • Export Citation
  • Pearce, W., M. Mahony, and S. Raman, 2018: Science advice for global challenges: Learning from trade-offs in the IPCC. Environ. Sci. Policy, 80, 125131, https://doi.org/10.1016/j.envsci.2017.11.017.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Pierce, D. W., D. R. Cayan, and B. L. Thrasher, 2014: Statistical downscaling using localized constructed analogs (LOCA). J. Hydrometeor., 15, 25582585, https://doi.org/10.1175/JHM-D-14-0082.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Saldaña, J., 2016: The Coding Manual for Qualitative Researchers. SAGE Publishing, 328 pp.

  • Seth, A., G. Wang, C. Kirchhoff, K. Lombardo, S. Stephenson, R. Anyah, and J. Wu, 2019: Connecticut Physical Climate Science Assessment Report (PCSAR): Observed trends and projections of temperature and precipitation. Connecticut Institute for Resilience and Climate Adaptation, 68 pp., https://circa.uconn.edu/wp-content/uploads/sites/1618/2019/08/CTPCSAR-Aug2019.pdf.

  • USGCRP, 2017: Climate Science Special Report: Fourth National Climate Assessment. Vol. I. D. J. Wuebbles et al., Eds., U.S. Global Change Research Program, 470 pp., https://doi.org/10.7930/J0J964J6.

    • Crossref
    • Export Citation
  • Vardy, M., M. Oppenheimer, N. K. Dubash, J. O’Reilly, and D. Jamieson, 2017: The Intergovernmental Panel on Climate Change: Challenges and opportunities. Annu. Rev. Environ. Resour. ,42, 5575, https://doi.org/10.1146/annurev-environ-102016-061053.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fig. 1.

    Scale, purpose, and audience of global, national, and state CAs.

  • Fig. 2.

    Comparison of both common and unique approaches to enhancing credibility, legitimacy, and salience among international, national and state CAs.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 1902 963 493
PDF Downloads 741 135 10