The Closer, the Better? Untangling Scientist–Practitioner Engagement, Interaction, and Knowledge Use

Maria Carmen Lemos School for Environment and Sustainability, University of Michigan, Ann Arbor, Michigan

Search for other papers by Maria Carmen Lemos in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0001-6686-730X
,
Kimberly S. Wolske Harris School of Public Policy, University of Chicago, Chicago, Illinois

Search for other papers by Kimberly S. Wolske in
Current site
Google Scholar
PubMed
Close
,
Laura V. Rasmussen Department of Forest and Conservation Sciences, University of British Columbia, Vancouver, British Columbia, Canada

Search for other papers by Laura V. Rasmussen in
Current site
Google Scholar
PubMed
Close
,
James C. Arnott School for Environment and Sustainability, University of Michigan, Ann Arbor, Michigan
Aspen Global Change Institute, Basalt, Colorado

Search for other papers by James C. Arnott in
Current site
Google Scholar
PubMed
Close
,
Margaret Kalcic Department of Food, Agricultural and Biological Engineering, The Ohio State University, Columbus, Ohio

Search for other papers by Margaret Kalcic in
Current site
Google Scholar
PubMed
Close
, and
Christine J. Kirchhoff Department of Civil and Environmental Engineering, University of Connecticut at Storrs, Storrs, Connecticut

Search for other papers by Christine J. Kirchhoff in
Current site
Google Scholar
PubMed
Close
Open access

Abstract

Scholarship on climate information use has focused significantly on engagement with practitioners as a means to enhance knowledge use. In principle, working with practitioners to incorporate their knowledge and priorities into the research process should improve information uptake by enhancing accessibility and improving users’ perceptions of how well information meets their decision needs, including knowledge credibility, understandability, and fit. Such interactive approaches, however, can entail high costs for participants, especially in terms of financial, human, and time resources. Given the likely need to scale up engagement as demand for climate information increases, it is important to examine whether and to what extent personal interaction is always a necessary condition for increasing information use. In this article, we report the results from two experimental studies using students as subjects to assess how three types of interaction (in-person meeting, live webinar, and self-guided instruction) affect different aspects of climate information usability. Our findings show that while in-person interaction is effective in enhancing understanding of climate knowledge, in-person interaction may not always be necessary, depending on the kinds of information involved and outcomes desired.

Supplemental information related to this paper is available at the Journals Online website: https://doi.org/10.1175/WCAS-D-18-0075.s1.

Denotes content that is immediately available upon publication as open access.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Maria Carmen Lemos, lemos@umich.edu

Abstract

Scholarship on climate information use has focused significantly on engagement with practitioners as a means to enhance knowledge use. In principle, working with practitioners to incorporate their knowledge and priorities into the research process should improve information uptake by enhancing accessibility and improving users’ perceptions of how well information meets their decision needs, including knowledge credibility, understandability, and fit. Such interactive approaches, however, can entail high costs for participants, especially in terms of financial, human, and time resources. Given the likely need to scale up engagement as demand for climate information increases, it is important to examine whether and to what extent personal interaction is always a necessary condition for increasing information use. In this article, we report the results from two experimental studies using students as subjects to assess how three types of interaction (in-person meeting, live webinar, and self-guided instruction) affect different aspects of climate information usability. Our findings show that while in-person interaction is effective in enhancing understanding of climate knowledge, in-person interaction may not always be necessary, depending on the kinds of information involved and outcomes desired.

Supplemental information related to this paper is available at the Journals Online website: https://doi.org/10.1175/WCAS-D-18-0075.s1.

Denotes content that is immediately available upon publication as open access.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Maria Carmen Lemos, lemos@umich.edu

1. Introduction

Current and future impacts of climate change underscore the need for climate information to support societal responses (Moss et al. 2013). Meeting this societal need for information is nontrivial as traditional ways to produce and communicate science often fail to yield usable knowledge to meet users’ needs (Kirchhoff et al. 2013). Engagement with practitioners in the process of creating climate information is believed to accelerate the production of usable knowledge. While there have been growing calls for interaction with stakeholders to support climate adaptation, (NRC 2010; Williams et al. 2015) there has been relatively less empirical evidence of its impact on actual knowledge use [but see Ford et al. (2013) and Fujitani et al. (2017)]. Given the growing costs and popularity of engagement and interaction among environmental scientists and funding organizations, especially in communicating climate knowledge, there is a critical need to better understand the role of engagement and interaction in increasing knowledge use. On the one hand, we need to design better ways to evaluate and assess the impact of all forms of engagement in increasing knowledge use and supporting societal and ecological well-being (Klenk et al. 2015; Lemos et al. 2014; Meadow et al. 2015; Wall et al. 2017). On the other hand, we need to make better use of the science of understanding knowledge use to inform the practice and design of engagement processes (Lemos et al. 2018). In this study, we use randomized-controlled experiments to better understand how interaction between scientists and potential users shapes drivers of knowledge use, such as understanding, credibility, and perceptions of fit (Briley et al. 2015; Cash et al. 2003; Parris et al. 2016).

While there is growing evidence that engagement enhances usability—that is, the likelihood that knowledge will be used—recent scholarship has increasingly called attention to the amount of resources necessary to sustain face-to-face science–practice interactions (Kettle and Trainor 2015; Lemos et al. 2014). These costs include financial and logistical resources for getting scientists and users together, the time spent by producers and users in repeated interaction, and less tangible costs such as the long-term commitment required to build trust and legitimacy, which are often mentioned as significant constraints to usability (Pidgeon and Fischhoff 2011). On the one hand, concerns about resource demands for engagement have centered on the resources required of producers. These include the institutional and organizational constraints scientists face in engaging with users (Briley et al. 2015; Lemos and Morehouse 2005), the relatively low number of scientists willing to engage, and a perceived mismatch between the growing need for engagement and willingness to do so (McNie 2007). On the other hand, there is concern about resource demands placed on potential users such as focusing on a relatively small number of decision-makers involved in climate-related decisions at the local level, leading to “stakeholder fatigue,” and personal risks that may be involved in engagement when their place of employment discourages engagement (Lemos et al. 2018). Moreover, potential users are increasingly reluctant to interact with climate information producers due to the high costs involved in traveling and lost work days (e.g., Kettle and Trainor 2015). Finally, financial and human resources to organize such interactions are often not available. Understanding these costs and how to offset them is important for both maximizing existing resources and scaling up engagement processes across new sectors and communities.

One way to reduce the costs of engagement, particularly the cost and time associated with traveling and hosting in-person meetings, is to explore different ways of communicating and interacting with potential users. With the steady advance of technology, there are now many options to enable effective remote interaction, perhaps making it a viable alternative to in-person interactions. While the effectiveness of remote interaction for building trust, for sustaining effective communication, and for knowledge exchange have been explored in business and other contexts (Alsharo et al. 2017; Henttonen and Blomqvist 2005; Jarvenpaa and Leidner 1999), relatively little work has been done within the context of climate change research and application [but see Kettle and Trainor (2015)]. As such, we know very little about the effectiveness of remote interaction or its viability as an alternative to face-to-face interaction in supporting engagement in this context (Lach and Rayner 2017). This is especially the case with oft-cited factors that influence the usability of climate information: understanding, credibility, and fit (Lemos et al. 2012).

In this article, we report the results of two experimental studies, using University of Michigan students as subjects, to assess how three types of interaction (in-person meeting, live webinar, and self-guided instruction) affect different aspects of climate information usability and uptake. To our knowledge, this is the first effort using an experimental design to explore how different types of interaction—which is at the heart of engagement—influence climate knowledge uptake. Our findings show that while in-person interaction is sometimes effective at enhancing understanding of climate knowledge, in-person interaction may not always be necessary, depending on the kinds of information involved and outcomes expected.

In choosing to carry out the experiment with students, we are aware of the potential limitations of our findings when compared with using actual decision-makers as subjects. Our reasons to carry out the experiments with students were twofold. First there was feasibility: the logistics of carrying out randomized field experiments with samples large enough to allow statistical analyses were daunting without a compelling proof of concept that our ideas were viable. Second, while working with actual practitioners would have been ideal, previous research has shown the benefits of using students, in terms of the cost and recruitment efficiency, may outweigh the costs to external validity as student and nonstudent responses often are largely equivalent (Anderson and Edwards 2015).

In the next sections, we first describe the literature on knowledge use that grounds our experiment and second, the two studies that informed our findings. Subsequently, we describe each experiment in detail, including methods, analyses, and findings.

2. Literature review

a. Information use and usability

Questions about the use of information attract broad interest from scholars, policy-makers, practitioners, and funders alike. As an area of social inquiry, these questions motivate research to better understand the conditions by which scientific information and other forms of knowledge gets used by people and organizations in the course of decision-making (Gitomer and Crouse 2019). While pioneering work on this topic occurred in the late 1970s and early 1980s (e.g., Caplan 1979; Rich 1981; Weiss 1979), more recent scholarship is emerging in the context of different social problem domains such as education (Tseng 2012), health (Holmes et al. 2012), climate change (Kirchhoff et al. 2013), and sustainable development (Clark et al. 2016).

Across these arenas, the meaning of “use” and what drives the use of information open up to a range of definitions and explanations. First, use may refer to direct inputs to decision-making and implementation to support problem-solving. Second, use may refer to shaping how issues or agendas are framed, or for general enlightenment and rationalizing of preconceived actions, decisions or value judgements for political or tactical ends (Weiss 1979). Some explanations for why information is used (or not used) examine the quality or form of the information itself or the social or organizational context in which it is used (Landry et al. 2003). A recurring and dominant explanation examined across time and contexts focuses on the disconnect—institutional, cultural, even linguistic—between where information is produced and where it is used (Caplan 1979). This disconnect, in turn, hinders access to potentially useful information or leads to the production of information that is not relevant or does not fit decision contexts.

One line of study for understanding how to increase information use in decision-making examines the role of interaction between researchers and practitioners. For example, early research by David Cash and colleagues (Cash et al. 2003) found that environmental assessments would be more likely to be perceived by practitioners as credible, relevant, and legitimate if their production entailed some form of interaction between the providers and users of the assessments. Lemos and Morehouse (2005) argued that iteration between researchers and users was a necessary condition for the coproduction of usable knowledge. Subsequent work further suggests that particular kinds of information, like seasonal climate forecasts (Dilling and Lemos 2011) and downscaled climate projections (Vogel et al. 2016), could be rendered more usable for decision-making when produced through producer and user interactions, especially when addressing the complexities and uncertainties embedded in data-intensive climate information (Briley et al. 2015; Kirchhoff 2013; Kirchhoff et al. 2015b; McNie 2013).

b. Types of engagement and interaction

Much of the research on how interaction enhances climate information use centers on in-person engagements between producers and users, leading many to argue that sustained, in-person interactions increase usability. This is not surprising given that research on scientist–practitioner interaction tends to emphasize the importance of relationship building and trust (Brugger and Crimmins 2015; Dilling and Lemos 2011; Jones et al. 2016; Moss 2016). In particular, personal interaction that builds trust and understanding in the context of coproduction also increases users’ willingness to share that information and learning within their organizations and networks (Kirchhoff et al. 2015a). While scientist–practitioner interaction critically improves usability, doing it “right” is resource intensive, requiring not only financial and logistical resources but also time and long-term commitment from both producers and users to sustain collaboration over time (Pidgeon and Fischhoff 2011).

Mitigating this resource intensiveness and advancing our ability to meet expected demand for climate information requires exploring how different forms of interaction affect information use. First, by better understanding what specific characteristics of in-person interaction enhance different dimensions of usability, we may be able to reduce the costs of interaction by leveraging the capacity for engagement through webinars and other virtual technologies. Second, we may also be able to better evaluate other forms of knowledge sharing such as web-based decision-support tools, which have great potential to scale up use. For example, the proliferation of online decision support tools for climate decision-making (see, e.g., NOAA’s resilience tool kit—https://toolkit.climate.gov/) suggests that careful evaluation of the usability of remote interaction with climate information is overdue.

With the steady advance of technology, there are many more options that potentially enable effective remote interaction. Research in business and related fields has explored different forms of remote interaction and their role in building trust, sustaining effective communication, and exchanging knowledge among virtual teams (Alsharo et al. 2017; Bhappu et al. 2001; Henttonen and Blomqvist 2005; Jarvenpaa and Leidner 1999). The evidence from these studies is mixed. For example, Bhappu et al. (2001) found computer-mediated communication helped virtual team members with diverse backgrounds acquire and integrate different knowledges more effectively. Alsharo et al. (2017) found that sharing knowledge among virtual teams helped to build trust and collaboration (although they did not find a significant increase in team effectiveness as well). In contrast, Cramton and Orvis (2003) found that social (e.g., information about an individual’s networks, motives, and goals) and contextual (e.g., information about norms, rules, expectations) information are particularly difficult to share in virtual environments, potentially leading to misunderstanding and a breakdown of trust. Also, Riopelle et al. (2003) found that remote technologies must be carefully matched to the task and context. For complex tasks with complex contexts, face-to-face communication may be the best solution to facilitate understanding and task completion (Riopelle et al. 2003). While we know a great deal about remote interaction in business and related contexts, we know very little about the effectiveness of remote interaction or its viability as an alternative to face-to-face interaction in supporting climate information use.

In the area of distance learning, evaluations of in-person versus distance or remote learning has been carried out for many years. Early research on online learning signaled the possibility that few differences, and perhaps even benefits, may occur in pursuing Internet based learning (e.g., Bernard et al. 2004). Two meta-analyses of such studies suggest that online learners perform better than students in traditional learning environments (Means et al. 2013; Means et al. 2009). It is unclear, however, whether the results can be attributed to the mode of delivery per se, as the instructional methods used in online courses and face-to-face classrooms often differ. Furthermore, some research has found that online learning only has significant advantages when it also includes an element of face-to-face interaction (i.e., “blended” delivery mode). In the context of training, such as for one-time skill development or continuing education, additional studies have found opportunities for similar or even enhanced performance by learners, such as in the context of library instruction or health training (Hemmati et al. 2013; Silk et al. 2015). In the public health arena, online training has become increasingly popular such that studies may now be fully focused on the efficacy of online efforts (Colleran et al. 2012; Webb et al. 2017), which seek the promise of expanded and accelerated health worker training in underserved or under-resourced areas (Rowe et al. 2005).

3. Study experiments: Description and methods

Our studies investigate the influence of three different forms of interaction and their influence on climate information use for decision-making: in-person meeting, live webinar, and self-guided web-based instruction. For ease of conducting the studies, our focus is on one-time interactions, such as might be used to introduce practitioners to new climate tools or to share new research findings that may impact practitioners’ work. We assume in-person meeting to be more resource intensive (e.g., logistically, and in terms of human and financial resources) than live webinar. Following the same logic, we assume a live webinar to be more resource intensive than self-guided instruction.

We compare these different forms of interaction through two randomized experiments. In both studies, experienced climate information brokers (scientists who have worked with potential users to help them learn about and potentially use scientific information) interacted with participants in semicontrolled environments for the in-person meeting and live webinar. For purposes of the experiment, we refer to the climate information broker as the “instructor.” The first study (2015) was designed as a “proof of concept” seeking to explore the assumption that “closer” interaction would lead to better understanding and intention to use climate information in a decision context. Study 2, carried out in 2016, sought to further explore and validate the results of study 1 while also examining whether the type of interaction affects decision-making. All study protocols were approved by the Institutional Review Board at the University of Michigan.

In both studies, we examine the effects of interaction on three dimensions of usability: understanding, credibility, and fit. Given prior scholarship, we expected in-person interactions would yield greater levels of understanding, credibility, and perceived fit relative to other forms of scientist-user interaction. We additionally measure uptake of information. In study 1 this takes the form of intentions to use the presented climate information while in study 2, we ask participants to draw upon information provided to make a decision within a hypothetical scenario and then reflect on which types of information informed their decision making. Specifically, we expected the in-person group to be more accepting of uncertain projections from climate models and thus more likely to report using that information.

a. Study 1

In our first study, we tested whether the form of interaction affects understanding of and intention to use information provided in a climate adaptation planning tool.

1) Participants and procedure

To approximate potential users’ expertise in the context of climate-related decision-making, we recruited graduate students (N = 46) at the University of Michigan with either environmental/natural resources or urban planning backgrounds. Students were offered a $35 Amazon gift card in exchange for their participation.

Students interested in participating provided their availability during two 4-hour blocks in May 2015. Those who signed up for a given time block were then randomly assigned to one of three tutorials: in-person meeting, live webinar, or self-guided instruction (i.e., written instructions and recorded videos). This stratified randomization process helped ensure that students with similar characteristics (e.g., motivated students who signed up for the first time slot) would be distributed across the three treatments. The final sample sizes per condition were 11 students in the in-person meeting, 16 in the live webinar, and 19 in the self-directed group.

Students were told that the purpose of the study was to evaluate the Cities Impact and Adaptation Tool (CIAT; http://graham-maps.miserver.it.umich.edu/ciat/home.xhtml), an online resource aimed at helping city planners to plan and implement adaptive responses to climate change. All students were asked to complete a tutorial about the tool, which, depending on their assigned treatment, occurred through an in-person meeting, a live webinar, or self-guided instruction on the CIAT website. In all conditions, students were shown how to look at both historic climate data and modeled projections to ascertain whether and how temperatures and precipitation levels within a region might change. At the end of the presentation, students had the opportunity to ask questions of the presenter. Students in the in-person condition tended to ask more questions than in the webinar. Following the tutorial, participants completed a survey about their understanding and perceptions of the data presented.

2) Measures

To test objective understanding of CIAT data, students completed a short quiz with 19 possible correct answers. All other measures on the survey were assessed through five- or seven-point scaled questions (see Table 1). Students separately rated the understandability and credibility of both the observed historical data in the tool as well as the projected climate model data presented. We also measured understanding of the tool itself by asking students to rate their difficulty in learning the tool and whether they wanted additional guidance for using it. To assess fit—that is, the appropriateness of the information for city decision-makers—we asked participants to rate the perceived riskiness of making decisions based on the tool. Finally, as a measure of uptake, we asked respondents about their intentions to use or recommend the tool in the future. Where appropriate, we used principal component analysis with oblimin rotation to reduce the number of items into a smaller set of reliable scales.

Table 1.

Study 1 survey items. Note: Unless otherwise noted, all items were on 7-point scales. Understandability and Credibility were rated on semantic differential scales while other items were scaled from 1 = strongly disagree to 7 = strongly agree.

Table 1.

3) Results

Because of the small sample sizes and nonnormal distribution of the data, we initially used Kruskal Wallis H tests with Dunn’s test for multiple comparisons to identify differences between treatments. These analyses revealed that the in-person and live webinar treatments did not differ significantly in any respect (all p values > 0.3), including in participants’ evaluations of the scientist presenter (referred to as the instructor in the study 1 survey; see Table 1) (U = 73.5, p = 0.481) and the perceived level of interaction during the training (U = 73.5, p = 0.481) (which were only measured for the in-person and webinar groups). The relationships between each of these two treatments and the self-guided treatment also followed similar trends, with the exception of the results for uptake intentions. We therefore combined the in-person and group webinar treatments in subsequent analyses to enhance statistical power. Combining these treatments also resulted in observations that more closely approximated a normal distribution, thereby allowing us to use independent t tests to compare scientist-led and self-directed groups.

As shown in Fig. 1, no differences were found between scientist-led (in-person + live webinar) and self-directed trainings in terms of the understandability and credibility of the climate information presented or in the perceived riskiness of using climate models to inform decision-making (i.e., fit). We did observe, however, modest differences in objective knowledge, with the self-guided group performing slightly worse (M = 15.79 correct responses, SE = 0.31) on the quiz than those trained by a scientist [M = 17.07, SE = 0.32, t(44) = 2.75, p = 0.009, d = 0.84]. Self-guided participants had greater difficulty learning the tool (M = 2.74, SE = 0.22) and reported wanting more guidance (M = 5.11, SE = 0.23) on how to use it than those trained by a scientist [difficulty: M = 2.12, SE = 0.19, t(44) = 2.10, p = 0.042, d = 0.65; guidance: M = 4.07, SE = 0.27, t(44) = 2.72, p = 0.009, d = 0.84] (Fig. 1). In terms of uptake intentions, preliminary analyses suggested that the in-person group had higher intentions than the self-guided group (padj = 0.047, r = 0.44), but the effect disappeared when the in-person and webinar treatments were combined (Fig. 1).

Fig. 1.
Fig. 1.

Study 1: Mean ratings with 95% confidence intervals of climate data perceptions for scientist-led vs self-guided trainings. All measures are on 7-point scales with higher values indicating higher endorsement.

Citation: Weather, Climate, and Society 11, 3; 10.1175/WCAS-D-18-0075.1

b. Study 2

Study 2 tested whether the form of interaction influences climate information uptake in the context of a risky decision. Unlike study 1, where students learned about a climate tool for which they had no immediate use, study 2 asked participants to play the role of a water utility manager tasked with making a long-term investment decision to deal with harmful algal blooms (HABs). To inform their decision making, we presented information about the potential impacts of climate change on future occurrences of HABs, again manipulating whether this information was delivered through an in-person meeting, live webinar, or self-guided instruction (via a prerecorded webinar).

1) Participants

Participants (N = 156) were undergraduate and graduate students at the University of Michigan with backgrounds in natural resource management, urban planning, and business. Students were offered a $30 Amazon gift card to complete a short reading assignment, attend a presentation, and respond to two short questionnaires. Participants included in the dataset completed all parts of the study. The final sample sizes per condition were 55 students in the in-person group meeting, 50 in the live webinar, and 51 in the self-directed group.

2) Procedure and materials

To participate in the study, students first completed an online form that included questions about their program of study and year in school. They then signed up for one of nine time slots offered over a three-day period in September 2016. We randomly assigned students to each treatment through a two-stage process. In the first stage, we randomized time slots such that four slots were assigned to the in-person treatment, four to the live webinar treatment, and one to the self-guided recorded webinar treatment. Within each of the in-person and webinar time slots, we then stratified participants according to their major and tenure (year in program). From these stratified groups, we randomly selected a set number of students to participate in the self-directed treatment (which was done online during the students’ own time). This process ensured that students with similar experience and backgrounds were evenly distributed across the three treatments.

Upon signing up to participate, students were directed to an online pretest survey. The survey included a scenario (held constant across all three treatments) in which we asked students to assume the role of a drinking water utility manager for a city on Lake Erie experiencing harmful algal blooms (HABs; see the online supplemental material). The utility manager (i.e., the experiment participant) had five investment options for protecting the city from future HABs. Larger investments would provide greater protection from HABs but would divert funds from other important city programs. Participants had to weigh the risk of future HABs against the risk of wasting city funds, bearing in mind that the occurrence of future HABs was uncertain and dependent on factors such as climate change and regional agricultural practices. After reading the scenario, participants completed the pretest survey by selecting their investment decision.

Students participated in the experimental portion of the study (in-person group seminar, live webinar, or self-guided recorded webinar) four to eight days later. In each condition, an environmental scientist well-versed in topics related to climate information and harmful algal blooms (held constant across all treatments) presented information on how climate change could influence the occurrence and severity of HABs in the future. During the presentation, the scientist explained that changing temperature and precipitation levels may influence future HABs. Of these two factors, the connection between precipitation and HABs was described as being less certain. The presenter further explained that decision-makers have three types of climate data (for either temperature or precipitation) that might be used to make predictions about future HABs: projections from historical data, current observations, and projections from climate models.

To ensure a minimum level of interaction between the scientist and participants, we used two confederate students to ask the same predetermined questions in each of the conditions (including the recorded webinar in the self-directed condition). Students in the in-person meeting and live webinar could ask additional questions. More student-generated questions were observed in the in-person meeting than in the live webinar. Immediately following the presentation, students in the in-person meeting completed the posttest survey in an adjacent computer laboratory while participants in the live webinar were emailed a link to the posttest survey. Participants in the self-guided condition were instructed via e-mail to visit a website where they could watch a recorded webinar before completing the posttest survey.

3) Measures

The investment options presented to students on both the pretest and posttest are provided in the supplemental material. The choices were scaled such that each successive option required a greater upfront investment of money. Students were told that spending more money upfront would reduce the cost of future HAB events but doing so came with the risk of wasting city funds. If the number of future HABs was low, the money—which could have gone to other important city programs—would be wasted. If students underinvested and the number of future HABs was high, the city would have to borrow funds from other programs.

The posttest also included items to assess the overall quality of the tutorial and perceived usability of the information presented (Table 2). Similar to study 1, students rated the level of interaction with the scientist presenter (“instructor” in the study 2 survey; Table 2), the quality of the presentation, and how credible and engaging they found the presenter to be. Additional measures assessed the overall usability of the information presented, using separate items for fit, credibility, and understanding.

Table 2.

Study 2 survey items.

Table 2.

We also asked students about the fit and credibility of the different types of climate data presented. We defined fit in terms of how relevant, useful, and informative students found the climate data presented for their decision on how to handle HABs. An initial question asked students to rate how much each type of climate data (current and historical observations, projections from historical data, and climate model predictions), in general, influenced their decision making. Students then rated the perceived fit and credibility of the different types of temperature and precipitation data presented (i.e., projections from historic temperature data, projections from historic precipitation data, current observations of temperature, current observations of precipitation, projections climate model temperature data, projections from climate model precipitation). To examine differences on these measures between experimental conditions, we ran a series of mixed factorial ANOVAs, treating data type (current observations, historical projections, and climate model projections) as the within-subjects factor and experimental treatment as the between-subjects factor.

4) Results

As shown in Fig. 2, the experimental manipulation demonstrated that participants perceived three different levels of interaction with the instructor [Welch’s F(2, 98.15) = 49.68, p < 0.001, est. ω2 = 0.38], but otherwise found the quality of the presentation and the credibility of the instructor to be equivalent. Perceptions of how engaging the presenter was also varied across treatments [F(2, 153) = 6.90, p = 0.001, ω2 = 0.07] with in-person participants rating the presenter as more engaging than participants in either the live webinar (p = 0.001, d = 0.38) or prerecorded webinar (p = 0.028, d = 0.50). Despite differences in perceived level of interaction, the treatments did not lead participants to perceive differences in terms of the overall fit, understandability, or credibility of the information presented (Fig. 3).

Fig. 2.
Fig. 2.

Study 2: Mean ratings with 95% confidence intervals of presentation quality across treatment conditions, measured on (a) 5-point scales and (b) 7-point scales. Mean ratings were compared using one-way analysis of variance (ANOVA) with Tukey or Games–Howell post hoc tests, as appropriate.

Citation: Weather, Climate, and Society 11, 3; 10.1175/WCAS-D-18-0075.1

Fig. 3.
Fig. 3.

Study 2: Mean ratings with 95% confidence intervals of climate data usability (in general). All measures are on 5-point scales with higher values indicating greater endorsement. Mean ratings were compared using one-way analysis of variance (ANOVA) with Tukey or Games–Howell post hoc tests, as appropriate.

Citation: Weather, Climate, and Society 11, 3; 10.1175/WCAS-D-18-0075.1

Next, we examined whether perceptions of different types of climate data varied by treatment. No main effects were found for experimental condition on any of the outcome variables, and, with one exception, no interactions were found between data type and treatment condition (see Figs. 4 and 5). The results, overall, suggest that perceptions of different data sources did not differ across treatments. The exception was for the perceived fit of climate precipitation data. Here we observed a significant interaction between data type and experimental condition [F (4,306) = 3.56, p = 0.008]. As shown in Fig. 5b, participants in the self-directed group perceive the fit of the information as lower and, thus according to the literature reviewed above, might be less likely to use projected precipitation data from climate models than participants in either the in-person group or live webinar.

Fig. 4.
Fig. 4.

Study 2: Influence of different types of climate data on investment decision by experimental condition. Participants rated how much each type of data influenced their decision for treating HABs on a scale from 1 = Not at all to 5 = Very much.

Citation: Weather, Climate, and Society 11, 3; 10.1175/WCAS-D-18-0075.1

Fig. 5.
Fig. 5.

Study 2: Mean ratings of credibility and fit for each of the six types of climate data presented.

Citation: Weather, Climate, and Society 11, 3; 10.1175/WCAS-D-18-0075.1

Finally, to assess whether treatment condition influenced participants’ investment decisions, we calculated change scores from pretest to posttest. As most students did not change their investment plan, the data were not normally distributed and required a Kruskal Wallis H test to examine whether there were differences between treatments. No significant differences were found [H(2) = 1.91, p = 0.384].

4. Discussion

Based on our two studies, we find limited support for the hypothesis that in-person interactions will yield a greater level of understanding and use of information relative to other forms of scientist–user interaction. While study 1 suggests there may be marginal benefits to disseminating climate information through forms of interaction where practitioners have direct contact with knowledge producers, we found no differences in perception of overall fit, understandability, or credibility of the information between treatment groups in study 2.

Yet, a few observations deserve attention. In study 1, participants who had a scientist guide them through the CIAT tool found it easier to understand and demonstrated greater understanding of the information presented. However, it does not appear to matter whether that guidance is delivered in person or through a live webinar. While study 2 indicates that both webinar and self-guided instruction may be reasonable alternatives to in-person interaction for enhancing usability (fit, understanding, and credibility of information), we found one exception—the perceived fit of climate precipitation data. Participants in the self-directed group reported lower perceived fit of climate precipitation data than participants in either the in-person group or live webinar. This suggests that for more uncertain climate change projections such as precipitation, more interaction is better.

Based on these results, we argue that to improve and potentially scale up climate information uptake, climate scientists and information brokers should consider the transaction costs associated with in-person interaction against the expected gains of that interaction. In such cases, it may be that intensive efforts to interact with practitioners should be reserved for complex information and contexts in which there may be no substitute for in-person interaction. For example, situations in which information is complex or highly uncertain, such as climate precipitation projections, may require in-person interaction. Similarly, in contexts where local politics or distrust of science may inhibit action, close and meaningful interaction to build legitimacy and trust may be desirable. In contrast, where credibility may not be an issue (e.g., when information is delivered by a well-respected university-based scientist with knowledge brokering expertise) remote means of interaction could present tangible advantages in terms of lower human and time costs without forgoing the opportunity for trust building.

Several methodological limitations point to avenues for future research. First, as mentioned before our studies were conducted with a relatively small sample of students and not practitioners in the field. While we attempted to recruit students who might reasonably use climate information in their future careers and our experimental design sought to instigate realistic stakes in a decision-making process, students may not have been as personally invested in the quality of the tool presented in study 1 or in the tradeoffs associated with the harmful algal bloom scenario described in study 2. Second, our studies only speak to the effects of one-time interactions between knowledge producers and users. Despite these limitations, our findings are consistent with those of scholars finding that virtual interaction can achieve certain goals as effectively as face-to-face instruction (Alsharo et al. 2017; Bhappu et al. 2001; Means et al. 2013). Additional field research is needed to determine how and when the results might generalize to different real-world contexts, including how power dynamics and governance contexts among and between different groups of practitioners would influence the role of in-person versus virtual interaction with scientists.

5. Conclusions

Through two randomized experiments, we examined whether different forms of interaction influence knowledge users’ understanding of climate information as well as their perceptions of credibility and fit in utilizing climate tools to support decision-making. The results of studies 1 and 2 together show that in the context of one-off efforts to enhance climate information usability, increased interaction between knowledge producers and users may offer few advantages over less resource-intensive approaches. In both studies, the live webinar and in-person meetings led to similar outcomes, and with rare exception, offered little advantage over groups of participants who viewed the same materials on their own.

Our study is one of the first attempts to investigate the effects of science–practice interaction on different drivers of climate science usability through a randomized experiment. We believe this experimental approach has the potential to significantly increase understanding of how different forms of remote communication can be used to augment in-person engagement efforts. Rather than challenge the compelling evidence that person-to-person interaction fosters usability, our results suggest that there may be alternative avenues to enhance usability and to aid interaction that complement (rather than replace) well-established best practices documented in the climate science literature. While there are many others aspects of the role of engagement in increasing the usability of scientific knowledge that need to be explored, our findings suggest that climate scientists, information brokers, and practitioners should consider that more face-to face interaction may not always be better. Given limited resources and the urgency of climate change, strategic investment of time and effort is essential.

Acknowledgments

The research for this article was supported by National Science Foundation (NSF) Grant 1039043 and National Oceanic and Atmospheric Administration (NOAA) Grant NA15OAR4310148. We wish to thank Avik Basu for his advice on study 1, and Dan Brown and Ashley Grace for the presentations of the CIAT tool in study 1. Author contributions: M.C.L. conceptualized the study. K.S.W. designed, executed, and analyzed data from studies 1 and 2. K.S.W, J.C.A., L.V.R., M.K. and C.K. designed and executed study 2. All authors contributed to the manuscript. Data: Survey data for studies 1 and 2 can be found at https://osf.io/eb8ck/.

REFERENCES

  • Alsharo, M., D. Gregg, and R. Ramirez, 2017: Virtual team effectiveness: The role of knowledge sharing and trust. Inf. Manage., 54, 479490, https://doi.org/10.1016/j.im.2016.10.005.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderson, D. M., and B. C. Edwards, 2015: Unfulfilled promise: Laboratory experiments in public management research. Public Manage. Rev., 17, 15181542, https://doi.org/10.1080/14719037.2014.943272.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bernard, R. M., and Coauthors, 2004: How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Rev. Educ. Res., 74, 379439, https://doi.org/10.3102/00346543074003379.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bhappu, A. D., M. Zellmer-Bruhn, and V. Anand, 2001: The effects of demographic diversity and virtual work environments on knowledge processing in teams. Virtual Teams, Vol. 8, Advances in Interdisciplinary Studies of Work Teams, M. M. Beyerlein, D. A. Johnson, and S. T. Beyerlein, Eds., Emerald Group Publishing, 149–165.

  • Briley, L., D. Brown, and S. E. Kalafatis, 2015: Overcoming barriers during the co-production of climate information for decision-making. Climate Risk Manage., 9, 4149, https://doi.org/10.1016/j.crm.2015.04.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brugger, J., and M. Crimmins, 2015: Designing institutions to support local-level climate change adaptation: Insights from a case study of the U.S. Cooperative Extension System. Wea. Climate Soc., 7, 1838, https://doi.org/10.1175/WCAS-D-13-00036.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Caplan, N., 1979: The two-communities theory and knowledge utilization. Amer. Behav. Sci., 22, 459470, https://doi.org/10.1177/000276427902200308.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cash, D. W., W. C. Clark, F. Alcock, N. M. Dickson, N. Eckley, D. H. Guston, J. Jäger, and R. B. Mitchell, 2003: Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. USA, 100, 80868091, https://doi.org/10.1073/pnas.1231332100.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Clark, W. C., L. van Kerkhoff, L. Lebel, and G. C. Gallopin, 2016: Crafting usable knowledge for sustainable development. Proc. Natl. Acad. Sci. USA, 113, 45704578, https://doi.org/10.1073/pnas.1601266113.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Colleran, K., and Coauthors, 2012: Building capacity to reduce disparities in diabetes: Training community health workers using an integrated distance learning model. Diabetes Educ., 38, 386396, https://doi.org/10.1177/0145721712441523.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cramton, C. D., and K. L. Orvis, 2003: Overcoming barriers to information sharing in virtual teams. Virtual Teams that Work: Creating Conditions for Virtual Team Effectiveness, C. B. Gibson and S. G. Cohen, Eds., John Wiley & Sons, 214–229.

    • Search Google Scholar
    • Export Citation
  • Dilling, L., and M. C. Lemos, 2011: Creating usable science: Opportunities and constraints for climate knowledge use and their implications for science policy. Global Environ. Change, 21, 680689, https://doi.org/10.1016/j.gloenvcha.2010.11.006.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ford, J. D., M. Knight, and T. Pearce, 2013: Assessing the ‘usability’ of climate change research for decision-making : A case study of the Canadian International Polar Year. Global Environ. Change, 23, 13171326, https://doi.org/10.1016/j.gloenvcha.2013.06.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fujitani, M., A. McFall, C. Randler, and R. Arlinghaus, 2017: Participatory adaptive management leads to environmental learning outcomes extending beyond the sphere of science. Sci. Adv., 3, e1602516, https://doi.org/10.1126/sciadv.1602516.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gitomer, D. H., and K. Crouse, 2019: Studying the use of research evidence: A review of methods. William T. Grant Foundation Monograph, 90 pp., http://wtgrantfoundation.org/library/uploads/2019/02/A-Review-of-Methods-FINAL003.pdf.

  • Hemmati, N., S. Omrani, and N. Hemmati, 2013: A comparison of internet-based learning and traditional classroom lecture to learn CPR for continuing medical education. Turk. Online J. Distance Educ., 14 (1), 256265, https://eric.ed.gov/?id=EJ1006264.

    • Search Google Scholar
    • Export Citation
  • Henttonen, K., and K. Blomqvist, 2005: Managing distance in a global virtual team: The evolution of trust through technology-mediated relational communication. Strateg. Change, 14, 107119, https://doi.org/10.1002/jsc.714.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Holmes, B., G. Scarrow, and M. Schellenberg, 2012: Translating evidence into practice: The role of health research funders. Implement. Sci., 7, 39, https://doi.org/10.1186/1748-5908-7-39 1–10.

    • Crossref
    • Export Citation
  • Jarvenpaa, S. L., and D. E. Leidner, 1999: Communication and trust in global virtual teams. Organ. Sci., 10, 791815, https://doi.org/10.1287/orsc.10.6.791.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Jones, L., C. Champalle, S. Chesterman, L. Cramer, and T. A. Crane, 2016: Constraining and enabling factors to using long-term climate information in decision-making. Climate Policy, 17, 551–572, https://doi.org/10.1080/14693062.2016.1191008.

    • Crossref
    • Export Citation
  • Kettle, N. P., and S. F. Trainor, 2015: The role of remote engagement in supporting boundary chain networks across Alaska. Climate Risk Manage., 9, 619, https://doi.org/10.1016/j.crm.2015.06.006.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., 2013: Understanding and enhancing climate information use in water management. Climatic Change, 119, 495509, https://doi.org/10.1007/s10584-013-0703-x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., M. Lemos, and S. Dessai, 2013: Actionable knowledge for environmental decision making: Broadening the usability of climate science. Annu. Rev. Environ. Resour., 38, 393414, https://doi.org/10.1146/annurev-environ-022112-112828.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., R. Esselman, and D. Brown, 2015a: Boundary organizations to boundary chains: Prospects for advancing climate science application. Climate Risk Manage., 9, 2029, https://doi.org/10.1016/j.crm.2015.04.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., M. C. Lemos, and S. Kalafatis, 2015b: Narrowing the gap between climate science and adaptation action: The role of boundary chains. Climate Risk Manage., 9, 15, https://doi.org/10.1016/j.crm.2015.06.002.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Klenk, B. N. L., K. Meehan, S. L. Pinel, F. Mendez, T. Lima, and D. M. Kammen, 2015: Stakeholders in climate science: Beyond lip service? Science, 350, 743744, https://doi.org/10.1126/science.aab1495.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lach, D., and S. Rayner, 2017: Are forecasts still for wimps? J. Southwest, 59, 245263, https://doi.org/10.1353/jsw.2017.0013.

  • Landry, R., M. Lamari, and N. Amara, 2003: The extent and determinants of the utilization of university research in government agencies. Public Adm. Rev., 63, 192205, https://doi.org/10.1111/1540-6210.00279.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., and B. J. Morehouse, 2005: The co-production of science and policy in integrated climate assessments. Global Environ. Change, 15, 5768, https://doi.org/10.1016/j.gloenvcha.2004.09.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., C. J. Kirchhoff, and V. Ramprasad, 2012: Narrowing the climate information usability gap. Nat. Climate Change, 2, 789794, https://doi.org/10.1038/nclimate1614.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., C. J. Kirchhoff, S. E. Kalafatis, D. Scavia, and R. B. Rood, 2014: Moving climate information off the shelf: Boundary chains and the role of RISAs as adaptive organizations. Wea. Climate Soc., 6, 273285, https://doi.org/10.1175/WCAS-D-13-00044.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., J. C. Arnott, N. M. Ardoin, K. Baja, A. T. Bednarek, A. Dewulf, and Coauthors, 2018: To co-produce or not to co-produce. Nature Sustainability, 1, 722724, https://doi.org/10.1038/s41893-018-0191-0.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McNie, E. C., 2007: Reconciling the supply of scientific information with user demands: An analysis of the problem and review of the literature. Environ. Sci. Policy, 10, 1738, https://doi.org/10.1016/j.envsci.2006.10.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McNie, E. C., 2013: Delivering climate services: Organizational strategies and approaches for producing useful climate-science information. Wea. Climate Soc., 5, 1426, https://doi.org/10.1175/WCAS-D-11-00034.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Meadow, A. M., D. B. Ferguson, Z. Guido, A. Horangic, G. Owen, and T. Wall, 2015: Moving toward the deliberate co-production of climate science knowledge. Wea. Climate Soc., 7, 179191, https://doi.org/10.1175/WCAS-D-14-00050.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Means, B., Y. Toyama, R. Murphy, M. Bakia, and K. Jones, 2009: Evaluation of evidence-based practices in online learning. U.S. Department of Education, 93 pp, https://eric.ed.gov/?id=ED505824.

  • Means, B., Y. Toyama, R. Murphy, and M. Baki, 2013: The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teach. Coll. Rec., 115, 147, https://www.sri.com/work/publications/effectiveness-online-and-blended-learning-meta-analysis-empirical-literature.

    • Search Google Scholar
    • Export Citation
  • Moss, R. H., 2016: Assessing decision support systems and levels of confidence to narrow the climate information “usability gap.” Climatic Change, 135, 143155, https://doi.org/10.1007/s10584-015-1549-1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Moss, R. H., and Coauthors, 2013: Hell and high water: Practice-relevant adaptation science. Science, 342, 696698, https://doi.org/10.1126/science.1239569.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • NRC, 2010: Informing an Effective Response to Climate Change. America’s Climate Change Project, National Research Council, 346 pp., http://www.nap.edu/catalog.php?record_id=12784.

  • Parris, A. S., G. M. Garfin, K. Dow, R. Meyer, and S. L. Close, Eds., 2016: Climate in Context: Science and Society Partnering for Adaptation. Wiley, 304 pp.

    • Search Google Scholar
    • Export Citation
  • Pidgeon, N., and B. Fischhoff, 2011: The role of social and decision sciences in communicating uncertain climate risks. Nat. Climate Change, 1, 3541, https://doi.org/10.1038/nclimate1080.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Rich, R., Ed., 1981: The Knowledge Cycle. SAGE Publications, 222 pp.

  • Riopelle, K., and Coauthors, 2003: Context, task, and the evolution of technology use in global virtual teams. Virtual Teams That Work: Creating Conditions for Virtual Team Effectiveness, C. Gibson and S. G. Cohen, Eds., John Wiley & Sons, 239–264.

  • Rowe, A. K., D. de Savigny, C. F. Lanata, and C. G. Victora, 2005: How can we achieve and maintain high-quality performance of health workers in low-resource settings? Lancet, 366, 10261035, https://doi.org/10.1016/S0140-6736(05)67028-6.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Silk, K. J., E. K. Perrault, S. Ladenson, and S. A. Nazione, 2015: The effectiveness of online versus in-person library instruction on finding empirical communication research. J. Acad. Libr., 41, 149154, https://doi.org/10.1016/j.acalib.2014.12.007.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tseng, V., 2012: Commentary on the uses of research in policy and practice. Soc. Policy Rep., 26, 124, https://doi.org/10.1002/j.2379-3988.2012.tb00071.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Vogel, J., E. McNie, and D. Behar, 2016: Co-producing actionable science for water utilities. Climate Serv., 2–3 (September), 3040, https://doi.org/10.1016/j.cliser.2016.06.003.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wall, T. U., A. M. Meadow, and A. Horganic, 2017: Developing evaluation indicators to improve the process of coproducing usable climate science. Wea. Climate Soc., 9, 95107, https://doi.org/10.1175/WCAS-D-16-0008.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Webb, J., J. Stockwell, and Y. Chavez-Ugalde, 2017: The reach, adoption, and effectiveness of online training for healthcare professionals. Public Health, 153, 107110, https://doi.org/10.1016/j.puhe.2017.08.016.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Weiss, C. H., 1979: The many meanings of research utilization. Public Adm. Rev., 39, 426431, https://doi.org/10.2307/3109916.

  • Williams, C., A. Fenton, and S. Huq, 2015: Knowledge and adaptive capacity. Nat. Climate Change, 5, 8283, https://doi.org/10.1038/nclimate2476.

Supplementary Materials

Save
  • Alsharo, M., D. Gregg, and R. Ramirez, 2017: Virtual team effectiveness: The role of knowledge sharing and trust. Inf. Manage., 54, 479490, https://doi.org/10.1016/j.im.2016.10.005.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderson, D. M., and B. C. Edwards, 2015: Unfulfilled promise: Laboratory experiments in public management research. Public Manage. Rev., 17, 15181542, https://doi.org/10.1080/14719037.2014.943272.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bernard, R. M., and Coauthors, 2004: How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Rev. Educ. Res., 74, 379439, https://doi.org/10.3102/00346543074003379.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bhappu, A. D., M. Zellmer-Bruhn, and V. Anand, 2001: The effects of demographic diversity and virtual work environments on knowledge processing in teams. Virtual Teams, Vol. 8, Advances in Interdisciplinary Studies of Work Teams, M. M. Beyerlein, D. A. Johnson, and S. T. Beyerlein, Eds., Emerald Group Publishing, 149–165.

  • Briley, L., D. Brown, and S. E. Kalafatis, 2015: Overcoming barriers during the co-production of climate information for decision-making. Climate Risk Manage., 9, 4149, https://doi.org/10.1016/j.crm.2015.04.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brugger, J., and M. Crimmins, 2015: Designing institutions to support local-level climate change adaptation: Insights from a case study of the U.S. Cooperative Extension System. Wea. Climate Soc., 7, 1838, https://doi.org/10.1175/WCAS-D-13-00036.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Caplan, N., 1979: The two-communities theory and knowledge utilization. Amer. Behav. Sci., 22, 459470, https://doi.org/10.1177/000276427902200308.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cash, D. W., W. C. Clark, F. Alcock, N. M. Dickson, N. Eckley, D. H. Guston, J. Jäger, and R. B. Mitchell, 2003: Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. USA, 100, 80868091, https://doi.org/10.1073/pnas.1231332100.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Clark, W. C., L. van Kerkhoff, L. Lebel, and G. C. Gallopin, 2016: Crafting usable knowledge for sustainable development. Proc. Natl. Acad. Sci. USA, 113, 45704578, https://doi.org/10.1073/pnas.1601266113.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Colleran, K., and Coauthors, 2012: Building capacity to reduce disparities in diabetes: Training community health workers using an integrated distance learning model. Diabetes Educ., 38, 386396, https://doi.org/10.1177/0145721712441523.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cramton, C. D., and K. L. Orvis, 2003: Overcoming barriers to information sharing in virtual teams. Virtual Teams that Work: Creating Conditions for Virtual Team Effectiveness, C. B. Gibson and S. G. Cohen, Eds., John Wiley & Sons, 214–229.

    • Search Google Scholar
    • Export Citation
  • Dilling, L., and M. C. Lemos, 2011: Creating usable science: Opportunities and constraints for climate knowledge use and their implications for science policy. Global Environ. Change, 21, 680689, https://doi.org/10.1016/j.gloenvcha.2010.11.006.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ford, J. D., M. Knight, and T. Pearce, 2013: Assessing the ‘usability’ of climate change research for decision-making : A case study of the Canadian International Polar Year. Global Environ. Change, 23, 13171326, https://doi.org/10.1016/j.gloenvcha.2013.06.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fujitani, M., A. McFall, C. Randler, and R. Arlinghaus, 2017: Participatory adaptive management leads to environmental learning outcomes extending beyond the sphere of science. Sci. Adv., 3, e1602516, https://doi.org/10.1126/sciadv.1602516.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gitomer, D. H., and K. Crouse, 2019: Studying the use of research evidence: A review of methods. William T. Grant Foundation Monograph, 90 pp., http://wtgrantfoundation.org/library/uploads/2019/02/A-Review-of-Methods-FINAL003.pdf.

  • Hemmati, N., S. Omrani, and N. Hemmati, 2013: A comparison of internet-based learning and traditional classroom lecture to learn CPR for continuing medical education. Turk. Online J. Distance Educ., 14 (1), 256265, https://eric.ed.gov/?id=EJ1006264.

    • Search Google Scholar
    • Export Citation
  • Henttonen, K., and K. Blomqvist, 2005: Managing distance in a global virtual team: The evolution of trust through technology-mediated relational communication. Strateg. Change, 14, 107119, https://doi.org/10.1002/jsc.714.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Holmes, B., G. Scarrow, and M. Schellenberg, 2012: Translating evidence into practice: The role of health research funders. Implement. Sci., 7, 39, https://doi.org/10.1186/1748-5908-7-39 1–10.

    • Crossref
    • Export Citation
  • Jarvenpaa, S. L., and D. E. Leidner, 1999: Communication and trust in global virtual teams. Organ. Sci., 10, 791815, https://doi.org/10.1287/orsc.10.6.791.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Jones, L., C. Champalle, S. Chesterman, L. Cramer, and T. A. Crane, 2016: Constraining and enabling factors to using long-term climate information in decision-making. Climate Policy, 17, 551–572, https://doi.org/10.1080/14693062.2016.1191008.

    • Crossref
    • Export Citation
  • Kettle, N. P., and S. F. Trainor, 2015: The role of remote engagement in supporting boundary chain networks across Alaska. Climate Risk Manage., 9, 619, https://doi.org/10.1016/j.crm.2015.06.006.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., 2013: Understanding and enhancing climate information use in water management. Climatic Change, 119, 495509, https://doi.org/10.1007/s10584-013-0703-x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., M. Lemos, and S. Dessai, 2013: Actionable knowledge for environmental decision making: Broadening the usability of climate science. Annu. Rev. Environ. Resour., 38, 393414, https://doi.org/10.1146/annurev-environ-022112-112828.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., R. Esselman, and D. Brown, 2015a: Boundary organizations to boundary chains: Prospects for advancing climate science application. Climate Risk Manage., 9, 2029, https://doi.org/10.1016/j.crm.2015.04.001.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kirchhoff, C. J., M. C. Lemos, and S. Kalafatis, 2015b: Narrowing the gap between climate science and adaptation action: The role of boundary chains. Climate Risk Manage., 9, 15, https://doi.org/10.1016/j.crm.2015.06.002.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Klenk, B. N. L., K. Meehan, S. L. Pinel, F. Mendez, T. Lima, and D. M. Kammen, 2015: Stakeholders in climate science: Beyond lip service? Science, 350, 743744, https://doi.org/10.1126/science.aab1495.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lach, D., and S. Rayner, 2017: Are forecasts still for wimps? J. Southwest, 59, 245263, https://doi.org/10.1353/jsw.2017.0013.

  • Landry, R., M. Lamari, and N. Amara, 2003: The extent and determinants of the utilization of university research in government agencies. Public Adm. Rev., 63, 192205, https://doi.org/10.1111/1540-6210.00279.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., and B. J. Morehouse, 2005: The co-production of science and policy in integrated climate assessments. Global Environ. Change, 15, 5768, https://doi.org/10.1016/j.gloenvcha.2004.09.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., C. J. Kirchhoff, and V. Ramprasad, 2012: Narrowing the climate information usability gap. Nat. Climate Change, 2, 789794, https://doi.org/10.1038/nclimate1614.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., C. J. Kirchhoff, S. E. Kalafatis, D. Scavia, and R. B. Rood, 2014: Moving climate information off the shelf: Boundary chains and the role of RISAs as adaptive organizations. Wea. Climate Soc., 6, 273285, https://doi.org/10.1175/WCAS-D-13-00044.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., J. C. Arnott, N. M. Ardoin, K. Baja, A. T. Bednarek, A. Dewulf, and Coauthors, 2018: To co-produce or not to co-produce. Nature Sustainability, 1, 722724, https://doi.org/10.1038/s41893-018-0191-0.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McNie, E. C., 2007: Reconciling the supply of scientific information with user demands: An analysis of the problem and review of the literature. Environ. Sci. Policy, 10, 1738, https://doi.org/10.1016/j.envsci.2006.10.004.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McNie, E. C., 2013: Delivering climate services: Organizational strategies and approaches for producing useful climate-science information. Wea. Climate Soc., 5, 1426, https://doi.org/10.1175/WCAS-D-11-00034.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Meadow, A. M., D. B. Ferguson, Z. Guido, A. Horangic, G. Owen, and T. Wall, 2015: Moving toward the deliberate co-production of climate science knowledge. Wea. Climate Soc., 7, 179191, https://doi.org/10.1175/WCAS-D-14-00050.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Means, B., Y. Toyama, R. Murphy, M. Bakia, and K. Jones, 2009: Evaluation of evidence-based practices in online learning. U.S. Department of Education, 93 pp, https://eric.ed.gov/?id=ED505824.

  • Means, B., Y. Toyama, R. Murphy, and M. Baki, 2013: The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teach. Coll. Rec., 115, 147, https://www.sri.com/work/publications/effectiveness-online-and-blended-learning-meta-analysis-empirical-literature.

    • Search Google Scholar
    • Export Citation
  • Moss, R. H., 2016: Assessing decision support systems and levels of confidence to narrow the climate information “usability gap.” Climatic Change, 135, 143155, https://doi.org/10.1007/s10584-015-1549-1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Moss, R. H., and Coauthors, 2013: Hell and high water: Practice-relevant adaptation science. Science, 342, 696698, https://doi.org/10.1126/science.1239569.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • NRC, 2010: Informing an Effective Response to Climate Change. America’s Climate Change Project, National Research Council, 346 pp., http://www.nap.edu/catalog.php?record_id=12784.

  • Parris, A. S., G. M. Garfin, K. Dow, R. Meyer, and S. L. Close, Eds., 2016: Climate in Context: Science and Society Partnering for Adaptation. Wiley, 304 pp.

    • Search Google Scholar
    • Export Citation
  • Pidgeon, N., and B. Fischhoff, 2011: The role of social and decision sciences in communicating uncertain climate risks. Nat. Climate Change, 1, 3541, https://doi.org/10.1038/nclimate1080.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Rich, R., Ed., 1981: The Knowledge Cycle. SAGE Publications, 222 pp.

  • Riopelle, K., and Coauthors, 2003: Context, task, and the evolution of technology use in global virtual teams. Virtual Teams That Work: Creating Conditions for Virtual Team Effectiveness, C. Gibson and S. G. Cohen, Eds., John Wiley & Sons, 239–264.

  • Rowe, A. K., D. de Savigny, C. F. Lanata, and C. G. Victora, 2005: How can we achieve and maintain high-quality performance of health workers in low-resource settings? Lancet, 366, 10261035, https://doi.org/10.1016/S0140-6736(05)67028-6.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Silk, K. J., E. K. Perrault, S. Ladenson, and S. A. Nazione, 2015: The effectiveness of online versus in-person library instruction on finding empirical communication research. J. Acad. Libr., 41, 149154, https://doi.org/10.1016/j.acalib.2014.12.007.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tseng, V., 2012: Commentary on the uses of research in policy and practice. Soc. Policy Rep., 26, 124, https://doi.org/10.1002/j.2379-3988.2012.tb00071.x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Vogel, J., E. McNie, and D. Behar, 2016: Co-producing actionable science for water utilities. Climate Serv., 2–3 (September), 3040, https://doi.org/10.1016/j.cliser.2016.06.003.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wall, T. U., A. M. Meadow, and A. Horganic, 2017: Developing evaluation indicators to improve the process of coproducing usable climate science. Wea. Climate Soc., 9, 95107, https://doi.org/10.1175/WCAS-D-16-0008.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Webb, J., J. Stockwell, and Y. Chavez-Ugalde, 2017: The reach, adoption, and effectiveness of online training for healthcare professionals. Public Health, 153, 107110, https://doi.org/10.1016/j.puhe.2017.08.016.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Weiss, C. H., 1979: The many meanings of research utilization. Public Adm. Rev., 39, 426431, https://doi.org/10.2307/3109916.

  • Williams, C., A. Fenton, and S. Huq, 2015: Knowledge and adaptive capacity. Nat. Climate Change, 5, 8283, https://doi.org/10.1038/nclimate2476.

  • Fig. 1.

    Study 1: Mean ratings with 95% confidence intervals of climate data perceptions for scientist-led vs self-guided trainings. All measures are on 7-point scales with higher values indicating higher endorsement.

  • Fig. 2.

    Study 2: Mean ratings with 95% confidence intervals of presentation quality across treatment conditions, measured on (a) 5-point scales and (b) 7-point scales. Mean ratings were compared using one-way analysis of variance (ANOVA) with Tukey or Games–Howell post hoc tests, as appropriate.

  • Fig. 3.

    Study 2: Mean ratings with 95% confidence intervals of climate data usability (in general). All measures are on 5-point scales with higher values indicating greater endorsement. Mean ratings were compared using one-way analysis of variance (ANOVA) with Tukey or Games–Howell post hoc tests, as appropriate.

  • Fig. 4.

    Study 2: Influence of different types of climate data on investment decision by experimental condition. Participants rated how much each type of data influenced their decision for treating HABs on a scale from 1 = Not at all to 5 = Very much.

  • Fig. 5.

    Study 2: Mean ratings of credibility and fit for each of the six types of climate data presented.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 2067 463 133
PDF Downloads 1442 258 17