Usable Science? The U.K. Climate Projections 2009 and Decision Support for Adaptation Planning

Samuel Tang King’s Centre for Risk Management, Department of Geography, King’s College London, London, United Kingdom

Search for other papers by Samuel Tang in
Current site
Google Scholar
PubMed
Close
and
Suraje Dessai Sustainability Research Institute and ESRC Centre for Climate Change Economics and Policy, School of Earth and Environment, University of Leeds, Leeds, United Kingdom, and Climate Change Impacts, Adaptation, and Mitigation Research Group, Faculty of Sciences, University of Lisbon, Lisbon, Portugal

Search for other papers by Suraje Dessai in
Current site
Google Scholar
PubMed
Close
Full access

Abstract

With future changes in climate being inevitable, adaptation planning has become a policy priority. A central element in adaptation planning is scientific expertise and knowledge of what the future climate may hold. The U.K. Climate Projections 2009 (UKCP09) provide climate information designed to help those needing to plan how to adapt to a changing climate. This paper attempts to determine how useful and usable UKCP09 is for adaptation decision making. The study used a mixed-methods approach that includes analysis of adaptation reports, a quantitative survey, and semistructured interviews with key adaptation stakeholders working in the science–policy interface, which included decision makers, knowledge producers, and knowledge translators. The knowledge system criteria were used to assess the credibility, legitimacy, and saliency of UKCP09 for each stakeholder group. It emerged that stakeholders perceived UKCP09 to be credible and legitimate because of its sophistication, funding source, and the scientific reputation of organizations involved in UKCP09’s development. However, because of the inherent complexities of decision making and a potentially greater diversity in users, UKCP09’s saliency was found to be dependent upon the scientific competence and familiarity of the user(s) in dealing with climate information. An example of this was the use of Bayesian probabilistic projections, which improved the credibility and legitimacy of UKCP09’s science but reduced the saliency for decision making. This research raises the question of whether the tailoring of climate projections is needed to enhance their salience for decision making, while recognizing that it is difficult to balance the three knowledge criteria in the production of usable science.

Corresponding author address: Samuel Tang, King’s Centre for Risk Management, Department of Geography, King’s College London, London WC2R 2LS, United Kingdom. E-mail: samuel.tang@kcl.ac.uk

Abstract

With future changes in climate being inevitable, adaptation planning has become a policy priority. A central element in adaptation planning is scientific expertise and knowledge of what the future climate may hold. The U.K. Climate Projections 2009 (UKCP09) provide climate information designed to help those needing to plan how to adapt to a changing climate. This paper attempts to determine how useful and usable UKCP09 is for adaptation decision making. The study used a mixed-methods approach that includes analysis of adaptation reports, a quantitative survey, and semistructured interviews with key adaptation stakeholders working in the science–policy interface, which included decision makers, knowledge producers, and knowledge translators. The knowledge system criteria were used to assess the credibility, legitimacy, and saliency of UKCP09 for each stakeholder group. It emerged that stakeholders perceived UKCP09 to be credible and legitimate because of its sophistication, funding source, and the scientific reputation of organizations involved in UKCP09’s development. However, because of the inherent complexities of decision making and a potentially greater diversity in users, UKCP09’s saliency was found to be dependent upon the scientific competence and familiarity of the user(s) in dealing with climate information. An example of this was the use of Bayesian probabilistic projections, which improved the credibility and legitimacy of UKCP09’s science but reduced the saliency for decision making. This research raises the question of whether the tailoring of climate projections is needed to enhance their salience for decision making, while recognizing that it is difficult to balance the three knowledge criteria in the production of usable science.

Corresponding author address: Samuel Tang, King’s Centre for Risk Management, Department of Geography, King’s College London, London WC2R 2LS, United Kingdom. E-mail: samuel.tang@kcl.ac.uk

1. Introduction

Scientific expertise, knowledge, and progress are perceived to be key reference points in policy-making (Braun and Kropp 2010; Kropp and Wagner 2010), making science a fundamental global commodity. In fact, within the United Kingdom demand for scientific information to support policy and investment decisions has grown rapidly ever since bold commitments were made in the white paper “1999 Modernizing Government,” in which the U.K. government invested significant political currency in evidence-based policy-making (Young et al. 2002; Sutcliffe and Court 2005). Therefore, the need to produce and disseminate comprehensive, robust, and trustworthy scientific information to inform policy design is essential (Dilling and Lemos 2011).

An emerging policy priority where scientific information is considered to be particularly important for decision-making is adaptation planning (or governance), which, in contrast to mitigation, aims to deal with the consequences rather than the causes of climate change. Adaptation—“the adjustment in natural or human systems in response to actual or expected climatic stimuli or their effects, which moderates harm or exploits beneficial opportunities” (Parry et al. 2007, p. 6)—aims to reduce the negative impacts (and exploit any benefits) from actual or expected climatic changes (Füssel 2007).

In the United Kingdom, adaptation planning emerged as a policy issue in 1997 when the U.K. Climate Impacts Programme (UKCIP) was established (McKenzie-Hedger et al. 2006) and has since risen to greater prominence, particularly with the passing of the Climate Change Act 2008. To achieve this the Act provides the Government with special “Adaptation Reporting Powers” to request “bodies with functions of a public nature” and “statutory undertakers” (e.g., utility companies and harbor authorities) to report on the risks and benefits posed by changes in climate and how they plan to adapt to them (Defra 2011a). In addition, the Act requires the Government to undertake a U.K.-wide Climate Change Risk Assessment every five years (the first assessment of its kind was published on 25 January 2012) to provide an evidence base to help better understand climate change risks and also help inform the development of a National Adaptation Programme (to be published in 2013). However, while Government is keen to encourage adaptation action at all levels of society, informed by the best available scientific information, research has identified various obstacles to its effective use in policymaking (e.g., Demeritt and Langdon 2004; Gawith et al. 2009; Arnell 2011; Reeder and Ranger 2011). Consequently, it is possible to question the practical usability of science being produced to inform policy and decision making.

The United Kingdom has a long history of producing climate change scenarios/projections (see Hulme and Dessai 2008a,b), with the latest disseminated in 2009. Conceived in 2003, the Department of Environment, Food and Rural Affairs (Defra) and the Department of Environment and Climate Change (DECC) provided the Met Office (MO) as the lead agency (alongside other organizations) with £11 million to develop state-of-the-art free for use climate projections of future changes in the United Kingdom known as UKCP09 (U.K. Climate Projections 2011a). These projections have experienced significant uptake, resulting in their emergence as the “standard benchmark set of climate information in use by the U.K. impacts and adaptation community” (UKCIP 2011a, p. 28). Yet, few observations and assessments have been undertaken to determine the efficacy of that investment and how the information translates into informing decision making. Therefore, given that the Government has requested key infrastructure providers to report on adaptation measures, and in light of the significant financial investment in climate projections, it is timely to consider whether, how, and why U.K. climate information is being used to inform adaptation decision making.

This paper utilizes UKCP09 as a case study to investigate the science–policy interface. It will examine if key stakeholders (decision makers, knowledge producers, and knowledge translators) perceive UKCP09 to be usable for adaptation decision making. The paper consists of the following: Section 2 contextualizes the paper within the science–policy interface literature; section 3 introduces UKCP09; section 4 presents the research methods employed; sections 5 and 6 assess and discuss the findings; and finally section 7 identifies a number of conclusions.

2. The science–policy nexus

a. Modes of science

The traditional method of producing science for policy, herein called mode-1 science (commonly known as the linear model or loading-dock approach) assumes that more science will result in better decision outcomes. For example, the quantification and reduction of uncertainties will lead to better decision making. Yet, attempts at utilizing mode-1 science for policy have experienced variable success, leading a number of researchers to speculate about a “disconnect” between the science produced ostensibly to inform decision making and actual policy processes (Lemos and Morehouse 2005; McNie 2007; Sarewitz and Pielke 2007; Dilling and Lemos 2011; Meyer 2011). A commonly referred reason for this disconnect is the realization that mode-1 science is now outdated because it makes “a number of unsubstantiated assumptions about the resources, capabilities and motivations of research users” (Eden 2011, p. 12), including that the science produced is expected and presumed to be useful (and usable) to help intended recipients (and society) address problems they may face (Dilling 2007a).

However, crucially, research has shown a whole range of contextual and intrinsic factors affect decision-making, including informal and formal institutional barriers, what the decision and policy goals are, the information’s spatial and time scale resolution, the level of skill required to utilize the information, and the level of trust, among others (Cash et al. 2003; Lemos and Morehouse 2005; Dilling 2007a; McNie 2007; Sarewitz and Pielke 2007; Hulme and Dessai 2008b; Kirchhoff 2010; Lemos and Rood 2010; Dilling and Lemos 2011; Eden 2011). Therefore, in essence, mode-1 science oversimplifies the complexities within the science–policy interface.

Consequently, alternate models and relationships have been suggested that emphasize and recognize the need for stronger linkages between science and society, in order for science to more effectively assist decision making. Although different in their details, “mode-2” (Nowotny et al. 2001; Lemos and Morehouse 2005), “post-normal” (Funtowicz and Ravetz 1993), and “use-inspired” science [Stokes 1997, cited in Dilling (2007b)] all aim to improve the connection between supply and demand by being socially distributive, application-oriented, transdisciplinary, and subject to multiple accountabilities by encouraging knowledge producers to consider the social, physical, institutional, and political context of decision makers (Dilling 2007a; McNie 2007; Sarewitz and Pielke 2007). Effective decision support emerges when the information decision makers need is identified and aligned alongside with what is feasible for science to deliver (NRC 2009).

Furthermore, the creation of “boundary organizations” and “boundary objects” helps improve the usability of science by linking science and policy across different levels. This is achieved by facilitating a better exchange between stakeholders creating the science (knowledge producers) and stakeholders writing the policies (decision makers) through enhanced emphasis on iteration and interaction (Guston 1999; Cash 2001; Lemos and Morehouse 2005; Kirchhoff 2010; Dilling and Lemos 2011).

Despite the principles and arguments for mode-2 science, doubt remains over the usability of information produced due to difficulties in addressing the contextual and intrinsic factors that affect decision making and different actors perceiving the usefulness of scientific information differently (Lemos and Rood 2010). In addition, it has been suggested that science has moved beyond the capabilities of societal understanding and implementation (McNie 2007; Tribbia and Moser 2008; Braun and Kropp 2010), since more accurate science does not necessarily make decisions easier. Hence, it has become “a sociological truism today that a greater supply of knowledge will not ensure a greater degree of certainty in decision-making” (Kropp and Wagner 2010, p. 813). Therefore, although the theory implies that science produced in this manner will be more practical and usable for decision makers, in practice it remains hard to distinguish what constitutes better (usable) science.

b. Knowledge system criteria for usable science

A number of researchers have suggested science for policy needs to be considered holistically as a knowledge system consisting of three quality criterion (Cash et al. 2003; McNie 2007). Specifically, for scientific information to be useful and usable, decision-makers must perceive it “to not only be credible, but also salient and legitimate” (Cash et al. 2003, p. 8086); that is, they simultaneously perceive the information’s technical evidence and arguments to be scientifically sound, relevant to their needs, and produced (and distributed) in an unbiased transparent conduct that considered among other factors potential opposing views, values, and beliefs (Cash et al. 2003; Hulme and Dessai 2008b; Munang et al. 2011).

In order for scientific information to demonstrate these criteria, each criterion must consist of various distinctive characteristics decision makers recognize. For instance, information is likely to be deemed credible if the science is accurate, valid, of high quality, supported by some form of peer review, and funded from one or more recognizable or established institutions. To ensure the information is legitimate, it must have been produced and disseminated in a transparent, open, and observable way that is free from political suasion or bias. To be salient, information must appear context sensitive and specific to the demands of a decision maker across ecological, spatial, temporal, and administrative scales.

However, stakeholders generally have different perceptions of what makes credible, legitimate, and salient information (Cash et al. 2003; Lemos and Morehouse 2005; Lemos and Rood 2010; Dilling and Lemos 2011). As a result, the criteria cannot simply be incorporated without case specific consideration of the user(s). Difficulties arise from two complex linkages between the criteria. First, if the science is perceived to be seriously lacking in any of the criteria, its likelihood of producing influential information falls significantly; and second, because of tight trade-offs among the criteria, efforts to enhance one succeed at the expense of one or more of the others, undermining the information’s overall influence (Cash et al. 2003).

In spite of these difficulties, the knowledge system criterion is a good indicator to assess stakeholders’ perspectives of what constitutes usable science because it considers the entire process (from inception to dissemination) of the science in question. Indeed, credibility can be used to assess stakeholders’ perceptions of the quality of science underpinning the disseminated information; legitimacy can assess stakeholders’ perceptions of the level of transparency and bias of the individuals and institutions involved in its development; and saliency directly assesses stakeholders’ perceptions of its relevancy to their needs and requirements.

3. U.K. Climate Projections 2009

Climate change projections (or scenarios) are increasingly visible in national and international public policy debates. Based upon peer-reviewed science, projections provide quantitative or semiquantitative descriptions of possible future climates that carry considerable authority. Projections are conditional upon the emission scenario considered.

In the United Kingdom, the first government-funded scenarios were published in 1991. Five generations later, the latest suite of projections, UKCP09 (released in June 2009), represents seven years’ work by a consortium of organizations including Defra, UKCIP, and MO. UKCP09 provides projections of future changes in climate compared to a 1961–90 baseline. These projections were “purposefully designed to meet the needs of a wide range of people who will want to assess potential impacts of the projected future climate and explore adaptation options to address those impacts” (U.K. Climate Projections 2011b). To achieve this, UKCP09 delivered of a wealth of climate information, including a briefing report, climate change land projections (e.g., variables of temperature and precipitation), marine and coastal projections (e.g., variables of storm surge and sea level changes), observed trends in climate data, a weather generator, an 11-member regional climate model output ensemble (Jenkins et al. 2009; Street et al. 2009; UKCIP 2011a), and more recently (April 2012), spatially coherent projections and a newer version of the weather generator.

Compared to previous projections, UKCP09 offers users much greater detail and complexity. For example, for the first time, climate projections quantify uncertainties explicitly in a probabilistic fashion; the 25-km (instead of 50 km) grid squares provide greater spatial resolution, as do predefined aggregated areas, which offer more specialized climate information for administrative regions, river basins, and some marine regions. In addition, UKCP09’s management process encouraged greater input from decision makers through the creation of a user panel to ensure that a wide range of opinions were considered and to produce the most comprehensive package of climate information.

UKCP09 offers users more functionality than ever before. For instance, decision makers can now assign probabilities to different future climate outcomes (conditional on the selected emission scenario) and reflect on the uncertainties of data in more detail; and UKCP09’s User Interface allows data to be visualized and interrogated to produce maps and graphs or be downloaded as numerical outputs, thus providing specific extraction and manipulation of data. However, as with any suite of climate information, various uncertainties exist [e.g., modeling uncertainty, natural climate variability, and emissions uncertainty; for more information, see Jenkins et al. (2009)]. Furthermore, using probabilistic projections is not without controversy, since the type of probability used (i.e., Bayesian) is not necessarily the type decision makers are familiar with or want (Dessai and Hulme 2004; Stainforth et al. 2007). Bayesian projections are often less favored by decision makers because of their difficulty in practical application, which encourages a less robust decision-making approach (Smith et al. 2009; Arnell 2011; Reeder and Ranger 2011).

4. Methods

To assess the usability of UKCP09, research focused on the perceptions of three distinct groups of adaptation stakeholders. These were “knowledge producers” involved in developing or conducting academic research with UKCP09 or predecessor projections; “knowledge translators” providing specialist, consultancy services to organizations responsible for adaptation planning and policy-making; and “decision makers” within organizations with adaptation duties.

Data collection involved a mixed methods approach combining an online questionnaire, semistructured interviews, and content analysis of 95 adaptation reports, which were produced in response to the adaptation reporting power. These reports were written by a range of stakeholders including benchmark organizations (n = 8; e.g., Environment Agency and Network Rail), water (n = 21), electricity generators (n = 9), electricity distributors and transmitters (n = 8), gas transporters (n = 7), road and rail (n = 4), ports (n = 9), aviation (n = 10), lighthouse authority (n = 1), regulators (n = 7), and public bodies (n = 11) [see Defra (2011b) for a full list of published reports]. Content analysis focused on how UKCP09 was utilized.

The survey used a mixture of open-ended, single and multifixed response, and agreement-scaling questions to explore perceptions of UKCP09 and collect basic demographic data. For example, respondents were asked if they had created an adaptation report, whether they had utilized UKCP09 for that report and why, and if they associated the terms credible, legitimate, and salient with UKCP09.

In the summer of 2011, 130 decision makers were e-mailed (Fig. 1) with follow-up e-mails after three and five weeks, and a direct call after week six. The survey universe was compiled in two ways. Eighty were selected from organizations included under the Adaptation Reporting Power (Defra 2011c). An additional 50 were chosen to represent those sectors not requested by Defra to produce an adaptation report but whose functions (which have a public interest) are likely to be affected by changes in climate. Furthermore, they were selected on the size of the organization and region they manage.

Fig. 1.
Fig. 1.

A diagram showing sectors of organizations approached to participate in the questionnaire survey. The survey universe consists of sectors (organizations) that were Defra mandated and those that were not mandated to produce an adaptation report. Sectors underlined and highlighted in bold participated in the study.

Citation: Weather, Climate, and Society 4, 4; 10.1175/WCAS-D-12-00028.1

The response rate was 25% (n = 33/130). Survey responses were initially entered into a spreadsheet for cross tabulation and further statistical analysis. Nominal and ordinal coding was performed to help quantify responses and identify patterns. Cross tabulation between sectors was performed in order to draw comparisons between sectoral perceptions of UKCP09.

A follow-up round of interviews conducted with all three stakeholder groups explored in more detail findings emerging from the questionnaire survey. For example, stakeholders were asked if they were familiar with science like UKCP09, whether they had extensively used UKCP09 (how, why, and what for), if they required expert help to utilize UKCP09, if they were aware of other sources (and had they used them), and whether communicating known sources of uncertainties and some information as Bayesian projections affected the usability of UKCP09.

Whereas decision-maker interviewees were identified through the survey, knowledge producers were identified from published lists of contributors to the development of UKCP09 development (i.e., U.K. Climate Projections 2011c; UKCIP 2011b) websites, while knowledge translators were identified from a web-based search (on Google Scholar). All individuals were contacted initially via e-mail, with follow-up e-mails after two and four weeks (no direct follow-up calls were undertaken). Table 1 illustrates our interview sample, including each interviewee’s area of expertise, employer sector, and relationship to UKCP09 (self-assessed).

Table 1.

Summary of the interviewee participant population.

Table 1.

Interviews were taped and transcribed verbatim. Following transcription, content analysis was applied to identify response themes. The theme categorization used was based on the knowledge system criteria (credibility, legitimacy, and saliency). Stakeholder groups were initially analyzed on their own and then compared to the two other groups.

To ensure individual and group perception consistency, decision makers’ surveys and interview responses were compared, and then additionally cross-referenced against their relevant Adaptation Report, which were collected from Defra’s website (Defra 2011b). Such methodological triangulation helped assure the quality of the research and the robustness of our interpretation of our findings (Olsen 2004; Guion et al. 2012).

5. Results

a. Initial decision-maker perceptions of UKCP09

Of the 33 respondents 24 had created or were creating an adaptation report, with nine of these employing commercial (e.g., Jan Brooke Consulting and Met Office Consulting) or noncommercial (e.g., UKCIP) consultants and knowledge translators to assist in the preparation of their adaptation reports. Of these 24 decision makers, 21 utilized UKCP09 representing five sectors: water (n = 7), transport (n = 6), local and regional authority (n = 2), environment (n = 3), and energy (n = 3).

These decision makers were asked to select one reason (It was the best option/Recommended to/No alternative/Other) for why they chose to utilize UKCP09 in their adaptation report. Responses indicated that 10 of 21 utilized UKCP09 because “It was the best option,” four were “Recommended to” use it, two felt “No alternative” existed, and five provided alternate reasons that were positive in nature; for example, “UKCP09 is the most up-to-date sophisticated projections” and “UKCP09 supplemented information previously developed.” Among these decision makers, UKCP09 has a positive reputation and is perceived to be an important source of information. Indeed, analysis of published Adaptation Reports indicates that the majority utilized UKCP09 in their report. Analysis also highlighted several additional reasons for why UKCP09 was utilized, including that it represents an updated version of previous projections with advancements in knowledge and information; it provides the tools to undertake quantitative options analysis; it is the most definitive evidence base on the U.K.’s future climate; and it is perceived as a highly reliable dataset.

As for the three nonusers of UKCP09, unfortunately they did not provide direct reasons for why they did not utilize the projections; however, one respondent noted that instead they used a combination of information sources consisting of the UKCIP Local Climate Impacts Profile (LCLIP), a self-administered media trawl and various local case studies from local officers.

b. Credibility and legitimacy

Survey and interview responses indicate that UKCP09 is perceived as credible and legitimate. For example, decision makers were asked in the survey to choose how much they agreed (Not at all/A little/Moderately/Quite a bit/Extremely/No opinion) with using the terms “credible” and “legitimate” to describe utilization of UKCP09. Results indicate that primarily UKCP09 is described as “quite a bit” credible (63%) and legitimate (52%), while 26% and 37% chose to describe UKCP09 as “extremely” credible and legitimate, respectively.

It also emerged that stakeholders perceived the two criteria to be overlapping concepts and difficult, in practice, to distinguish from one another. For example, decision-maker B ran two concepts together in discussing the open communication of uncertainties:

I think it’s more credible because it’s a realistic and honest approach. (Decision-maker B).

Decision-maker B denotes credibility through the use of “realistic” (which is a synonym for credible) and legitimacy through the use of “honest,” which implies they perceived the process to be open due to the explicit discussion of uncertainties. Therefore, while in theory credibility and legitimacy are distinct, in practice they are perceived to be so closely intertwined that the typology is hard to use.
Stakeholder groups provided different reasons for why they judged UKCP09 to be credible and legitimate. Decision makers tended to stress the importance of UKCP09 being government funded and nationally (and internationally) recognized.

It’s essential that it’s a national thing. It’s credible that it’s endorsed by those various different organizations and used uniformly. I think it’s really key. (Decision-maker B).

Decision makers believed other information sources, without government approval, were not as credible and legitimate:

Actually I don’t see much point in getting another tool that doesn’t have the UK Government stamp of approval on it. (Decision-maker A).

This perception of government approval resulted in decision makers considering UKCP09 to represent a common framework for all sectors to utilize when assessing future climate risks. Decision makers perceived that by utilizing something that is nationally accepted (e.g., UKCP09), their results will be accepted by and compliant with the demands of the government regulator, like the Environment Agency:

…let’s say we’re doing some kind of project that requires Environment Agency sign off and approval. If you’re actually using a tool that isn’t actually nationally recognized, then you have to go through this process or persuasion of what you’ve actually got is fit for the job. If you’ve got something that actually is nationally accepted, the results are accepted, processes of using it are accepted, then actually what it means is that from our perspective the processes go a lot smoother. (Decision-maker A).

For this decision maker, it was the credibility of UKCP09 with the regulator that mattered. Its scientific reputation was less important than the promise that the resulting adaptation would meet with regulatory approval from government. That was echoed by others:

Using UKCP09 also allows Defra and anyone else to compare plans across the water industry and other industry’s [sic] plans if required. (Decision-maker J).

This touches on Rothstein et al.’s (2006) argument about institutional risks, namely that failure to utilize science, in this case UKCP09, allows for the creation of blame, accountability, and reputational damage. However, if decision makers do include the science, and the risk still occurs, adapting organizations are at least safeguarded against the most extreme sociopolitical criticisms. Therefore, by using UKCP09 decision makers are minimizing their institutional exposure.
In contrast, credibility and legitimacy for knowledge producers and knowledge translators emerged from the incorporation of Bayesian probabilistic projections, which they perceived as enhancing scientific accuracy and validity. Specifically, they perceived Bayesian projections as encouraging uncertainties to be further explored and/or allowing uncertainties to be accommodated for in adaptation planning. We found a belief that using UKCP09 should lead to better decisions (consistent with the linear model of science):

I think it [Bayesian probabilistic projections] enhances credibility. Importantly, it makes people realize the inherent uncertainties and should lead to better planning. (Knowledge producer H).

Significantly, this difference between stakeholder groups’ (decision makers to knowledge producers and knowledge translators) reasons for why they perceive UKCP09 to be credible and legitimate begins to raise wider implications for the knowledge system criteria. In particular, it indicates that stakeholders are likely to consider what makes UKCP09 usable for decision-making differently, an issue that has been raised in previous research (Cash et al. 2003; Lemos and Morehouse 2005; Lemos and Rood 2010; Dilling and Lemos 2011). Furthermore, this points to some important underlying differences in the understandings of the applications of climate information and thus of the saliency of UKCP09 for decision making.

c. Saliency

Unlike credibility and legitimacy, perception of saliency is less consistent among stakeholders. Decision makers, in particular, were split in how they described UKCP09’s saliency. When asked in the survey to choose how much they agreed with using the term, 14% chose “a little,” 33% chose “moderately,” 33% chose “quite a bit,” 14% chose “extremely,” and 6% had “no opinion.” In addition, the range indicates that perception of saliency is less positive than credibility and legitimacy, as 47% of saliency responses were positive (33% quite a bit, 14% extremely) whereas 89% of responses were positive for both credibility (63% quite a bit, 26% extremely) and legitimacy (52% quite a bit, 37% extremely). Notably this variation is also shown in a sectoral comparison. Specifically, in terms of modal response, 42% of the water sector felt UKCP09 was “extremely” salient, 67% of energy and 100% of environment perceived it as “quite a bit” salient, and 83% of transport perceived it as “moderately” salient, while local authority responses were split equally between “a little” (50%) and “moderately” (50%).

When pressed further on the issue during interviews, decision makers stressed the complexity of UKCP09 and the difficulties of using its raw outputs in decision-making. The below quotation is typical of the views expressed by four decision makers:

…in terms of creating our adaptation report and adaptation strategy there was less using of UKCP[09]’s outputs and more using of the stuff that is there in the maps that is used for public consumption rather than any sort of raw data that comes from UKCP[09]. (Decision-maker F).

Instead of using the full technical capabilities of UKCP09 that so impressed knowledge producers, decision makers preferred simply to borrow from heavily digested summary reports that were less complex (e.g., 67% used the land projections and only 19% used the spatial coherent projections). This tendency was also demonstrated through analysis of the adaptation reports. For example, Manchester Airports Group (2011) believed the inclusion of certain specific variables of temperature and precipitation data, such as relative humidity and cloud amount, would have introduced unnecessary complexity for their planning. Similarly, as Severn Trent Water Ltd. (2011, p. 48) put it, “the UKCP09 data and tools are so wide ranging it is difficult to know which is the best method/tool/dataset to use.”

Additionally, adaptation report analysis highlighted that, in spite of UKCP09 being perceived as invaluable in helping planning, it did not provide the specific information directly required. A number of reports (National Grid Gas 2010; London Stansted 2011; Port of Sheerness 2011; SP Energy Networks 2012) commented that UKCP09 lacked useful information concerning the frequency and intensity of ice storms, wind (direction and speed), snow storms, lightning storms, heat waves, and droughts. A view held even in light of the (November 2010) UKCIP published technical notes (UKCIP 2012a,b)–provide additional advice on these variables–as decision makers perceived data from these was not easy to extract. A few examples are shown below:

  • Severn Trent Water Ltd. (2011, p. 39) stated they could not assess the impact of summer convective storm events on sewer systems because there are limitations in predicting the intensity and frequency of such events while using UKCP09.

  • SP Generation (2011, p. 13) criticized the Weather Generator’s usability, stating it did not constitute “a profound extreme event analysis suitable to assess the change in likelihood of extreme events in the future.”

  • RWE Npower (2011, p. 16) expressed concerns that estimations for the implications of the UKCP09 projections on the “aquatic environment” are not available, resulting in the overreliance on the autonomous (and resource consuming) implementation of supplementary models (such as a rainfall-runoff model).

Besides the lack of salience, some of these statements also point toward a perceived lack of credibility because UKCP09 is seen as weak in certain areas (e.g., summer convective storms). Furthermore, this highlights an apparent contradiction among decision makers, who on the one hand complain about the complexity yet on the other hand state that it leaves out information they require, thus showing the difficulties in appeasing a range and variety of decision makers. Nevertheless, it must also be noted that it is extremely difficult to produce data concerning weather variables such as wind, snow, and lightning storms because these events are fraught with uncertainty. This is a universal shortcoming in what science can currently offer and thus is not unique to UKCP09.

Our findings also suggest that the information UKCP09 provides is one or two steps removed from what decision makers want or need. This is unsurprising, given that UKCP09 is climate information and not the impact information some decision makers would like, an issue directly mentioned by four decision makers and exemplified by the following quotation:

Within our risk assessments the information I need is not climate information, it’s environmental impact information. (Decision-maker D).

Arguably, UKCP09 has a saliency gap in the knowledge it can actually provide for decision making, a finding consistent with emerging research from the sectors, in particular the water and building services industries (see Arnell 2011; Mylona 2012, respectively).
Why UKCP09 has a saliency (and not a credibility and legitimacy) gap can partly be attributed to the incorporation of Bayesian projections, which result in much greater complexity and information richness. Although many interviewed stakeholders (68%) perceive that the inclusion of such information enhances scientific credibility (see section 5b), they perceived that the information produced is difficult to integrate successfully into decision making and moves the individual away from a decision. For example, knowledge producers and knowledge translators, who like the arguments of Dessai and Hulme (2004), Smith et al. (2009), Arnell (2011), and Reeder and Ranger (2011), believe that decision-makers are familiar with a different type of probability that is less complex to interpret and apply. The below quotation is representative of this perception for five knowledge producers and two knowledge translators:

All the probabilistic estimates they did are all very difficult to interpret because they are not probabilities in the way that a decision-making would use probabilities. (Knowledge producer D)

Considering the above quotation and similar responses there is a perception within the scientific community that Bayesian projections place decision makers into a decision-making arena with which they are somewhat unfamiliar. Subsequently this demonstrates an ongoing disconnect in the science–policy interface between what scientists produce and what users want or require, creating wider challenges for end users (Shackley and Wynne 1995; Knorr-Cetina 1999). For example, the assessment of climate risk becomes time consuming because thousands of Bayesian projections often serve as an input to impact models (which have their own uncertainties) in order to derive more decision-relevant information (cf. Dessai and Hulme 2007). The challenge is compounded by the fact that whoever undertakes the research is usually not the same individual that makes the decision, since typically the actual decision maker is someone from senior management who does not understand the science in great detail (or is not used to dealing with a probabilistic framework) and, given time constraints, wants one answer instead of several possible outcomes to choose from. Therefore, although decision-makers reflected that having a range of outcomes was useful in highlighting uncertainty, in reality they actually bemoaned how this proliferation tended to complicate decision making.

UKCIP02 gave you a figure, whereas UKCP09 uses this probabilistic approach which I think is a more realistic approach, but in itself trying to write those in a report to your management team is hard. You struggle sometimes with making decisions with that variability, but that’s the reality, they [management] still want to know a figure. (Decision-maker B)

Decision-maker B reaffirms the widespread perception among sampled stakeholder groups that Bayesian projections reduce the capacity for decision making. In addition, decision-maker B iterates the view that senior management is unwilling to consider a range of possible outcomes when trying to make cost-effective adaptation strategy decisions. Therefore, although decisions made are perceived to be more robust and realistic, the actual decision-making process is considered to be harder and less engaging to decision makers’ needs.

This highlights wider implications for the science–policy interface. First, effective decision making (for adaptation planning) is limited not only by the science available but also partly by subconscious barriers organizations have constructed through institutional self-governance. For example, traditional use and overreliance on deterministic information to make decisions has resulted in senior management’s reluctance to make decisions that have multiple potential outcomes because they are used to only having to consider one outcome. Significantly, this finding supports the sentiments of Demeritt and Langdon (2004) and Dilling and Lemos (2011) that the science–policy interface is severely impacted by an informal and formal institutional barrier. Second, responses indicate that calls for flexibility in decision making—which would permit adaptation strategies to be scaled up, or scaled back, as conditions dictate (Lemos and Morehouse 2005; Reeder and Ranger 2011)—have yet to be listened to or subsequently implemented in practice. This implies that decision-making is still being undertaken through a linear approach regardless of its negative perception within research spheres and the promotion of alternate approaches (mode-2 science).

This leads us to consider that the science of UKCP09, in particular the use of Bayesian projections, is not solely to blame for the perceived lack of saliency that decision-makers (and other stakeholder groups) feel. An individual’s ability to interpret the data (from the Bayesian projections) and willingness to utilize new methods also affect perceived saliency. A quote from decision-maker D supports this assessment of cognitive capacity gaps among decision makers in utilizing the information:

I think the problem that many people have in terms of decisions-makers [is] they can’t articulate a policy question in a way that makes it easy to interpret that information. … There is a real gap between the way policy questions are framed and the way that scientists and experts need to articulate those questions to use something like [UK]CP09. (Decision-maker D)

Notably, according to this response, who the user is has a major influence on how salient UKCP09 appears. Specifically, we found the user’s familiarity in dealing with climate information and whether they had been scientifically trained affected perceptions of saliency. In fact, when knowledge producers and knowledge translators reflected on their applications of UKCP09 and what made the projections usable to them, the majority (~80% of the combined sample) referred in some way to their scientific training, background, and familiarity. For example, knowledge producer E recognized the value and advantage of being closely involved in its development:

Yeah [it was difficult to interpret the information I used], though I’ve been involved with the background of UKCP09 for the last 5–6 years so I roughly understand what it’s about. … I think it’s virtually impossible for somebody relatively new to pick it up and apply it. (Knowledge producer E).

Subsequently, they naturally perceived that decision-makers who are familiar with climate information and are scientifically trained (e.g., underwent training from experts or educated to the level of Ph.D.) would be able to utilize the projections more effectively.

It’s an enormous amount of information for somebody who is not normally dealing with that sort of thing allied with dealing with issues of understanding probability and all that kind of malarkey, you know it’s quite indigestible if your [sic] coming in cold. (Knowledge translator A)

Significantly three decision makers acknowledged this perception:

I think if you have a scientific background you are used to using this type of data or the methodologies. If you’re not used to it, then it is harder. (Decision-maker G)

Hence, our findings suggest that the saliency of UKCP09 is enhanced as a user’s level of familiarity and scientific competence increases. To a degree this is additionally supported by survey results as no midrange decision-makers (who stated that they required medium detailed information) perceived UKCP09 to be “hard” to use whereas 33% of low-end decision makers (who stated they required low detailed information) did. The range of decision makers able to utilize science effectively for policy is therefore narrow, which has wider implications for the science–policy interface given that increasing numbers of decision makers are using scientific information for purposes other than pure research (UKCIP 2006; Gawith et al. 2009), a trend that is broadening the user community, causing diversity to replace narrowness.

6. Discussion: Interactions of the knowledge system criteria and the implications for the science–policy interface

Stakeholder responses further emphasize the tight tradeoffs observed by Cash et al. (2003), where enhancements in one criterion can negatively affect one or more others. For example, stakeholders perceived that the incorporation of Bayesian-style projections increased the credibility and legitimacy of the science but also perceived that their inclusion reduced the saliency for decisions. With improvements in UKCP09’s credibility apparently coming at the expense of saliency, this raises wider questions for the production of science for policy. For instance, how do you decide which technique to use to satisfy all three criteria? Should more emphasis be placed on one criterion over another? And how do you reconcile the supply and demand of scientific information between knowledge producers and decision-makers?

Tradeoffs are not the only implication to consider. This study additionally highlights that perceived saliency is also largely affected by who the user is. Indeed, for many decision makers the science may be too advanced or not salient enough for them to make sensible decisions (McNie 2007; Sarewitz and Pielke 2007; Tribbia and Moser 2008), a problem recognized by the following quotation, which is representative of four knowledge producers and two knowledge translators:

If there are people who need to know a little bit about what’s going to happen, then I’d say yes definitely use it. If there are people who actually wanted to do some data analysis with it and some modelling work I’d say yes you can use it but use some other sources as well. (Knowledge producer D).

Knowledge producer D affirms the view that although the dataset is varied, because of the diversity of users and uses there is a lack of specific guidance on how to use the data for different types of risks, resulting in reduced usability and potential misuse of information. This implies that the science–policy interface is still lacking the right level of support information that Gawith et al. (2009) called for. Therefore, despite Defra’s intention of UKCP09 being developed with a range of uses in mind, in reality its usability is limited.
Arguably this issue is amplified by a mismatch of expectations between what contributing scientists were developing and what Defra intended to receive from its investment. Given how much UKCP09 cost to develop, it is not unreasonable to assume that the Government stressed to Defra that they must make good on their investment. In their “Statutory Guidance to Reporting Authorities,” although it is not directly stated, Defra (2009) strongly implies that organizations (many of whom were reporting on adaptation measures officially for the first time) should consider utilizing the projections (as a component of the methodology) to help assess the impacts of climate change to their functions. For instance, under the heading “What evidence is available about the future climate?” Defra (2009, p. 8) only explicitly discusses UKCP09, with other pertinent information only briefly mentioned in a supporting capacity. By Defra placing this implicit emphasis on utilizing UKCP09 they inadvertently steer decision makers to utilize it when other sources of information may be more relevant. One decision makers while reflecting on others use of UKCP09 said that

[UK]CP09 is not the first place for them to start, so they need someone to translate that into something more relevant for them. (Decision-maker D)

Another went as far to say

…I think if we didn’t make any reference to it then you would have to wonder why. I think therefore the reader would wonder why we haven’t made reference to it and would probably think it’s more carelessness on our part than a failing of UKCP09. (Decision-maker F).

These quotations imply that among some decision makers there is wariness in using UKCP09, suggesting that UKCP09 is in danger of becoming a constant or “rite of passage” that must be included when writing adaptation reports. Perhaps inadvertently, the government has created a perception among decision makers that UKCP09 is the only game in town when it comes to adaptation planning. This is also observed elsewhere by Porter and Demeritt (2012), who talk about how the Environment Agency’s Flood Map acts as an “obligatory passage point” that all decisions for flood planning should be filtered through.

This raises several implications for the science–policy interface. First, as Meyer (2011) noted, expectations between what is wanted as a return from an investment and what can be delivered from that investment need to be managed more closely to ensure the subsequent science is used in the best means possible and be deemed usable. Second, although utilization of the same science allows for national consistency and helps makes governance easier, if every decision maker utilizes the same information source the safety net created by diversity in information sources is removed because if the science turns out to be categorically incorrect, then everyone who utilized it will be affected—meaning, in the case of the United Kingdom, that the entire national infrastructure will be particularly vulnerable to changes in climate (cf. Hall 2007). This highlights the dangers of placing too much emphasis on using one scientific source of information as a standalone to support policy decisions (Brown 2009), and the need to continually state that other sources must be used in conjunction with specialist information such as UKCP09. These observations are consistent with an emerging literature that emphasizes robust decision making—predicated on identifying strategies immune to wide ranges of uncertainty—over a “predict and optimize” approach (Dessai et al. 2009; Lempert and Groves 2010; Wilby and Dessai 2010).

7. Conclusions

Advances in scientific understanding, greater acknowledgment of uncertainty and greater user input have helped install credibility and legitimacy in UKCP09. However, this has come at the expense of saliency for decision makers because saliency is dependent both on their ability to understand and interpret the science and on what information they require. Consequently, although UKCP09 is perceived by decision-makers to represent a common framework for assessing future climate changes because of its credibility and legitimacy, paradoxically it is not actually a common framework for all sectors to utilize as UKCP09 lacks saliency for some decision makers. This saliency disconnect is in part caused by an increase of users (and range of uses) due to societal pressures and regulatory requirements to plan for a changing climate.

Our findings suggest that we may have reached a limit to the utility of national climate projections. While they have played important roles in the past (pedagogic and motivational, for example; see Hulme and Dessai 2008a,b), they lack salience for adaptation decision making (among many users), which is the primary reason UKCP09 was constructed. This raises the question of whether climate scenarios can truly ever be constructed through mode-2/postnormal science. This study suggests that the large number of users of climate projections now make this very difficult. Furthermore, it hints at a move from the postnormal science realm to the applied consultancy domain (cf. Funtowicz and Ravetz 1993). This is evident from the important role played by boundary organizations and knowledge brokerage. Hence, one way to enhance the salience of science for adaptation decision making could be through the tailoring of climate and climate impact projections to particular adaptation contexts or problems. One of the drawbacks of this approach is that national consistency may be lost, which could be beneficial as a diversity of approaches may prevent maladaptation if only one set of projections is used (and proved incorrect). Attempts at increasing saliency are likely to have impacts on credibility and legitimacy. This study has demonstrated that ultimately the production of usable science requires a careful balancing act between the knowledge system criteria.

One of the limitations of our study is the small number of stakeholders who participated. This makes it difficult to extrapolate wider conclusions for each stakeholder group’s perception. It is likely that with a larger sample, greater variation in perception would emerge. For example, we would expect credibility to erode slightly as we are aware of disagreements among the academic community; for example, one of the reviewers of UKCP09 was concerned that the results were “stretching the ability of current climate science” (Heffernan 2009). Further in-depth, ethnographic work with a wide range of stakeholders is necessary to better understand how climate science is currently informing decision-making and how this process can be improved for greater societal benefits.

Acknowledgments

The authors thank all of the individuals that participated in the study. In addition, thanks should be given to David Demeritt, Megan Gawith, Mike Hulme, James Porter, and Henry Rothstein for commenting on earlier drafts of the paper. Further thanks to Megan Gawith, Sophie Millin, and Anna Steynor-Greenwood (formerly of UKCIP) for sparing their time to discuss this research. Any errors remain our own. Suraje Dessai was supported by the ARCC-Water project funded by EPSRC (EP/G061181/1) and the EQUIP project funded by NERC (NE/H003509/1).

REFERENCES

  • Arnell, N. W., 2011: Incorporating climate change into water resources planning in England and Wales. J. Amer. Water Resour. Assoc., 47, 541549.

    • Search Google Scholar
    • Export Citation
  • Braun, K., and Kropp C. , 2010: Beyond speaking truth? Institutional responses to uncertainty in scientific governance. Sci. Technol. Human Values, 35, 771782.

    • Search Google Scholar
    • Export Citation
  • Brown, M., 2009: Science in Democracy: Expertise, Institutions, and Representation. MIT Press, 368 pp.

  • Cash, D. W., 2001: “In order to aid diffusing useful and practice information”: Agricultural extension and boundary organizations. Sci. Technol. Human Values, 26, 431453.

    • Search Google Scholar
    • Export Citation
  • Cash, D. W., Clark W. C. , Alcock F. , Dickson N. M. , Eckley N. , Guston D. H. , Jager J. , and Mitchell R. B. , 2003: Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. USA, 100, 80868091.

    • Search Google Scholar
    • Export Citation
  • Defra, 2009: Adapting to climate change: Helping key sectors to adapt to climate change: Statutory guidance to reporting authorities 2009. Department for Environment, Food and Rural Affairs, 14 pp.

  • Defra, cited 2011a: What is government doing? [Available online at http://www.defra.gov.uk/environment/climate/government/.]

  • Defra, cited 2011b: Adaptation Reporting Power received reports. [Available online at http://www.defra.gov.uk/environment/climate/sectors/reporting-authorities/reporting-authorities-reports.]

    • Search Google Scholar
    • Export Citation
  • Defra, cited 2011c: Advice for reporting authorities. [Available online at http://www.defra.gov.uk/environment/climate/sectors/reporting-authorities/.]

  • Demeritt, D., and Langdon D. , 2004: The UK Climate Change Programme and communication with local authorities. Global Environ. Change, 14, 325336.

    • Search Google Scholar
    • Export Citation
  • Dessai, S., and Hulme M. , 2004: Does climate adaptation policy need probabilities? Climate Policy, 4, 107128.

  • Dessai, S., and Hulme M. , 2007: Assessing the robustness of adaptation decisions to climate change uncertainties: A case study on water resources management in the east of England. Global Environ. Change, 17, 5972.

    • Search Google Scholar
    • Export Citation
  • Dessai, S., Hulme M. , Lempert R. , and Pielke R. Jr., 2009: Climate prediction: A limit to adaptation? Adaptation to Climate Change: Thresholds, Values and Governance, W. Adger, I. Lorenzoni., and K. O’Brien., Eds., Cambridge University Press, 64–78.

  • Dilling, L., 2007a: Towards science in support of decision making: Characterizing the supply of carbon cycle science. Environ. Sci. Policy, 10, 4861.

    • Search Google Scholar
    • Export Citation
  • Dilling, L., 2007b: The opportunities and responsibility for carbon cycle science in the U.S. Environ. Sci. Policy, 10, 14.

  • Dilling, L., and Lemos M. C. , 2011: Creating usable science: Opportunities and constraints for climate knowledge use and their implications for science policy. Global Environ. Change, 21, 680689.

    • Search Google Scholar
    • Export Citation
  • Eden, S., 2011: Lessons on the generation of usable science from an assessment of decision support practices. Environ. Sci. Policy, 14, 1119.

    • Search Google Scholar
    • Export Citation
  • Funtowicz, S. O., and Ravetz J. R. , 1993: Science for the post-normal age. Futures, 25, 739755.

  • Füssel, H.-M., 2007: Adaptation planning for climate change: Concepts, assessment approaches, and key lessons. Sustainability Sci., 2, 265275.

    • Search Google Scholar
    • Export Citation
  • Gawith, M., Street R. , Westaway R. , and Steynor A. , 2009: Application of the UKCIP02 climate change scenarios: Overview and lessons learnt. Global Environ. Change, 19, 113121.

    • Search Google Scholar
    • Export Citation
  • Guion, L. A., Diehl D. C. , and McDonald D. , cited 2012: Triangulation: Establishing the validity of qualitative studies. [Available online at http://edis.ifas.ufl.edu/fy394.]

  • Guston, D. H., 1999: Stabilizing the boundary between US politics and science: The role of the Office of Technology Transfer as a boundary organization. Soc. Stud. Sci., 29, 87111.

    • Search Google Scholar
    • Export Citation
  • Hall, J., 2007: Probabilistic climate scenarios may misrepresent uncertainty and lead to bad adaptation decisions. Hydrol. Processes, 21, 11271129.

    • Search Google Scholar
    • Export Citation
  • Hefferman, O., 2009: UK climate effects revealed in finest detail yet. [Available online at http://www.nature.com/news/2009/090619/full/news.2009.586.html.]

    • Search Google Scholar
    • Export Citation
  • Hulme, M., and Dessai S. , 2008a: Predicting, deciding, learning: Can one evaluate the ‘success’ of national climate scenarios? Environ. Res. Lett., 3, 045013, doi:10.1088/1748-9326/3/4/045013.

    • Search Google Scholar
    • Export Citation
  • Hulme, M., and Dessai S. , 2008b: Negotiating future climates for public policy: A critical assessment of the development of climate scenarios for the UK. Environ. Sci. Policy, 11, 5470.

    • Search Google Scholar
    • Export Citation
  • Jenkins, G. J., Murphy J. M. , Sexton D. S. , Lowe J. A. , Jones P. , and Kilsby C. G. , 2009: UK climate projections: Briefing report. Met Office Hadley Centre, 59 pp.

  • Kirchhoff, C. J., 2010: Integrating science and policy: Climate change assessments and water resources management. Ph.D. dissertation, University of Michigan, 293 pp.

  • Knorr-Cetina, K., 1999: Epistemic Cultures: How the Sciences Make Knowledge. Harvard University Press, 340 pp.

  • Kropp, C., and Wagner J. , 2010: Knowledge on stage: Scientific policy advice. Sci. Technol. Human Values, 35, 812838.

  • Lemos, M. C., and Morehouse B. , 2005: The co-production of science and policy in integrated climate assessments. Global Environ. Change, 15, 5768.

    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., and Rood R. B. , 2010: Climate projections and their impact on policy and practice. WIREs Climate Change, 1, 670682.

  • Lempert, R. J., and Groves D. G. , 2010: Identifying and evaluating robust adaptive policy responses to climate change for water management agencies in the American west. Technol. Forecasting Soc. Change, 77, 960974.

    • Search Google Scholar
    • Export Citation
  • London Stansted, 2011: London Stansted Airport climate change adaptation plan. 144 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/08aviation/stansted-airport.pdf.]

  • Manchester Airports Group, 2011: Climate change adaptation report for East Midlands Airport and Manchester Airport. 27 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/08aviation/manc-airport.pdf.]

  • McKenzie-Hedger, M., Cornell M. , and Bramwell P. , 2006: Bridging the gap: Empowering decision-making for adaptation through the UK Climate Impacts Programme. Climate Policy, 6, 201215.

    • Search Google Scholar
    • Export Citation
  • McNie, E. C., 2007: Reconciling the supply of scientific information with user demands: An analysis of the problem and review of the literature. Environ. Sci. Policy, 10, 1738.

    • Search Google Scholar
    • Export Citation
  • Meyer, R., 2011: The public values failures of climate science in the US. Minerva, 49, 4770.

  • Munang, R., Rivington M. , Takle E. S. , Mackey B. , Thiaw I. , and Liu J. , 2011: Climate information and capacity needs for ecosystem management under a changing climate. Procedia Environ. Sci., 1, 206227.

    • Search Google Scholar
    • Export Citation
  • Mylona, A., 2012: The use of UKCP09 to produce weather files for building simulation. Build. Serv. Eng. Res. Technol., 33, 5162.

  • National Grid Gas, 2010: Climate change adaptation report. 53 pp. [Available online at http://www.nationalgrid.com/NR/rdonlyres/C456F00F-1063-43CF-ACF6-85313D9D78F0/45161/nationalgridccagasreport100928.pdf.]

    • Search Google Scholar
    • Export Citation
  • Nowotny, H., Scott P. , and Gibbons M. , 2001: Re-Thinking Science: Knowledge and the Public in an Age of Uncertainty. Polity Press, 288 pp.

  • NRC, 2009: Informing decisions in a changing climate. National Research Council, 200 pp.

  • Olsen, W., 2004: Triangulation in social research: Qualitative and quantitative methods can really be mixed. 30 pp. [Available online at http://www.ccsr.ac.uk/staff/Triangulation.pdf.]

    • Search Google Scholar
    • Export Citation
  • Parry, M. L., Canziani O. F. , Palutikof J. P. , van der Linden P. J. , and Hanson C. E. , Eds., 2007: Climate Change 2007: Impacts, Adaptation and Vulnerability. Cambridge University Press, 976 pp.

  • Porter, J., and Demeritt D. , 2012: Flood risk management, mapping and planning: The institutional politics of decision-support in England. Environ. Plann. A, 44, 23592378, doi:10.1068/a44660.

    • Search Google Scholar
    • Export Citation
  • Port of Sheerness, 2011: Port of Sheerness Ltd Climate Adaptation Assessment: Report to Defra under the Adaptation Reporting Powers. 62 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/07ports/port-sheerness.pdf.]

  • Reeder, T., and Ranger N. , 2011: How do you adapt in an uncertain world? Lessons from the Thames Estuary 2100 project. World Resources Report Uncertainty Series, Vol. 1, 16 pp. [Available online at http://www.worldresourcesreport.org/files/wrr/papers/wrr_reeder_and_ranger_uncertainty.pdf.]

  • Rothstein, H., Huber M. , and Gaskell G. , 2006: A theory of risk colonization: The spiralling regulatory logics of societal and institutional risk. Econ. Soc., 35, 91112.

    • Search Google Scholar
    • Export Citation
  • RWE Npower, 2011: Climate change adaptation report. 117 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/03electric-gen/npower.pdf.]

  • Sarewitz, D., and Pielke R. A. Jr., 2007: The neglected heart of science policy: Reconciling supply and demand for science. Environ. Sci. Policy, 10, 516.

    • Search Google Scholar
    • Export Citation
  • Severn Trent Water Ltd., 2011: Climate change adaptation report: A response to the Climate Change Act’s adaptation reporting power. 180 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/02water-comp/wc-severn-trent.pdf.]

  • Shackley, S., and Wynne B. , 1995: Global climate change: The mutual construction of an emergent science–policy domain. Sci. Public Policy, 22, 218230.

    • Search Google Scholar
    • Export Citation
  • Smith, L., Lopez A. , Stainforth D. , Ranger N. , and Niehoerster F. , 2009: Toward decision-relevant probability distributions: Communicating ignorance, uncertainty and model-noise. Center for Climate Change Economics and Policy. [Available online at www.rmets.org/pdf/presentation/20091015-smith.pdf.]

  • SP Energy Networks, 2012: Climate change adaption report. Rep. ENV-05-015, Issue 1, 136 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/04distribute-trans/sp-energy-networks.pdf.]

  • SP Generation, 2011: SP Gen climate change adaptation report. Scottish Power, 44 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/03electric-gen/scottish-power.pdf.]

  • Stainforth, D. A., Allen M. R. , Tredger E. R. , and Smith L. A. , 2007: Confidence, uncertainty and decision-support relevance in climate predictions. Philos. Trans. Roy. Soc., 365, 21452161.

    • Search Google Scholar
    • Export Citation
  • Stokes, D. E., 1997: Pasteur’s Quadrant: Basic Science and Technological Innovation. Brookings Institution Press, 196 pp.

  • Street, R., Steynor A. , Bowyer P. , and Humphrey K. , 2009: Delivering and using the UK Climate Projections 2009. Weather, 64, 227231.

  • Sutcliffe, S., and Court J. , 2005: Evidence-based policymaking: What is it? How does it work? What relevance for developing countries? Overseas Development Institute, 45 pp. [Available online at http://www.odi.org.uk/resources/docs/3683.pdf.]

    • Search Google Scholar
    • Export Citation
  • Tribbia, J., and Moser S. C. , 2008: More than information: What coastal managers need to plan for climate change. Environ. Sci. Policy, 11, 315328.

    • Search Google Scholar
    • Export Citation
  • U.K. Climate Projections, cited 2011a: Reports and analysis. FAQ: How much did UKCP09 cost? [Available online at http://ukclimateprojections.defra.gov.uk/22681.]

  • U.K. Climate Projections, cited 2011b: What is UKCP09? [Available online at http://ukclimateprojections.defra.gov.uk/21678.]

  • U.K. Climate Projections, cited 2011c: About UKCP09: Contributors. [Available online at http://ukclimateprojections.defra.gov.uk/21691.]

  • UKCIP, 2006: Expressed preferences for the next package of UK climate change information: Final report on the user consultation. UK Climate Impacts Programme, 28 pp.

  • UKCIP, 2011a: Making progress: UKCIP and adaptation in the UK. UK Climate Impacts Programme, 99 pp.

  • UKCIP, cited 2011b: Users’ panel: November 2010. UK Climate Impacts Programme. [Available online at http://www.ukcip.org.uk/resources/ukcp09/users-panel/.]

  • UKCIP, 2012a: Interpretation and use of future snow projections from the 11-member Met Office Regional Climate Model ensemble. UKCP09 Tech. Note, 25 pp. [Available online at http://ukclimateprojections.defra.gov.uk/media.jsp?mediaid=87842&filetype=pdf.]

  • UKCIP, 2012b: UKCP09: Probabilistic projections of wind speed. UK Climate Impacts Programme, 16 pp. [Available online at http://ukclimateprojections.defra.gov.uk/media.jsp?mediaid=87876&filetype=pdf.]

  • Wilby, R. L., and Dessai S. , 2010: Robust adaptation to climate change. Weather, 65, 180185.

  • Young, K., Ashby D. , Boaz A. , and Grayson L. , 2002: Social science and the evidence-based policy movement. Soc. Policy Soc., 1, 215224.

    • Search Google Scholar
    • Export Citation
Save
  • Arnell, N. W., 2011: Incorporating climate change into water resources planning in England and Wales. J. Amer. Water Resour. Assoc., 47, 541549.

    • Search Google Scholar
    • Export Citation
  • Braun, K., and Kropp C. , 2010: Beyond speaking truth? Institutional responses to uncertainty in scientific governance. Sci. Technol. Human Values, 35, 771782.

    • Search Google Scholar
    • Export Citation
  • Brown, M., 2009: Science in Democracy: Expertise, Institutions, and Representation. MIT Press, 368 pp.

  • Cash, D. W., 2001: “In order to aid diffusing useful and practice information”: Agricultural extension and boundary organizations. Sci. Technol. Human Values, 26, 431453.

    • Search Google Scholar
    • Export Citation
  • Cash, D. W., Clark W. C. , Alcock F. , Dickson N. M. , Eckley N. , Guston D. H. , Jager J. , and Mitchell R. B. , 2003: Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. USA, 100, 80868091.

    • Search Google Scholar
    • Export Citation
  • Defra, 2009: Adapting to climate change: Helping key sectors to adapt to climate change: Statutory guidance to reporting authorities 2009. Department for Environment, Food and Rural Affairs, 14 pp.

  • Defra, cited 2011a: What is government doing? [Available online at http://www.defra.gov.uk/environment/climate/government/.]

  • Defra, cited 2011b: Adaptation Reporting Power received reports. [Available online at http://www.defra.gov.uk/environment/climate/sectors/reporting-authorities/reporting-authorities-reports.]

    • Search Google Scholar
    • Export Citation
  • Defra, cited 2011c: Advice for reporting authorities. [Available online at http://www.defra.gov.uk/environment/climate/sectors/reporting-authorities/.]

  • Demeritt, D., and Langdon D. , 2004: The UK Climate Change Programme and communication with local authorities. Global Environ. Change, 14, 325336.

    • Search Google Scholar
    • Export Citation
  • Dessai, S., and Hulme M. , 2004: Does climate adaptation policy need probabilities? Climate Policy, 4, 107128.

  • Dessai, S., and Hulme M. , 2007: Assessing the robustness of adaptation decisions to climate change uncertainties: A case study on water resources management in the east of England. Global Environ. Change, 17, 5972.

    • Search Google Scholar
    • Export Citation
  • Dessai, S., Hulme M. , Lempert R. , and Pielke R. Jr., 2009: Climate prediction: A limit to adaptation? Adaptation to Climate Change: Thresholds, Values and Governance, W. Adger, I. Lorenzoni., and K. O’Brien., Eds., Cambridge University Press, 64–78.

  • Dilling, L., 2007a: Towards science in support of decision making: Characterizing the supply of carbon cycle science. Environ. Sci. Policy, 10, 4861.

    • Search Google Scholar
    • Export Citation
  • Dilling, L., 2007b: The opportunities and responsibility for carbon cycle science in the U.S. Environ. Sci. Policy, 10, 14.

  • Dilling, L., and Lemos M. C. , 2011: Creating usable science: Opportunities and constraints for climate knowledge use and their implications for science policy. Global Environ. Change, 21, 680689.

    • Search Google Scholar
    • Export Citation
  • Eden, S., 2011: Lessons on the generation of usable science from an assessment of decision support practices. Environ. Sci. Policy, 14, 1119.

    • Search Google Scholar
    • Export Citation
  • Funtowicz, S. O., and Ravetz J. R. , 1993: Science for the post-normal age. Futures, 25, 739755.

  • Füssel, H.-M., 2007: Adaptation planning for climate change: Concepts, assessment approaches, and key lessons. Sustainability Sci., 2, 265275.

    • Search Google Scholar
    • Export Citation
  • Gawith, M., Street R. , Westaway R. , and Steynor A. , 2009: Application of the UKCIP02 climate change scenarios: Overview and lessons learnt. Global Environ. Change, 19, 113121.

    • Search Google Scholar
    • Export Citation
  • Guion, L. A., Diehl D. C. , and McDonald D. , cited 2012: Triangulation: Establishing the validity of qualitative studies. [Available online at http://edis.ifas.ufl.edu/fy394.]

  • Guston, D. H., 1999: Stabilizing the boundary between US politics and science: The role of the Office of Technology Transfer as a boundary organization. Soc. Stud. Sci., 29, 87111.

    • Search Google Scholar
    • Export Citation
  • Hall, J., 2007: Probabilistic climate scenarios may misrepresent uncertainty and lead to bad adaptation decisions. Hydrol. Processes, 21, 11271129.

    • Search Google Scholar
    • Export Citation
  • Hefferman, O., 2009: UK climate effects revealed in finest detail yet. [Available online at http://www.nature.com/news/2009/090619/full/news.2009.586.html.]

    • Search Google Scholar
    • Export Citation
  • Hulme, M., and Dessai S. , 2008a: Predicting, deciding, learning: Can one evaluate the ‘success’ of national climate scenarios? Environ. Res. Lett., 3, 045013, doi:10.1088/1748-9326/3/4/045013.

    • Search Google Scholar
    • Export Citation
  • Hulme, M., and Dessai S. , 2008b: Negotiating future climates for public policy: A critical assessment of the development of climate scenarios for the UK. Environ. Sci. Policy, 11, 5470.

    • Search Google Scholar
    • Export Citation
  • Jenkins, G. J., Murphy J. M. , Sexton D. S. , Lowe J. A. , Jones P. , and Kilsby C. G. , 2009: UK climate projections: Briefing report. Met Office Hadley Centre, 59 pp.

  • Kirchhoff, C. J., 2010: Integrating science and policy: Climate change assessments and water resources management. Ph.D. dissertation, University of Michigan, 293 pp.

  • Knorr-Cetina, K., 1999: Epistemic Cultures: How the Sciences Make Knowledge. Harvard University Press, 340 pp.

  • Kropp, C., and Wagner J. , 2010: Knowledge on stage: Scientific policy advice. Sci. Technol. Human Values, 35, 812838.

  • Lemos, M. C., and Morehouse B. , 2005: The co-production of science and policy in integrated climate assessments. Global Environ. Change, 15, 5768.

    • Search Google Scholar
    • Export Citation
  • Lemos, M. C., and Rood R. B. , 2010: Climate projections and their impact on policy and practice. WIREs Climate Change, 1, 670682.

  • Lempert, R. J., and Groves D. G. , 2010: Identifying and evaluating robust adaptive policy responses to climate change for water management agencies in the American west. Technol. Forecasting Soc. Change, 77, 960974.

    • Search Google Scholar
    • Export Citation
  • London Stansted, 2011: London Stansted Airport climate change adaptation plan. 144 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/08aviation/stansted-airport.pdf.]

  • Manchester Airports Group, 2011: Climate change adaptation report for East Midlands Airport and Manchester Airport. 27 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/08aviation/manc-airport.pdf.]

  • McKenzie-Hedger, M., Cornell M. , and Bramwell P. , 2006: Bridging the gap: Empowering decision-making for adaptation through the UK Climate Impacts Programme. Climate Policy, 6, 201215.

    • Search Google Scholar
    • Export Citation
  • McNie, E. C., 2007: Reconciling the supply of scientific information with user demands: An analysis of the problem and review of the literature. Environ. Sci. Policy, 10, 1738.

    • Search Google Scholar
    • Export Citation
  • Meyer, R., 2011: The public values failures of climate science in the US. Minerva, 49, 4770.

  • Munang, R., Rivington M. , Takle E. S. , Mackey B. , Thiaw I. , and Liu J. , 2011: Climate information and capacity needs for ecosystem management under a changing climate. Procedia Environ. Sci., 1, 206227.

    • Search Google Scholar
    • Export Citation
  • Mylona, A., 2012: The use of UKCP09 to produce weather files for building simulation. Build. Serv. Eng. Res. Technol., 33, 5162.

  • National Grid Gas, 2010: Climate change adaptation report. 53 pp. [Available online at http://www.nationalgrid.com/NR/rdonlyres/C456F00F-1063-43CF-ACF6-85313D9D78F0/45161/nationalgridccagasreport100928.pdf.]

    • Search Google Scholar
    • Export Citation
  • Nowotny, H., Scott P. , and Gibbons M. , 2001: Re-Thinking Science: Knowledge and the Public in an Age of Uncertainty. Polity Press, 288 pp.

  • NRC, 2009: Informing decisions in a changing climate. National Research Council, 200 pp.

  • Olsen, W., 2004: Triangulation in social research: Qualitative and quantitative methods can really be mixed. 30 pp. [Available online at http://www.ccsr.ac.uk/staff/Triangulation.pdf.]

    • Search Google Scholar
    • Export Citation
  • Parry, M. L., Canziani O. F. , Palutikof J. P. , van der Linden P. J. , and Hanson C. E. , Eds., 2007: Climate Change 2007: Impacts, Adaptation and Vulnerability. Cambridge University Press, 976 pp.

  • Porter, J., and Demeritt D. , 2012: Flood risk management, mapping and planning: The institutional politics of decision-support in England. Environ. Plann. A, 44, 23592378, doi:10.1068/a44660.

    • Search Google Scholar
    • Export Citation
  • Port of Sheerness, 2011: Port of Sheerness Ltd Climate Adaptation Assessment: Report to Defra under the Adaptation Reporting Powers. 62 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/07ports/port-sheerness.pdf.]

  • Reeder, T., and Ranger N. , 2011: How do you adapt in an uncertain world? Lessons from the Thames Estuary 2100 project. World Resources Report Uncertainty Series, Vol. 1, 16 pp. [Available online at http://www.worldresourcesreport.org/files/wrr/papers/wrr_reeder_and_ranger_uncertainty.pdf.]

  • Rothstein, H., Huber M. , and Gaskell G. , 2006: A theory of risk colonization: The spiralling regulatory logics of societal and institutional risk. Econ. Soc., 35, 91112.

    • Search Google Scholar
    • Export Citation
  • RWE Npower, 2011: Climate change adaptation report. 117 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/03electric-gen/npower.pdf.]

  • Sarewitz, D., and Pielke R. A. Jr., 2007: The neglected heart of science policy: Reconciling supply and demand for science. Environ. Sci. Policy, 10, 516.

    • Search Google Scholar
    • Export Citation
  • Severn Trent Water Ltd., 2011: Climate change adaptation report: A response to the Climate Change Act’s adaptation reporting power. 180 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/02water-comp/wc-severn-trent.pdf.]

  • Shackley, S., and Wynne B. , 1995: Global climate change: The mutual construction of an emergent science–policy domain. Sci. Public Policy, 22, 218230.

    • Search Google Scholar
    • Export Citation
  • Smith, L., Lopez A. , Stainforth D. , Ranger N. , and Niehoerster F. , 2009: Toward decision-relevant probability distributions: Communicating ignorance, uncertainty and model-noise. Center for Climate Change Economics and Policy. [Available online at www.rmets.org/pdf/presentation/20091015-smith.pdf.]

  • SP Energy Networks, 2012: Climate change adaption report. Rep. ENV-05-015, Issue 1, 136 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/04distribute-trans/sp-energy-networks.pdf.]

  • SP Generation, 2011: SP Gen climate change adaptation report. Scottish Power, 44 pp. [Available online at http://archive.defra.gov.uk/environment/climate/documents/adapt-reports/03electric-gen/scottish-power.pdf.]

  • Stainforth, D. A., Allen M. R. , Tredger E. R. , and Smith L. A. , 2007: Confidence, uncertainty and decision-support relevance in climate predictions. Philos. Trans. Roy. Soc., 365, 21452161.

    • Search Google Scholar
    • Export Citation
  • Stokes, D. E., 1997: Pasteur’s Quadrant: Basic Science and Technological Innovation. Brookings Institution Press, 196 pp.

  • Street, R., Steynor A. , Bowyer P. , and Humphrey K. , 2009: Delivering and using the UK Climate Projections 2009. Weather, 64, 227231.

  • Sutcliffe, S., and Court J. , 2005: Evidence-based policymaking: What is it? How does it work? What relevance for developing countries? Overseas Development Institute, 45 pp. [Available online at http://www.odi.org.uk/resources/docs/3683.pdf.]

    • Search Google Scholar
    • Export Citation
  • Tribbia, J., and Moser S. C. , 2008: More than information: What coastal managers need to plan for climate change. Environ. Sci. Policy, 11, 315328.

    • Search Google Scholar
    • Export Citation
  • U.K. Climate Projections, cited 2011a: Reports and analysis. FAQ: How much did UKCP09 cost? [Available online at http://ukclimateprojections.defra.gov.uk/22681.]

  • U.K. Climate Projections, cited 2011b: What is UKCP09? [Available online at http://ukclimateprojections.defra.gov.uk/21678.]

  • U.K. Climate Projections, cited 2011c: About UKCP09: Contributors. [Available online at http://ukclimateprojections.defra.gov.uk/21691.]

  • UKCIP, 2006: Expressed preferences for the next package of UK climate change information: Final report on the user consultation. UK Climate Impacts Programme, 28 pp.

  • UKCIP, 2011a: Making progress: UKCIP and adaptation in the UK. UK Climate Impacts Programme, 99 pp.

  • UKCIP, cited 2011b: Users’ panel: November 2010. UK Climate Impacts Programme. [Available online at http://www.ukcip.org.uk/resources/ukcp09/users-panel/.]

  • UKCIP, 2012a: Interpretation and use of future snow projections from the 11-member Met Office Regional Climate Model ensemble. UKCP09 Tech. Note, 25 pp. [Available online at http://ukclimateprojections.defra.gov.uk/media.jsp?mediaid=87842&filetype=pdf.]

  • UKCIP, 2012b: UKCP09: Probabilistic projections of wind speed. UK Climate Impacts Programme, 16 pp. [Available online at http://ukclimateprojections.defra.gov.uk/media.jsp?mediaid=87876&filetype=pdf.]

  • Wilby, R. L., and Dessai S. , 2010: Robust adaptation to climate change. Weather, 65, 180185.

  • Young, K., Ashby D. , Boaz A. , and Grayson L. , 2002: Social science and the evidence-based policy movement. Soc. Policy Soc., 1, 215224.

    • Search Google Scholar
    • Export Citation
  • Fig. 1.

    A diagram showing sectors of organizations approached to participate in the questionnaire survey. The survey universe consists of sectors (organizations) that were Defra mandated and those that were not mandated to produce an adaptation report. Sectors underlined and highlighted in bold participated in the study.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 2439 1081 73
PDF Downloads 1228 185 18