1. Introduction
a. Background
A large body of research has focused on the understanding of scientific concepts by students and teachers. Probes used in these studies included open-ended questionnaires, concept maps, and interviews, as well as closed-ended true–false or agree–disagree questionnaires, five-point Likert-scale questionnaires, and multiple-choice questionnaires. In previous informal research, we used open-ended focus group interviews and closed-ended, multiple-choice “conceptests.” We gradually realized, however, when working with multidisciplinary topics such as climate change, that binary-choice item questionnaires offer some advantages that multiple-choice ones do not.
With regard to the topic we studied, the province of Ontario where we undertook our research is increasingly focusing on environmental education. Climate change, in particular, occupies an important place in the science curriculum. A good curriculum does not necessarily result in effective teaching, however, especially if the teachers have never been educated in the topic. What is needed to complement this is an aggressive education program for teachers. To measure the effectiveness of any such program, we created a 59-item binary choice questionnaire of key climate change concepts and administered it to 89 preservice elementary teachers at two Ontario institutions. The results have given us a baseline of current understandings and misconceptions held by Ontario preservice elementary teachers, as well as validated the statistical method we used to analyze the data.
b. Literature review
Research on the understanding of climate change has been conducted with students in Australia, Canada, Britain, Greece, Norway, Sweden, Turkey, the United Kingdom, and the United States.1 Several studies have been reported on in the Bulletin of the Atmospheric Meteorological Society,2 as well as in other journals.3 Table 1 lists misconceptions identified in these studies, classifying them into four categories. Many surveys have also been conducted with teachers in different countries, including preservice elementary teachers.4 The survey being reported here may be the first done with preservice elementary teachers in Canada.
Summary of misconceptions held by people regarding climate change.
2. Methodology
a. Concepts tested
We relied on several research sources when developing our diagnostic instrument (e.g., Boyes and Stanisstreet 1993; Cordero et al. 2008) but also incorporated climate concepts from relevant curriculum documents for Ontario and Canada,5 international documents such as the IPCC Fourth Assessment Report, and our own experience.6 Climate change surveys often focus on recent changes in Earth’s climate system resulting from human impacts. In our educational work, however, we realized that long-range changes, such as the 100 000-yr cycle, were not understood well; and thus many teachers were not able to differentiate between natural and anthropogenic causes of climate change. Therefore, 15 of the 59 items in our survey addressed long-range concepts related to Earth’s climate. (The questionnaire is included in the supplemental material available at the Journals Online Web site: http://dx.doi.org/10.1175/2011WCAS1100021.s1.) Although many of the 59 items had been addressed before, for 15–20 of them we had no previous data. Even for items addressed previously, researchers often reported significantly different results.
b. Binary-choice format
We initially developed a multiple-choice questionnaire and field-tested it with Ontario teachers. The results were interesting. Unlike other multiple-choice diagnostics we had used with Ontario teachers and students, for the climate change diagnostic only two of the four options were used for most items. Teachers either chose the correct answer or the same incorrect answer. This suggested that we employ binary choice items rather than four-distracter items. In fact, Haladyna (2004), in his well-known book, Developing and Validating Multiple-Choice Test Items, had suggested that multiple-choice questions work best with only two distracters. We found we only needed one.
Having chosen binary-choice items, we initially followed the lead of previous researchers on climate change understanding, who used true–false or agree–disagree items, and sometimes a Likert scale response (i.e., strongly agree, somewhat agree, not sure, somewhat disagree, strongly disagree). After further field testing, we had misgivings that a “positive statement bias” might be attached to agree–disagree or true–false statements. Teachers with little science background might err on the “agree” or “true” side when confronted with a statement that sounded scientific. We therefore reworded each item to make it as neutral as possible, as follows:
10. If there was no greenhouse effect
life on Earth would be drastically different
life on Earth would be much the same as it is today
c. Subjects
Preservice elementary teachers enrolled in primary–junior (K–6) science methods courses at two Ontario universities in the Toronto area were invited to participate in the anonymous online questionnaire by answering it on their own time. (These teachers were enrolled in one-year Bachelor of Education programs, consisting of educational courses at the universities and practicum sessions in schools.) While the possibility exists that some may have studied up on topics while answering the questionnaire, there was no obvious motivation for them to do so. Their participation was anonymous, they were never told their score, and they were not studying climate change at that time.
Of approximately 280 preservice teachers enrolled in 8 classes, 114 participated, with 89 completing all 59 items. One-third of the participants were age 30 or older. The classes from which they were drawn were 80%–90% female and represented the cultural diversity of the greater Toronto area. Everyone had a university degree, usually a generalist degree with little if any science at the university level. The smaller institution was a private Christian university, while the larger institution supplying the majority of subjects was a midsize public university. The survey did not ask how many university-level science courses they had taken, since demographic data collected by the universities indicated that very few had studied science at that level. Since the topic of climate change was only introduced into the Ontario secondary school science curriculum in 2007, we knew that few if any had formally studied the topic in science. Furthermore, to our knowledge, the topic of climate change is not covered in any detail in the social science courses offered at nearby universities.
3. Observations and analysis
The questionnaire in the supplement includes item difficulty and discrimination indices for the 59 items, when answered by the 89 preservice elementary teachers. The item difficulty index refers to the percent of subjects answering that item correctly. To compute item discrimination indices, we subtracted the average score on that item of the bottom 27% from the top 27% of the subjects, based on their response to the entire questionnaire, and then divided by the number of subjects included in the 27% (i.e., 24 subjects).
One of the concerns with a closed-ended questionnaire, especially a binary-choice one, is the role of randomness. Since few if any subjects had formally studied climate change, the possibility existed that for many items their answers would be basically random. This is where Monte Carlo simulations came in. Before drawing conclusions from the survey results, we first compared our data with randomly generated data. Only data that differed significantly from this random data will be commented on.
a. Item difficulty analysis
The average item difficulty index on the survey was 0.58, fairly close to an average item difficulty for randomly generated data of 0.50. Does this suggest that respondents were guessing on most of the questions? The frequency histograms for the item difficulties for real data compared with randomly generated data tell another story. Figure 1 presents the randomly generated data and Fig. 2 the real data. With randomly generated data, 95% of the items (56 of 59) have an item difficulty between 0.400 and 0.600. With real data, only 12% (7 of 59) have an item difficulty in this range. Rather, 58% (34 of 59) have an item difficulty above 0.600 and 30% (18 of 59) have an item difficulty below 0.400. The frequency histogram is strongly bimodal. This suggests that for the 34 items with a difficulty index greater than 0.600, many respondents had a good understanding about the concepts addressed, while for the 18 items with a difficulty index less than 0.400, many had misconceptions about those concepts addressed.
Frequency histogram of item difficulty for 59 items with 89 randomly generated responses, averaged over 10 trials. (The numbers on the x axes represent the 10% of the spread leading up to that number. For example, 28.7 of the items had an item difficulty greater than 0.50 and less than or equal to 0.60.)
Citation: Weather, Climate, and Society 3, 4; 10.1175/WCAS-D-11-00021.1
Frequency histogram of item difficulty for the 59 items with real responses from the 89 teacher candidates.
Citation: Weather, Climate, and Society 3, 4; 10.1175/WCAS-D-11-00021.1
1) Items answered correctly by over 80% of respondents
Although all items with difficulty index above 60% are probably significant, ones above 80% particularly stand out. Respondents overwhelming knew that …
weather and climate mean different things (#2; 94%)
Earth’s surface gives off radiation at night (#6; 94%)
carbon dioxide and methane are invisible (#10; 89%),
historically Earth’s climate has varied in long cycles (#14; 92%)
sea levels have varied by 5 to 10 m, not 1 m or less (#15; 84%),
volcanic eruptions cause temporary climate change (#16; 85%)
more atmospheric carbon dioxide increases the greenhouse effect (#27; 90%)
the greenhouse effect is increased by the removal of large forests (#38; 91%).
oceans and forests continually exchange CO2 with the atmosphere (#54; 88%).
2) Items answered correctly by less than 30% of respondents
Respondents had misconceptions regarding these concepts:
solar energy is concentrated in the visible, not infrared part of the spectrum (#4; 8%)7
the most common greenhouse gas is water vapor and not carbon dioxide (#8; 15%)
waste heat from use of fossil fuels does not contribute to global warming (#19; 18%)
when sea ice melts it does not affect the sea level of oceans (#20; 19%)
radioactive waste from nuclear power does not contribute to the greenhouse effect (#28; 29%)
the thinning of the ozone layer has not contributed to the greenhouse effect (#29; 28%)
oceans absorb most of the atmospheric carbon dioxide produced from human activities (#32; 29%)
atmospheric pollutants such as dust and sulfur dioxide cause a decrease (not an increase) in Earth’s average temperature (#37; 25%)
What is interesting in these results is that 82% of respondents thought that the release of waste heat from the use of fossil fuels significantly contributes to global warming (item 19 above). Perhaps this is why 71% thought that radioactive waste from nuclear reactions also contributes to global warming (item 28 above), not because nuclear reactions produce carbon dioxide, but because they produce waste heat, which contributes to global warming. (Compared to greenhouse gases, waste heat from chemical and nuclear reactions has no significant effect.) Regarding item 32 above, most of the respondents were unaware that the oceans absorb most of the carbon dioxide produced by humans (something not reported on in previous research). This fact is important because it affects the acidification of the oceans and it delays the effect of carbon dioxide production on climate change. (Table 2 gives a summary of results.)
Climate literacy misconceptions identified in this questionnaire.
b. Item discrimination analysis
As mentioned earlier, the discrimination index we used equaled the average score on that item for the highest 27% of the respondents (based on their overall score) minus the lowest 27%. Since 27% of 89 respondents equals 24, we subtracted the scores of the lowest 24 from the highest 24 and divided by 24. This resulted in decimal numbers based on fractions of 24 (i.e.,
Frequency histogram of item discrimination for 59 items with 89 randomly generated responses, averaged over 10 trials (The number under each bar gives the frequency up to that number; i.e., 7.6 items had a discrimination index from 0.167 to 0.208, according to this Monte Carlo simulation.)
Citation: Weather, Climate, and Society 3, 4; 10.1175/WCAS-D-11-00021.1
Frequency histogram of item discrimination for the 59 items with real responses from the 89 teacher candidates.
Citation: Weather, Climate, and Society 3, 4; 10.1175/WCAS-D-11-00021.1
An item with a high index, above 0.30 or 0.40, discriminates well between respondents who score high overall on the test and those that do not and is considered to be a good item. Items with a low index have little value in discriminating because respondents scoring high overall on the test do little better on that item than respondents scoring low overall on the test. This assumes a strong coherence among concepts being tested, such as what you would expect in a questionnaire on Newton’s laws of force, for example.8 In this climate change questionnaire, the average item discrimination index was only 0.197. In fact, only 10 items had a discrimination index equal to or greater than 0.333, compared with 4 in the randomly generated data. The frequency histograms for the real data and randomly generated data are strikingly similar. If climate change science had the coherence of a traditional discipline such as physics, we might be worried about these results. Perhaps most of the items were not valid. There are other interpretations, however. The low number of items with high discrimination indices may imply that respondents’ knowledge came in bits and pieces (i.e., from newspapers, radio, the Internet, television newscasts programs, talking with friends), so that how they did on one item has little correlation with how they did on other items. Gowda et al. (1997) arrived at a similar interpretation in their discussion of the results of a survey on climate change with 99 American high school students. Another interpretation of our data is that climate change science, unlike physics, involves concepts drawn from many fields, such as astronomy, biology, chemistry, earth science, environmental studies, geology, and physics, and people who understand concepts well in one or two fields might not understand them well in the others. These two conclusions were further supported by the comparison of Pearson correlation coefficients for the 1772 pairs of the 59 items. As with the discrimination indices, there was little difference if any between the matrix of correlation coefficients of the real data and those of the randomly generated data.
4. Summary and conclusions
A survey of the understanding of climate change concepts was undertaken with Ontario preservice teachers, most of whom had never formally studied climate change nor studied university-level science. The survey used closed, binary-choice items and led to statistically interesting results. A comparison of item difficulty indices for the real data against the randomly generated data suggested that although many teachers knew a considerable amount about climate change, they also had many misconceptions, some identified here for the first time. A comparison of item discrimination indices for the real data against the randomly generated data implied that the teachers’ knowledge was a “kaleidoscope of understanding,” rather than a coherent picture. This conclusion was further supported by comparing the matrices of paired Pearson correlation coefficients for the real and randomly generated data.
This study demonstrated the usefulness of comparing real with randomly generated data, using closed binary-choice items for multidiscipline topics such as climate change. In future research, we plan to extend this analysis to another multidiscipline topic in Ontario’s science curriculum, Water Systems. This study also led to tentative conclusions regarding the lack of significant item discrimination indices or correlation coefficients. These may be due to the fact that the teachers’ understanding of climate change came from unconnected sources, or it may be because climate change science builds on concepts from many different fields of study. This suggests that when incorporating climate change into our science methods courses, we emphasize the difference between reliable and unreliable sources of information, and that we give careful attention to how we integrate concepts from different scientific fields.
REFERENCES
Andersson, B., and Wallin A. , 2000: Students’ understanding of the greenhouse effect, the societal consequences of reducing CO2 emissions and the problem of ozone layer depletion. J. Res. Sci. Teach., 37, 1096–1111.
Bostrom, A., Morgan M. G. , Fischhoff B. , and Read D. , 1994: What do people know about global climate change? 1. Mental models. Risk Anal., 14, 959–970.
Boyes, E. and Stanisstreet M. , 1992: Students’ perceptions of global warming. Int. J. Environ. Stud., 42, 287–300.
Boyes, E., and Stanisstreet M. , 1993: The ‘greenhouse effect’: Children’s perceptions of causes, consequences and cures. Int. J. Sci. Educ., 15, 531–552.
Boyes, E., and Stanisstreet M. , 2001: Global warming: What do high school students know 10 years on? World Resour. Rev., 13, 221–238.
Cordero, E. C., Todd A. M. , and Abellera D. , 2008: Climate change education and the ecological footprint. Bull. Amer. Meteor. Soc., 89, 865–872.
Dove, J., 1996: Student teacher understanding of the greenhouse effect, ozone layer depletion, and acid rain. Environ. Educ. Res., 2, 89–100.
Ekborg, M., and Areskoug M. , 2006: How student teachers’ understanding of the greenhouse effect develops during a teacher education program. Nordina, 5, 17–29.
Gautier, C., Deutsch K. , and Rebich S. , 2006: Misconceptions about the greenhouse effect. J. Geosci. Educ., 54, 386–395.
Gowda, M., Fox J. , and Magelky R. , 1997: Students’ understanding of climate change: Insights for scientists and educators. Bull. Amer. Meteor. Soc., 78, 2232–2240.
Groves, F., and Pugh A. , 1999: Elementary pre-service teacher perceptions of the greenhouse effect. J. Sci. Educ. Technol., 8, 75–81.
Haladyna, T. M., 2004: Developing and Validating Multiple-Choice Test Items. 3rd ed. Routledge, 320 pp.
Jeffries, H., Stanisstreet M. , and Boyes E. , 2001: Knowledge about the “greenhouse effect”: Have college students improved? Res. Sci. Technol. Educ., 19, 205–221.
Khalid, T., 2003: Pre-service high school teachers’ perceptions of three environmental phenomena. Environ. Educ. Res., 9, 35–50.
Kisoglu, M., Gurbuz H. , Erkol M. , Akar M. S. , and Akilli M. , 2010: Prospective Turkish elementary science teachers’ knowledge level about the greenhouse effect and their views on environmental education in university. Inter. Electron. J. Elem. Educ., 2, 217–236.
Koulaidis, V., and Christidou V. , 1999: Models of students’ thinking concerning the greenhouse effect and teaching implications. Sci. Educ., 83, 559–576.
Matkins, J. J., and Bell R. L. , 2007: Awakening the scientist inside: Global climate change and the nature of science in an elementary science methods course. J. Sci. Teach. Educ., 18, 137–163.
Michail, S., Stamou A. G. , and Stamou G. P. , 2007: Greek primary school teachers’ understanding of current environmental issues: An exploration of their environmental knowledge and images of nature. Sci. Educ., 91, 244–259.
Morgan, M., and Moran J. , 1995: Understanding the greenhouse effect and the ozone shield: An index of scientific literacy among university students. Bull. Amer. Meteor. Soc., 76, 1185–1190.
Papadimitriou, V., 2004: Prospective primary teachers’ understanding of climate change, greenhouse effect, and ozone layer depletion. J. Sci. Educ. Technol., 13, 299–307.
Rebich, S., and Gautier C. , 2005: Concept mapping to reveal prior knowledge and conceptual change in a mock summit course on global climate change. J. Geosci. Educ., 53, 355–365.
Reynolds, T. W., Bostrom A. , Read D. , and Morgan M. D. , 2010: Now what do people know about global climate change? Survey studies of educated laypeople. Risk Anal., 30, 1520–1538.
Sterman, J. D., and Sweeney L. B. , 2007: Understanding public complacency about climate change: Adults’ mental models of climate change violate conservation of matter. Climate Change, 80, 213–238.
Summers, M., Kruger C. , Childs A. , and Mant J. , 2001: Understanding the science of environmental issues: Development of a subject knowledge guide for primary teacher education. Int. J. Sci. Educ., 23, 33–53.
In a longer form of this paper, we have catalogued the results of over 40 research reports on this topic.
For example, Andersson and Wallin 2000; Boyes and Stanisstreet 2001; Reynolds et al. 2010; Sterman and Sweeney 2007.
For example, Dove 1996; Ekborg and Areskoug 2006; Groves and Pugh 1999; Kisoglu et al. 2010; Matkins and Bell 2007.
The 2007 Ontario Curriculum Science and Technology Grades 1–8; The 2008 Ontario Curriculum Science Grades 9–12; The 1996 Pan-Canadian Common Framework of Science Learning Outcomes K to 12.
One of the authors is a recognized climate scientist. The other two are experienced science and physics teachers, as well as science education researchers. Two of the authors were part of a three-author team that wrote the Grade 10 Unit, Climate Change, for a well-known student textbook used across Ontario, called Nelson Science 10.
The authors now realize this is not correct. A little less than half of the sun’s energy is visible light, with about the same amount being infrared radiation and the remainder ultraviolet. See Solar Radiation and Climate Experiment, available online at the National Aeronautics and Space Administration (NASA) Earth Observatory website, accessed 23 April 2011.
The authors did informal research using 35 item multiple-choice diagnostics of force and motion with over a hundred secondary students. The frequency histogram of the item difficulty index was not bimodal as for this questionnaire, and 16 of the 35 discrimination indices were above 0.30, in contrast to only 3 of the 35 discrimination indices being above 0.30 for randomized data.