Over the past few decades, a large body of research has demonstrated that pedagogical strategies that promote collaborative student interactions and concept-focused engagement with content leads to significantly greater student learning (e.g., Hake 1998; Crouch and Mazur 2001; Prince 2004; Knight and Wood 2005; Umbach and Wawrzynski 2005; Fairweather 2009; National Research Council 2012; Freeman et al. 2014). These engagement-focused strategies, collectively known as active learning, benefit students in a variety of ways. Rooted in cognitive theory, active learning requires students to actively participate in the acquisition of knowledge, allowing learners to more effectively integrate new information with their prior knowledge (Faust and Paulson 1998; Yilmaz 2011; Jaeger et al. 2017). As a result, students increase their self-efficacy, feeling more confident in their ability to learn the material, and succeed in a course (e.g., Wilke 2003; Fencl and Scheel 2005). Active learning strategies also help to reduce the achievement gap for students who are socioeconomically or educationally disadvantaged as well as traditionally underrepresented within the sciences (e.g., Ernst and Colthorpe 2007; Haak et al. 2011; Eddy and Hogan 2014; Freeman et al. 2014).
The benefits of active learning are well established within the geosciences; both introductory and upper-level courses have demonstrated success in using diverse, interactive pedagogical approaches (e.g., Yuretich et al. 2001; McConnell et al. 2003; Yuretich 2004; Kortz et al. 2008; Goldsmith 2011; Dohaney et al. 2012; McConnell et al. 2017). Most notably, student performance is improved (e.g., McConnell et al. 2006; Mora 2010; Freeman et al. 2014; Davenport 2019), and students are more engaged with and exhibit more interest in the material (e.g., Yuretich et al. 2001; Sawyer et al. 2005; Cutrim et al. 2006; Francek 2006). Additionally, significant evidence demonstrates that students, particularly within the geosciences, prefer interactive and collaborative methods over passive lecture (e.g., Yuretich et al. 2001; McConnell et al. 2003; Dohaney et al. 2012; Yuretich and Kanner 2015; Davenport 2018, 2019). The successes of the broader geoscience discipline have, to a lesser extent, also been replicated within the field of atmospheric science. A variety of active learning approaches have been implemented, including opportunities for student reflection and interaction on tasks (e.g., Cutrim et al. 2006; Steeneveld and Vilà-Guerau de Arellano 2019), authentic real-world experiences both in the classroom and out in the field (Barrett and Woods 2012; Charlton-Perez 2013; Coleman and Mitchell 2014; Croft and Ha 2014; Tanamachi et al. 2020), and guided, collaborative exercises working through problems (Davenport 2019).
Even with the notable successes associated with active learning approaches, there are several challenges to implementing these research-based instructional strategies. Faculty interviews identify several situational barriers, such as the need to cover too much material, student resistance, department norms, and physical limitations of classroom layouts (Henderson and Dancy 2007; Walczyk et al. 2007; Shadle et al. 2017). Some faculty also perceive a potential loss of autonomy over how a course is conducted, an inability to manage meaningful assessments in large courses, and a belief that their current instructional methods are adequate [see a comprehensive list of barriers in Shadle et al. (2017)]. However, even with a strong desire to implement active learning, the most commonly cited barrier to modifying the classroom experience is lack of time (Dancy and Henderson 2010; Shadle et al. 2017; Riihimaki and Viskupic 2020). Time constraints can be a function of a number of factors; however, studies routinely demonstrate that tenure and promotion considerations and the related issue of research efforts being rewarded more than instruction are critical components in how faculty decide to use their time (Michael 2007; Walczyk et al. 2007; Brownell and Tanner 2012; Riihimaki and Viskupic 2020).
Taking into account the barriers listed above across science, technology, engineering, and mathematics (STEM) disciplines, instructors are more likely to use traditional didactic lecture-based teaching methods than active learning or other research-based pedagogical methods (Wieman et al. 2010; National Research Council 2012; Stains et al. 2018). While knowledgeable within their field, instructors are often not afforded opportunities to develop deep pedagogical knowledge (e.g., Auerbach and Andrews 2018). Even with knowledge of different active learning approaches, the act of implementing such changes is dependent on personal and professional context, as well as experiences (e.g., Rogers 2003; Andrews and Lemons 2015).
Within the geosciences, a limited number of studies have been conducted to quantify the various instructional practices used in the classroom. Macdonald et al. (2005) collected self-reported information from over 2,000 geoscience faculty on courses they taught, teaching methods used, activities incorporated into courses, and the types of assessments given to students. However, only 5.6% of the N = 2,094 respondents within Macdonald et al. (2005) were faculty teaching atmospheric science, meteorology, or climate-related courses. While a more recent follow-up survey with N = 2,600 participants increased that rate to 9.5% (Egger et al. 2019), neither study provided breakdowns by subdiscipline with regard to instructional practices. To the authors’ knowledge, the “state” of active learning use and implementation within the atmospheric sciences specifically has not been addressed since Macdonald et al. (2005).
In light of the prior work establishing the benefits of active learning, combined with the need for updated and more comprehensive instructor implementation data specific to atmospheric science, the goal of this study is to provide a baseline regarding the state of active learning within the atmospheric sciences. A survey was developed to 1) identify the types of active learning strategies used within college-level atmospheric science courses and 2) quantify the frequency of use of identified active learning strategies. While the self-report data gathered from the survey limit what conclusions can be drawn from the data (e.g., Fung and Chow 2002), the survey results will serve as the foundation for improving the implementation of active learning, ultimately leading to improvements in atmospheric science students’ educational experience, thus supporting a deeper and richer understanding of the atmosphere. Understanding how instructors are using a particular active learning strategy (i.e., “Does the strategy align with learning goals?”) as well as the reasons behind their decision to use the strategy are important research questions; however, these are outside the scope of this project and will be considered in future work.
The remainder of this paper is organized as follows: the “Methodology” section describes the development of the active learning survey and the demographics of the study population. The “Results” and “Discussion” sections examine the survey data and analyze the key findings. The “Conclusions and future work” section concludes with a recap of key findings as well as possible avenues for future work.
Methodology
Survey design, dissemination, and response rate.
The active learning use survey for this study, created in Qualtrics (see online supplementary material; https://doi.org/10.1175/BAMS-D-20-0239.2; www.qualtrics.com/), consisted of an electronic consent form and two sections. The first section asked participants to specify the number of introductory, upper-level undergraduate and graduate courses that they had taught between fall 2018 and summer 2019. At the time of conducting this study, this was the most recent 12-month academic year. Then, for each category in which a participant taught at least one course, participants were asked to assess the frequency with which they utilize one or more active learning strategies.1
The active learning strategies included in the survey were selected based on their established efficacy in the geoscience education literature, as well as their (broadly defined) widespread usage (see review in McConnell et al. 2017; along with Rao et al. 2002; Vázquez-García 2018; Petrunich-Rutherford and Daniel 2019). Specifically, the usefulness for instruction and potential to improve student learning in geoscience classrooms has been analyzed for all of the strategies that are included within the survey for this study. Definitions of active learning strategies could be viewed by participants by hovering their mouse pointer over each strategy (see supplemental material for the list of active learning strategy definitions from Q1.2a–Q1.2c of the survey). The second section of the survey asked participants to answer a series of demographics questions.
The survey was emailed to faculty in atmospheric science departments (N = 757). Faculty email contact information was extracted from institutions listed as hosting atmospheric science programs based on a program listing provided by the National Weather Association (NWA; https://nwas.org/membership/committees/education/colleges-universities/). Participants were also contacted within atmospheric science listservs (e.g., SUNY Albany “MAP” listserv, AMS member Open Forum). While a larger potential sample size was contacted with inclusion of the listservs, the sample size estimate from the NWA list is believed to be a better representation of the number of participants invited; faculty who also happen to participate in one or more of the contacted listservs likely received overlapping email invitations. The authors of this study contacted participants within the communities and listservs described above at least three times to help maximize the study response rate, which is consistent with suggestions for improving survey response rate outlined by recent research (e.g., Saleh and Bista 2017).
A total of N = 211 participants completed the survey, implying a survey response rate of 27.9%. This response rate is lower than the 254–260 participant sample size threshold recommended by Krejcie and Morgan (1970) given our N = 757 participants contacted (i.e., 32.5%–33.9% response rate). This may be tied to the challenges associated with online survey response rates due to “burnout” (e.g., Muñoz-Leiva et al. 2010) and survey oversaturation (e.g., Sax et al. 2003; Saleh and Bista 2017).
Demographics.
The majority of the participants were employed by “doctoral” universities (75.6%), with nearly all of the remaining participants being evenly split between “baccalaureate” and “master’s” universities (11.3% each; N = 168). Approximately 65.1% of the survey participants were male, with just over a third of the participants identifying as female (34.9%; N = 166). For context on whether this study sample is representative, this study refers to MacPhee and Canetto (2015),2 who documented the representation of women in U.S. atmospheric sciences doctoral programs via the demographic and professional information found on 34 atmospheric science doctoral program websites in 2009, or approximately 70% of the total number of atmospheric science doctoral programs (NWA 2020). MacPhee and Canetto (2015) determined that, of the 813 tenured and tenure-track faculty members employed in atmospheric science, only 17% were female. In the present study, just over one-third of all participants identified as female (34.9%; N = 166), with a slightly smaller fraction (29.2%) when considering only tenured or tenure-track faculty. Thus, the results presented herein may be biased toward female faculty, though it is possible that MacPhee and Canetto (2015) may have undersampled female faculty in 2009.
Just over half of the participants (53.6%) had over 10 years of experience, 19.1% had 5–10 years of experience, 15.5% had 2–5 years of experience, and roughly 11.9% had less than 2 years of experience (N = 168). Nearly all of the participants use traditional (in-person) course delivery format (85.2%), with 8.28% indicating they teach using a hybrid course format. Only 2.37% of the participants indicated that they teach online. The remaining 4.14% of the participants indicated that they use two or more of the course delivery formats.
The average number of courses taught by the participants between fall 2018 and summer 2019 was 3.73 courses, with a range of 1–10 courses taught during the period (N = 174). The most frequent response was that an instructor taught 2 courses during the period. Almost two-thirds of the participants (65.5%) indicated that they taught at least one introductory level course between fall 2018 and summer 2019, with 77.0% teaching at least one upper-level course, and 44.8% teaching at least one graduate-level course during the period (N = 174).
Results
The majority of results shown in this section pertain to Q1.2 of the survey, which asks participants to rate the frequency of use of a variety of in-class active learning teaching strategies within their courses (see supplemental material). The authors define “high-use” frequency as a participant selecting either the “occasionally” or “frequently” rating per active learning strategy.
Most common high-use active learning strategies.
Table 1 shows the percentage of participants within introductory (N = 114), upper-level undergraduate (N = 134), and graduate-level (N = 78) atmospheric science courses that use one or more of the active learning strategies listed throughout the table at a frequency categorized as high-use. In all three categories, “case studies” exhibit the highest percentage of high-use by participants (57.0% for introductory, 66.4% for upper-level undergraduate, and 53.8% for graduate courses, respectively). The higher frequency use of case studies by participants is not particularly surprising, as such assignments are often used within a number of different disciplines, including atmospheric science, as a way to engage students in data analysis applied to specific weather and climate events of interest (e.g., Herreid 1994; Schultz 2010; Steeneveld and Vilà-Guerau de Arellano 2019). Interestingly, the only other high-use active learning strategies that 50% or more of participants responded to were “think–pair–share” within introductory courses and “peer instruction” within graduate courses. All other active learning strategies were selected by less than 50% of participants.
Percentage of participants that teach courses at the “introductory,” “upper-level” undergraduate, and “graduate” levels that “frequently” use any of the following active learning strategies listed above. “Frequently” is defined here as a participant selecting, for an active learning strategy, that they use this strategy either “occasionally” or “frequently.” Strategies frequently used by 50% or more of participants that responded are italicized and marked with an asterisk (*). See the supplemental material regarding a list of responses included as “other” for each of the three levels.
Figure 1 provides insight into how many unique active learning strategies are used by participants. In general, the number of active learning strategies used spans 1 to 9 strategies for all three instructional categories. On average, participants teaching introductory courses use 3.32 active learning strategies, those in upper-level undergraduate courses use 3.06 strategies, and those in graduate-level courses use 2.54 strategies. Fifty percent or more of participants that teach introductory courses use up to 3 active learning strategies, and the same holds true for those that teach upper-level undergraduate courses. For graduate-level courses, 50% or more of participants use up to 2 active learning strategies. While no study (to the authors’ knowledge) quantifies the use of active learning strategies specifically within the atmospheric sciences, the results from Fig. 1 are in line with Macdonald et al. (2005), which showed that 50% of geosciences faculty incorporate interactive teaching strategies [also see Table 1 of Lund and Stains (2015)].
Percentage of participants using at least X number of active learning strategies at a high-use frequency, where X is the value on the x axis. Note that “other” is considered a unique active learning strategy. See text for more details.
Citation: Bulletin of the American Meteorological Society 103, 4; 10.1175/BAMS-D-20-0239.1
Active learning strategies and teaching experience.
Figure 2a shows the cumulative3 number of unique high-use active learning strategies participants use within their instruction partitioned by years of teaching experience (N = 167). The cohorts that incorporate a variety of active learning strategies more frequently relative to the study population overall include newer instructors (i.e., <2 years of teaching experience) and those that are likely posttenure (i.e., 5–10 years of teaching experience). The greatest diversity of high-use active learning activities occurs at the introductory level, with each cohort incorporating one additional unique strategy at the graduate level (Fig. 2b). Furthermore, the specific active learning strategies used by a majority of each subset of instructors reaffirms the popularity of case studies (cf. Table 1).
Depictions of high-use (i.e., at least occasionally) active learning strategies used by a majority (at least 50%) of faculty for different levels of teaching experience. The high-use strategies for the entire study population (labeled as “all”) are shown for comparison. (top) The cumulative number of unique active learning strategies for each cohort, and (bottom) specific activities for each cohort at each course level. Note that the specific strategies for the entire study population are not shown in the bottom panel for ease of viewing but are the same as the “10+ years” teaching cohort.
Citation: Bulletin of the American Meteorological Society 103, 4; 10.1175/BAMS-D-20-0239.1
The results of this study do not directly reveal why particular subsets of teaching experience tend to incorporate more, as well as different, active learning activities more frequently than others. One possibility is that these instructors take advantage of various professional development and pedagogical resources more often, thus gaining ideas and training in how to incorporate more strategies. Figure 3, which shows the percentage of respondents participating within a variety of professional development opportunities (see Q1.4 within survey in the supplemental material), indicates no consistent differences among the teaching experience subgroups. While slightly higher rates of newer and posttenure instructors using journal articles, books, websites, and conference presentations are observed relative to instructors within the 2–5-yr-experience cohort, these differences are not significant.
The percentage of respondents within a given teaching experience cohort who used different resources to implement active learning strategies within their classrooms.
Citation: Bulletin of the American Meteorological Society 103, 4; 10.1175/BAMS-D-20-0239.1
Active learning strategy use: Male versus female instructors.
When partitioning the data by male versus female instructors, there is a notable difference in the cumulative number of unique high-use active learning strategies used by males and females (Fig. 4a). While males (N = 72, 86, and 52 for introductory, upper-level undergraduate and graduate courses, respectively) reported “case studies” as their only unique high-use strategy, females (N = 35, 38, and 20 for introductory, upper-level undergraduate and graduate courses, respectively) noted 5 unique high-use active learning strategies (Fig. 4b). In addition to case studies, females also use “think–pair–share,” “peer instruction,” “concept maps,” and “lecture tutorials” (the latter of which only exhibited high use by 50% or more of female participants for introductory and graduate-level courses).
Depictions of high-use (i.e., at least occasionally) active learning strategies used by a majority (at least 50%) of male and female faculty. (top) The cumulative number of unique active learning strategies for each cohort, and (bottom) the specific activities for each cohort at each course level.
Citation: Bulletin of the American Meteorological Society 103, 4; 10.1175/BAMS-D-20-0239.1
While the male and female groups had approximately the same percentage of respondents fall into the 2–5 years and 5–10 years of teaching experience cohorts, the male cohort was largely biased toward those instructors with 10+ years of experience. Approximately 62.6% of the male respondents that answered Q1.2 had 10+ years of experience, compared to only 37.9% of the females answering this question (N = 107 and N = 58, respectively). In addition, nearly a quarter of the female respondents for this question (24.1%) had less than 2 years of experience, while less than 5% of male respondents had less than 2 years of experience. While instructors with less than 2 years of experience exhibited a greater use of active learning strategies relative to instructors with 10+ years of experience, differences in the number of strategies used between males and females are evident across all experience categories.
When examining the various professional development and pedagogical resources that instructors take advantage of by gender (Fig. 5), we find that males and females seek out resources related to teaching and learning with similar frequency. In this study, males are slightly more likely than females to seek out resources through informal conversations within their departments as well as through consultation with their institution’s center for teaching and learning. However, females are much more likely than their male counterparts to seek out resources through informal conversations outside their departments and to participate in professional development workshops.
The percentage of respondents (by gender) who used different resources to implement active learning strategies within their classrooms.
Citation: Bulletin of the American Meteorological Society 103, 4; 10.1175/BAMS-D-20-0239.1
Discussion
Case studies.
It is clear that case studies are the most popular choice of high-use active learning strategy within all three levels of atmospheric science instruction surveyed. Case studies are fairly straightforward to implement within the classroom given the vast amount of reanalysis and archived data available to instructors and students on the internet. That being said, the authors of this study are unaware of any research that has directly investigated the efficacy of case studies within the atmospheric sciences. For example, one could question whether case studies are truly effective in maximizing understanding of course material within atmospheric science courses or are used because instructors observed their peers implementing such exercises. Understanding how instructors are using a particular active learning strategy (i.e., Does the strategy align with learning goals?) as well as the reasons behind their decision to use the strategy are important research questions, but these are outside the scope of this project and left for future work. “Personal practical theories,” described as the use of an instructional strategy based on one’s own educational experience or observation of their peers (e.g., Gess-Newsome et al. 2003; Lund and Stains 2015), may factor into the popularity of case studies observed in this study.
Importantly, it cannot be directly determined from the results of this study whether case studies are implemented in a way that is considered active learning. For example, an instructor could teach a course that is primarily lecture based and, embedding within a lecture, describe a case study of an atmospheric science event. In this scenario, students are passive recipients of the information, and not taking an active role or engaging with the material. Herreid (2011) notes that the manner in which case studies are implemented is vital for retention of information. Case studies that are used within the framework of discussion assignments, group or individual case study projects, problem-based learning or team-based learning, or clicker questions within lecture, lead to greater student retention of course material versus instructors simply referencing a case study within their lecture [e.g., see Fig. 1 “Cone of Learning” in Herreid (2011), adapted from Dale (1969)]. Determining how case studies are implemented is the first step in understanding the effectiveness of case studies within the atmospheric sciences.
In general, an active learning strategy can be classified as requiring students to participate in either “generative” or “active” work (e.g., Andrews et al. 2019). “Generative” work trains students to apply higher-order learning skills (e.g., “analyze,” “evaluate,” and “create”; see Armstrong 2020) toward developing understanding of course topics. On the other hand, “active” work involves the implementation of frequent, “low-stakes” activities that assess student ability to recall important course concepts, which requires students to utilize lower-order learning skills (e.g., “remember,” “understand,” and “apply”). It is not known within this study whether case studies require atmospheric sciences students to participate more often within the context of generative or active work. Investigating this (along with how case studies are implemented) would better inform the community of how to best implement case studies within atmospheric science courses.
Influence of teaching experience on implementing active learning strategies.
A key finding of this study is the uneven implementation of active learning strategies and the extent to which they are used within the classroom by instructors. Namely, relatively new instructors (i.e., <2 years of experience) and likely posttenure faculty (i.e., 5–10 years of experience) incorporate more active learning strategies relative to all instructors (Fig. 2). While the survey in this study did not ask about motivations or reasons for using various active learning strategies, prior research provides insight into why these differences are evident.
First, faculty motivation for including and updating courses with reformed teaching approaches is inherently complex and personal (e.g., Andrews and Lemons 2015; Riihimaki and Viskupic 2020). For example, there is an overarching process that tends to occur to result in such changes; Rogers (2003) describes the manner in which innovative instructional changes are made as the “innovation-decision” process. The first step involves “discovering” a teaching strategy and learning more about it. Given that many new instructors (including graduate students) undergo various trainings as part of their onboarding, it is possible that the “new instructors” cohort is using active learning strategies at a higher rate simply due to exposure to new ideas. However, as shown in Fig. 3, atmospheric science instructors frequently use a variety of sources to learn about active learning strategies, including informal discussions, websites, books, journal articles, and professional development workshops, with relatively little dependence on level of teaching experience. This suggests that simple exposure is an insufficient explanation.
The discovery and learning period is followed by becoming “persuaded” to implement the strategy, with a subsequent testing within the classroom stage. This is then supplanted by ultimately adopting a specific strategy within an instructor’s pedagogy (Lund and Stains 2015). Unfortunately, the survey did not collect any data related to these intermediate steps before adoption, so it is unclear what differences lie among the teaching experience cohorts and any reasoning behind them. Despite this, these results are consistent with other studies in that simple awareness of reformed teaching methods does not neatly transition to implementation, as a number of different factors such as departmental culture, norms within disciplines, and personal context are all important (e.g., Lund and Stains 2015; Sturtevant and Wheeler 2019).
One critical component not factored into the innovation-decision process that may shed additional light on differences in the implementation of active learning strategies based on teaching experience is the time and effort required to implement a new approach. This has been shown to be the most frequently cited barrier to instructional change (Dancy and Henderson 2010; Riihimaki and Viskupic 2020). Riihimaki and Viskupic (2020) framed this barrier (among others) in terms of expectancy value theory (Walker and Symons 1997), which essentially states that motivation for change is a function of perceived effort in producing a given outcome. For example, “implementing a new teaching method may be labor intensive … but if the learning outcomes are dramatically improved or if the new method is particularly valued by the students, instructor, department, or institution, the instructor may still be motivated to adopt the new method” (Riihimaki and Viskupic 2020).
Also, one of the more significant drivers of how instructors choose to spend their time relates to tenure and the balance and perceived importance of research versus teaching (Tang and Chamberlain 2003; Brownell and Tanner 2012). Noting that two-thirds of participants in this study exhibiting 5–10 years of instructional experience were tenured associate professors, it is possible that the uptick in more frequent use of active learning strategies may be tied to having less pressure with respect to research productivity, thus allowing more time to focus on teaching. Even if this is true, the perception that teaching activities are unimportant within promotion and tenure decisions is still strong at colleges and universities (e.g., Tang and Chamberlain 2003; Shapiro 2006; Walczyk et al. 2007; Riihimaki and Viskupic 2020). Implementing initiatives that improve teaching effectiveness is important for fostering pedagogical reform in the classroom (Brownell and Tanner 2012).
Gender.
Another key finding of this study is that female instructors use multiple active learning strategies with high frequency, while the majority of male instructors typically only use case studies (Fig. 4). Although the male cohort was largely biased toward those instructors with 10+ years of experience, the disparity in the number of high-use active learning strategies used by males and females still exists. While the survey did not ask instructors what influences their decisions about learning strategies, the findings presented here generally are in line with prior research.
Numerous studies have shown that women are more likely than men to adopt interactive and engaging pedagogies. For example, a survey study of 107 U.S. 4-yr colleges and universities revealed that female instructors were more likely to incorporate active learning in their classes than their male counterparts (Kuh et al. 2004, p. 27; Nelson Laird et al. 2011, p. 263; Bennett et al. 2005). While the study cited above included atmospheric science instructors, a breakdown by discipline was not provided. It was noted, however, that the gender gap in teaching styles can vary significantly by discipline. Therefore, the results from the previous section in this current study offer insight into differences in teaching style by gender in the field of atmospheric sciences. Kuh et al. (2004) also agree with the result of this study that, as years of teaching experience increase, the likelihood that a faculty member uses active learning activities decreases.
While there is a large body of research that demonstrates the benefits of active learning, it is important to note that this does not imply that females are more skillful educators than males, as research has not shown differences in student outcomes based on the gender gap. In fact, it has been shown that women may be significantly more open to new pedagogical approaches, are more knowledgeable about active learning strategies and, therefore, more likely to adopt them than their male counterparts (Henderson et al. 2012; Williams 2015). Further research into the reason for the gender gap in atmospheric science is needed, as well as an investigation of any learning differences that arise due to different pedagogies (Nelson Laird et al. 2011).
Conclusions and future work
Even with substantial literature demonstrating the value of active learning within STEM courses, research that has specifically investigated active learning strategy use across U.S. institutions within the atmospheric sciences is limited. The results of this study, based on survey data collected throughout the atmospheric science college and university community, provides a baseline regarding the frequency of active learning strategy use within this field.
The key results that emerged from this study are the following:
- 1)Case studies are the most popular high use active learning strategy across all levels of instruction, though how they are implemented within the classroom across the atmospheric science community is not clear.
- 2)New atmospheric science instructors (i.e., <2 years of experience) as well as instructors just beyond the typical 5-yr tenure mark (i.e., 5–10 years of experience) exhibit the largest number of high-use active learning strategies.
- 3)The majority of atmospheric science female instructors surveyed typically use several high-use active learning strategies, while the majority of male instructors only use case studies.
The results presented above provide a more comprehensive baseline regarding atmospheric science active learning use and professional development opportunities. While the findings of this study do provide initial insight as to who is using active learning strategies with high frequency, several follow-up questions about active learning use and implementation arise that require further investigation.
The most significant limitation of this study is the lack of a significant sample size. Recall that the study response rate was 27.9%, with a maximum N = 211 participants for any given question. The response rate for some questions was even lower when data were partitioned based on type of courses instructed, years of instructional experience and sex/gender. Also, the majority of respondents worked at doctoral institutions, thus biasing the data based on instructional experiences within typically research-intensive departments. Given that the number of atmospheric science instructors within the United States is small compared to other STEM fields (e.g., biology, chemistry, engineering, physics, mathematics), a higher participation rate is required to ultimately improve overall understanding of active learning use within this field.
As mentioned earlier, several challenges arise with online survey response rate, such as survey “burnout” and optimizing the number of notifications to prospective participants regarding survey completion (e.g., Sax et al. 2003; Muñoz-Leiva et al. 2010; Saleh and Bista 2017; Liu and Wronski 2018). Along with online survey response challenges, it has been shown that, when instructors self-report, they overestimate the frequency with which they use active learning (e.g., Fung and Chow 2002; Ebert-May et al. 2011). Therefore, future work requires consideration of alternative modes of data collection to help alleviate these issues.
A more systematic and reliable (though labor-intensive) approach versus self-reporting is direct classroom observation (Budd et al. 2013). Acknowledging the benefits and drawbacks of each approach, more recent studies have opted to leverage a blend of both self-reporting by instructors and classroom observations (Ryker 2014; Teasdale et al. 2017). Given that this study relies on self-reported information, future work will require direct observation of atmospheric science instructors within the classroom to determine if the results of this study hold with respect to frequency of active learning use. This includes utilizing mixed modes of data collection (e.g., instructor and/or student interviews, collection of student work samples) to develop a more comprehensive understanding of how active learning strategies are implemented and motivations for using certain active learning strategies. Furthermore, additional classroom observations and follow-up interviews will be needed to directly address the “why” of observed differences among male versus female faculty and instructors with differing levels of teaching experience.
Research tools exist that can be used to help accomplish the above. An example is the Reformed Teaching Observation Protocol (RTOP), a tool that can be used by an observer to measure the implementation of reformed teaching methods, which includes active learning strategies (e.g., Piburn and Sawada 2000). Another tool is Classroom Observation Protocol for Undergraduate STEM (COPUS; e.g., Smith et al. 2013; Stains et al. 2018), where trained observers code student and instructor behaviors within 2-min increments while observing the course live. The use of any of the observation tools listed above would provide a more comprehensive understanding of active learning application within atmospheric science courses, though these methods require more time and resources. Therefore, such tools will be used in future work by the authors of this study to develop a better understanding of the state of active learning in this field.
Regarding case studies, it is important to know how this strategy is implemented within atmospheric science courses. For example, case studies being used within a “generative” active learning framework may be more effective versus an “active” framework. In general, atmospheric science courses should strive to incorporate generative work as much as possible to provide students the opportunity to develop higher-order thinking skills, which will enhance their understanding of atmospheric science and better prepare them for career positions. Also, an instructor simply referencing a case study within a lecture will lead to a different learning experience for students versus one in which students are required to take the lead on applying course material and self-analyzing data.
Given that this study has shown that case studies are the most popular active learning strategy in the atmospheric sciences, it is vital that case studies are being utilized in the most effective manner. The use of research tools that require direct observation of instructor and student activity within the classroom setting would be an effective way of determining case study implementation across U.S. institutions. Future work will investigate this as well as if case studies serve as learning tools guiding generative (rather than active) work.
Last, this paper did not address any of the results collected regarding the frequency of high-use assessment tools within atmospheric science courses. The type of assessment tools used within a STEM course can impact the effectiveness of a course. For example, Cotner and Ballen (2017) showed that female students perform better within introductory biology courses when mixed modes of assessment are used relative to that of males. This suggests that thoughtful choices in assessment for a course may play a role in closing the gender gap within a science course. The assessment results from the survey conducted in this study can play a role in understanding the “state” of assessment use in the atmospheric sciences. This would be a first step toward understanding what (if any) reform is necessary to improve atmospheric science course assessment.
A similar question asking participants to assess frequency of use of various types of assessments followed; however, assessment results are not discussed in this study (see “Future work” section).
Other studies, such as Gannet Hallar et al. (2015) and Egger et al. (2019), have examined demographics of faculty in the geosciences more broadly, which includes (but does not specifically separate out) the atmospheric sciences.
In this study, the cumulative value is calculated by first determining the number of high-use active learning strategies used by 50% or more of participants that teach introductory courses. Next, the same calculation is performed for participants teaching upper-level undergraduate courses. If an active learning strategy that was not accounted for in the introductory course results now exhibits 50% or more high-use by participants, the cumulative total increases. Otherwise, the count remains the same. This is then repeated once more by repeating the above calculation for participants teaching graduate courses.
Acknowledgments.
The authors thank all participants who completed the survey for this study as well as all anonymous reviewers assigned to this study for their constructive feedback.
Data availability statement.
All data for this study were collected using Qualtrics Survey Software (www.qualtrics.com/). Readers are encouraged to contact the authors if interested in utilizing study data.
References
Andrews, T. C., and P. P. Lemons , 2015: It’s personal: Biology instructors prioritize personal evidence over empirical evidence in teaching decisions. CBE Life Sci. Educ., 14, 7, https://doi.org/10.1187/cbe.14-05-0084.
Andrews, T. C., A. J. J. Auerbach, and E. F. Grant , 2019: Exploring the relationship between teacher knowledge and active-learning implementation in large college biology courses. CBE Life Sci. Educ., 18, 48, https://doi.org/10.1187/cbe.19-01-0010.
Armstrong, P. , 2020: Bloom’s taxonomy. Vanderbilt University Center for Teaching, accessed 21 July 2020, https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/.
Auerbach, A. J. J., and T. C. Andrews , 2018: Pedagogical knowledge for active-learning instruction in large undergraduate biology courses: A large-scale qualitative investigation of instructor thinking. Int. J. STEM Educ., 5, 19, https://doi.org/10.1186/s40594-018-0112-9.
Barrett, B. S., and J. E. Woods , 2012: Using the amazing atmosphere to foster student learning and interest in meteorology. Bull. Amer. Meteor. Soc., 93, 315– 323, https://doi.org/10.1175/BAMS-D-11-00020.1.
Bennett, D., and Coauthors, 2005: Exploring different dimensions of student engagement. National Survey of Student Engagement Rep., 56 pp.
Brownell, S. E., and K. D. Tanner , 2012: Barriers to faculty pedagogical change: Lack of training, time, incentives, and … tensions with professional identity? CBE Life Sci. Educ., 11, 339– 346, https://doi.org/10.1187/cbe.12-09-0163.
Budd, D. A., K. J. van der Hoeven Kraft, D. A. McConnell, and T. Vislova , 2013: Characterizing teaching in introductory geology courses: Measuring classroom practices. J. Geosci. Educ., 61, 461– 475, https://doi.org/10.5408/12-381.1.
Charlton-Perez, A. , 2013: Problem-based learning approaches in meteorology. J. Geosci. Educ., 61, 12– 19, https://doi.org/10.5408/11-281.1.
Coleman, J., and M. Mitchell , 2014: Active learning in the atmospheric science classroom and beyond through high-altitude ballooning. J. Coll. Sci. Teach., 44, 26– 30, https://doi.org/10.2505/4/jcst14_044_02_26.
Cotner, S., and C. J. Ballen , 2017: Can mixed assessment methods make biology classes more equitable? PLOS ONE, 12, e0189610, https://doi.org/10.1371/journal.pone.0189610.
Croft, P. J., and J. Ha , 2014: The undergraduate “consulting classroom”: Field, research, and practicum experiences. Bull. Amer. Meteor. Soc., 95, 1603– 1612, https://doi.org/10.1175/BAMS-D-13-00045.1.
Crouch, C. H., and E. Mazur , 2001: Peer instruction: Ten years of experience and results. Amer. J. Phys., 69, 970– 977, https://doi.org/10.1119/1.1374249.
Cutrim, E. M., D. Rudge, K. Kits, J. Mitchell, and R. Nogueira , 2006: Changing teaching techniques and adapting new technologies to improve student learning in an introductory meteorology and climate course. Adv. Geosci., 8, 11– 18, https://doi.org/10.5194/adgeo-8-11-2006.
Dale, E. , 1969: Audio-Visual Methods in Teaching. Rinehart and Winston and Dryden Press, 719 pp.
Dancy, M., and C. Henderson , 2010: Pedagogical practices and instructional change of physics faculty. Amer. J. Phys., 78, 1056– 1063, https://doi.org/10.1119/1.3446763.
Davenport, C. E. , 2018: Evolution in student perceptions of a flipped classroom in a computer programming course. J. Coll. Sci. Teach., 47, 30– 35, https://doi.org/10.2505/4/jcst18_047_04_30.
Davenport, C. E. , 2019: Using worked examples to improve student understanding of atmospheric dynamics. Bull. Amer. Meteor. Soc., 100, 1653– 1664, https://doi.org/10.1175/BAMS-D-18-0226.1.
Dohaney, J., E. Brogt, and B. Kennedy , 2012: Successful curriculum development and evaluation of group work in an introductory mineralogy laboratory. J. Geosci. Educ., 60, 21– 33, https://doi.org/10.5408/10-212.1.
Ebert-May, D., T. L. Derting, J. Hodder, J. L. Momsen, T. M. Long, and S. E. Jardeleza , 2011: What we say is not what we do: Effective evaluation of faculty professional development programs. BioScience, 61, 550– 558, https://doi.org/10.1525/bio.2011.61.7.9.
Eddy, S. L., and K. A. Hogan , 2014: Getting under the hood: How and for whom does increasing course structure work? CBE Life Sci. Educ., 13, 453– 468, https://doi.org/10.1187/cbe.14-03-0050.
Egger, A. E., K. Viskupic, and E. R. Iverson , 2019: Results of the National Geoscience Faculty Survey (2004–2016). National Association of Geoscience Teachers Rep., 82 pp.
Ernst, H., and K. Colthorpe , 2007: The efficacy of interactive lecturing for students with diverse science backgrounds. Adv. Physiol. Educ., 31, 41– 44, https://doi.org/10.1152/advan.00107.2006.
Fairweather, J. , 2009: Linking evidence and promising practices in science, technology, engineering, and mathematics (STEM) undergraduate education. NRC Board of Science Education Rep., 31 pp., www.nsf.gov/attachments/117803/public/Xc--Linking_Evidence--Fairweather.pdf..
Faust, J., and D. R. Paulson , 1998: Active learning in the college classroom. J. Excellence Coll. Teach., 9, 3– 24.
Fencl, H., and K. Scheel , 2005: Engaging students: An examination of the effects of teaching strategies on self-efficacy and course climate in a nonmajors physics course. J. Coll. Sci. Teach., 35, 20– 24.
Francek, M. , 2006: Promoting discussion in the science classroom using gallery walks. J. Coll. Sci. Teach., 36, 27– 31.
Freeman, S., S. L. Eddy, M. McDonough, M. K. Smith, N. Okoroafor, H. Jordt, and M. P. Wenderoth , 2014: Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. USA, 111, 8410– 8415, https://doi.org/10.1073/pnas.1319030111.
Fung, L., and L. P. Y. Chow , 2002: Congruence of student teachers’ pedagogical images and actual classroom practices. Educ. Res., 44, 313– 321, https://doi.org/10.1080/0013188022000031605.
Gannet Hallar, A., L. Avallone, H. Thiry, and L. M. Edwards , 2015: ASCENT, a discipline-specific model to support the retention and advancement of women in science. Women in the Geosciences: Practical, Positive Practices Toward Parity, M. A. Holmes, S. O’Connell, and K. Dutt , Eds., John Wiley and Sons, Inc., 135– 148.
Gess-Newsome, J., S. A. Southerland, A. Johnston, and S. Woodbury , 2003: Educational reform, personal practical theories, and dissatisfaction: The anatomy of change in college science teaching. Amer. Educ. Res. J., 40, 731– 767, https://doi.org/10.3102/00028312040003731.
Goldsmith, D. W. , 2011: A case-based curriculum for introductory geology. J. Geosci. Educ., 59, 119– 125, https://doi.org/10.5408/1.3604824.
Haak, D. C., J. HilleRisLambers, E. Pitre, and S. Freeman , 2011: Increased structure and active learning reduce the achievement gap in introductory biology. Science, 332, 1213– 1216, https://doi.org/10.1126/science.1204820.
Hake, R. R. , 1998: Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. Amer. J. Phys., 66, 64– 74, https://doi.org/10.1119/1.18809.
Henderson, C., and H. Dancy , 2007: Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Phys. Rev. Spec. Top. Phys. Educ. Res., 3, 020102, https://doi.org/10.1103/PhysRevSTPER.3.020102.
Henderson, C., H. Dancy, and M. Niewiadomska-Bugaj , 2012: Use of research-based instructional strategies in introductory physics: Where do faculty leave the innovation-decision process? Phys. Rev. Spec. Top. Phys. Educ. Res., 8, 020104, https://doi.org/10.1103/PhysRevSTPER.8.020104.
Herreid, C. F. , 1994: Case studies in science—A novel method of science education. J. Coll. Sci. Teach., 23, 221– 229.
Herreid, C. F. , 2011 : Case study teaching. New Dir. Teach. Learn., 2011, 31– 40, https://doi.org/10.1002/tl.466.
Jaeger, A. J., T. F. Shipley, and S. J. Reynolds , 2017: The roles of working memory and cognitive load in geoscience learning. J. Geosci. Educ., 65, 506– 518, https://doi.org/10.5408/16-209.1.
Knight, J. K., and W. B. Wood , 2005: Teaching more by lecturing less. Cell Biol. Educ., 4, 298– 310, https://doi.org/10.1187/05-06-0082.
Kortz, K. M., J. J. Smay, and D. P. Murray , 2008: Increasing learning in introductory geoscience courses using lecture tutorials. J. Geosci. Educ., 56, 280– 290, https://doi.org/10.5408/1089-9995-56.3.280.
Krejcie, R. V., and D. W. Morgan , 1970: Determining sample size for research activities. Educ. Psychol. Meas., 30, 607– 610, https://doi.org/10.1177/001316447003000308.
Kuh, G. D., T. F. Nelson Laird, and P. D. Umbach , 2004: Aligning faculty and student behavior: Realizing the promise of greater expectations. Lib. Educ., 90, 24– 31.
Liu, M., and L. Wronski , 2018: Examining completion rates in web surveys via over 25,000 real-world surveys. Soc. Sci. Comput. Rev., 36, 116– 124, https://doi.org/10.1177/0894439317695581.
Lund, T. J., and M. Stains , 2015: The importance of context: An exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty. Int. J. STEM Educ., 2, 13, https://doi.org/10.1186/s40594-015-0026-8.
Macdonald, R. H., C. A. Manduca, D. W. Mogk, and B. J. Tewksbury , 2005: Teaching methods in undergraduate geoscience courses: Results of the 2004 On the Cutting Edge survey of U.S. faculty. J. Geosci. Educ., 53, 237– 252, https://doi.org/10.5408/1089-9995-53.3.237.
MacPhee, D., and S. S. Canetto , 2015: Women in academic atmospheric sciences. Bull. Amer. Meteor. Soc., 96, 59– 67, https://doi.org/10.1175/BAMS-D-12-00215.1.
McConnell, D. A., D. N. Steer, and K. D. Owens , 2003: Assessment and active learning strategies for introductory geology courses. J. Geosci. Educ., 51, 205– 216, https://doi.org/10.5408/1089-9995-51.2.205.
McConnell, D. A., and Coauthors, 2006: Using conceptests to assess and improve student understanding in introductory geoscience courses. J. Geosci. Educ., 54, 61– 68, https://doi.org/10.5408/1089-9995-54.1.61.
McConnell, D. A., D. L. Chapman, C. D. Czajka, J. P. Jones, K. D. Ryker, and J. Wiggen , 2017: Instructional utility and learning efficacy of common active learning strategies. J. Geosci. Educ., 65, 604– 625, https://doi.org/10.5408/17-249.1.
Michael, J. , 2007: Faculty perceptions about barriers to active learning. Coll. Teach., 55, 42– 47, https://doi.org/10.3200/CTCH.55.2.42-47.
Mora, G. , 2010: Peer instruction and lecture tutorials equally improve student learning in introductory geology classes. J. Geosci. Educ., 58, 286– 296, https://doi.org/10.5408/1.3559693.
Muñoz-Leiva, F., J. Sánchez-Fernández, F. Montoro-Ríos, and J. Á. Ibáñez-Zapata , 2010: Improving the response rate and quality in web-based surveys through the personalization and frequency of reminder mailings. Qual. Quant., 44, 1037– 1052, https://doi.org/10.1007/s11135-009-9256-5.
National Research Council, 2012: Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. National Academies Press, 282 pp., https://doi.org/10.17226/13362.
Nelson Laird, T. F., A. K. Garver, and A. S. Niskode-Dossett , 2011: Gender gaps in collegiate teaching style: Variations by course characteristics. Res. Higher Educ., 52, 261– 277, https://doi.org/10.1007/s11162-010-9193-0.
NWA, 2020: NWA university degree listing degree programs in meteorology or atmospheric science. Accessed 10 December 2020, https://nwas.org/membership/committees/education/colleges-universities/.
Petrunich-Rutherford, M. L., and F. Daniel , 2019: Collaborative quizzes: Impact on student performance and attendance. Teach. Psychol., 46, 115– 120, https://doi.org/10.1177/0098628319834172.
Piburn, M., and D. Sawada , 2000: Reformed teaching observation protocol (RTOP): Reference manual. ACEPT Tech. Rep. IN00-3, 41 pp., www.public.asu.edu/~anton1/AssessArticles/Assessments/Chemistry%20Assessments/RTOP%20Reference%20Manual.pdf.
Prince, M. , 2004: Does active learning work? A review of the research. J. Eng. Educ., 93, 223– 231, https://doi.org/10.1002/j.2168-9830.2004.tb00809.x.
Rao, S. P., H. L. Collins, and S. E. DiCarlo , 2002: Collaborative testing enhances student learning. Adv. Physiol. Educ., 26, 37– 41, https://doi.org/10.1152/advan.00032.2001.
Riihimaki, C. A., and K. Viskupic , 2020: Motivators and inhibitors to change: Why and how geoscience faculty modify their course content and teaching methods. J. Geosci. Educ., 68, 115– 132, https://doi.org/10.1080/10899995.2019.1628590.
Rogers, E. M. , 2003: Diffusion of Innovations. Simon and Schuster, 576 pp.
Ryker, K. , 2014: An evaluation of classroom practices, inquiry and teaching beliefs in introductory geoscience classrooms. Ph.D. dissertation, North Carolina State University, 144 pp.
Saleh, A., and K. Bista , 2017: Examining factors impacting online survey response rates in educational research: Perceptions of graduate students. J. Multidiscip. Eval., 13, 63– 74.
Sawyer, D. S., A. T. Henning, S. Shipp, and R. W. Dunbar , 2005: A data rich exercise for discovering plate boundary processes. J. Geosci. Educ., 53, 65– 74, https://doi.org/10.5408/1089-9995-53.1.65.
Sax, L. J., S. K. Gilmartin, and A. N. Bryant , 2003: Assessing response rates and nonresponse bias in web and paper surveys. Res. Higher Educ., 44, 409– 432, https://doi.org/10.1023/A:1024232915870.
Schultz, D. M. , 2010: How to research and write effective case studies in meteorology. Electron. J. Severe Storms Meteor., 5 ( 2), https://ejssm.org/archives/2010/vol-5-2-2010/.
Shadle, S. E., A. Marker, and B. Earl , 2017: Faculty drivers and barriers: Laying the groundwork for undergraduate STEM education reform in academic departments. Int. J. STEM Educ., 4, 8, https://doi.org/10.1186/s40594-017-0062-7.
Shapiro, H. N. , 2006: Promotion & tenure & the scholarship of teaching & learning. Change, 38, 38– 43, https://doi.org/10.3200/CHNG.38.2.38-43.
Smith, M. K., F. H. M. Jones, S. L. Gilbert, and C. E. Wieman , 2013: The Classroom Observation Protocol for Undergraduate STEM (COPUS): A new instrument to characterize university STEM classroom practices. CBE Life Sci. Educ., 12, 618– 627, https://doi.org/10.1187/cbe.13-08-0154.
Stains, M., and Coauthors, 2018: Anatomy of STEM teaching in North American universities. Science, 359, 1468– 1470, https://doi.org/10.1126/science.aap8892.
Steeneveld, G., and J. Vilà-Guerau de Arellano , 2019: Teaching atmospheric modeling at the graduate level: 15 years of using mesoscale models as educational tools in an active learning environment. Bull. Amer. Meteor. Soc., 100, 2157– 2174, https://doi.org/10.1175/BAMS-D-17-0166.1.
Sturtevant, H., and L. Wheeler , 2019: The STEM Faculty Instructional Barriers and Identity Survey (FIBIS): Development and exploratory results. Int. J. STEM Educ., 6, 35, https://doi.org/10.1186/s40594-019-0185-0.
Tanamachi, R. L., D. T. Dawson, and L. C. Parker , 2020: Students of Purdue Observing Tornadic Thunderstorms for Research (SPOTTR): A severe storms field work course at Purdue University. Bull. Amer. Meteor. Soc., 101, E847– E868, https://doi.org/10.1175/BAMS-D-19-0025.1.
Tang, T. L.-P., and M. Chamberlain , 2003: Effects of rank, tenure, length of service, and institution on faculty attitudes toward research and teaching: The case of regional state universities. J. Educ. Bus., 79, 103– 110, https://doi.org/10.1080/08832320309599097.
Teasdale, R., K. Viskupic, J. K. Bartley, D. McConnell, C. Manduca, M. Bruckner, D. Farthing, and E. Iverson , 2017: A multidimensional assessment of reformed teaching practice in geoscience classrooms. Geosphere, 13, 608– 627, https://doi.org/10.1130/GES01479.1.
Umbach, P. D., and M. R. Wawrzynski , 2005: Faculty do matter: The role of college faculty in student learning and engagement. Res. Higher Educ., 46, 153– 184, https://doi.org/10.1007/s11162-004-1598-1.
Vázquez-García, M. , 2018: Collaborative-group testing improves learning and knowledge retention of human physiology topics in second-year medical students. Adv. Physiol. Educ., 42, 232– 239, https://doi.org/10.1152/advan.00113.2017.
Walczyk, J. J., L. L. Ramsey, and P. Zha , 2007: Obstacles to instructional innovation according to college science and mathematics faculty. J. Res. Sci. Teach., 44, 85– 106, https://doi.org/10.1002/tea.20119.
Walker, C. J., and C. Symons , 1997: The meaning of human motivation. Teaching Well and Liking It: Motivating Faculty to Teach Effectively, J. L. Bess , Ed., John Hopkins University Press, 3– 18.
Wieman, C., K. Perkins, and S. Gilbert , 2010: Transforming science education at large research universities: A case study in progress. Change, 42, 6– 14, https://doi.org/10.1080/00091380903563035.
Wilke, R. R. , 2003: The effect of active learning on student characteristics in a human physiology course for nonmajors. Adv. Physiol. Educ., 27, 207– 223, https://doi.org/10.1152/advan.00003.2002.
Williams, C. , 2015: Examining openness to pedagogical change among secondary mathematics teachers: Developing and testing a structural model. Ph.D. dissertation, University of North Dakota, 148 pp., https://commons.und.edu/theses/1981.
Yilmaz, Y. , 2011: Task effects on focus on form in synchronous computer-mediated communication. Mod. Lang. J., 95, 115– 132, https://doi.org/10.1111/j.1540-4781.2010.01143.x.
Yuretich, R. F. , 2004: The effects of course redesign on an upper-level geochemistry course. J. Geosci. Educ., 52, 277– 283, https://doi.org/10.5408/1089-9995-52.3.277.
Yuretich, R. F., and L. C. Kanner , 2015: Examining the effectiveness of team-based learning (TBL) in different classroom settings. J. Geosci. Educ., 63, 147– 156, https://doi.org/10.5408/13-109.1.
Yuretich, R. F., S. A. Khan, R. M. Leckie, and J. J. Clement , 2001: Active-learning methods to improve student performance and scientific interest in a large introductory oceanography course. J. Geosci. Educ., 49, 111– 119, https://doi.org/10.5408/1089-9995-49.2.111.