Atmospheric dynamics can represent a significant hurdle for students; the need to successfully apply concepts from calculus and physics, as well as the sometimes counterintuitive nature of fluid flow, combine to produce frustration and suboptimal learning. Additionally, there is often an emphasis on equation derivations and theory, rather than real-world applications. A new approach for teaching atmospheric dynamics, known as worked examples, is discussed herein. This pedagogy resolves identified challenges in a few ways: 1) reducing the cognitive load of students by explicitly demonstrating (via an expert-constructed guide) how mathematics and physics are applied to the atmosphere; 2) utilizing (as much as appropriate) real-world scenarios to demonstrate how equations explain what we observe; and 3) providing opportunities for students to critically examine the scenario, the relevant math and physics, and the underlying theory via a series of self-explanation prompts throughout the example. This study provides detailed information on the creation and implementation of worked examples in the two-semester atmospheric dynamics course sequence at the University of North Carolina at Charlotte. Comparisons in performance between students who took the course as a traditional lecture and derivation-based course and those who were subject to the worked examples pedagogy identify significant improvements with the new approach, especially for first-semester dynamics. Students also express deep satisfaction with the hands-on, application-based pedagogy.
The worked examples pedagogy guides students through assessments of real-world examples of weather phenomena to demonstrate and improve understanding of key atmospheric dynamics concepts.
The atmospheric dynamics course represents, for many, a challenging hurdle within the meteorology undergraduate major curriculum. The course presents difficulties for students for a number of reasons, including demanding prerequisites in calculus and physics, as well as the counterintuitive nature of dynamical concepts (Persson 2010). Furthermore, the traditional approach to teaching atmospheric dynamics lies in a series of equation derivations, emphasizing the mathematical and physical theory of a concept before getting to any tangible observations or applications. This approach is in stark contrast to meteorology students’ preferred learning style, as exemplified by the following student quotes from Roebber (2005):
We did equations all the time, derivations constantly, so we would think about why we were doing this. Spent five days doing a derivation, all math all the time, and wondering why were we doing that?
Another student further elaborated:
We’re taught equation one and then equation two…we’re just told this is equation one and this is equation two, derived from one. Do you understand? And that’s it. Some teachers directly relate theory and general application and some absolutely don’t and from my perspective, that makes it very difficult. But this is meteorology: it always has a direct application to the weather. Our field is obvious, visual, everyday scenarios, and for that reason, I think that every single thing you talk about has a direct connection.
Indeed, as many studies have shown, it is not just meteorology students, but novice (or inexperienced) learners who more broadly rely on and heavily prefer learning through examples and applications of content (e.g., Cooper and Sweller 1987; Anderson et al. 1997; Atkinson et al. 2000; Roebber 2005).
Additional challenges come into play when instructors ask students to perform complex tasks or learn multifaceted topics that require a blending of multiple skills or ideas; derivation of a dynamical equation, for example, entails the application of various algebraic and calculus concepts, as well as keen knowledge of the relevant physics and underlying meteorological phenomenon. While experts (i.e., instructors) have little difficulty demonstrating and combining such skills, when it comes to performing on their own, students can become easily overwhelmed, not knowing where to start, focusing on irrelevant details, or getting tripped up on small steps of the task (e.g., Ward and Sweller 1990). This excessive mental effort can result in students feeling frustrated, disheartened, and confused, thereby hindering or even preventing learning. Cognitive psychologists call this mental effort used to process information and make decisions cognitive load, consisting of three components: 1) intrinsic load, referring to the inherent difficulty of a topic in relation to the learner’s expertise; 2) extraneous load, related to challenges imposed by the manner in which information is presented; and 3) germane load, referring to effort needed to process and combine ideas into long-term memory (Sweller 1988; Sweller et al. 1998; Paas et al. 2003). While intrinsic load is challenging to reduce, particularly for complex topics with multiple interacting components, extraneous and germane loads can be modified via careful instructional design. Importantly, if extraneous load is reduced, then more effort can be put toward deeper, long-term understanding (i.e., a higher germane load; Sweller et al. 1998; Paas et al. 2003).
One proven method to reduce extraneous cognitive load and enhance student learning is known as worked examples. In essence, worked examples represent an expert-constructed guide that provides in-depth, step-by-step explanations of how to solve a problem or perform a complex task. Note, however, that these examples do not simply tell students what to do; rather, they are carefully constructed guides that leverage a wealth of educational ideas that improve learning. The specific components of a worked example will be described in more detail in the next section, but those that are well-constructed have proven to be effective in enhancing learning and problem solving skills in a variety of scientific disciplines, including mathematics (e.g., Sweller and Cooper 1985; Carroll 1994; Nathan et al. 1994), physics (e.g., Chi and Bassok 1989; Atkinson et al. 2000; van Gog et al. 2011), engineering (e.g., Chandler and Sweller 1991; Moreno et al. 2009; Barnes et al. 2011), chemistry (e.g., Crippen and Brooks 2009; Hesser and Gregory 2015), and statistics (e.g., Paas 1992; Quilici and Mayer 1996).
To the author’s knowledge, there has yet to be a published study demonstrating the effect of implementing a worked examples–centric course pedagogy in the field of atmospheric science. Given the strong student preference to learn from examples (particularly real-world examples; e.g., Roebber 2005) and the need to reduce extraneous cognitive load for novice learners performing complex tasks, the worked example pedagogy is a natural fit for teaching atmospheric dynamics. The goal of this article is to describe the creation and implementation of the worked examples pedagogy at the University of North Carolina at Charlotte, as well as the subsequent impact it had on student performance.
CONSTRUCTING WORKED EXAMPLES.
The specific format of a worked example can vary, yet care must be taken to ensure that their structure and composition work to reduce cognitive load and allow learning to take place. Perhaps one of the most important considerations is to ensure that relevant information is visually combined; examples that split students’ attention between different sources of information, thus requiring more work to mentally integrate them, are much less effective (e.g., Chandler and Sweller 1991; Sweller et al. 2011). Instead, it is recommended to visually integrate problem statements, equations, and diagrams, which has been shown to reduce extraneous cognitive load (e.g., Ward and Sweller 1990; Sweller 1994; Atkinson et al. 2000).
To support the learning process, it is naturally important for a worked example to contain a clear structure and reasoning behind the steps shown, as well as explicit identification of the relevant concept to prevent inaccurate assumptions. Coherent and logical flow to the discussion, as well as enumerating or annotating subgoals needed along the way, is required to reduce extraneous load (e.g., Catrambone 1994; Atkinson et al. 2000). Together, this provides expert-rooted schema for students to adhere to, as opposed to allowing them to develop incorrect or unphysical frameworks.
Another key component lies in asking the student to actively engage with the example. Namely, a worked example should be interspersed with self-explanation prompts, whereby students are asked targeted questions throughout the example that are designed to force readers to critically examine the given scenario (e.g., explaining why a particular step was taken or assumption was made), as well as target and correct common misunderstandings and misconceptions. Including questions that provide opportunities for students to complete small steps of the problem also encourage self-explanation. A wealth of previous research has shown that self-explanation results in enhanced problem-solving skills, as well as improvement in overall learning (e.g., Chi and Bassok 1989; Chi et al. 1994; Nathan et al. 1994; Atkinson et al. 2000, 2003; Crippen and Earl 2007).
Exposing students to multiple examples of each concept, particularly those that contain varying degrees of complexity, is further beneficial for students (e.g., Quilici and Mayer 1996; Atkinson et al. 2000). Similarly, pairing each worked example with practice problems for students to solve on their own further enhances their learning. Numerous studies have shown that the use of such an example–problem pair promotes more effective learning (e.g., Sweller and Cooper 1985; Cooper and Sweller 1987; Carroll 1994; Mwangi and Sweller 1998; Atkinson et al. 2003; van Gog et al. 2011).
To illustrate how some of these recommended components are combined, a sample worked example used in this study, demonstrating how to calculate geostrophic wind based on geopotential height data, is shown in Figs. 1–4. While this example is perhaps relatively simple, it does represent the first exposure for students to equations in the pressure coordinate system, thus providing a foundation for deeper and more complex applications (e.g., geostrophic adjustment leading to geostrophic equilibrium; Wampler et al. 1998; Persson 2002) covered elsewhere in the course in follow-up examples, in-class activities, or homework problems. Readers interested in viewing additional course materials are encouraged to contact the author.
Figure 1 illustrates the first page of the sample worked example, which includes a problem statement, relevant figure, and strategy to be employed in solving the problem. Note that this problem leverages real atmospheric data from upper-air radiosondes to make the analysis more authentic and rooted in reality for students; other examples used in this study have analyzed both routine weather data as well as data available from well-known events such as Hurricane Katrina or the Tri-State Tornado of 1925. Importantly, the problem statement is accompanied by an annotated map of the scenario, which should reduce extraneous cognitive load. Next, the stated strategy for this problem is designed to do a few things: 1) explicitly identify the relevant concept; 2) introduce the key equations (without a lengthy derivation); and 3) briefly explain the structure used to solve the problem moving forward. Figures 2 and 3 then explain step by step how the problem is solved and the individual calculations are made. Along the way, in this example, students respond to four self-explanation prompts, each promoting different critical thinking skills. For example, question 1 asks students to flex their vector calculus skills by expanding the relevant equation (such tasks are emphasized throughout the course), while question 2 asks them to explain why a given value is negative (Fig. 2). Students are also asked to perform small calculations and provide physical context for the final answer (Fig. 3). This entire process importantly forces students to slow down and explain to themselves the procedure for solving the problem, rather than immediately trying to plug in numbers in an equation. Following completion of the worked example, students are given the opportunity to apply what they have learned in a related application problem, which also uses real atmospheric data (Fig. 4). Ideally, students would follow the same steps outlined in the worked example to solve the application problem.
During the spring and fall semesters of 2017 and 2018 at the University of North Carolina at Charlotte, out of a desire to improve student learning, the atmospheric dynamics course sequence was dramatically modified to be oriented around worked examples; previous offerings of the course sequence in 2014, 2015, and 2016 were oriented around lectures and detailed equation derivations during class. Note that while the worked examples were being written and developed before spring 2017, they were not included in any manner in the lecture-based offerings of the course. Over the course of the study, a total of 28 (25) students were taught using the traditional lecture (worked examples) method for Dynamics I, while in Dynamics II, 22 students were taught for each method.
To implement the worked examples pedagogy, each dynamics concept was converted into one or more worked examples, depending on its complexity or depth. For example, a more straightforward and/or familiar concept (i.e., introduced in prior courses) such as the pressure gradient force was associated with a single worked example, while a very complex concept such as quasigeostrophic theory had several worked examples assigned over multiple class periods. Students were assigned to complete one to two (usually one) worked examples before each class period (including the first day of class, where the pedagogical approach was motivated and discussed with students) and were also expected to read the related textbook section; the goal of this pre-class work was for students to construct a basic understanding of concepts and how they are used, allowing for more in-depth investigation during class (i.e., the “flipped classroom” approach; Lage et al. 2000; Bergmann and Sams 2009; Berrett 2012). This approach allows for more active learning activities in class that develop deeper learning (e.g., Prince 2004). Right before the start of each class period, the instructor checked each student’s example(s) to assign up to 10 points based on completeness; as long as a reasonable attempt was made to answer each question and work through the application problem, credit was given. Incomplete responses or unanswered questions resulted in partial credit. The vast majority of students frequently completed the assigned examples before class; the average worked example score across four semesters was 9.65 out of 10.
Each 75-min class period consisted of three components. First, a brief (3–5 min) summary of the main concept was given by the instructor (i.e., a mini lecture) to lay a proper foundation for the class period and to emphasize key points that students should understand. These key points, along with the relevant equations and any important visuals, were listed on a single sheet of paper handed out to students, called “crib notes.” Since there was no formal, extended lecture during class, these crib notes also provided space for additional notes as desired, and gave something concrete that could be quickly referenced by students as a study tool.
Next, a discussion of the assigned worked example is facilitated by the instructor; the scenario and relevant strategy were briefly summarized, followed by discussion of the steps taken to solve the problem, along with student responses from the self-explanation prompts. This component is typically very interactive among the students and between the students and instructor, with many students offering their answers, bringing up additional questions, drawing pictures or equations on the board (at times prompted or unprompted by the instructor), and debating among one another. Depending on the group dynamics and specific questions being discussed, the instructor’s role during this time can be more active (e.g., encouraging student responses or asking clarifying questions) or less active (e.g., allowing students to debate and struggle with the material). This back and forth among students and the instructor continues until all students feel comfortable with the worked example, including the associated application problem, typically taking 30–45 min. This interactive component is vital to implementing worked examples, as oral discussion, in combination with the visual information in the example, has been shown to further reduce extraneous cognitive load (Mousavi et al. 1995).
The final component taking up the remainder of the class period consists of students working on assigned in-class examples or activities, designed to solidify the key concept and provide additional depth and practice. These activities varied widely in terms of format and content, ranging from more application problems (exposing students to more complex or different types of problems on the same concept) to more theoretical applications, including derivations, consistent with students’ preferred learning style (Roebber 2005). Students regularly worked together during this portion of the class to support peer learning (e.g., Crouch and Mazur 2001), with the instructor walking around the room to answer questions and provide guidance as needed. As time permitted, discussion of the assigned activities was led by the instructor to ensure students felt comfortable with the material and its additional applications.
Implementing worked examples had a marked effect on student performance, as summarized in Table 1. Overall course grades notably improved for students taught via the worked examples pedagogy; on average, these students performed 4–7 percentage points better than students in the lecture sections. Examining the distribution of final course grades further demonstrates this clear shift, with a larger fraction of worked examples students receiving final grades above 80% (Fig. 5). It is also evident that implementing worked examples led to varying degrees of improvement. For example, in Dynamics I, the bulk of students went from receiving grades in the 60s and 70s in the lecture sections to receiving grades in the 70s and 80s in the worked examples sections (Fig. 5a). In Dynamics II, however, the distribution shift is not as clean. Lecture-based students tended to have final grades in the 60s and 70s, while a more bimodal distribution was evident for worked examples students, with a plurality receiving final grades in the 60s even as a larger fraction of students received grades in the 80s and 90s (Fig. 5b). Indeed, a Welch’s t test indicates that worked examples students in Dynamics I performed significantly better (p = 0.014) than the lecture students, while in Dynamics II the smaller average improvement was not significant (p = 0.307).
The final course grade, while informative in a bulk sense, encompasses a number of different types of assessments, so it is beneficial to examine the impact of implementing worked examples on various components of the course. Overall, using worked examples almost always resulted in higher average scores in every category of assessment (final exam, midterm exams, homework, and quizzes) in both semesters of dynamics (Table 1). Even so, the significance of these changes varies with the category and with the semester of the course.
Performance on the final exam is perhaps the best measure of the effectiveness of the worked examples pedagogy; the exam is a cumulative assessment of all topics covered in a given semester, and importantly remained unchanged throughout this study (students were not handed back their final exam). Similar to the final course grades, the effect of worked examples was much more clear-cut in the Dynamics I final exam compared to Dynamics II. Indeed, students that learned from worked examples in Dynamics I performed nearly 15 percentage points higher on average on the final exam (statistically significant with p = 0.003; Table 1). However, these students tended to perform worse on the Dynamics II final exam compared to those in the lecture sections, though this difference is not statistically significant (p = 0.280).
The distribution of final exam scores further illustrates the disparate effect of worked examples on the two halves of the course. In Dynamics I, there is a clear upward shift, with a larger fraction of students receiving scores in the 70s, 80s, and 90s and a smaller fraction receiving 40s, 50s, and 60s (Fig. 6a). In Dynamics II, however, a larger fraction of worked examples students received final exam grades in the 40s and 50s, thus contributing to the lower average (Fig. 6b). While there is not an obvious explanation for worked examples leading to poorer performance on the final exam, there are a couple of possibilities. For example, the content covered in Dynamics II is generally more difficult and multifaceted; complex topics (such as quasigeostrophic theory) are more challenging to condense into a limited number of well-constructed, well-explained examples. As a result, students could attain lesser or more incomplete understanding of those topics, thus not receiving the intended benefits. Another possibility is something known as the expertise reversal effect, wherein the supports provided by an instructional intervention technique (such as worked examples) become less effective or produce negative effects; with higher levels of incoming subject expertise, a more experienced learner can have a higher cognitive load trying to reconcile the provided guidance with their existing knowledge (Kalyuga et al. 2003). Notably, the worked examples pedagogy is designed with inexperienced learners in mind, so it may be that students do not need as much explicit support or guidance in the second half of the atmospheric dynamics course, since they have already learned many of the basics in Dynamics I. In either case, the author intends to modify and update the content and structure of the worked examples in the future to increase their efficacy for students.
In addition to the final exam, students were also assessed using two midterm exams. While these exams were modified over the course of this study, the changes were mostly superficial, with topical content and type of problems remaining the same. As shown in Table 1, worked examples students performed on average over 8 points better than the lecture students in Dynamics I; this improvement was statistically significant (p = 0.023). In Dynamics II, performance was quite similar, with no significant difference (Table 1).
A series of homework assignments were given throughout each semester of the course, largely consisting of problem sets of similar format to the worked examples and application problems; however, equation derivations as well as detailed physical explanations were also a regular component of these assignments. Given this increased exposure to solving problems and guidance in using various math skills, students in the worked examples section performed 5–8 points higher on average on their homework assignments (Table 1). The effect was not significant (p = 0.148) in Dynamics I but was significant in Dynamics II (p = 0.018).
The final category of assessment for the course was a number of quizzes interspersed between the midterm and final exams, the format of which was similar to the exams, but much shorter in length. Thus, as in the results comparing exam performance, students subject to the worked examples method performed significantly better (10 points on average; p = 0.003) on quizzes in Dynamics I but only slightly better (1.5 points on average; p = 0.704) in Dynamics II.
It is clear that the introduction of worked examples provided a significant boost for students in enhancing their problem solving skills and overall learning of atmospheric dynamics. While the improvement was not universal across semesters or different categories of assessment, the overall effect was undoubtedly positive and beneficial for students.
Any time a new pedagogical approach is embraced in the classroom, it is necessary to assess student response to the changes to ensure it is viewed in a positive light, allowing for a smoother implementation and continued student buy-in. In the case of worked examples, students overwhelmingly supported its use in atmospheric dynamics. On end of semester evaluations throughout this study, students often mentioned enjoying and benefiting from the collaborative nature of this pedagogical method. As one student stated,
Personally, I enjoy the structure of the class. Being able to work on problems during class more than listening to a lecture is best for learning for me. Then if a question does arise then everyone is able to ask you and get a personal answer.
Another student echoed a similar sentiment, finding that learning comes more easily via worked examples:
I like that we don’t have to sit and watch a teacher write a whole bunch of notes on the board because the class is interactive and we all work on the examples together. I find material easier to learn when we are doing examples than reading notes or the textbook.
However, some students found the constant need to prepare for class, on top of homework assignments and exams, to be a bit burdensome. Even so, they saw the value in their efforts:
Worked examples are a lot…but they are pretty effective at helping you learn the material and I think it works better for this course than lecture style.
While student opinion is not the ultimate measure of the effectiveness of a pedagogical technique, it is reassuring that such a drastic shift in course design did not result in student resistance or other shifts in motivation or participation.
The worked examples pedagogy is an instructional approach that has proven to be effective in reducing cognitive load, thus allowing for more effort to be put into developing deeper long-term memory and understanding. Worked examples were constructed for each concept addressed in the two-semester atmospheric dynamics course sequence at the University of North Carolina at Charlotte, and implemented during the spring and fall semesters in 2017 and 2018. A flipped classroom approach was used, where students were required to complete one to two examples and read the associated section of the textbook before each class period to familiarize themselves with the applications of a particular concept. In-class time was then primarily used to discuss the assigned example(s) and work on additional problems, diving deeper with other applications and theory.
On average, students learning atmospheric dynamics via worked examples performed better than those learning via lecture in nearly every category of assessment in both Dynamics I and II; many of these improvements were statistically significant, particularly for Dynamics I (Table 1). Even so, the complex and multifaceted material associated with the second half of the course was challenging to condense into effective worked examples, potentially contributing to poorer performance on the final exam. Furthermore, increased experience with dynamics may have resulted in the examples being less effective for students in Dynamics II, thus hindering their learning (i.e., the expertise reversal effect). Work is ongoing to make improvements to account for these issues.
Readers should be encouraged that students greatly valued the worked examples approach. They were particularly appreciative of the more interactive nature of the class, strongly preferring it over lecture, as well as being able to see explicit examples of how dynamical equations and processes manifest in the real atmosphere. Finally, while this pedagogical method was applied to teaching atmospheric dynamics, it is hoped that others will be inspired to implement it in other quantitatively intensive courses that they teach. Those interested in seeing additional sample worked examples and course materials are encouraged to contact the author.
The author is indebted to her dynamics students over the years for being willing to try new things and helping her grow as an instructor. Helpful discussions and unwavering support from meteorology faculty at UNC Charlotte are also gratefully recognized. Insightful comments from anonymous reviewers also helped to clarify and improve this manuscript. This work received approval from the Institutional Review Board and was supported by funds provided by the Scholarship of Teaching and Learning Program at the University of North Carolina at Charlotte.