Professional meteorologists gain a great deal of knowledge through formal education, but two factors require ongoing learning throughout a career: professionals must apply their learning to the specific subdiscipline they practice, and the knowledge and technology they rely on becomes outdated over time. It is thus inherent in professional practice that much of the learning is more or less self-directed. While these principles apply to any aspect of meteorology, this paper applies concepts to weather and climate forecasting, for which a range of resources, from many to few, for learning exist. No matter what the subdiscipline, the responsibility for identifying and pursuing opportunities for professional, lifelong learning falls to the members of the subdiscipline. Thus, it is critical that meteorologists periodically assess their ongoing learning needs and develop the ability to reflectively practice. The construct of self-directed learning and how it has been implemented in similar professions provide visions for how individual meteorologists can pursue—and how the profession can facilitate—the ongoing, self-directed learning efforts of meteorologists.
Directing one’s own lifelong learning is an important part of being a professional. The constantly evolving and growing field of meteorology could better facilitate those efforts.
Graduation from formal education is not the end but rather the beginning of an ongoing series of learning efforts that we all undertake to specialize in a subdiscipline and become an expert. Our first task is to overcome the school–job disconnect that Baum (1975) recognized when he argued to the World Meteorological Organization (WMO) that the role of universities was to teach generalized knowledge that professionals and their employers then refine and apply to a specific job. As that task is completed, a new and significant one begins: for a discipline to thrive, its members must develop their knowledge and skills to that of expert level, meaning to acquire great skill or knowledge in a particular field (New Oxford American Dictionary, 3rd ed., s.v. “expert”). This second effort is generally considered to involve 10,000 or more hours to achieve (Ericsson et al. 2006).
In addition to learning to apply knowledge to a specific job and developing expertise, knowledge itself is changing, and what was once learned can later be forgotten if not used. Meteorology has advanced quickly, with dramatic changes just in the past 20 years (see examples in Table 1). If we do not make efforts to continually learn about the ever-occurring advancements in our field, we will quickly fall behind. Moreover, if we do not have ways to reinforce and practice what we have learned, our knowledge will atrophy and be forgotten. Easley first highlighted this effect, called the Ebbinghaus forgetting curve (Easley 1937), and it was recently revalidated (Murre and Dros 2015). Easley also recognized the importance of deliberate practice to reinforce knowledge, upon which the present work provides a framework in the context of meteorology.
Because the constructs in this paper may be clearest when applied to a particular subdiscipline, we have chosen to focus this paper on a particular subdomain that we have studied (e.g., LaDue 2011) and work in: weather forecasting. Weather forecasting, while changing in scope to encompass decision support and customized forecasts for private clients, remains one of the better studied and better supported subdisciplines of meteorology. Characteristics of expertise in weather forecasting, for example, have been identified (Pliske et al. 2004; Hoffman et al. 2017, chapter 6). Termed intuitive-based scientists by Pliske et al., those forecasters possessed highly visual representations of weather processes, made dynamic use of mental models of weather systems, demonstrated high-level skill in pattern recognition, and made flexible use of information sources. Only two of the military forecaster participants in Pliske et al.’s study, each of whom had considerable work experience, had reached this level. No other subdomains of meteorology have been studied in the expertise framework, to our knowledge.
The bottom line is clear: from the initial application of formal learning as a new hire to the development of expertise in your subdiscipline, while also keeping up with the latest developments in tools and knowledge, there is much learning to do throughout your career. We hope this paper provides motivation and guidance for any lifelong-learning meteorologist, including how to identify your learning needs, reflect upon how you have applied the knowledge you gained, and how to increase your professional mastery. Potential answers to a couple of questions will be addressed: How do meteorologists choose resources, strategies, and subjects to learn, and what can be done to assure meteorologists are effectively learning throughout their careers? We suggest that this article poses a third, only partially answered question: What should the role of meteorological professional societies be in setting standards for continued learning? In the absence of studies from within the field of meteorology, we introduce practices from other disciplines that have expended a great deal of resources studying the self-directed, continuing learning of their professionals.
THE SELF-DIRECTED LEARNING OF PROFESSIONALS.
The backbone of our framework is the notion that learning takes on a less formal structure during adulthood, customized to any given individual (Knowles 1975, p. 18):
In its broadest meaning, “self-directed learning” describes a process in which individuals take the initiative, with or without the help of others, in diagnosing their learning needs, formulating learning goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes.
Hammond and Collins (1991) later modified Knowles’s original definition to include acknowledgment that social and self-awareness often causes us, as adults, to realize our learning needs, and that reflection on and analysis of learning are important aspects of high-quality, effective self-directed learning (see “Critical self-directed learning” sidebar). It is this second definition that is emphasized in the continuing professional development literature in medicine (e.g., Davis et al. 2003), which is in many ways a similar applied science.
Hammond and Collins (1991, p. 13) created the phrase critical self-directed learning to “describe a process in which learners take the initiative, with the support and collaboration of others, for increasing self- and social awareness; critically analyzing and reflecting on their situations; diagnosing their learning needs with specific reference to competencies they have helped identify; formulating socially and personally relevant learning goals; identifying human and material resources for learning; choosing and implementing appropriate learning strategies; and reflecting on and evaluating their learning.”
Because even the nature of forecasting can vary quite a bit depending on geographic location, time scales of minutes to seasons, customer focus, and so on, it is appropriate to look at lifelong learning as something each forecaster will self-direct to some extent. Learning goals are often built after interacting with others, however, and it is unrealistic to expect adults to be consistently self-directing: the demands of modern life are simply too mentally taxing for us to always have the capacity to self-direct in all areas and at all times (Kegan 1994). We may not have the time or capacity to effectively think about our work.
FACILITATING AND CONDUCTING SELF-DIRECTED LEARNING.
Thinking about lifelong learning as a partially self-directed endeavor acknowledges the characteristics of autonomy we desire as professionals while clarifying that there is often an assisting role for others to play. One way to envision and organize the process of professional development is to 1) create an image of change, 2) determine learning needs, 3) identify appropriate human and material resources, 4) engage in learning strategies that are effective and appropriate for the particular subject, and 5) reflect upon and assess learning (Fig. 1). Self-direction is an important personal attribute that is congruent with the notion of a professional taking responsibility for their ongoing development; thus, it is worth knowing how to do it well, whether facilitated or not.
Create an image of change.
For the purpose of this paper, we have chosen to focus on weather forecasting. In the lead author’s study of forecaster learning (LaDue 2011), none of the forecasters felt adequately prepared to immediately begin forecasting. They could observe other, more senior forecasters in their private and public sector workplaces, and some had dedicated mentors. Thus, each could build upon prior notions with real examples to envision a competent state for themselves through observation or mentoring.
Not all notable learning occurs at the outset of a career; thus, we also consider the results from a seminal study of physician learning (Olson et al. 2005). Fox et al. (1989) discovered that significant career growth and change began after some combination of three forces—personal (e.g., desire to do well, changes in home life), professional (e.g., new science, job promotion), and social (e.g., office culture)—resulted in a desired image of change that the physician then pursued. Importantly, the premise of the study had not been, how do we encourage physicians to learn, but instead, when physicians learn, why and how do they do it?
Research in social psychology broadly supports this step of forming an image, though with a caveat. Rather than simply thinking positively about the future, several studies suggest that you best invoke self-regulation by mentally contrasting the hoped-for future state with the reality of the present. By doing this, you can identify likely obstacles and the strategies to overcome them; this has been shown to be effective in leading to learning and change (Oettingen 2012; Oettingen and Reininger 2016). These findings complement and confirm decades of research in adult education that show similar effects of large and small gaps between one’s current and desired state.
We note that the World Meteorological Organization has a competency framework for public weather service forecasters (WMO 2015) that can be used to further define and specify any number of areas in which to focus learning. For example, five performance criteria are listed for the task of warning, ranging from assessing the phenomena to understanding how those warnings affect decisions. It is in the process of being updated as this paper is being written.
Determine learning needs.
In even the earliest days of adult education, Knowles (1975) recognized that making the effort to determine your learning needs can make you uncomfortable; adults are not used to revealing their weaknesses. This marks an important and stark departure from formal education, where someone else bore that role for you. If you are in the position of facilitating someone else’s learning needs, be aware that the determination can have a significant impact on their performance because it affects their internal motivation to learn.
In adulthood, and for professionals in particular, this job should be led by the learner himself or herself whenever possible. They may need help, however. Studies in a variety of disciplines have shown that people have the most accurate sense of their performance in areas where good performance is definitive but tend to grossly overestimate their abilities in areas where competent performance cannot be concretely defined (Dunning et al. 2004). For example, people are more likely to accurately self-rate how punctual they are than how disciplined they are.
Professionals can and should be involved in defining the knowledge, skills, and attitudes that define competent performance. Some aspects of performance can be quantified. For those that are not, you can work through the following exercise alone or with a facilitator or peer to create an assessment tool for any competency you would like to achieve:
Identify an area of job mastery that you would like to improve.
Identify the elements it takes to excel in that area: What knowledge, understanding, skills, attitudes, or values are needed?
Construct a rating scale using qualitative or quantitative measures.
Have peers/coworkers rate your performance to identify areas of strengths and weaknesses.
The results can be used to focus your learning goals and assess how much you have improved. Keep in mind that large gaps can result in high anxiety, whereas small gaps can lead to apathy. When you are helping facilitate someone else’s learning, be sensitive to their feelings. Being willing to show your own learning journey by example can help.
Identify human and material resources.
The range of resources to help meet your learning needs will likely include both formal resources and informal strategies created by you and your facilitator or peers. Table 2 provides a range of resources professionals use for learning. Formal resources may include textbooks, some of which have only become available in the last decade, and courses made available online, such as the Cooperative Program for Operational Meteorology, Education, and Training (COMET) Program’s MetEd website, National Weather Service (NWS) Office of the Chief Learning Officer training division courses, massive open online courses (MOOCs), and so on (see Table 3 for learning strategies and “Online resources” sidebar for forecaster-relevant resources). Informal resources include discussion forums, blog posts, conversations with colleagues, local event summaries, forecast debriefings, and learning from customers/users of your meteorological information. Specific employers may have additional methods, such as the NWS’s Weather Event Simulator located in each NWS forecast office.
Strategies created by the learner, with or without a facilitator, can be quite varied. In LaDue’s (2011; or see Hoffman et al. 2017, chapter 3) study of how meteorologists learn to forecast the weather, forecasters early in their careers tended to rely heavily on human resources. They were still learning how to think about the forecast problems and how to sort out and quickly recognize important cues in complex datasets. Those topics were best learned by having a more experienced forecaster explain how they think about the problem and how they identify features in data. Forecasters working in seasonal climate forecasting generally did not have experts to talk with because it was a relatively new type of forecast. Instead, they took extensive notes on their forecast strategy and later conducted careful review and verification of those forecasts to evaluate how well their forecast strategy worked. Sometimes, a missed forecast (of any type) had no clear cause and so was not a matter of forecast expertise or a missed cue in the data. For those situations, forecasters either conducted research or appealed to researchers to study the missed forecast situation with the luxury of time to more thoroughly analyze the event.
A critically important mechanism for learning also includes facilitating others’ learning. Facilitators, such as National Weather Service Science and Operations Officers, can help by identifying particular resources in the types mentioned above. For example, in LaDue’s (2011) study, experienced forecasters would sometimes initiate conversations with less experienced forecasters about interesting data or cases. Although these incidents occurred at the whim of the experienced forecaster, they were incredibly valuable to the younger forecaster. For some forecasters, where this occurred frequently, learning was fast and much more efficient than relying on formal resources and the slow accumulation of experience. Experience is not necessarily a mechanism for learning but with deliberate effort can be extremely effective, as discussed next.
Learning through work.
Learning can take a highly sophisticated form that is relevant to meteorology: reflective practice. Reflection has long been acknowledged as important to learning in professional educational literature (e.g., Olson et al. 2005). The art of reflective practice was conceptualized and articulated in the seminal writings of Schön (1983, 1987) as the active reflection and retrospective learning that professionals do as they engage in the nonlinear application of their scientific knowledge to their practice. He called this process “reflective practice” (see “Reflective practice” sidebar) and those who do it as reflective practitioners. Experience alone is insufficient as a mechanism for learning. Professionals must continually reflect on their experiences in order to deliberately expand their proficiency.
Reflective practice is the active reflection and retro-spective learning professionals do to increase their expertise as they engage in surprises encountered in the nonlinear application of scientific knowledge to their practice.
A few authors in meteorology have recognized the importance of reflective behavior more generally, such as Market’s (2006) forecast activity in which students wrote forecast discussions every other week as a means to encourage reflection on the forecast process. He found a statistically significant improvement in certain aspects of students’ forecasts during weeks when they did the reflective activity. Studies of professional meteorologists are rare, but Hoffman et al. (2017) and LaDue (2011) found reflective behaviors to be a critical learning strategy.
It is common in the natural sciences to come across situations where a straightforward application of empirically tested knowledge cannot be done. Schön (1983, 1987) carefully articulates that these are the most interesting problems encountered in practice. One of Schön’s examples is that if an engineer is told where a road will be built and asked for recommendations on how to build it, that is a more or less straightforward application of knowledge. But if that same engineer is instead asked where the best location for the road might be, that engineer must now engage in a nonlinear application of knowledge in order to frame a workable problem. Meteorology researchers similarly encounter both straightforward and complex situations in their work. Some research is simple to conduct, whereas other studies require inventing new research strategies. Weather forecasters engage in this type of thinking when they shift their forecast information from a broad, County Warning Area perspective to providing site-specific decision support that requires particular information and lead times for a small area or point.
The process of reflective practice allows a professional to build upon what might be called their zone of mastery (Fig. 2). That is, at any given stage in time, a professional has a set of problems or tasks he or she is comfortable handling. Each time one encounters something that does not fit current knowledge and skills, the professional can respond in a few ways, including to initially resist the surprise and attempt to apply known solutions to the new problem; for example, the forecaster who fails to investigate the first signals of the weather deviating from what was forecast. Someone who is unskilled at learning through practice may repeat the same reaction to surprise and ineffective application of knowledge many times over the course of their career. A more productive approach, which Schön described and advocated, is for the professional to recognize they have now fallen into what he referred to as the swampy lowland of professional practice. At this point, effective professionals engage in one or more types of rigorous experimentation as they work through the problem, reflecting in action as they go. Alternately, when time is insufficient, a professional might later reflect on action and consider this experimentation in retrospect, such as when forecasters conduct quality case studies on past events (Schultz 2010). For such a situation, data archives are enormously helpful to assure learning is of high quality. In either case, if lessons learned are incorporated into the zone of mastery, the professional has effectively learned through their work and increased their expertise.
How this active experimentation manifests itself will depend on the particular situation. These experiments can take place as thought experiments, often with a more experienced practitioner serving as a coach to help the individual think through chains of implications. The more experienced coach can see implications the less experienced professional may not realize are possible following an initial action or event. Expert forecasters, for example, tend to conceptualize variations in how weather might evolve and actively look for signals that a solution other than the one forecasted is beginning to verify (e.g., Andra et al. 2002). Forecasters also can serve as mentors to one another, with more experienced forecasters laying the groundwork for contextualizing learning opportunities for weather events. Consultants and broadcasters might have opportunities to actively experiment with the effectiveness of their communications with their clients and customers, literally testing the efficacy of each approach.
The lessons learned in the swamp are only effective in building the zone of mastery if they are adequately reflected upon. The lessons must become integrated into one’s knowledge and skill set if they are to expand the professional’s expertise. There is wide consensus that it is the reflection on experience that is key to learning from experience (e.g., Bereiter and Scardamalia 1993). Such postlearning reflection can take many forms, from personal reflection tools like journals to public forms such as articles that go beyond merely reporting on a weather event to include the coaching aspect of how the forecaster worked through forecasting for the event—or would work through it in the future based on what was learned.
The profession of meteorology has a history of providing opportunities for learning, including timely, relevant short courses offered at our professional meetings and a wide array of publicly available educational options ranging from productions of professional learning groups such as COMET and the NWS Office of the Chief Learning Officer (e.g., “Online resources” sidebar) to focused efforts such as the “Severe Thunderstorm Forecasting Video Lecture Series” (Cohen et al. 2018). These efforts are excellent, and the field of meteorology can further enhance the quality of self-directed learning efforts of meteorologists by 1) more deliberately teaching skills related to effective self-directed learning, 2) seeking assistance from supervisors and peers to identify areas for growth of expertise, and 3) engaging in reflective practice.
Many of the following sites have publicly available resources for learning about weather forecasting [availability of online courses for weather forecasters employed outside the NWS varies on the sites from being fully accessible (e.g., the COMET site) to portions being unavailable]:
NWS Warning Decision Training Division: https://training.weather.gov/wdtd/
NWS Training Center: https://training.weather.gov/CourseListing.php
NWS Training Center YouTube Channel: www.youtube.com/user/NWSTrainingCenter?reload=9
Virtual Institute for Satellite Integration Training (VISIT): http://rammb.cira.colostate.edu/training/visit/
Cooperative Institute for Meteorological Satellite Studies (CIMSS) education and outreach programs: http://cimss.ssec.wisc.edu/education/
Severe Thunderstorm Forecasting Video Lecture Series: www.spc.noaa.gov/exper/spcousom/
The skills for effective self-direction can be encouraged in undergraduate education and workplace culture. Table 4 highlights qualities first identified in Candy’s (1991) review of early adult education literature and further refined by medical education in their most recent comprehensive work identifying the state of the science in continuing professional development (Davis et al. 2003). The resulting list of skills highlights the importance of very active engagement, management, and ownership of the learning process.
The skills in Table 4 can be built through formal education. There are many resources supporting university faculty, such as On the Cutting Edge, a partnership of several organizations to provide resources and workshops for improving undergraduate geoscience teaching (National Association of Geoscience Teachers 2017). The field of meteorology, nested as it is within the broader natural sciences structure, is fertile ground for such building. Two skills in particular, identification of learning needs and the ability to reflectively practice, may be especially relevant to our applied science. Many innovations in undergraduate education have emerged in recent years, such as Croft’s (2006) application of problem-based learning and Bals-Elsholz et al.’s (2017) use of observations to learn theory and generate research questions.
Regardless of whether these skills were nurtured and developed during formal schooling, the profession of meteorology can actively facilitate the development of expertise in any subdomain through high-quality, self-directed learning of our professionals in the workplace. The construct of self-directed learning and how it has been implemented in similar professions provides visions for how individual meteorologists can pursue—and how their colleagues can better facilitate—the ongoing self-directed learning efforts of meteorologists. This work substantiates and illustrates a critical process for self-directed learning: 1) recognition of learning needs, 2) identification of appropriate human and material resources, 3) engagement of learning strategies that are effective for them and appropriate for the particular subject, and 4) reflection upon and assessment of learning.
Identification of learning needs must come from two sources: the needs perceived by those who have them and the actual needs a professional may not be aware of. The profession can help identify both types of needs through development of competence models and monitoring of the development of new knowledge, skills, and tools used in the profession. Individual professionals can develop competence models to evaluate their needs and monitor their progress in learning to overcome weaknesses found.
Reflective practice is an effective and important way in which individuals in similar professions build their expertise. It is the conscious reaction and willingness to react to surprise and move into the swampy lowlands of professional practice that allow for growth of expertise. Such surprising problems often require a nonlinear application of scientific knowledge. Professionals work through the surprise by actively experimenting and evaluating the results. They then increase their expertise by incorporating the lessons learned into their zone of mastery.
A final note of caution: while meteorology is an applied science, meaning that theory is critically important, young professionals are driven by the immediacy of the job duties. They need practical knowledge and have little time to discover their own ways to apply theory. It should come as no surprise, then, that LaDue (2011), Ramming (1992), and others find that young professionals exhibit strong preference for, and reliance on, human versus formal resources for the majority of their learning needs. In other words, the zone of mastery is largely a practical endeavor, deepened through continued reflective learning from experience.
It is inherent in professional practice that each person manifests the profession in his or her own unique way. Much of professional learning, therefore, is self-directed. To cultivate related skills and assistance to avoid the risk of eroding our profession, it is critical that professions and individuals within them identify ongoing learning needs and develop the ability to reflectively practice.
The authors thank Israel Jirak and the anonymous reviewers of the original and revised submission for helpful comments to make this work more applicable to the field. The original paper was written after discussions with students and faculty in the University of Oklahoma’s College of Education, colleagues at National Severe Storms Laboratory, and fellow American Meteorological Society (AMS) Board of Continuing Professional Development members. The lead author thanks many who provided opportunities to help her contribute in new ways to our profession, including Jim LaDue, Harold Brooks, Kevin Kelleher, Jeff Kimpel, John Snow, Ed Mahoney, Liz Quoetone, Brad Grant, Berrien Moore, Ming Xue, Keith Brewster, Kunihiro Naito, Chris Karstens, James Correia, Jr., Lans Rothfusz, Alan Gerard, Kristin Calhoun, and many others; she also thanks her second author, Ariel Cohen, for assisting in revisions of the original draft to make this version more actionable as reviewers of the original draft had requested. This work was funded by the University of Oklahoma’s Center for Analysis and Prediction of Storms the NOAA Storm Prediction Center. The scientific results and conclusions, as well as any views or opinions expressed herein, are those of the authors and do not necessarily reflect the views of NOAA or the Department of Commerce.
CURRENT AFFILIATION: Cohen—National Weather Service, Topeka, Kansas