The WxChallenge, a project developed at the University of Oklahoma, brings a state-of-the-art, fun, and exciting forecast contest to participants at colleges and universities across North America. The challenge is to forecast the maximum and minimum temperatures, precipitation, and maximum wind speeds for select locations across the United States over a 24-h prediction period. The WxChallenge is open to all undergraduate and graduate students, as well as higher-education faculty, staff, and alumni. Through the use of World Wide Web interfaces accessible by personal computers, tablet computer, and smartphones, the WxChallenge provides a state-of-the-art portal to aid participants in submitting forecasts and alleviate many of the administrative issues (e.g., tracking and scoring) faced by local managers and professors.
Since its inception in 2006, 110 universities have participated in the contest and it has been utilized as part of the curricula for 140 classroom courses at various institutions. The inherently challenging nature of the WxChallenge has encouraged its adoption as an educational tool. As its popularity has grown, professors have seen the utility of the Wx-Challenge as a teaching aid and it has become an instructional resource of many meteorological classes at institutions for higher learning. In addition to evidence of educational impacts, the competition has already begun to leave a cultural and social mark on the meteorological learning experience.
Since its start in 2006, the WxChallenge has become so ingrained in the higher education experience across North America that some participants put results on their resumes and continue competing after graduation.
Forecast competitions serve as an excellent teaching tool in the meteorology and atmospheric science communities. Many students become geographically focused on everyday local conditions with regard to understanding meteorology and its impacts with respect to forecasting, which impacts their overall forecast skill (Roebber et al. 1996). By introducing students to various forecast locations across North America, they learn firsthand the impacts of terrain and maritime effects and the consequence of microscale, mesoscale, and synoptic-scale atmospheric processes that impact the forecast. Additionally, students learn about the effectiveness and biases of numerical weather prediction models.
Collegiate weather forecasting competitions have existed for at least 40 years in the United States. Many universities utilized local forecasting competitions within their academic programs (e.g., Bosart 1975; Sanders 1986; Roebber and Bosart 1996) and the National Collegiate Weather Forecasting Competition (NCWFC; Peyrefitte and Mogil 1968) was a widely used competition on the national scale. The host of the NWCFC moved from several universities (including the University of Missouri and the University of Michigan) until it finally settled at Penn State University, where it was operated until 2006. The NWCFC was utilized in the education of many meteorology students, established the framework for all future forecasting competitions, and was even used in scientific research studying the human forecast process (Roebber et al. 1996).
Beginning in early 2005, colleagues at the University of Oklahoma, Texas Tech University, and San Jose State University formulated a plan to develop a national forecasting contest that would take advantage of significant advances in technology in both consumer electronics and the field of atmospheric sciences. The project, initially called the Weather Challenge (now WxChallenge), was developed to bring a state-of-the-art, fun, and exciting forecast contest to students at colleges and universities across North America.
An advisory board was established consisting of professionals from multiple universities who had experience participating in and/or managing weather forecast contests. The advisory board administers the contest, resolves disputes that might arise during competition, and directs innovations required to maintain the contest as state of the art.
The primary focus of the contest began with identifying 1) forecast variables relevant to the current development and needs of the weather forecast community and 2) methods of verifying forecasts. The advisory board first recommended traditional forecast values including maximum and minimum air temperature. In addition, because of advances in numerical weather prediction associated with quantitative precipitation forecasts (QPF) as well as QPF products provided by the U.S. National Weather Service (NWS) as part of the National Digital Forecast Database (NDFD; Glahn and Ruth 2003), the contest was designed to include numerical forecasts of total precipitation. Also, because wind speed forecasts are widely issued by meteorologists across the public and private sectors and given that wind speed is a variable that has not received specific attention in past competitions, the advisory board determined that similar wind speed forecasts should be included as part of the contest. The WxChallenge is not intended to be pedagogical in and of itself, but it does provide the framework and tools within which to develop and apply forecasting pedagogy.
The advisory board recommended that the forecast period include a 24-h window spanning 0600– 0600 UTC with forecasts to be entered daily no later than 0000 UTC. This forecast period represents roughly a standard day time frame (e.g., roughly midnight to midnight) and has been used in other current and previous forecast competitions (e.g., NCWFC). Further, each forecast variable used as part of the contest required reliable verification. Thus, the advisory board determined that only locations with a daily climatological report issued by the NWS could serve as forecast sites. As such, official verification of the forecast results were designed to utilize a combination of the available fully automatic weather reports (METAR) as well as the daily climatological report. The final forecast variables with corresponding verification within the 0600–0600 UTC forecast window include maximum temperature (°F), minimum temperature (°F), total liquid equivalent precipitation (inches), and maximum sustained wind speed (knots).
During the fall of 2005, the WxChallenge began its beta period with seven schools and 221 participants, which allowed for real-time evaluation of forecasts provided by collegiate forecasters. Following a successful beta period, official operations began during the fall of 2006 with 55 participating universities and nearly 1600 participants. Since that time, the WxChallenge has grown to include a maximum of 89 participating universities and over 2100 participants in any single academic year.
GENERAL RULES AND OPERATIONS.
The WxChallenge is open to all undergraduate and graduate students, faculty, staff, and alumni of a higher-education institution. Each participant is grouped into one of five categories: 1) freshman/sophomore, 2) junior/senior, 3) M.S. or Ph.D. student, 4) faculty/staff, or 5) alumni. Thus, forecasters compete against other participants with a similar level of educational background in meteorology as well as against all participants. Additionally, schools and/or forecasters can register personal or specifically tailored numerical models to be compared against commonly used numerical models such as the North American Mesoscale Model (NAM) or Global Forecast System (GFS). The participants are registered through an online process by their local manager, a single representative for each school who acts as a contact point between schools' forecasters and the national manager, and a small entry fee is collected. The registration fee is used to pay for trophies, shipping costs of trophies, computer server hardware, and minor administrative overhead. The WxChallenge is a nonprofit entity and fees and quantity of awards are adjusted as needed to maintain a minimal carryover financial balance.
The contest operates in both the fall and spring academic semesters for 10 weeks. Every two weeks, the forecast location changes so that a variety of synoptic-scale, mesoscale, and even microscale meteorological features can be encountered by the participants. Forecasts are entered Monday through Thursday before 0000 UTC to be verified by National Weather Service Automated Surface Observing Systems (ASOS) observations taken during the following 0600–0600 UTC period (i.e., the next day). Forecasters submit values for the maximum temperature, minimum temperature, maximum sustained wind speed, and total cumulative liquid precipitation within the 24-h forecast period.
Daily scores for each city are the summation of the error points incurred from each of the forecast variables. These include the following:
maximum and minimum temperature: one error point for every degree difference between the forecasted high and low and the verified high and low temperatures;
wind speed: 0.5 error points for every knot difference between the forecasted wind speed and the verified wind speed;
0.4 points for each 0.01 of error in the verification range from 0.00–0.10 inclusive,
0.3 points for each 0.01 of error in the verification range from 0.11 to 0.25 inclusive,
0.2 points for each 0.01 of error in the verification range from 0.26 to 0.50 inclusive, and
0.1 points for each 0.01 of error in the verification range over 0.50; and
penalty: 10 points of penalty are assessed to every missed or persistence based forecast, and 5 points of penalty are assessed to every numerical guidance forecast [e.g., model output statistic values (MOS); Glahn and Lowry 1972] beginning on the second nonhuman forecast, where a nonhuman forecast includes a missed, persistence, or guidance forecast: The penalty points serve to systematically eliminate forecasters who have dropped out of the competition from negatively impacting their team scores, as well as ensure that the focus of the contest remains on individual forecasting and not reliance on external forecasts.
Error totals and relative weights between the forecasts categories were developed by aligning temperature forecast errors with prior forecast history, and by scaling the relative weighting for precipitation and wind errors to ensure they were of similar magnitude to temperatures.
Because forecasting at some locations may present a larger challenge than at other places, some locations will inherently produce higher total error scores. Thus, each city's 8-day cumulative scores are normalized to adjust for the varying complexity of forecasting at each location to better determine the forecaster's skill among the other forecasters using the following formula,
This formula results in a value of zero for a forecaster score equal to the consensus of all forecasts and a negative value for scores better than the consensus. The final cumulative score assigned to the forecasters and used to rank overall skill in the WxChallenge is the average of all individual cumulative scores with the worst (e.g., highest) cumulative score from each semester removed to minimize any personal issues (e.g., illness, heavy school workloads, spring break, vacations, etc.) from impacting scores. The process of normalizing scores throughout the year is and has been utilized in current and previous forecasting competitions (e.g., the NCWFC); however, aspects of the raw score (e.g., calculation of the raw score and dropping the worst) and the normalized central value (e.g., WxChallenge uses 0, NCWFC used 80) may differ.
After two weeks (eight total forecasts), the participants with the lowest cumulative scores in each category are acknowledged as the top forecasters for the city. In addition, the participant with the fewest overall error points is acknowledged as the top overall forecaster. Those forecasters who participate for both semesters (i.e., an entire academic year) are eligible for awards based on cumulative forecasting excellence. Further, the team (i.e., academic institution) with the top score at the end of the year receives the WxChallenge team trophy and the individual with the top score at the end of the year receives the WxChallenge individual trophy for display at their respective university. Both trophies contain placards of all past winners.
Immediately following the completion of the 10 forecast periods that comprise the main portion of the WxChallenge season, the top forecasters are invited to participate in a head-to-head, end-of-season tournament, with pairings and seeds determined by cumulative forecast excellence during the year; the tournament rankings and seeds are updated hourly and available to the forecasters on the WxChallenge website. The tournament, which includes the top 64 forecasters, consists of daily forecasts for 2 days. After the 2-day period has ended, the forecaster with the least error points advances to the next round. With over 2,100 forecasters participating in WxChallenge each year, there is strong competition to make the top 64 and be eligible for the end of year tournament.
The WxChallenge shares many similarities to other current and past competitions, but it also has its differences. Other forecasting competitions utilize different rules including the number of times per week forecasts are entered, the variables forecast, the number of forecast locations per forecast, and methodologies for scoring. Some forecast competitions use probabilistic- or parimutuel-based forecasting as their core concept, which provides a different forecasting experience. The WxChallenge and its advisory board continue to test different forecasting concepts to learn how their challenges and their educational benefits may play a role in their goals as a competition.
To provide a central location to submit, evaluate, and display forecasts and results, an interactive website was developed for the contest. From the website, forecasters are given a variety of forecast and verification tools at their fingertips and each forecaster can log on to the website and submit their forecasts directly to the contest. Forecasters have the ability to enter forecasts up to a week in advance to prevent any penalty for missed forecasts during periods they will be away (e.g., vacation, spring break). Additionally, once the forecast submission deadline has passed for a given day (0000 UTC), all forecasts are publicly displayed with histograms to provide immediate feedback concerning the distribution of forecasts entered by the participants. The website also provides rapid feedback concerning forecast verification and results. Thus, as each hourly observation is collected at the forecast site, updated results are immediately available to the forecasters. The website can also be accessed via tablet computers and smartphones to allow access to forecast submissions and results when away from a desktop computer. The main benefit to a web-based interface is its inherently platformindependent nature. The removal of the need for particular operating systems and/or specialized software to interact with the contest has allowed technological barriers to be removed from participation. This was a driving impetus for the WxChallenge in its creation as students had such technological barriers during participation in the NCWFC at the time.
Every hour, weather observations from the verification site are collected and processed to update the results of the contest. At any point, forecasters can check their standings in their classification, their academic institution, and their enrolled class (if applicable), and also do so for all forecasters participating in the contest. Participants also have the ability to enter hypothetical values on the website to envision how scores may be distributed under various verification scenarios. Additionally, the yearly cumulative standings are updated every hour.
Since officially beginning during the fall of 2006, 110 universities have participated in the contest and it has been utilized as part of the curricula for 140 classroom courses at various institutions since inception (Table 1). As its popularity has grown, instructors have seen the utility of the WxChallenge as a teaching aid (Grenci et al. 2008), and it has become a staple of many meteorological classes.
A main benefit of the WxChallenge is its use in the classroom as a teaching tool. As part of WxChallenge, course instructors can request a specific course listing that can be used to quickly identify students within a certain class so they can assess the progress of their pupils. At a quick glance, an instructor can determine whether students are struggling with various aspects of forecasting and gain feedback regarding whether any additional teaching material is needed.
In the spring of 2012, a survey was sent to all of the instructors using the WxChallenge as part of their curriculum to determine how it is being utilized at their educational institution. Nearly 90% of the respondents mentioned that the WxChallenge was used as part of a grade (rather than extra credit) by evaluating their participation levels, overall performance, and the degree of improvement throughout the semester. Over 90% of the instructors have the students perform map briefings, forecast discussions, and/or written forecast journals as methods for preparations for upcoming WxChallenge forecasts. Instructors have hailed the WxChallenge as providing “a critical link to the ‘real’ atmosphere beyond the whiteboard, notes, and PowerPoint presentations” and teaching the students that they “can't blindly go with the numerical model forecast.”
However, a more critical benefit of the WxChallenge is its ability to remove additional work from professors instructing forecasting classes and laboratories. While many instructors recognize the benefits of consistent forecasting, much time is required to individually organize a classroom evaluation strategy of student forecasting skills. Thus, for an individual instructor to design and implement forecast evaluation criteria removes resources (especially time) that could be better used for course preparation. The WxChallenge removes this burden and allows instructors to focus more time on their true task at hand: instructing students on meteorology and weather forecasting. As such, the WxChallenge is being incorporated into academic courses at universities and colleges across North America. In addition, the WxChallenge has also served as a focal point of many forecast discussions and weather briefings at participating institutions. Forecasters gather and discuss openly the past, current, and future conditions at the given city during the competition. Because cities vary throughout the year, participants benefit from detailed group discussions of synoptic, mesoscale, and microscale weather conditions in potentially unfamiliar areas, thus broadening the meteorological knowledge and capability of the forecasters. Additionally, some instructors have students lead particular briefings for WxChallenge cities thus providing them opportunities to better communicate scientifically in public.
The WxChallenge is a project developed at the University of Oklahoma that has brought a state-of-the-art, fun, and exciting forecasting contest to students enrolled at colleges and universities across North America. Its use as an educational tool in the collegiate classroom has increased from 41 classes in 2005/06 to over 60 classes in 2012/13 (Table 1), and continued growth is anticipated. To date, the WxChallenge has been used at 110 universities, in 140 individual academic courses, and by almost 8000 participants in its short 7-yr span.
Even though the history of the WxChallenge is relatively short, it has already begun to leave a mark on the culture of the meteorological learning experience. Feedback from students and faculty to the WxChallenge manager regularly detail how each late summer they anxiously await the release of the upcoming schedule. Further, many graduating students of every subdiscipline of meteorology proudly list their standings in the competition on resumes to potential employers or applications to graduate school. Through the urging of participants, the WxChallenge was presented with an award at the 2010 annual meeting of the American Meteorological Society for “a new paradigm for the nation's weather forecasting enterprise based on a voluntary grassroots effort, with impressive national impact through its use in curricula at scores of universities.”
Further, aside from the educational and culture change impacts, the WxChallenge has become a passive social medium between students and faculty alike. While a strong desire exists to aid the team to victory, there is often significant interschool competition being reported to the WxChallenge manager (e.g., as students compete against their instructors for grades or extra credit) as well as competitions between individuals and academic institutions. As a measure of this competitive drive and at the request of graduating students, the alumni category was insti instituted to allow these friendly competitions to continue beyond the collegiate experience.
Those involved in the WxChallenge would like to acknowledge the University of Oklahoma, and in particular the College of Atmospheric and Geographic Sciences, for their technical, financial, and computational support of the WxChallenge. They would also like to acknowledge the NCWFC, the WxChallenge predecessor, for laying the groundwork for a national forecasting competition. Additionally, the WxChallenge would also like to thank the American Meteorological Society for the recognition given from the special award presented at the 2010 AMS Annual Meeting. Finally, the WxChallenge would like to thank the students and faculty at the participating academic institutions for the overwhelming support and positive feedback through the years and to the many local managers that have made the WxChallenge as successful as possible.