Whose Ground Truth Is It? Harvesting Lessons from Missouri’s 2018 Bumper Crop of Drought Observations

Kelly Helm Smith National Drought Mitigation Center, School of Natural Resources, University of Nebraska–Lincoln, Lincoln, Nebraska

Search for other papers by Kelly Helm Smith in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0001-5229-984X
,
Mark E. Burbach Conservation and Survey Division, School of Natural Resources, University of Nebraska–Lincoln, Lincoln, Nebraska

Search for other papers by Mark E. Burbach in
Current site
Google Scholar
PubMed
Close
,
Michael J. Hayes School of Natural Resources, University of Nebraska–Lincoln, Lincoln, Nebraska

Search for other papers by Michael J. Hayes in
Current site
Google Scholar
PubMed
Close
,
Patrick E. Guinan University of Missouri Extension, Kansas City, Missouri

Search for other papers by Patrick E. Guinan in
Current site
Google Scholar
PubMed
Close
,
Andrew J. Tyre School of Natural Resources, University of Nebraska–Lincoln, Lincoln, Nebraska

Search for other papers by Andrew J. Tyre in
Current site
Google Scholar
PubMed
Close
,
Brian Fuchs National Drought Mitigation Center, School of Natural Resources, University of Nebraska–Lincoln, Lincoln, Nebraska

Search for other papers by Brian Fuchs in
Current site
Google Scholar
PubMed
Close
,
Tonya Haigh National Drought Mitigation Center, School of Natural Resources, University of Nebraska–Lincoln, Lincoln, Nebraska

Search for other papers by Tonya Haigh in
Current site
Google Scholar
PubMed
Close
, and
Mark D. Svoboda National Drought Mitigation Center, School of Natural Resources, University of Nebraska–Lincoln, Lincoln, Nebraska

Search for other papers by Mark D. Svoboda in
Current site
Google Scholar
PubMed
Close
Open access

Abstract

Drought-related decision-making and policy should go beyond numeric hydrometeorological data to incorporate information on how drought affects people, livelihoods, and ecosystems. The effects of drought are nested within environmental and human systems, and relevant data may not exist in readily accessible form. For example, drought may reduce forage growth, compounded by both late-season freezes and management decisions. An effort to gather crowdsourced drought observations in Missouri in 2018 yielded a much higher number of observations than did previous related efforts. Here we examine 1) the interests, circumstances, history, and recruitment messaging that coincided to produce a high number of reports in a short time; 2) whether and how information from volunteer observers was useful to state decision-makers and to U.S. Drought Monitor (USDM) authors; and 3) potential for complementary use of stakeholder and citizen science reports in assessing trustworthiness of volunteer-provided information. State officials and the Cattlemen’s Association made requests for reports, clearly linked to improving the accuracy of the USDM and the related financial benefit. Well-timed requests provided a focus for people’s energy and a reason to invest their time. State officials made use of the dense spatial coverage that observers provided. USDM authors were very cautious about a surge of reports coinciding closely with financial incentives linked to the Livestock Forage Disaster program. An after-the-fact comparison between stakeholder reports and parallel citizen science reports suggests that the two could be complementary, with potential for developing protocols to facilitate real-time use.

Denotes content that is immediately available upon publication as open access.

© 2021 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Supplemental information related to this paper is available at the Journals Online website: https://doi.org/10.1175/WCAS-D-19-0140.s1.

Corresponding author: Kelly Helm Smith, ksmith2@unl.edu

Abstract

Drought-related decision-making and policy should go beyond numeric hydrometeorological data to incorporate information on how drought affects people, livelihoods, and ecosystems. The effects of drought are nested within environmental and human systems, and relevant data may not exist in readily accessible form. For example, drought may reduce forage growth, compounded by both late-season freezes and management decisions. An effort to gather crowdsourced drought observations in Missouri in 2018 yielded a much higher number of observations than did previous related efforts. Here we examine 1) the interests, circumstances, history, and recruitment messaging that coincided to produce a high number of reports in a short time; 2) whether and how information from volunteer observers was useful to state decision-makers and to U.S. Drought Monitor (USDM) authors; and 3) potential for complementary use of stakeholder and citizen science reports in assessing trustworthiness of volunteer-provided information. State officials and the Cattlemen’s Association made requests for reports, clearly linked to improving the accuracy of the USDM and the related financial benefit. Well-timed requests provided a focus for people’s energy and a reason to invest their time. State officials made use of the dense spatial coverage that observers provided. USDM authors were very cautious about a surge of reports coinciding closely with financial incentives linked to the Livestock Forage Disaster program. An after-the-fact comparison between stakeholder reports and parallel citizen science reports suggests that the two could be complementary, with potential for developing protocols to facilitate real-time use.

Denotes content that is immediately available upon publication as open access.

© 2021 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Supplemental information related to this paper is available at the Journals Online website: https://doi.org/10.1175/WCAS-D-19-0140.s1.

Corresponding author: Kelly Helm Smith, ksmith2@unl.edu

Like the tree falling in the forest, does drought occur if there is no human to record or experience it? . . . What serves as ‘ground truth?’ What if there are many ground truths to choose from?—Kelly Redmond (Redmond 2002)

1. Why track drought impacts?

No single numeric definition of drought is applicable for all places and circumstances. Measurements of different aspects of the hydrologic cycle may not tell the same story and do not necessarily reflect the full range of circumstances (Svoboda et al. 2002). Most conceptual definitions involve a water balance, factoring in the difference between supply and expectations (Redmond 2002). A physical water shortage, typically understood and described in context of meteorology, agriculture, or hydrology, triggers socioeconomic or ecological drought impacts (Crausbay et al. 2017; Ding et al. 2011; Van Loon et al. 2016a,b). Impacts result from interactions of physical drought, vulnerability/adaptive capacity, social or environmental systems, and more (Kallis 2008). Being able to describe and ideally quantify the impacts of drought—such as reduced crop or pasture yield, an increase in dust-related respiratory problems, fish kills, or more intense wildfires—can focus drought response and mitigation (Lackstrom et al. 2013). As a warming climate brings about more hydrologic extremes, compounded by the drying effect of heat, effective drought response and mitigation can reduce stress on communities, health, livelihoods, and environmental diversity and productive capacity (Reidmiller et al. 2018).

Drought researchers advocate calibrating hydrometeorological indices by comparing them with impacts (Bachmair et al. 2016; Blauhut et al. 2015; Lackstrom et al. 2013; Meadow et al. 2013; Redmond 2002; Van Loon et al. 2016b). Systematic comparison of indices and impacts requires identifying or developing longitudinal data on drought impacts at relevant scales, however, and drought impact data are not as readily available as climate data (Redmond 2002). Defining drought impacts is in itself a challenge, both conceptually and because impacts are relative to expectation (Redmond 2002).

In some cases, particularly retrospectively, drought impacts are implicitly defined as an event, a change relative to normal (Smith et al. 2014). But a risk management approach to drought requires monitoring environmental conditions so that decision-makers can respond in time to mitigate impacts, and hydrometeorological data do not describe the full picture (Meadow et al. 2013). Drought impacts tend to be most frequently connected to decision-making, most thoroughly documented, and most frequently communicated in context of agriculture, water management, and firefighting (Lackstrom et al. 2013). Decision-makers need regional and local information on underlying conditions contributing to drought impacts, such as the effects of coastal salinity, high evapotranspiration, long-term environmental stressors, or the complexities introduced by semiarid mountainous terrain (Lackstrom et al. 2013). Effects of drought such as reduced air quality, dry domestic wells, or loss of habitat for wildlife may occur outside any established data-collection system or may not be obviously connected to drought (Smith et al. 2014; Smith 2018).

Evidence of these impacts may show up in data collected or information shared for a variety of reasons. Some of the worst effects of drought on human health, such as famine and migration to urban areas, are more evident in regions of the globe with populations dependent on subsistence farming (UN Convention to Combat Desertification 2019). But the connection between drought and West Nile virus in the United States has been documented since shortly after the formerly tropical mosquito-borne disease first appeared in North America (Epstein and Defilippo 2001; Shaman et al. 2005). News stories may highlight drought impacts and response and suggest sources of data that could be assembled systematically. For example, news stories may report ranchers thinning or liquidating their herds due to drought, near-real-time narratives that may eventually also emerge in expert interpretation of state and federal agricultural statistics, such as Peel (2013). The state of California began collecting data on dry domestic wells as a means to help local governments respond to drought and has continued the practice (State of California 2015). Ecological researchers suggest it would be valuable to identify and monitor indicators of drought-sensitive ecosystem services (Crausbay et al. 2017).

2. Crowdsourcing drought impacts

As a means to investigate the subjective experience of drought, researchers are experimenting with ways to collect observations from citizen scientists and other volunteers. If working well, a system for detecting drought impacts or conditions leading to an impact would anticipate worsening conditions in time for decision-makers to respond (Smith et al. 2014, Lackstrom et al. 2017). In 2005 the National Drought Mitigation Center (NDMC) at the University of Nebraska–Lincoln launched the Drought Impact Reporter (DIR) as a comprehensive national archive of drought impacts. The DIR had a “Submit a Report” option that allowed anyone to volunteer a drought impact report, and moderators also read those reports for evidence of drought impacts. Volunteer reporting was fairly sporadic, although it appeared to occur in surges corresponding to livestock producers’ experience with drought. In 2018, the NDMC implemented newer, easier-to-use technology to collect observations and saw a jump in the magnitude of event-driven reporting in Missouri, where several trends converged to produce a bumper crop of reports, in contrast to previous experience, as described below. This article examines the interests, circumstances, history, and recruitment messaging that coincided to produce a high number of reports in a short time; whether and how information from volunteer observers was useful to state decision-makers and to U.S. Drought Monitor (USDM) authors; and potential for complementary use of stakeholder and citizen science reports in assessing trustworthiness of volunteer-provided information.

a. Motivating drought reporting

The DIR’s system for collecting drought reports in 2018 built on previous efforts and assessments. In a 2013 assessment of Arizona’s DroughtWatch project, Meadow et al. (2013) likened the prevalent approach to drought impact collection to the belief repeatedly expressed in the 1989 movie Field of Dreams: “if we build it, people will come.” The de facto model for impacts reporting, they said, was that “if websites and portals are built, people will freely contribute their impact observations” (Meadow et al. 2013, p. 1507). The alternative they proposed was a system based on a “dream team” of local experts who can contribute regular observations. As described by Meadow et al., Arizona DroughtWatch (AZDW) was a web-based system to gather reports about drought impacts, working through the Local Drought Impacts Groups (LDIG) defined in the state’s 2003 drought plan. University of Arizona researchers, Arizona Cooperative Extension, and the state’s Department of Water Resources launched AZDW in 2009. Data collection centered around a survey form asking questions specific to Arizona’s natural resources and related economic activity. The system’s designers anticipated that observers would provide monthly reports about drought related conditions, including qualitative descriptions, and sounding an “all clear” when drought subsided. By late 2009, however, there was little involvement from either LDIG members or the public. Observers were most active during the onset of drought, and reporting declined substantially as conditions improved.

The assessment of AZDW by Meadow et al. found that it should have worked, given that it was scaled to and embedded in regional decision-making needs. Several problems may have contributed to lack of participation, including reliance on well-intended but overcommitted volunteers; asking volunteers to assess the presence or absence of drought; lack of feedback on how decision-makers used reports; and lack of computer skills (Meadow et al. 2013). A better alternative would be a system with a backbone of resource management agency experts who would be more familiar with assessing drought and whose efforts might encourage engagement by a larger group of volunteers (Meadow et al. 2013).

Lackstrom et al. (2013) presented a comprehensive set of considerations for collecting data on drought impacts, including systems that make use of volunteer observers. They identified many missing pieces, noting a lack of connection between decision-makers and drought observers, and that the lack of a clearly stated use for observations could lead to poor participation or to event-driven rather than sustained observations. They cited disincentives to reporting, such as wanting to avoid deterring tourists from water-based recreation, lack of motivation to report impacts such as cracked foundations not covered by homeowners’ insurance, and not reporting improving conditions, in order to increase eligibility for agricultural assistance. Key components of a drought impacts reporting system would include a range of data providers at different scales and from different sectors so that information could inform both local and national decision-making; mechanisms for aggregating data from local to regional or national scales; mechanisms for communicating about drought impacts with different users; and different users at different scales and in different sectors (Lackstrom et al. 2013).

In 2013, the Carolinas Integrated Sciences and Assessments (CISA) team launched a pilot project in North Carolina and South Carolina, based on the recommendations of Lackstrom et al. (2013) and Meadow et al. (2013). Partners included the Community Collaborative Rain, Hail and Snow Network (CoCoRaHS), the NDMC, and the National Integrated Drought Information System. The objective was to assess the feasibility of leveraging the CoCoRaHS citizen science network to support “condition monitoring”—continuous, long-term monitoring of weather and climate (and primarily drought) impacts on local environments (Lackstrom et al. 2017). In contrast to one-off reports, condition-monitoring reports are intended to provide information about drought onset, intensification, and recovery, rather than only information about the most severe impacts and conditions. In October 2016, the Carolinas-focused pilot project moved to the national level when CoCoRaHS introduced a new reporting form to all observers in the CoCoRaHS network. This form asks observers to record their assessment of conditions on a 7-point severely dry-to-severely wet scale. The CoCoRaHS network was selected for the CISA condition-monitoring pilot project because this network of volunteer citizen scientists is widely considered a reliable, trusted, and high-quality source of precipitation data (Reges et al. 2016). CoCoRaHS data is used by a wide range of federal, state, and local agencies, adding greater density of precipitation observations to what automated systems can provide. CoCoRaHS observers primarily provide daily precipitation reports and not all CoCoRaHS observers participate in condition monitoring, which is an additional step.

In 2018, the NDMC deployed a new form using Esri’s Survey123 platform to collect and display user reports, which are now called condition-monitoring observer reports (CMOR, pronounced “see more”; current version: https://go.unl.edu/CMOR_drought). Survey123 enables users such as the NDMC to design and deploy web-based forms to collect place-based observations, including photographs, and to display observations on a map. Being able to create and deploy a survey and map using off-the-shelf capabilities, without the time and expense of custom programming, enables an iterative approach to devising a system for collecting observations about drought impacts. As of this writing, NDMC is updating the survey form and maps each year, based on experience in the previous year and input from state decision-makers. Using commercially available software also enables NDMC and states using the same software to collaborate and share data.

Deploying the Survey123-based form and map created an opportunity for condition monitoring outside the CoCoRaHS network. It included the same 7-point scale and the recommendation that people submit observations at regular intervals, in wet, dry, and normal conditions. Missouri observers submitted more than 1400 reports in 2018. The previous high number of reports from a single state and year was 209 from South Dakota in 2017, and 184 from Georgia in 2016. Other states also saw higher numbers in 2018, with 197 reports from Arkansas, and 142 from Texas. The only other times that state totals neared triple digits were Montana, with 94 in 2017, and Missouri, with 93 in 2012. This seemingly event-driven pattern is consistent with the NDMC’s deployment of observation collection infrastructure, available for state and regional use, but without a dedicated recruitment and training program for drought observers. This research assesses how that played out in 2018 in Missouri.

b. Notes on terminology

The NDMC’s DIR (https://droughtreporter.unl.edu), established in 2005, includes both drought impacts culled from media reports and drought impacts based on reports from volunteer observers (Wilhite et al. 2007; Smith et al. 2014). The volunteers submitted reports either through CoCoRaHS condition-monitoring reports or through the DIR’s user report form. In January 2019, NDMC discontinued the practice of moderating CoCoRaHS and user reports to create drought impacts. The higher volume of both types of reports made it impractical, and both types of reports were collected by systems designed for reports to flow automatically onto maps. The reports NDMC has gathered via Survey123 are mapped separately from media-based impacts, and as of 2020, have been renamed CMOR. For the sake of clarity, we apply the CMOR term retroactively. We use the term “volunteer” to refer to reports that people voluntarily submit on their own time, as an unpaid activity. It applies to both CoCoRaHS and CMOR reports, and to the discontinued DIR user reports.

Stakeholders were people whose livelihoods were affected by drought, and who could benefit from livestock disaster relief triggered by the USDM. Stakeholders constituted the bulk of people who submitted CMOR through the NDMC’s DIR in Missouri 2018. Citizen scientists were participants in CoCoRaHS. The division between the two groups is not absolute. Some provided reports via both systems, and some CoCoRaHS observers are also stakeholders.

c. Crowdsourcing and credibility

Drought observations from Missouri in 2018 raise a drought-centered version of a question that has come up in other contexts, namely, the reliability of crowdsourced observations. Platforms that allow anyone to contribute information bypass traditional gatekeeping functions that used to be performed by, for example, news media or professional cartographers. Scholars of voluntary geographic information systems (VGIS) and overlapping, related fields such as public participation GIS (PPGIS) have focused on how motivation and credibility affect crowdsourced mapping. Citizen science projects such as CoCoRaHS, involving place-based data, or environmental monitoring, with each observation connected to a point on a map, are a form of VGIS. “The Internet presents a very different environment—one of information abundance—which makes traditional models of gatekeeper oversight untenable due to the sheer volume of information that would have to be vetted” (Flanagin and Metzger 2008, p. 140). In the traditional gatekeeping model, credibility is a proxy for accuracy, with information presumed to be accurate if it is provided by or has made it past a credible or official gatekeeper (Flanagin and Metzger 2008). In the absence of a gatekeeper, we need ways to assess the credibility of the observer and the accuracy of their information (Flanagin and Metzger 2008). In practice, one of the most often-used ways to assess credibility or reliability of the observer is the frequency of contributions (Coleman et al. 2009). An important distinction is whether volunteers are providing “data,” such as an objective measurement that can be independently verified, or “content,” such as a subjective assessment or commentary (Gómez-Barrón et al. 2016).

Cross-checking volunteers’ observations and checking for inconsistent information may help identify patterns of accurate and inaccurate information (Gollan et al. 2012; Wright et al. 2015). Training volunteers may increase accuracy or scientific reliability of their contributions and is a typical component of traditional citizen science projects (Flanagin and Metzger 2008; Tang and Liu 2016). In the Philippines, an automated process for validating crowdsourced flood reports compared them with nearby reports and with weather station data, identifying reports as correct if they fell within a computed confidence interval (Victorino et al. 2016). Structuring questions so that observations are as objective as possible, asking about data rather than content, such as a measurement of weight or size (Gollan et al. 2012), precipitation or number of species of grass within a frame, could provide opportunities to cross-check accuracy.

Why people contribute to crowdsourced maps—their motivation—is another relevant focus of VGIS research. Budhathoki and Haythornthwaite (2012) situated analysis of motivation within discussion of project structure, distinguishing between lightweight and heavyweight contributions. Lightweight crowdsourcing is centrally organized, places little demand on participants and does not involve participants interacting with one another, and results lend themselves to statistical aggregation. In contrast, they described systems requiring heavyweight contributions as more akin to scholarly disciplines, where participants know one another personally, by reputation and by output, and interact over time to evolve the level of understanding. Another way to differentiate these two ends of the spectrum is “crowd” and “community” (Gómez-Barrón et al. 2016). Although communities require more effort and commitment from volunteers, they may also be capable of a more engaged level of participation and autonomy, as on Arnstein’s ladder (Arnstein 1969; Gómez-Barrón et al. 2016). Self-organizing efforts such as Wikipedia and OpenStreetMap need the full spectrum of participants, and should recognize and tap into differing motivations, with more serious contributors participating in building and shaping community, and more casual participants supplying data within a structure others have established (Budhathoki and Haythornthwaite 2012). A survey of motivations for contributing found that “seeing errors on a map of their local area is a particularly highly motivating factor,” leading both serious and casual mappers to believe that their actions can make a difference (Budhathoki and Haythornthwaite 2012, p. 570). The same survey found that citizen science projects tap into people’s desire to learn, and sometimes to participate in a community, and that the prospect of financial gain also motivated people.

d. USDM process, pressures, and uses

The USDM is a weekly map depicting the extent and severity of drought. The map comes out each Thursday, on the basis of data through the preceding Tuesday, and shows each area of the United States in one of six color-coded categories: none, abnormally dry (D0), and moderate (D1), severe (D2), extreme (D3), and exceptional (D4) drought (https://droughtmonitor.unl.edu/About/WhatistheUSDM.aspx). Each week’s map is the outcome of a process led by one of a rotating team of about a dozen authors. The author integrates many different streams of numeric data, such as precipitation and temperature measurements and various drought and vegetation indices. The author also incorporates input from a listserv of about 450 state and regional experts who provide local interpretation of conditions on the ground. A 2017 survey of USDM listserv participants found that each of them is on average in touch with another five contacts in their area (NDMC 2018). In some cases, this takes the form of a state climatologist such as Missouri’s Pat Guinan asking extension specialists to report on what they observe, and then the climatologist forwards the gathered reports to the USDM listserv. Discussions on the listserv often cite media stories and reports from extension or Farm Service Agency (FSA) for further detail from credible observers (Lackstrom et al. 2013). Another source of information about drought impacts is observations submitted via the DIR or email or phone calls from citizen scientists, agency personnel, extension specialists and others reporting on conditions in specific places. The advent of easy-to-use technology supporting crowdsourced mapping and citizen science is enabling more people to participate in describing drought conditions. This introduces questions of motivation and credibility into the USDM process, particularly because USDM authors have observed that higher numbers of drought-related observations sometimes coincide with call-in campaigns and calls to elected officials. This convergence of drought condition reporting and political pressure complicates the task of establishing the credibility of large numbers of first-time drought observers.

The drought areas depicted on the map are of particular interest to livestock producers because the level of drought for a county shown on the USDM triggers different amounts of drought relief under the Livestock Forage Disaster Relief Program (LFP), including a substantial increase tied to the difference between D2 and D3 (Rippey 2019; Table 1). Although the LFP is tied to drought conditions, circumstances other than drought such as a late freeze, dramatic temperature swings, or grazing practices can also contribute to poor forage conditions. The USDM has been written into the U.S. Farm Bill as a trigger for drought relief under the LFP since 2008 (Food, Conservation, and Energy Act of 2008). The Internal Revenue Service (2006) also uses the USDM to define drought for livestock-related provisions. After drought in 2012, the USDM became a fast-track trigger for secretarial disaster declarations from the U.S. Department of Agriculture (USDA).

Table 1.

LFP relief levels (from USDA Farm Service Agency 2018a).

Table 1.

e. Authors’ participation in events

Some authors of this article were also participants in the events of 2018. Smith is leading NDMC’s implementation of CMOR reporting, and Fuchs is a USDM author. Guinan worked closely with state agencies on drought response, particularly related to recruiting observers, and gathers observations to make recommendations from Missouri to USDM authors.

3. Missouri drought reporting

Missouri’s unprecedented volume of drought reporting in 2018 is an opportunity to explore what worked to garner a higher rate of participation, and to explore questions related to maintaining credibility as the group of participants expands.

a. Laying the groundwork

By 2018, author Guinan, the Missouri state climatologist, had a well-developed system for gathering drought impact observations, including a network of extension specialists and producers (Smith 2016). Guinan cultivated extension reports by emailing University of Missouri Extension specialists, often individually. He asked the agronomy, horticulture and livestock specialists for photographs and descriptions of drought impacts in their areas. Guinan then compiled their reports and forwarded them to the USDM authors and listserv. In 2011, Guinan began routinely informing the public about the option to submit reports to the DIR. A press release from his office invited Missourians to contribute information that would be used by USDM authors in their weekly assessment process, and “hopefully provide a more accurate portrayal of drought” (Proctor 2011). The release also said that anyone could contribute and provide local expertise, and that observations become part of a long-term archive. It included the recommendation to report at least monthly. The results of Guinan’s efforts were apparent in 2012. Drought that year was one of the more intense and extensive seen in the central United States. A disproportionate share of volunteered reports in the DIR were from Missouri that year, nearly one-third of the total.

b. A triple whammy of adverse weather

In 2018, weather circumstances that Guinan called “The Triple Whammy” in a blog post created difficult circumstances for Missouri’s farmers and ranchers (Guinan 2018). Autumn and winter were dry in 2017, the driest September–January in more than 40 years, providing what he called “little opportunity for cool season recharge of the soil profile and surface water supplies.” Then 2018 became known as “the year without a spring,” when temperatures flipped from the second coolest April to the hottest May on record, further stressing pastures and missing a normal window of growth. The state also had its hottest May–June on record, along with precipitation deficits (Guinan 2018).

c. Drought reports as an outlet for pressure

Another convergence of events appears to have been responsible for the outpouring of reports from Missouri in 2018. On top of the adverse weather conditions documented above, the drought reporting network Guinan cultivated, use of more intuitive technology for drought reporting, and a “media blitz” contributed to an outpouring of drought reports from Missouri in 2018. Just under half of Missouri was in drought at the start of 2018, according to the USDM. In January, Missouri news media reported a cold, dry start to the year, and in February, Guinan encouraged Missourians to submit drought condition reports, though none did. Other news stories in February discussed how continuing drought could affect planting decisions, and other preparations for a dry year. Winter drought peaked on 20 February, with 63% of the state in D1 (moderate drought) or worse on the USDM. The area in drought declined sharply after that, with news stories in late February reporting on the substantial reduction. A wet March appeared to have eased concerns, with most news stories that mentioned drought providing routine updates on commodities, rather than focusing on drought. A few news stories in April noted that it had been Missouri’s 10th driest April on record, that 38 counties were still in D0 (abnormally dry) or D1, and that this time period was a critical window for forage growth. Abnormally dry (D0) or worse reached a low of 28% of the area of Missouri on 17 April, with drought (D1 or worse), at 8%, according to the USDM.

Although news coverage of drought was minimal in May, D2 (severe) drought reappeared on the USDM on 29 May, encroaching on 2% of the state. Late May was also when stakeholders began to submit CMOR reports, although the reports remained relatively sparse through June. Drought expanded and intensified in June, with 37% of the area in D1 or worse and 16% of the area in D2 by 26 June. Media began to cover drought hardship, mentioning the emergency conservation program available through the FSA and effects on livestock producers, and the state Soil and Water Commission releasing lands for grazing due to drought. As dry conditions intensified, Guinan requested that a shorter, less visually daunting version of the CMOR report form be created for use by the University of Missouri Extension. The state also launched a media blitz, employing both press releases and social media, asking people to report on the conditions they were experiencing.

By 5 July, news media were reporting that “In order for cattle producers to receive federal drought assistance, they must be in a D2 drought for eight consecutive weeks, [or] a D3 drought for four consecutive weeks. Farmers can help report drought conditions that will be factored into the drought monitor” at the links for the DIR (KY3.com 2018). CMOR reports picked up dramatically the first week in July, with 53 the week starting 2 July. D3 (extreme drought) reappeared on the USDM map released 12 July, covering 8% of the state, and CMOR reports hit a peak of 509 for the week of 9–15 July. A time line comparing USDM status and the number of news, CoCoRaHS and CMOR reports (Fig. 1) shows this peak occurring shortly after D3 first appeared on the USDM. CoCoRaHS reports peaked the following week, and media were at their second-highest level. On 18 July, following the recommendations of Missouri’s Climate and Weather Committee, Governor Parson issued an executive order declaring a drought alert for counties in D2-D4, and activating the state’s Drought Assessment Committee, which started meeting 26 July. The state’s Department of Natural Resources created web pages for drought news and information, including a link to the CMOR form. On 23 July, the High Plains Journal ran a story headlined detailing the hardships ranchers were facing, as well as comments from M. Deering, the Missouri Cattlemen’s Association executive vice president. The article reported, “He is concerned the drought monitor isn’t adequately reflecting reality” (Bickel 2018). In the article, Deering specifically describes the need for more spatially detailed reports in context of reaching D3 on the USDM map, and the need to “give a better assessment.” A second peak of CMOR reports occurred in early August, just before D4 (exceptional) appeared on the USDM. The area in D3 peaked at 25% on the map released 14 August. CMOR reports fell to 89 the week of 6–12 August and trailed off after that, but media reports hit their highest level in mid-August. The depiction on the USDM began to improve after mid-August, as heavy rains drenched some of the most affected areas.

Fig. 1.
Fig. 1.

Time line with CMOR, CoCoRaHS, news stories, and USDM status. The time line shows changes in the rates of CMOR, CoCoRaHS, and news media reporting across the year, along with the proportion of Missouri in each category of drought. The numbers of CoCoRaHS, CMOR, and news stories are scaled so that each is expressed as a percent of the highest number each achieved in a given month. The actual numbers of CoCoRaHS reports were much lower than CMOR reports, but the line is higher on the chart because CoCoRaHS reports were more evenly distributed across the year. The number of CoCoRaHS reports ranged from 16 in December to 40 in July, with all 12 months represented. The monthly mean for CoCoRaHS reports was 24.5, and the standard deviation was 6.8. CMOR reports were clustered in the summer months. The NDMC received 2 CMOR reports in May, 20 in June, 877 in July, 499 in August, 11 in September, and 5 in October. The monthly mean for May–October was 236, and the standard deviation was 370. An interactive version of the time line (https://go.unl.edu/MO2018_timeline) provides representative phrases when users hover over points so as to convey the gist of the discussion at different times.

Citation: Weather, Climate, and Society 13, 2; 10.1175/WCAS-D-19-0140.1

d. Missouri drought response

A Missouri Department of Natural Resources (2019) postdrought report provides a summary of state responses: In the second half of August, the state made additional water and hay available through 28 Department of Conservation areas and 5 Department of Natural Resources areas and created a hay lottery for farmers on 900 acres (364 ha) of Missouri state park land. Two water systems received $77,000 in state funding, and $800,000 in emergency funds supported 10 more eligible projects. Missouri’s Soil and Water Districts Commission approved policy variances related to cover crops, grazing systems, pond cleaning, and additional grazing areas. The state’s Department of Transportation relaxed regulations on transporting hay. Extension specialists held more than 40 workshops related to drought and livestock. Federal assistance came through the USDA’s Natural Resources Conservation Service, with the Environmental Quality Incentive Program and Wetlands Reserve Easement, through the FSA (USDA Farm Service Agency 2018a), with the LFP, and several others. In addition, Missouri livestock producers received $76.7 million from the LFP for losses in 2018, which was 16% of all LFP payments in 2018 (USDA Farm Service Agency 2018b).

4. Questions, data, and methods

a. Research questions

As stated above, our research questions are 1) what interests, circumstances, history, and recruitment messaging coincided to produce a high number of reports in a short time; 2) whether and how information from volunteer observers was useful to state decision-makers and to USDM authors; and 3) potential for complementary use of stakeholder and citizen science reports in assessing trustworthiness of volunteer-provided information.

This inquiry incorporates methods from several academic and applied disciplines and subfields including history, statistics, voluntary GIS and citizen science, political communication, issue tracking, and computer-assisted text analysis. More detailed descriptions of methods related to media searches and text analysis for the time line are in the online supplemental material.

b. CMOR reports

NDMC received a total of 1414 CMOR reports from Missouri in 2018, with 1015 submitted via the main form and 399 coming from the University of Missouri Extension form. For purposes of this analysis, we refer to these respectively as “CMOR-main” and “CMOR-MO.” Although some University of Missouri Extension specialists and the Department of Natural Resources linked directly to the CMOR-MO form, the CMOR-main form was available to Missourians via the DIR and may also have been circulated within the state. There were two differences in the forms (available in full in the online supplemental material). First, the CMOR-main form asked observers to describe their perception of conditions on a 7-point dry-to-wet scale, whereas the CMOR-MO form asked them to describe conditions only on the dry end of the same scale, including severely dry, moderately dry, and mildly dry. Second, the main form provided observers with a long checklist of possible impacts (“pasture condition”) grouped in several sectors: agriculture, environment, water, recreation and tourism, other business and industry, public and community health, fire, and other. Guinan and others were concerned that the CMOR-main form, a first deployment of new technology, was too visually daunting and would deter use. The CMOR-MO form listed the same sectors but no impacts within sectors and asked observers to check a sector and enter a text description. Both forms provided the option to upload a photograph, with a final text field for caption or additional description. Nearly all of the observers using the CMOR-MO form, 93%, provided text in one or more of the fields for free text. In contrast, only 42% of observers using the CMOR-main form entered text. Agriculture was the most frequently chosen sector on both forms, with one or more agriculture impacts checked off by 97% of observers using the CMOR-main form and the agriculture sector checked by 96% using the CMOR-MO form. The reports were similarly distributed in time and space, and indistinguishable by text content. Thus, for the bulk of our analysis, we combined observations from the two CMOR forms into a single larger set of observations, and referred to them as CMOR reports, without the “main” or “MO” qualifier.

c. CoCoRaHS reports

Twenty-nine different CoCoRaHS observers submitted a total of 294 condition-monitoring reports from Missouri in 2018, 60% of which included agriculture-related observations. They included 163 from May through October, the months when we received CMOR reports. CoCoRaHS observers choose one or more drought-related impact category when they submit condition-monitoring reports. The categories for CoCoRaHS condition-monitoring reports are agriculture; energy; fire; tourism and recreation; plants and wildlife; business and industry; water supply and quality; society and public health; relief, response, and restrictions; and general awareness. (CoCoRaHS categories match those of the DIR, derived primarily from media reports. CMOR categories are currently more fluid, evolving in response to patterns of use and decision-makers’ needs.) CoCoRaHS observers also rate perceived conditions on the seven-point dry-to-wet scale, from severely dry to severely wet. The form is available in the online supplemental material.

d. Temporal and spatial comparison of CMOR and CoCoRaHS reporting patterns

CMOR reports were spatially dense, from 1320 different locations across the state, and clustered within the growing season, mostly within the same several weeks. CoCoRaHS reports were from 29 different locations and more evenly distributed throughout the year. A time line (go.unl.edu/MO2018_timeline; Fig. 1) and map (go.unl.edu/MO_2018; Fig. 2) visualize the distribution of CMOR and CoCoRaHS reports in time and space. The time line provides a means to compare the distributions of CMOR and CoCoRaHS reports across time. It also depicts numbers of news stories and USDM status to provide context about potential motivations to report. On the y axis, the number of reports from CMOR observers, from CoCoRaHS observers and from news media are each scaled as a percent of maximum for that report type. This was a way to create a common scale across raw numbers that would otherwise be difficult and not inherently meaningful to compare.

Fig. 2.
Fig. 2.

One week of CMOR and CoCoRaHS reports for Missouri. An interactive map (https://go.unl.edu/MO_2018) that was not available in 2018 enables users to compare the full text of CoCoRaHS and CMOR reports over time and to click on points to see observers’ descriptions and photographs. It uses diamonds for CoCoRaHS reports, circles for CMOR-main, and squares for CMOR-MO, along with a common color scheme associated with the dry-to-wet scale. A time slider enables users to define what interval of time to view. Narrowing it to as little as a week facilitates comparison of CoCoRaHS and CMOR report content. Larger icons represent clusters of reports or more than one from a single location. CMOR reports from Missouri represented 1320 different points on a map, whereas CoCoRaHS reports were from 29 different points, each representing a registered observer. Most of the CMOR points—1252 of the 1320—had one associated report; 52 points had two reports, 10 points had three reports, 2 had four, and 4 had five. For CoCoRaHS, 15 points had one report and several had more than one, with three having 25 or more.

Citation: Weather, Climate, and Society 13, 2; 10.1175/WCAS-D-19-0140.1

e. News stories

An analysis of news stories from Missouri in 2018 helps to recreate context. News stories provide a preliminary historic record of government actions, as well as a sense of what concerns different people expressed, and what drought impacts they experienced. Quantifying systematic news search results also provides a way to gauge the level of interest over time (Gruszczynski and Wagner 2017; McCombs and Shaw 1972). Our count of news stories in the time line is based on Meltwater search results (see the technical details in the online supplemental material for the full Boolean search query and other information). Meltwater is a subscription service marketed to public relations professionals for issue tracking. Stories are sorted chronologically and binned by week, with an added StoryID field so each story has a unique identifier. Descriptive phrases on the time line provide an overview of the gist of the stories that appeared each week.

f. Time line: Context, content, quantifying interest

In addition to providing a way to visualize variation in rates of CMOR and CoCoRaHS reporting, the interactive time line incorporates news coverage and USDM status, providing general context as to physical and social elements of what people were experiencing (go.unl.edu/MO2018_timeline; Fig. 1). The USDM status provides some physical context for people’s experience and may also be interpreted as a way to visualize the relationship between D2 and the rate of reporting. The number of news stories serves as a gauge of awareness or interest in a topic. CoCoRaHS reports reflect both an ongoing citizen science initiative and people living through the drought. CMOR reports are a response to conditions. The interactive version of the time line displays representative phrases from each type of report, each week when a user hovers on a point in time. CMOR reports predominantly occurred within a short interval of time, and the theme of drought-related hardship was prevalent. Representative phrases are intended to capture both the theme (hardship and intensity of experience) and how it played out in different settings (feeding hay, reduced crop yield, more dust, etc.). CoCoRaHS reports exhibited greater variety, spanning the full range of the year, and including more observations about nonagricultural conditions. A CoCoRaHS observer presciently noted in April that the cold, dry March and April inhibited grass growth, and tonnage would be down. More detail on construction of the time line is available in the online supplemental material as technical detail.

g. Looking for a D2 effect: Analysis by USDM status

Comparison of the distributions of both CMOR and CoCoRaHS reports, May–October, by USDM status with a hypothetical “expected” distribution was a quantitative means to evaluate USDM authors’ impression that observers were more likely to submit reports when they were in D2. All of the 1414 CMOR reports came from May through October. This portion of the analysis used only the 163 CoCoRaHS reports that came during those six months. The expected distribution comes from computing the number of counties in each category of drought each week, assigning a county to the highest category that any proportion of the county had for that week. This is consistent with USDA practice. Actual distributions are the numbers of CoCoRaHS or CMOR reports that there would have been if the total number of reports were distributed in the same proportion of USDM categories as the expected number. There were 115 counties (including the independent city of Saint Louis) and 27 weeks in our analysis, for a total of 3105 county weeks. The USDM depicted 471 county weeks, 15% of 3105, as being in D2. So, if observations were proportionately distributed, there should be 15% of CoCoRaHS and 15% of CMOR reports to be in D2. An analysis of variance (ANOVA) for CMOR and for CoCoRaHS reports determined whether differences between actual and expected distributions were statistically significant.

h. Exploring motives via a survey of CMOR observers

In spring 2019, we sent a brief survey to the 814 observers from Missouri in 2018 who provided email addresses when they used either the CMOR-main or the CMOR-MO form. The purpose of the first three questions was to learn what motivated people to provide a report, and what or who was most influential in their decision to submit a report. To assess potential for turning event-driven observers into long-term observers, another question asked whether they would be willing to submit reports regularly. A final question provided an opportunity for any other comments. The survey was administered via the Qualtrics online survey software. People on the list received up to three emailed invitations to participate, via a unique link. The survey had a response rate of 29%, with 236 respondents answering one or more of the questions. Exact wording of the questions and possible responses are included in results tables.

i. Preliminary assessment of CMOR use in decision-making

In the autumn of 2018 we surveyed the 12 USDM authors about their use of different sources of information related to drought impacts, including the DIR, CoCoRaHS, and CMOR reports. The survey was administered via Qualtrics. USDM authors received an anonymous link that would ensure that their answers would remain confidential. Seven of the USDM authors responded to the survey. Several questions primarily related to ease of use are not included here. Tables 8 and 9, described in more detail later in the paper, provide the full list of choices and responses from multiple choice questions. Preliminary insight on how the state made use of CMOR reports came from email exchanges and a brief conversation with a Missouri official.

5. Results

a. Temporal and spatial comparisons

Table 2 summarizes comparisons of CMOR and CoCoRaHS reports. Figure 1 and the caption provide detailed comparison of their patterns over time; Fig. 2 and the caption provide a mapped comparison.

Table 2.

Summary comparison of CMOR and CoCoRaHS reports.

Table 2.

b. D2 effect: Comparison by USDM status

We separately analyzed CMOR and CoCoRaHS reports to see whether either of them exhibited a D2 effect, namely, a greater propensity to report in D2 than at other times. CoCoRaHS observations were more evenly distributed across USDM categories than CMOR observations, but still showed a statistically significant pattern (χ2 = 15.07, degrees of freedom df = 5, and p < 0.02), with observations in D2 higher than what would be expected if they were evenly distributed. But CMOR observations showed a much more pronounced pattern, with the numbers of observations in D2 and D3 greatly exceeding what would be expected if they were evenly distributed. The difference between expected and actual was highly statistically significant (χ2 = 1324, df = 5, and p < 0.001) (Table 3; Fig. 3).

Table 3.

D2 effect: comparison of actual vs expected number of reports. We calculated an “expected” number of reports on the basis of how many there would have been if the total number of each type of reports had been proportionately distributed across the number of county weeks in each level of drought. The “county weeks” column shows how many county weeks fell into each USDM level, May–October. The “prop” column is the proportion of the total number. “Reports” are how many reports were submitted, by USDM level. “Expected” is how many reports there would have been if the number of reports were proportionate to the number of weeks in each USDM level. “Diff” is the difference between actual and expected numbers of reports. A chi-square statistic was computed for each USDM level by squaring the difference and dividing by the expected number and then summing them for a statistic that applied to the entire group of reports. CoCoRaHS observations were more evenly distributed across USDM categories than CMOR observations but still showed a statistically significant pattern (χ2 = 15.07, df = 5, and p < 0.02), with observations in D2 notably higher than what would be expected if they were evenly distributed. CMOR observations showed a more pronounced pattern, with the numbers of observations in D2 and D3 greatly exceeding what would be expected if they were evenly distributed. The difference between expected and actual was highly statistically significant (χ2 = 1324, df = 5, and p < 0.001).

Table 3.
Fig. 3.
Fig. 3.

The D2 effect: comparison of expected vs actual reports. The transparent gray bars show how many reports we would expect in each drought category if the 1414 CMOR reports and the 163 CoCoRaHS reports were proportionate to the number of county weeks that the USDM actually depicted in each category of drought for May–October 2018 in Missouri. Both CoCoRaHS and CMOR reports had more reports in D2 than we would expect if reports were proportionately distributed, but the D2 effect was much more pronounced in CMOR reports than in CoCoRaHS reports, and a similar effect also showed up in D3 for CMOR reports.

Citation: Weather, Climate, and Society 13, 2; 10.1175/WCAS-D-19-0140.1

c. Assessments of dry, normal, and wet conditions

In keeping with the timing of recruitment messaging, CMOR observations were almost entirely concentrated on the dry end of the dry-to-wet scale, with 67% reporting that conditions were “severely dry,” and 23% selecting “moderately dry.” (Note also that many people submitted CMOR reports when the USDM depicted their areas as being in D2, severe drought, so the use of the word “severe,” even on a different scale, may have biased them toward reporting that conditions were severely dry.) In contrast, CoCoRaHS reports were much more evenly distributed across the range of conditions. Although there were more dry than wet conditions reported, the condition most frequently reported was “near normal” (Fig. 4).

Fig. 4.
Fig. 4.

Comparison of dry, normal, or wet conditions. Both CMOR and CoCoRaHS observers have the opportunity to pick a dry or wet level on a seven-point scale from severely dry to severely wet. Figure 4 shows that nearly all CMOR reports, which generally came during the peak of drought conditions, reported that conditions were severely or moderately dry. In contrast, CoCoRaHS reports, which were more evenly spread across the year, reported the full range of conditions, with the largest single group reporting near-normal conditions. The CoCoRaHS reports analyzed in this chart are only for May–October to be more directly comparable.

Citation: Weather, Climate, and Society 13, 2; 10.1175/WCAS-D-19-0140.1

d. Comparison of impact categories (sectors) of CMOR and CoCoRaHS reports

Comparison of categories represented in each report type found that agriculture was represented in nearly all CMOR reports, and in 60% of CoCoRaHS reports (Fig. 5). CMOR-main reports had more categories proportionately represented than either of the other sets of reports. CoCoRaHS observers had three additional categories to choose from: energy, general awareness, and relief.

Fig. 5.
Fig. 5.

Comparison of categories by report form. This figure compares the proportion of observations from each set of reports (CMOR-main, CMOR-MO, and CoCoRaHS) showing which category or sectors of impact were selected. CMOR-main and CMOR-MO reporters cited agriculture at a nearly identical rate, 97% and 96%, whereas 60% of CoCoRaHS reporters described agriculture-related effects. CoCoRaHS observers had options for three categories that were not available to CMOR observers: energy, relief (policy responses), and general. The CMOR-main form included more detailed prompts for various sectors, which is relevant if evaluating differences between CMOR-main and CMOR-MO in the number of categories selected.

Citation: Weather, Climate, and Society 13, 2; 10.1175/WCAS-D-19-0140.1

e. Results from survey of CMOR observers

Recognizing that CMOR reporters may have heard about the opportunity in various ways, we asked which was most influential. The largest single group of respondents, 27%, ranked hearing from an organization such as the Cattlemen’s Association or Farm Bureau as most influential (n = 161; Table 4). Hearing from government agencies, extension, acquaintances, or social media were all slightly less influential, with hearing from traditional news media notably lower, ranked highest by only 4%. The data supported our impression that at least some individuals were submitting reports explicitly to influence the USDM map. When asked what best described their motivation for submitting a report, 30% of respondents (n = 236; Table 5), chose “to change the U.S. Drought Monitor map.” Similarly, when asked to rank possible effects that they anticipated their report would have, “change the U.S. Drought Monitor map for your location” was the most important reason for 31% of respondents and “increase awareness of drought conditions for U.S. Drought Monitor authors” was the most important reason for 25% of respondents (n = 173; Table 6). More than half of the respondents expressed willingness to submit observations over time (n = 230; Table 7), with 17% saying weekly; 31% monthly; and 19% choosing “weekly or monthly in certain seasons.” From comments in a free text box, at least one person was expecting a personal response beyond an automated thanks. Additional comments from the survey are included in the online supplemental material.

Table 4.

How observers learned about reporting opportunity: responses to “How did you hear about the opportunity to submit a report on local drought conditions and impacts in summer 2018? Please check all that apply and rank them in order from most to least influential.”

Table 4.
Table 5.

Motivation for reporting: responses to “Which of the following best describes your motivation for submitting a report on local drought conditions and impacts in summer 2018? Please pick one.”

Table 5.
Table 6.

Anticipated effect of reporting: responses to “What effect did you anticipate that your report would have? Please select all that apply and rank them in order, from most to least important to you.”

Table 6.
Table 7.

Willingness to report regularly: responses to “Would you be willing to submit a report on local conditions, including when it is wet, normal, or dry, on a regular basis? Please pick one.”

Table 7.

f. USDM author use of CMOR reports

When asked, “In general, as an author, which of these options best describes your approach to observer-submitted reports of drought impacts (select one),” three of the seven respondents said their approach was to consult them when they needed to fill gaps or reconcile differences in data, and three provided answers in comments (Table 8). Two of the commenters use impacts to identify areas that may need additional attention, and one commenter occasionally consults observer-submitted reports. Asked, “Have you ever used information from CoCoRaHS in deciding where to depict drought, or in other functions related to U.S. Drought Monitor authoring?” four said yes and three said no. (The question did not distinguish between CoCoRaHS precipitation data and CoCoRaHS condition-monitoring reports.) Asked whether they had used the new CMOR reports in decision-making in 2018, all said no, although four had not authored the map during the relevant time period. In comments fields, one said they would likely consult CMOR reports in the future. Asked what would make observer reports more useful, six chose “individual observers contributing consistently over time” (Table 9) and one said that the connection with the U.S. Department of Agriculture’s Livestock Forage Disaster Program cast serious doubt on the credibility of volunteer-submitted reports comments. Authors’ use of CMOR reports appears to be increasing over time. A similar survey in 2020 found that six of the nine authors who responded had used CMOR reports when authoring the map, including three who spent more time digging into the climate data for the area, two requesting more information from locally knowledgeable sources, and one moving a line on the map.

Table 8.

USDM author survey excerpt, approach to observer reports: responses to “In general, as an author, which of these options best describes your approach to observer-submitted reports of drought impacts (select one).”

Table 8.
Table 9.

USDM author survey excerpt, potential enhancements: responses to “What, if anything, would make user-submitted condition-monitoring reports, sector impact reports, and photos more useful to you? (select all that apply).”

Table 9.

g. State of Missouri use of CMOR reports

Guinan directed our inquiry about state officials’ use of CMOR reports to the state’s Water Resources Center. The director responded as follows:

During this particular drought, the impact reporter was particularly helpful because drought conditions were not widespread like in 2012. Though we ended up with a drought monitor map that showed more widespread extreme and exceptional drought by the middle of August, we were experiencing some pretty severe impacts in localized areas throughout the state beginning as early as May 2018. Without the impact reporter, these localized impacts would likely not have been on our radar—these reports helped us give valuable information to planners throughout the state (J. Hoggatt 2018, personal communication).

Hoggatt elaborated further in a November 2019 group discussion at a Midwest Drought Early Warning System workshop, saying that the state’s approach was “trust but verify.” They assumed that people’s reports were truthful but used state agency and extension contacts to confirm conditions at specific locations if resource allocation decisions were being made. She also noted that before and after photographs, contrasting normal and dry conditions from the same location, were useful. Hoggatt said that the state does not use CoCoRaHS reports consistently because it can be difficult to find one from a relevant location at the right time (J. Hoggatt 2020, personal communication). In 2018, Guinan followed his normal pattern of communication regarding CoCoRaHS reporting, sending a letter to welcome new volunteers, and publicizing CoCoRaHS reporting along with CMOR reporting in several professional presentations early in the year.

6. Discussion

a. What led to Missouri’s bumper crop of reports in 2018

Going into this inquiry, we knew of several factors that combined to produce a large number of reports from Missouri in 2018, including extreme weather; Guinan’s cultivation of reports over the years, particularly from the University of Missouri Extension; newly implemented, easier-to-use technology for collecting reports; and the state’s “media blitz” in 2018. The combined requests from Guinan, state agencies, and the Cattlemen’s Association appear to have been well-timed, eliciting reports as drought and drought impacts were intensifying. The requests for reports came as people were looking for ways to vent their feelings and to take action that could relieve their suffering. The messaging from both the state and the Cattlemen’ Association focusing on helping USDM authors understand local conditions tapped into the desire to help distant cartographers reflect local conditions accurately, a strong motivation to participate in crowdsourced mapping (Budhathoki and Haythornthwaite 2012). Guinan, the state of Missouri, and the Cattlemen’s Association assured people that submitting drought reports would help the makers of the drought map get it right. This provided a sense of agency, focusing stakeholders’ attention on an opportunity to have a voice in the process, with a clear sense of how observations would be used, all of which motivate action (Budhathoki and Haythornthwaite 2012; Lackstrom et al. 2013; Meadow et al. 2013).

Our survey of Missouri observers revealed that hearing from an advocacy organization such as the Cattlemen’s Association may have been the single most energizing factor in their decision to submit a report. In retrospect, our survey should have better separated questions about who prompted reporting (agencies; organizations) and how observers learned about reporting (news media; social media). The fact that only 4% of respondents said hearing via news media was most influential suggests interesting research possibilities into message transmission and amplification. Research focused on this aspect could help account for the effects of different actors with shared interests using both traditional and social media to bring about action on the part of the public related to a natural disaster (Neuman et al. 2014).

b. The state’s use of information provided by volunteers

The state official’s description of how the reports were used suggest that the spatial density they achieved was of value in assessing the extent of drought-related conditions and impacts. The density resulted from inviting agricultural producers across the state to submit reports, which was easier than in the past due to technological improvements. While this was quite effective, it also went beyond previously established conventions, which had focused more on gathering on-the-ground reports from known extension specialists. A “trust but verify” approach enabled state officials to make use of the reports, at the very least as a suggestion for which areas should be examined for further evidence of hardship.

c. USDM authors’ use of information provided by volunteers

In retrospect, at least two main factors contributed to USDM authors’ not directly using CMOR reports in decision-making. The first is that there is no standard procedure for blending narrative information or a subjective assessment of conditions with numeric information in the USDM process. It would be rare for any impact information to be directly used in the USDM process, other than as a means to set priorities for where to look more closely at data. However, it would be a mistake to equate “not directly used” with “not useful.” Identifying spots that merit more examination is a real part of the process. It is an operational, sequential mixed-methods process (Creswell and Plano Clark 2018), in which qualitative information about areas where people are experiencing hardship triggers closer analysis of quantitative hydrometeorological data.

Second, CMOR reports were new in 2018, and unlikely to gain immediate trust (Coleman et al. 2009; Flanagin and Metzger 2008), especially in a pressured situation. However, it was clear at the time that USDM authors were aware that ranchers and others in Missouri believed that the USDM needed to depict more intense drought over the summer. Grassroots constituencies drew authors’ attention to that area for greater scrutiny, with phone calls and email as well as CMOR reports. A substantial portion of the state did end up in more intense categories of drought.

d. Mediated use of reports

Although USDM authors may not have directly used CMOR reports, and indeed, viewed aspects of the push to garner more reports as a lobbying campaign, the detailed spatial information in the reports reached USDM authors in mediated form. CMOR reports were one of the sources of information that Guinan, a trusted translator of his state’s experience (Lackstrom et al. 2013; Cash 2001), consulted to provide recommendations about Missouri to USDM authors. Guinan is both the state climatologist and part of the University of Missouri Extension. The reports provided Guinan with detailed spatial information that he and state officials could verify and validate, to provide evidence-backed, modulated recommendations to USDM authors. In this case Guinan was working across both types of boundaries identified by Cash (2001): across perspectives, from producers to scientists, and across levels, from local to state and national uses. This suggests that better defining the process, particularly with regard to the role of interpreters such as Guinan, would be beneficial for all involved. The multilevel process may be in contrast to the expectations of agricultural producers who are accustomed to working directly with representatives of federal agencies. In the absence of clarifying information, they may reasonably assume that the CMOR form is a personal opportunity for them to express a preference or receive assistance, rather than a means of contributing data that will be weighed along with other considerations.

e. Accounting for underlying conditions

Observations may also serve as a way to identify underlying conditions that are compounded by drought. There is currently not a systematic process for the USDM to account for the effects of underlying conditions. Some of these contributing conditions, such as good or bad soil health resulting from management decisions, depend on human decisions (Van Loon et al. 2016b). The extent to which the USDM reflects drought-related experience, including outcomes based on human decisions, as opposed to being an expert-interpreted synthesis of hydrometeorological indicators, is somewhat ambiguous. While the USDM incorporates a broader range of considerations than any other drought-monitoring tool, including short- and long-term impacts (Svoboda et al. 2002), using the USDM as a trigger for LFP payments suggests a need to ensure that the USDM depicts drought as experienced by livestock producers as accurately as possible. Hence, the means to account for poor conditions driven by both natural events such as a late freeze, and human decisions such as stocking rates, would be valuable additions to the process.

f. Sustaining interest in drought reporting

Another question deserving further investigation is whether and how to convert crisis-driven first-time observers into reporters committed to submitting observations over the long term—is there an opportunity for the lightweight, casual participants (Budhathoki and Haythornthwaite 2012) to become more serious community members? A significant portion of the CMOR observers indicated willingness to submit reports regularly, not just in drought. Sustaining their involvement over time would require regular communication and outreach. Given the NDMC’s current role, providing infrastructure but leaving outreach in the hands of intermediaries in different states with different institutional capabilities and circumstances, it is not clear who would do the communication and outreach. As Dilling and Lemos (2011) noted, the institutions that would be necessary to facilitate regular communication do not always exist, although extension may well be a logical choice in many states (Cash 2001).

g. Potential for complementary use of stakeholder and trusted observer reports

A backbone of experts can support involvement of other volunteers, putting the first points on the map and providing examples (Budhathoki and Haythornthwaite 2012; Meadow et al. 2013). They can also provide points of comparison or focus for decision-makers who would like a way to gauge reliability of crowdsourced observations. Although CoCoRaHS citizen science observers are not necessarily experts, they benefit from consistent training and guidelines. The juxtaposition of CoCoRaHS and CMOR reports in Missouri in 2018 presents a natural opportunity to explore the possibility of using the two report collection systems to complement one another. The sustained effort to motivate and train CoCoRaHS reporters over time yields, in some cases, consistent sets of observations for a single point. Narrowing the time slider on the interactive map to a week or a few weeks helps identify CoCoRaHS and CMOR reports from the same time that are near one another. Comparing their dry-to-wet ratings and descriptive content is a way of checking whether the newcomers to the discussion—one-time CMOR participants recruited as part of the push—provided information that was consistent with information from longer-term observers. Read a week at a time, neighboring CMOR and CoCoRaHS reports from early July through mid-August were consistently on the dry side and provided similar descriptions of conditions.

CoCoRaHS observers provide observations over time for a single location, while CMOR reports provided observations for many locations, within a short span of time. To use a medical analogy, ongoing reports are similar to regularly monitoring vital signs, whereas the concerted effort to get many reports in a short period of time is akin to an X-ray or a scan to provide more detailed information.

h. Expanding networks of trusted observers

Extension and FSA are other potential sources of expert observations (Lackstrom et al. 2013), and some extension agents have made use of CMOR reporting. As of late 2020, however, extension specialists’ use of CMOR reports varied by state. Some submitted reports as observers or on behalf of producers in their counties, while others, such as Guinan, were state climatologists involved in soliciting and interpreting reports for the USDM authoring process. Consistent guidelines for good practices, articulating the process for all involved, would be beneficial. Although states’ institutional arrangements and norms have evolved differently over time, recommendations on best practices have potential to increase uniformity. CMOR reporting has potential to systematize reporting from extension, FSA, and the public at large, and make observations transparent, part of a public record, and available to researchers. It may also be worthwhile to explore the potential of producer organizations such as the Cattlemen’s Association to serve as boundary organizations.

i. Recommendations and next steps

Several areas merit additional research and development. The fact that observers were motivated to spend time submitting a report does not negate the accuracy of the information that they shared. Finding efficient ways to validate or interpret the information would enable USDM authors to tap into a rich source of information about conditions leading or contributing to drought impacts.

Working with all involved to defining objectives and processes for each state, with attention to the role of boundary organizations such as extension, may help build trust with USDM authors and provide a means to verify sudden influxes of reports associated with intensifying drought. Making it clear that drought observations will be evaluated as data will also be helpful, countering the impression that submitting a report is a new way to file for drought assistance, or that it is part of a democratic process in which the number of “votes” matters. It may also be worthwhile to explore the potential of producer organizations such as the Cattlemen’s Association to serve as boundary organizations.

Credibility scores could be a productive focus for further research (for both CMOR and CoCoRaHS reports) with consistency of reporting over time a key metric. Devising a calibration process, based on a developing understanding of how an observer or set of observers behaves over time, in relation to different dry and wet conditions, would be possible for CMOR reports if observations occurred more regularly (Coleman et al. 2009). It would provide much more context than clusters of reports mainly submitted when the prospect of federal assistance creates a financial incentive. Comparing different types of observer reports, such as CoCoRaHS and CMOR, from similar spatial areas could also provide a means of calibrating observations (Gollan et al. 2012; Victorino et al. 2016; Wright et al. 2015). The dry-to-wet scale is directly comparable for CMOR and CoCoRaHS reports.

The evolving reporting system needs further consideration of how to handle open participation, or whether to provide options related to a consistent user ID, such as logging in or voluntarily identifying oneself that would allow building credibility. Maps could also distinguish reports from extension, FSA, or other trusted observers (Lackstrom et al. 2013).

Balancing privacy and transparency is a recurring theme. Extension and FSA reports for the National Agricultural Statistical Service are in many cases aggregated to the state level, to protect the privacy and interests of local agricultural producers. Further assessment of risk to producers should be part of considerations in making reporting more transparent. Risk as traditionally constructed in this context relates to disclosing too much information and loss of competitive advantage. The concept of risk may need to be broadened to encompass personal safety or property protection. Training of observers may eventually need to incorporate what information not to share.

Further research could also focus on ascertaining differences in individual and collective motivation. The surge of reports that occurred in D2 was closely related to the timing of requests in the media from Guinan, the state, and the Cattlemen’s Association. Further research may determine that the timing of the surge of reports in 2018 said more about when the collective sense of urgency peaked, and when the state and Cattlemen’s Association could articulate the clearest connection between action and outcome, rather than the motivation or experience of individual producers.

7. Conclusions

We investigated what led to a high rate of reporting in Missouri in 2018; the value of reports to state decision-makers and USDM map makers; and the potential for complementary use of CoCoRaHS and CMOR reports. In addition to more obvious reasons for more reports—a “triple whammy” of unfavorable weather conditions, Guinan’s carefully cultivated impact-reporting network, and a newer, more user-friendly way to collect reports—a survey of observers found that hearing from the Missouri Cattlemen’s Association as part of a statewide push may have been particularly influential in their decision to submit a report. State officials used reports to help determine the spatial extent of drought impacts, verifying information as needed. USDM authors were leery of a barrage of reports that coincided with financial incentive to intensify drought status from D2 to D3, but much of the state did end up in D3 or D4. The volunteer reports were one of the sources of information used by Guinan, who was the state climatologist and Missouri’s point person for contributions to the USDM process. CoCoRaHS condition-monitoring reports can provide useful validation for CMOR reports. The two reporting systems share a common dry-to-wet categorization but tap into different motivations. CoCoRaHS condition-monitoring observers are citizen scientists who make a commitment to submit regular precipitation measurement over time, and they benefit from being part of a learning community. CMOR observers tend to be event-driven, often responding to drought as agricultural producers whose livelihoods are adversely affected. Comparing the two sets of reports provides a means of verification and could allay concerns about bias from financial motivation, although some CoCoRaHS observers are also stakeholders.

We speculate that another reason for the large number of condition-monitoring reports from Missouri stakeholders in 2018 is a mismatch between the scope of the USDM process and the assistance that is linked to the map. While USDM authors consider on-the-ground descriptions in assessing conditions, in the end they rely on numeric measurements of precipitation, temperature and other climate and hydrologic products to create the map. But producers may sometimes endure very poor pasture conditions for reasons other than drought, or drought may compound various preexisting conditions. This creates a high-pressure situation, with USDM authors asked to change the map for reasons that are difficult to justify under their primary mandate. Here we begin the conversation on a systematic means to evaluate reports that may be the result of the gap between numeric drought monitoring and federal LFP assistance.

We have several recommendations for the evolving CMOR reporting system: Explore converting event-driven CMOR observers into long-term reporters, to provide historic context for observations, and so that individual observers can build credibility. Best-suited boundary organizations might vary by state, as extension and other networks have evolved differently. The system needs to strike a careful balance between privacy and transparency. More structure, including better guidance and expectation management for observers, will be helpful, although exactly how to sustain motivation will likely be at least in part a function of what networks and boundary organizations come into play in each state.

Developing a map layer of human-reported drought-related conditions has potential to fill well-identified gaps in tracking drought impacts, including elusive aspects such as expectations, the role of human decision-making, and underlying vulnerability. The immediate challenge is finding the reasons for people across the country to invest their energy over time in describing drought-related conditions. The experience with drought-condition-monitoring reports in Missouri in 2018 suggests that it is possible; identifying boundary organizations that can support reporting networks will be key.

Acknowledgments

This work was partially supported by the Drought Risk Management Research Center, which is a cooperative agreement between the National Oceanic and Atmospheric Administration’s Sectoral Applications Research Program and the National Integrated Drought Information System. We also thank the reviewers for their detailed suggestions.

REFERENCES

  • Arnstein, S. R., 1969: A ladder of citizen participation. J. Amer. Inst. Plann., 35, 216224, https://doi.org/10.1080/01944366908977225.

  • Bachmair, S., and Coauthors, 2016: Drought indicators revisited: The need for a wider consideration of environment and society. Wiley Interdiscip. Rev.: Water, 3, 516536, https://doi.org/10.1002/wat2.1154.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bickel, A., 2018: In Missouri, ranchers are praying for rain. High Plains J., 26 Jul 2018, https://www.hpj.com/bickel/in-missouri-ranchers-are-praying-for-rain/article_e63b2c93-50e2-5d9d-86f9-abbb35813c93.html.

  • Blauhut, V., L. Gudmundsson, and K. Stahl, 2015: Towards pan-European drought risk maps: Quantifying the link between drought indices and reported drought impacts. Environ. Res. Lett., 10, 014008, https://doi.org/10.1088/1748-9326/10/1/014008.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Budhathoki, N. R., and C. Haythornthwaite, 2012: Motivation for open collaboration: Crowd and community models and the case of OpenStreetMap. Amer. Behav. Sci., 57, 548575, https://doi.org/10.1177/0002764212469364.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cash, D. W., 2001: “In order to aid in diffusing useful and practical information”: Agricultural extension and boundary organizations. Sci. Technol. Hum. Values, 26, 431453, https://doi.org/10.1177/016224390102600403.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Coleman, D., Y. Georgiadou, and J. Labonte, 2009: Volunteered geographic information: The nature and motivation of produsers. Int. J. Spat. Data Infrastruct. Res., 4, 332358.

    • Search Google Scholar
    • Export Citation
  • Crausbay, S. D., and Coauthors, 2017: Defining ecological drought for the twenty-first century. Bull. Amer. Meteor. Soc., 98, 25432550, https://doi.org/10.1175/BAMS-D-16-0292.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Creswell, J. W., and V. L. Plano Clark, 2018: Designing and Conducting Mixed Methods Research. 3rd ed. SAGE Publications, 520 pp.

  • Dilling, L., and M. C. Lemos, 2011: Creating usable science: Opportunities and constraints for climate knowledge use and their implications for science policy. Global Environ. Change, 21, 680689, https://doi.org/10.1016/j.gloenvcha.2010.11.006.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ding, Y., M. J. Hayes, and M. Widhalm, 2011: Measuring economic impacts of drought: A review and discussion. Disaster Prev. Manage., 20, 434446, https://doi.org/10.1108/09653561111161752.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Epstein, P. R., and C. Defilippo, 2001: West Nile virus and drought. Global Change Hum. Health, 2, 105107, https://doi.org/10.1023/A:1015089901425.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Flanagin, A. J., and M. J. Metzger, 2008: The credibility of volunteered geographic information. GeoJournal, 72, 137148, https://doi.org/10.1007/s10708-008-9188-y.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gollan, J., L. L. de Bruyn, N. Reid, and L. Wilkie, 2012: Can volunteers collect data that are comparable to professional scientists? A study of variables used in monitoring the outcomes of ecosystem rehabilitation. Environ. Manage., 50, 969978, https://doi.org/10.1007/s00267-012-9924-4.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gómez-Barrón, J.-P., M.-Á. Manso-Callejo, R. Alcarria, and T. Iturrioz, 2016: Volunteered geographic information system design: Project and participation guidelines. Int. J. Geo-Inf., 5, 108, https://doi.org/10.3390/ijgi5070108.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gruszczynski, M., and M. W. Wagner, 2017: Information flow in the 21st century: The dynamics of agenda-uptake. Mass Commun. Soc., 20, 378402, https://doi.org/10.1080/15205436.2016.1255757.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Guinan, P. E., 2018: The triple whammy. Missouri Climate Center News, 1 Aug 2018, http://climate.missouri.edu/news/arc/jul2018a.php.

  • Internal Revenue Service, 2006: Extension of replacement period for livestock sold on account of drought. IRS Notice 2006-82, https://www.irs.gov/pub/irs-drop/n-06-82.pdf.

  • Kallis, G., 2008: Droughts. Annu. Rev. Environ. Resour., 33, 85118, https://doi.org/10.1146/annurev.environ.33.081307.123117.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • KY3.com, 2018: Drought conditions impact Barry County cattle producers and hurt hay yields. KY3, July 5 2018, https://www.kspr.com/content/news/Drought-conditions-impact-Barry-County-cattle-producers-and-hurt-hay-yields--487445611.html.

  • Lackstrom, K., and Coauthors, 2013: The missing piece: Drought impacts monitoring. Carolinas Integrated Sciences & Assessments Program and the Climate Assessment for the Southwest Workshop Rep., 23 pp., https://www.drought.gov/drought/sites/drought.gov.drought/files/media/resources/workshops/20130305_Drought_Impacts_Monitoring_Tuscon_AZ/Drough_Impacts_Report_June2013_final.pdf.

  • Lackstrom, K., A. Farris, D. Eckhardt, N. Doesken, H. Reges, J. Turner, K. H. Smith, and R. Ward, 2017: CoCoRaHS observers contribute to “condition monitoring” in the Carolinas: A new initiative addresses needs for drought impacts information. Bull. Amer. Meteor. Soc., 98, 25272531, https://doi.org/10.1175/BAMS-D-16-0306.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McCombs, M. E., and D. L. Shaw, 1972: The agenda-setting function of mass media. Public Opin. Quart., 36, 176, https://doi.org/10.1086/267990.

  • Meadow, A. M., M. A. Crimmins, and D. B. Ferguson, 2013: Field of dreams or dream team? Assessing two models for drought impact reporting in the semiarid Southwest. Bull. Amer. Meteor. Soc., 94, 15071517, https://doi.org/10.1175/BAMS-D-11-00168.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Missouri Department of Natural Resources, 2019: 2018 Missouri 2018 drought by the numbers. Missouri Drought Rep., 22 pp., https://dnr.mo.gov/documents/2018-Missouri-Drought-Report.pdf.

  • National Drought Mitigation Center, 2018: US Drought Monitor data gathering network reaches throughout communities and across the nation. Accessed 25 July 2020, https://drought.unl.edu/Publications/News.aspx?id=329.

  • Neuman, W. R., L. Guggenheim, S. M. Jang, and S. Y. Bae, 2014: The dynamics of public attention: Agenda-setting theory meets big data. J. Commun., 64, 193214, https://doi.org/10.1111/jcom.12088.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Peel, D. S., 2013: Lack of Cattle Catches Up with Beef Industry. Progressive Cattle, 23 Sep 2013, https://www.progressivecattle.com/news/market-reports/lack-of-cattle-catches-up-with-beef-industry.

  • Proctor, M., 2011: Missourians encouraged to report drought information online. Missouri Climate Center News, October 2011, http://climate.missouri.edu/news/arc/oct2011b.php.

  • Redmond, K. T., 2002: The depiction of drought: A commentary. Bull. Amer. Meteor. Soc., 83, 11431148, https://doi.org/10.1175/1520-0477-83.8.1143.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reges, H. W., N. Doesken, J. Turner, N. Newman, A. Bergantino, and Z. Schwalbe, 2016: CoCoRaHS: The evolution and accomplishments of a volunteer rain gauge network. Bull. Amer. Meteor. Soc., 97, 18311846, https://doi.org/10.1175/BAMS-D-14-00213.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reidmiller, D. R., C. W. Avery, D. R. Easterling, K. E. Kunkel, K. L. M. Lewis, T. K. Maycock, and B. C. Stewart, Eds., 2018: Impacts, Risks, and Adaptation in the United States. Vol. II, The Fourth National Climate Assessment, U.S. Global Change Research Program, 1515 pp., https://doi.org/10.7930/NCA4.2018.

    • Crossref
    • Export Citation
  • Rippey, B., 2019: 2018 Farm bill and the USDM. U.S. Drought Monitor Forum 2019, Bowling Green, KY, Western Kentucky University, https://drought.unl.edu/archive/Documents/NDMC/Workshops/963/Pres/Rippey-2018%20Farm%20Bill%20and%20the%20USDM.pptx.

  • Shaman, J., J. F. Day, and M. Stieglitz, 2005: Drought-induced amplification and epidemic transmission of West Nile virus in southern Florida. J. Med. Entomol., 42, 134141, https://doi.org/10.1093/jmedent/42.2.134.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Smith, K. H., 2016: Missouri state climatologist cultivates network of drought observers. DroughtScape, University of Nebraska National Drought Mitigation Center, 14–15, https://drought.unl.edu/archive/Documents/NDMC/DroughtScape/DS2016spring.pdf.

  • Smith, K. H., 2018: Impacts in context: Sector-specific experiences of drought in the Pacific Northwest in 2015. University of Nebraska National Drought Mitigation Center Rep., 70 pp.

  • Smith, K. H., M. Svoboda, M. Hayes, H. Reges, N. Doesken, K. Lackstrom, K. Dow, and A. Brennan, 2014: Local observers fill in the details on drought impact reporter maps. Bull. Amer. Meteor. Soc., 95, 16591662, https://doi.org/10.1175/1520-0477-95.11.1659.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • State of California, 2015: Household water supply shortage reporting system. Accessed 10 November 2020, https://mydrywatersupply.water.ca.gov/report/.

  • Svoboda, M., and Coauthors, 2002: The Drought Monitor. Bull. Amer. Meteor. Soc., 83, 11811190, https://doi.org/10.1175/1520-0477-83.8.1181.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tang, Z., and T. Liu, 2016: Evaluating internet-based public participation GIS (PPGIS) and volunteered geographic information (VGI) in environmental planning and management. J. Environ. Plann. Manage., 59, 10731090, https://doi.org/10.1080/09640568.2015.1054477.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • UN Convention to Combat Desertification, 2019: The Drought Initiative. United Nations, accessed 1 November 2020, https://www.unccd.int/actions/drought-initiative.

  • USDA Farm Service Agency, 2018a: Livestock Forage Disaster Program. U.S. Department of Agriculture Fact Sheet, 4 pp., https://www.fsa.usda.gov/Assets/USDA-FSA-Public/usdafiles/FactSheets/2018/livestock_forage_disaster_program-july2018.pdf.

  • USDA Farm Service Agency, 2018b: Disaster assistance programs. U.S. Department of Agriculture, accessed 25 October 2019, https://www.fsa.usda.gov/programs-and-services/disaster-assistance-program/index.

  • Van Loon, A. F., and Coauthors, 2016a: Drought in the Anthropocene. Nat. Geosci., 9, 8991, https://doi.org/10.1038/ngeo2646.

  • Van Loon, A. F., and Coauthors, 2016b: Drought in a human-modified world: Reframing drought definitions, understanding, and analysis approaches. Hydrol. Earth Syst. Sci., 20, 36313650, https://doi.org/10.5194/hess-20-3631-2016.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Victorino, J. N. C., M. R. J. E. Estuar, and A. M. F. A. Lagmay, 2016: Validating the voice of the crowd during disasters. Social, Cultural, and Behavioral Modeling, K. S. Xu et al., Eds., Lecture Notes in Computer Science, Vol. 9708, Springer, 301–310, https://doi.org/10.1007/978-3-319-39931-7_29.

    • Crossref
    • Export Citation
  • Wilhite, D. A., M. D. Svoboda, and M. J. Hayes, 2007: Understanding the complex impacts of drought: A key to enhancing drought mitigation and preparedness. Water Resour. Manage., 21, 763774, https://doi.org/10.1007/s11269-006-9076-5.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wright, D. R., L. G. Underhill, M. Keene, and A. T. Knight, 2015: Understanding the motivations and satisfactions of volunteers to improve the effectiveness of citizen science programs. Soc. Nat. Resour., 28, 10131029, https://doi.org/10.1080/08941920.2015.1054976.

    • Crossref
    • Search Google Scholar
    • Export Citation

Supplementary Materials

Save
  • Arnstein, S. R., 1969: A ladder of citizen participation. J. Amer. Inst. Plann., 35, 216224, https://doi.org/10.1080/01944366908977225.

  • Bachmair, S., and Coauthors, 2016: Drought indicators revisited: The need for a wider consideration of environment and society. Wiley Interdiscip. Rev.: Water, 3, 516536, https://doi.org/10.1002/wat2.1154.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bickel, A., 2018: In Missouri, ranchers are praying for rain. High Plains J., 26 Jul 2018, https://www.hpj.com/bickel/in-missouri-ranchers-are-praying-for-rain/article_e63b2c93-50e2-5d9d-86f9-abbb35813c93.html.

  • Blauhut, V., L. Gudmundsson, and K. Stahl, 2015: Towards pan-European drought risk maps: Quantifying the link between drought indices and reported drought impacts. Environ. Res. Lett., 10, 014008, https://doi.org/10.1088/1748-9326/10/1/014008.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Budhathoki, N. R., and C. Haythornthwaite, 2012: Motivation for open collaboration: Crowd and community models and the case of OpenStreetMap. Amer. Behav. Sci., 57, 548575, https://doi.org/10.1177/0002764212469364.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cash, D. W., 2001: “In order to aid in diffusing useful and practical information”: Agricultural extension and boundary organizations. Sci. Technol. Hum. Values, 26, 431453, https://doi.org/10.1177/016224390102600403.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Coleman, D., Y. Georgiadou, and J. Labonte, 2009: Volunteered geographic information: The nature and motivation of produsers. Int. J. Spat. Data Infrastruct. Res., 4, 332358.

    • Search Google Scholar
    • Export Citation
  • Crausbay, S. D., and Coauthors, 2017: Defining ecological drought for the twenty-first century. Bull. Amer. Meteor. Soc., 98, 25432550, https://doi.org/10.1175/BAMS-D-16-0292.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Creswell, J. W., and V. L. Plano Clark, 2018: Designing and Conducting Mixed Methods Research. 3rd ed. SAGE Publications, 520 pp.

  • Dilling, L., and M. C. Lemos, 2011: Creating usable science: Opportunities and constraints for climate knowledge use and their implications for science policy. Global Environ. Change, 21, 680689, https://doi.org/10.1016/j.gloenvcha.2010.11.006.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ding, Y., M. J. Hayes, and M. Widhalm, 2011: Measuring economic impacts of drought: A review and discussion. Disaster Prev. Manage., 20, 434446, https://doi.org/10.1108/09653561111161752.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Epstein, P. R., and C. Defilippo, 2001: West Nile virus and drought. Global Change Hum. Health, 2, 105107, https://doi.org/10.1023/A:1015089901425.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Flanagin, A. J., and M. J. Metzger, 2008: The credibility of volunteered geographic information. GeoJournal, 72, 137148, https://doi.org/10.1007/s10708-008-9188-y.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gollan, J., L. L. de Bruyn, N. Reid, and L. Wilkie, 2012: Can volunteers collect data that are comparable to professional scientists? A study of variables used in monitoring the outcomes of ecosystem rehabilitation. Environ. Manage., 50, 969978, https://doi.org/10.1007/s00267-012-9924-4.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gómez-Barrón, J.-P., M.-Á. Manso-Callejo, R. Alcarria, and T. Iturrioz, 2016: Volunteered geographic information system design: Project and participation guidelines. Int. J. Geo-Inf., 5, 108, https://doi.org/10.3390/ijgi5070108.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gruszczynski, M., and M. W. Wagner, 2017: Information flow in the 21st century: The dynamics of agenda-uptake. Mass Commun. Soc., 20, 378402, https://doi.org/10.1080/15205436.2016.1255757.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Guinan, P. E., 2018: The triple whammy. Missouri Climate Center News, 1 Aug 2018, http://climate.missouri.edu/news/arc/jul2018a.php.

  • Internal Revenue Service, 2006: Extension of replacement period for livestock sold on account of drought. IRS Notice 2006-82, https://www.irs.gov/pub/irs-drop/n-06-82.pdf.

  • Kallis, G., 2008: Droughts. Annu. Rev. Environ. Resour., 33, 85118, https://doi.org/10.1146/annurev.environ.33.081307.123117.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • KY3.com, 2018: Drought conditions impact Barry County cattle producers and hurt hay yields. KY3, July 5 2018, https://www.kspr.com/content/news/Drought-conditions-impact-Barry-County-cattle-producers-and-hurt-hay-yields--487445611.html.

  • Lackstrom, K., and Coauthors, 2013: The missing piece: Drought impacts monitoring. Carolinas Integrated Sciences & Assessments Program and the Climate Assessment for the Southwest Workshop Rep., 23 pp., https://www.drought.gov/drought/sites/drought.gov.drought/files/media/resources/workshops/20130305_Drought_Impacts_Monitoring_Tuscon_AZ/Drough_Impacts_Report_June2013_final.pdf.

  • Lackstrom, K., A. Farris, D. Eckhardt, N. Doesken, H. Reges, J. Turner, K. H. Smith, and R. Ward, 2017: CoCoRaHS observers contribute to “condition monitoring” in the Carolinas: A new initiative addresses needs for drought impacts information. Bull. Amer. Meteor. Soc., 98, 25272531, https://doi.org/10.1175/BAMS-D-16-0306.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McCombs, M. E., and D. L. Shaw, 1972: The agenda-setting function of mass media. Public Opin. Quart., 36, 176, https://doi.org/10.1086/267990.

  • Meadow, A. M., M. A. Crimmins, and D. B. Ferguson, 2013: Field of dreams or dream team? Assessing two models for drought impact reporting in the semiarid Southwest. Bull. Amer. Meteor. Soc., 94, 15071517, https://doi.org/10.1175/BAMS-D-11-00168.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Missouri Department of Natural Resources, 2019: 2018 Missouri 2018 drought by the numbers. Missouri Drought Rep., 22 pp., https://dnr.mo.gov/documents/2018-Missouri-Drought-Report.pdf.

  • National Drought Mitigation Center, 2018: US Drought Monitor data gathering network reaches throughout communities and across the nation. Accessed 25 July 2020, https://drought.unl.edu/Publications/News.aspx?id=329.

  • Neuman, W. R., L. Guggenheim, S. M. Jang, and S. Y. Bae, 2014: The dynamics of public attention: Agenda-setting theory meets big data. J. Commun., 64, 193214, https://doi.org/10.1111/jcom.12088.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Peel, D. S., 2013: Lack of Cattle Catches Up with Beef Industry. Progressive Cattle, 23 Sep 2013, https://www.progressivecattle.com/news/market-reports/lack-of-cattle-catches-up-with-beef-industry.

  • Proctor, M., 2011: Missourians encouraged to report drought information online. Missouri Climate Center News, October 2011, http://climate.missouri.edu/news/arc/oct2011b.php.

  • Redmond, K. T., 2002: The depiction of drought: A commentary. Bull. Amer. Meteor. Soc., 83, 11431148, https://doi.org/10.1175/1520-0477-83.8.1143.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reges, H. W., N. Doesken, J. Turner, N. Newman, A. Bergantino, and Z. Schwalbe, 2016: CoCoRaHS: The evolution and accomplishments of a volunteer rain gauge network. Bull. Amer. Meteor. Soc., 97, 18311846, https://doi.org/10.1175/BAMS-D-14-00213.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reidmiller, D. R., C. W. Avery, D. R. Easterling, K. E. Kunkel, K. L. M. Lewis, T. K. Maycock, and B. C. Stewart, Eds., 2018: Impacts, Risks, and Adaptation in the United States. Vol. II, The Fourth National Climate Assessment, U.S. Global Change Research Program, 1515 pp., https://doi.org/10.7930/NCA4.2018.

    • Crossref
    • Export Citation
  • Rippey, B., 2019: 2018 Farm bill and the USDM. U.S. Drought Monitor Forum 2019, Bowling Green, KY, Western Kentucky University, https://drought.unl.edu/archive/Documents/NDMC/Workshops/963/Pres/Rippey-2018%20Farm%20Bill%20and%20the%20USDM.pptx.

  • Shaman, J., J. F. Day, and M. Stieglitz, 2005: Drought-induced amplification and epidemic transmission of West Nile virus in southern Florida. J. Med. Entomol., 42, 134141, https://doi.org/10.1093/jmedent/42.2.134.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Smith, K. H., 2016: Missouri state climatologist cultivates network of drought observers. DroughtScape, University of Nebraska National Drought Mitigation Center, 14–15, https://drought.unl.edu/archive/Documents/NDMC/DroughtScape/DS2016spring.pdf.

  • Smith, K. H., 2018: Impacts in context: Sector-specific experiences of drought in the Pacific Northwest in 2015. University of Nebraska National Drought Mitigation Center Rep., 70 pp.

  • Smith, K. H., M. Svoboda, M. Hayes, H. Reges, N. Doesken, K. Lackstrom, K. Dow, and A. Brennan, 2014: Local observers fill in the details on drought impact reporter maps. Bull. Amer. Meteor. Soc., 95, 16591662, https://doi.org/10.1175/1520-0477-95.11.1659.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • State of California, 2015: Household water supply shortage reporting system. Accessed 10 November 2020, https://mydrywatersupply.water.ca.gov/report/.

  • Svoboda, M., and Coauthors, 2002: The Drought Monitor. Bull. Amer. Meteor. Soc., 83, 11811190, https://doi.org/10.1175/1520-0477-83.8.1181.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tang, Z., and T. Liu, 2016: Evaluating internet-based public participation GIS (PPGIS) and volunteered geographic information (VGI) in environmental planning and management. J. Environ. Plann. Manage., 59, 10731090, https://doi.org/10.1080/09640568.2015.1054477.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • UN Convention to Combat Desertification, 2019: The Drought Initiative. United Nations, accessed 1 November 2020, https://www.unccd.int/actions/drought-initiative.

  • USDA Farm Service Agency, 2018a: Livestock Forage Disaster Program. U.S. Department of Agriculture Fact Sheet, 4 pp., https://www.fsa.usda.gov/Assets/USDA-FSA-Public/usdafiles/FactSheets/2018/livestock_forage_disaster_program-july2018.pdf.

  • USDA Farm Service Agency, 2018b: Disaster assistance programs. U.S. Department of Agriculture, accessed 25 October 2019, https://www.fsa.usda.gov/programs-and-services/disaster-assistance-program/index.

  • Van Loon, A. F., and Coauthors, 2016a: Drought in the Anthropocene. Nat. Geosci., 9, 8991, https://doi.org/10.1038/ngeo2646.

  • Van Loon, A. F., and Coauthors, 2016b: Drought in a human-modified world: Reframing drought definitions, understanding, and analysis approaches. Hydrol. Earth Syst. Sci., 20, 36313650, https://doi.org/10.5194/hess-20-3631-2016.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Victorino, J. N. C., M. R. J. E. Estuar, and A. M. F. A. Lagmay, 2016: Validating the voice of the crowd during disasters. Social, Cultural, and Behavioral Modeling, K. S. Xu et al., Eds., Lecture Notes in Computer Science, Vol. 9708, Springer, 301–310, https://doi.org/10.1007/978-3-319-39931-7_29.

    • Crossref
    • Export Citation
  • Wilhite, D. A., M. D. Svoboda, and M. J. Hayes, 2007: Understanding the complex impacts of drought: A key to enhancing drought mitigation and preparedness. Water Resour. Manage., 21, 763774, https://doi.org/10.1007/s11269-006-9076-5.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wright, D. R., L. G. Underhill, M. Keene, and A. T. Knight, 2015: Understanding the motivations and satisfactions of volunteers to improve the effectiveness of citizen science programs. Soc. Nat. Resour., 28, 10131029, https://doi.org/10.1080/08941920.2015.1054976.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Fig. 1.

    Time line with CMOR, CoCoRaHS, news stories, and USDM status. The time line shows changes in the rates of CMOR, CoCoRaHS, and news media reporting across the year, along with the proportion of Missouri in each category of drought. The numbers of CoCoRaHS, CMOR, and news stories are scaled so that each is expressed as a percent of the highest number each achieved in a given month. The actual numbers of CoCoRaHS reports were much lower than CMOR reports, but the line is higher on the chart because CoCoRaHS reports were more evenly distributed across the year. The number of CoCoRaHS reports ranged from 16 in December to 40 in July, with all 12 months represented. The monthly mean for CoCoRaHS reports was 24.5, and the standard deviation was 6.8. CMOR reports were clustered in the summer months. The NDMC received 2 CMOR reports in May, 20 in June, 877 in July, 499 in August, 11 in September, and 5 in October. The monthly mean for May–October was 236, and the standard deviation was 370. An interactive version of the time line (https://go.unl.edu/MO2018_timeline) provides representative phrases when users hover over points so as to convey the gist of the discussion at different times.

  • Fig. 2.

    One week of CMOR and CoCoRaHS reports for Missouri. An interactive map (https://go.unl.edu/MO_2018) that was not available in 2018 enables users to compare the full text of CoCoRaHS and CMOR reports over time and to click on points to see observers’ descriptions and photographs. It uses diamonds for CoCoRaHS reports, circles for CMOR-main, and squares for CMOR-MO, along with a common color scheme associated with the dry-to-wet scale. A time slider enables users to define what interval of time to view. Narrowing it to as little as a week facilitates comparison of CoCoRaHS and CMOR report content. Larger icons represent clusters of reports or more than one from a single location. CMOR reports from Missouri represented 1320 different points on a map, whereas CoCoRaHS reports were from 29 different points, each representing a registered observer. Most of the CMOR points—1252 of the 1320—had one associated report; 52 points had two reports, 10 points had three reports, 2 had four, and 4 had five. For CoCoRaHS, 15 points had one report and several had more than one, with three having 25 or more.

  • Fig. 3.

    The D2 effect: comparison of expected vs actual reports. The transparent gray bars show how many reports we would expect in each drought category if the 1414 CMOR reports and the 163 CoCoRaHS reports were proportionate to the number of county weeks that the USDM actually depicted in each category of drought for May–October 2018 in Missouri. Both CoCoRaHS and CMOR reports had more reports in D2 than we would expect if reports were proportionately distributed, but the D2 effect was much more pronounced in CMOR reports than in CoCoRaHS reports, and a similar effect also showed up in D3 for CMOR reports.

  • Fig. 4.

    Comparison of dry, normal, or wet conditions. Both CMOR and CoCoRaHS observers have the opportunity to pick a dry or wet level on a seven-point scale from severely dry to severely wet. Figure 4 shows that nearly all CMOR reports, which generally came during the peak of drought conditions, reported that conditions were severely or moderately dry. In contrast, CoCoRaHS reports, which were more evenly spread across the year, reported the full range of conditions, with the largest single group reporting near-normal conditions. The CoCoRaHS reports analyzed in this chart are only for May–October to be more directly comparable.

  • Fig. 5.

    Comparison of categories by report form. This figure compares the proportion of observations from each set of reports (CMOR-main, CMOR-MO, and CoCoRaHS) showing which category or sectors of impact were selected. CMOR-main and CMOR-MO reporters cited agriculture at a nearly identical rate, 97% and 96%, whereas 60% of CoCoRaHS reporters described agriculture-related effects. CoCoRaHS observers had options for three categories that were not available to CMOR observers: energy, relief (policy responses), and general. The CMOR-main form included more detailed prompts for various sectors, which is relevant if evaluating differences between CMOR-main and CMOR-MO in the number of categories selected.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 989 160 15
PDF Downloads 820 115 4