• AAPOR, 2015: Standard definitions: Final dispositions of case codes and outcome rates for surveys. 8th ed. American Association for Public Opinion Research, 70 pp.

  • Brock, F. V., K. C. Crawford, R. L. Elliott, G. W. Cuperus, S. J. Stadler, H. L. Johnson, and M. D. Eilts, 1995: The Oklahoma Mesonet: A technical overview. J. Atmos. Oceanic Technol., 12, 519, https://doi.org/10.1175/1520-0426(1995)012<0005:TOMATO>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hsiao, C., 2014: Analysis of Panel Data. 3rd ed. Cambridge University Press, 538 pp., https://doi.org/10.1017/CBO9781139839327.

    • Crossref
    • Export Citation
  • Link, M. W., M. P. Battaglia, M. R. Frankel, L. Osborn, and A. H. Mokdad, 2008: A comparison of address-based sampling (ABS) versus random-digit dialing (RDD) for general population surveys. Public Opin. Quart., 72, 627, https://doi.org/10.1093/poq/nfn003.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McPherson, R. A., C. A. Fiebrich, K. C. Crawford, J. R. Kilby, D. L. Grimsley, J. E. Martinez, and A. D. Melvin, 2007: Statewide monitoring of the mesoscale environment: A technical update on the Oklahoma Mesonet. J. Atmos. Oceanic Technol., 24, 301321, https://doi.org/10.1175/JTECH1976.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ripberger, J. T., H. J. Jenkins-Smith, C. L. Silva, J. Czajkowski, H. Kunreuther, and K. M. Simmons, 2017a: Tornado damage mitigation: Homeowner support for enhanced building codes in Oklahoma. National Institute for Risk and Resilience Working Paper, 41 pp., http://risk.ou.edu/downloads/news/TornadoRiskMitigation-BuildingCodes-Website.pdf.

  • Ripberger, J. T., H. J. Jenkins-Smith, C. L. Silva, D. E. Carlson, K. Gupta, N. Carlson, and R. E. Dunlap, 2017b: Bayesian versus politically motivated reasoning in human perception of climate anomalies. Environ. Res. Lett., 12, 114004, https://doi.org/10.1088/1748-9326/aa8cfc.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Smith, T. W., 2009: A revised review of methods to estimate the status of cases with unknown eligibility. University of Chicago National Opinion Research Center, 22 pp.

  • View in gallery
    Fig. 1.

    Map of mesonet and M-SISNet stations in Oklahoma.

  • View in gallery
    Fig. 2.

    Map of M-SISNet stratification regions in Oklahoma.

  • View in gallery
    Fig. 3.

    Schematic overview of the M-SISNet sample frame and survey implementation protocol.

  • View in gallery
    Fig. 4.

    Weather, Society and Government Survey response rates (RR4) by quarter; S is the number of complete surveys by eligible households; U is the number of incomplete surveys by eligible households.

  • View in gallery
    Fig. 5.

    Weather, Society and Government Survey retention rates by quarter. Rates show the percentage of respondents from a given wave who completed a survey in subsequent waves.

  • View in gallery
    Fig. 6.

    Tornado warning reception on 29 Apr 2016 and 9 May 2016. Red polygons display tornado warnings and points show survey respondents. Hit represents when the respondent received warning that was issued by the NWS, miss is when the respondent did not receive warning that was issued by the NWS, false alarm is when the respondent received warning that was not issued by the NWS, and correct negative is when the respondent did not receive warning that was not issued by the NWS. Probability of detection (POD); false alarm ratio (FAR).

  • View in gallery
    Fig. 7.

    Relationship between seasonal climate anomalies and perceptions. (a) Precipitation (blue) and (b) temperature (red) anomalies, and (c),(d) perceptions of those anomalies for each respondent in each season (survey wave). Tracking sample means by season (black lines).

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 263 121 5
PDF Downloads 140 57 4

The Oklahoma Meso-Scale Integrated Socio-Geographic Network: A Technical Overview

Hank Jenkins-SmithCenter for Risk and Crisis Management, and National Institute for Risk and Resilience, University of Oklahoma, Norman, Oklahoma

Search for other papers by Hank Jenkins-Smith in
Current site
Google Scholar
PubMed
Close
,
Joe RipbergerCenter for Risk and Crisis Management, and National Institute for Risk and Resilience, University of Oklahoma, Norman, Oklahoma

Search for other papers by Joe Ripberger in
Current site
Google Scholar
PubMed
Close
,
Carol SilvaCenter for Risk and Crisis Management, and National Institute for Risk and Resilience, University of Oklahoma, Norman, Oklahoma

Search for other papers by Carol Silva in
Current site
Google Scholar
PubMed
Close
,
Nina CarlsonCenter for Risk and Crisis Management, and National Institute for Risk and Resilience, University of Oklahoma, Norman, Oklahoma

Search for other papers by Nina Carlson in
Current site
Google Scholar
PubMed
Close
,
Kuhika GuptaCenter for Risk and Crisis Management, and National Institute for Risk and Resilience, University of Oklahoma, Norman, Oklahoma

Search for other papers by Kuhika Gupta in
Current site
Google Scholar
PubMed
Close
,
Matt HendersonCenter for Risk and Crisis Management, and National Institute for Risk and Resilience, University of Oklahoma, Norman, Oklahoma

Search for other papers by Matt Henderson in
Current site
Google Scholar
PubMed
Close
, and
Amy GoodinPublic Opinion Learning Laboratory, University of Oklahoma, Norman, Oklahoma

Search for other papers by Amy Goodin in
Current site
Google Scholar
PubMed
Close
Open access

Abstract

Established as a social companion to the Oklahoma Mesonet, the Oklahoma Meso-Scale Integrated Socio-Geographic Network (M-SISNet) is a network of approximately 1500 “social monitoring stations” (geolocated households) across the state of Oklahoma that provide data on household perceptions and responses to signals that are sent from agricultural, hydrological, and meteorological systems. This paper outlines the purpose and nature of the M-SISNet, with specific focus on the sample frame, protocols for recruitment, retention, and survey implementation. It concludes with example survey questions, analyses, and directions for accessing the data.

Denotes content that is immediately available upon publication as open access.

© 2017 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Joe Ripberger, jtr@ou.edu

Abstract

Established as a social companion to the Oklahoma Mesonet, the Oklahoma Meso-Scale Integrated Socio-Geographic Network (M-SISNet) is a network of approximately 1500 “social monitoring stations” (geolocated households) across the state of Oklahoma that provide data on household perceptions and responses to signals that are sent from agricultural, hydrological, and meteorological systems. This paper outlines the purpose and nature of the M-SISNet, with specific focus on the sample frame, protocols for recruitment, retention, and survey implementation. It concludes with example survey questions, analyses, and directions for accessing the data.

Denotes content that is immediately available upon publication as open access.

© 2017 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Joe Ripberger, jtr@ou.edu

1. Introduction

Starting in March of 2014, researchers at the University of Oklahoma (OU) and Oklahoma State University (OSU) initiated a panel survey of a random sample of Oklahoma households that is integrated with both current and historical meteorological data from the Oklahoma Mesonet. Dubbed the Meso-Scale Integrated Socio-Geographic Network (M-SISNet), the primary purpose of the survey is to measure the way that households, as key components of human systems, perceive, process, and respond to patterns of weather and climate variation as “signals” from natural systems. This brief paper outlines the purpose and nature of the M-SISNet, with specific focus on the sample frame, protocols for recruitment, retention, and survey implementation.

2. Integrating environmental and social data

The Oklahoma Mesonet, operated by OU and OSU, is a network of 121 environmental monitoring stations that collect and disseminate research-quality agricultural, hydrological, and meteorological data in near–real time to scientists and decision-makers throughout the state of Oklahoma (Brock et al. 1995; McPherson et al. 2007). As noted in McPherson et al. (2007, p. 301), the mission of mesonet personnel is to “operate a world-class environmental monitoring network; to deliver high-quality observations and timely value-added products to Oklahoma citizens; to support state decision makers; to enhance public safety and education; and to stimulate advances in resource management, agriculture, industry, and research.”

Building upon the success of the Oklahoma Mesonet, the M-SISNet consists of a network of approximately 1500 “social monitoring stations” (geolocated households) across the state of Oklahoma that provide data on household perceptions and responses to signals that are sent from agricultural, hydrological, and meteorological systems.1 This information is critical for decision-makers who are tasked with understanding when and/or why households may (or may not) respond to environmental hazards, including short-term meteorological events (tornadoes, floods, etc.) and long-term trends associated with climate change (e.g., persistent drought). For example, researchers are currently using M-SISNet data to study the following:

  • Household perceptions about tornado risk mitigation practices, including building code enhancements for new homes and retrofitting existing homes with wind-resistant components;

  • Patterns of National Weather Service warning reception during extreme weather events;

  • Impacts of short- and long-term meteorological events on health and economic outcomes across the state; and

  • Household willingness to pay for policies that protect water quality in large watersheds, such as the Fort Cobb Watershed.

In addition to decision support, the M-SISNet, when integrated with data from the mesonet, provides dynamic, research-quality data to scientists who study the complex interactions that connect human systems with agricultural, hydrological, and/or meteorological systems, allowing them to address complex questions about how changes in weather/climate affect patterns of human behavior (i.e., water/energy usage) that, in turn, affect atmospheric conditions and subsequent weather/climate patterns. For example, researchers are using the M-SISNet to study the following:
  • Human perception of climate anomalies and climate change;

  • Influence of values, beliefs, and norms on environmental cognition and learning; and

  • Linkages that connect human decision-making and behavior to short-term land use and long-term ecological change.

3. Overview

An interdisciplinary team of researchers at OU and OSU designed the M-SISNet in 2013. The design team consisted of scientists in meteorology, ecology, political science, sociology, economics, geography, and anthropology. Data collection, which is guided by the M-SISNet management and implementation team, began in 2014 with support from the National Science Foundation. The management and implementation team consists of researchers at the OU Center for Risk and Crisis Management and the OU Public Opinion Learning Laboratory (OU POLL).

M-SISNet data come from a network of approximately 1500 geolocated households across the state of Oklahoma that respond to quarterly surveys about the weather, society, and government. For reference, Fig. 1 displays the geographic location of M-SISNet relative to mesonet stations throughout the state. The surveys, which are administered at the end of each meteorological season (winter, spring, summer, fall), include questions that measure a variety of concepts, including perceptions about extreme weather and climate variability and corresponding behaviors that may be related to these perceptions (e.g., energy, water, and land usage). The surveys also include one-time experimental question sets that are submitted by scientists who are interested in specialized topics, such as fire risk mitigation practices, the health impacts of extreme weather events, or ecosystem services in Oklahoma watersheds.

Fig. 1.
Fig. 1.

Map of mesonet and M-SISNet stations in Oklahoma.

Citation: Journal of Atmospheric and Oceanic Technology 34, 11; 10.1175/JTECH-D-17-0088.1

When data collection for a survey wave is complete, the M-SISNet management and implementation team post the data, along with the survey instrument, on the M-SISNet website (http://crcm.ou.edu/epscor/). At the time of this writing, 13 survey waves have been completed, with the expectation that we will complete at least 17 in the current project. We plan to continue operating the M-SISNet (beyond 17 waves) if future resources are available.

4. Design

The M-SISNet is designed to collect research-quality geolocated survey data from a random sample of households in Oklahoma who consent to participate in quarterly surveys. Consistent with best practices in survey research, we initiated this process with a list of households that was drawn from a random sample of known addresses in Oklahoma. Recruitment was then undertaken by phone or mail, as described in the following sections. During the recruitment process, we asked respondents to participate in four surveys a year over a span of several years. Moreover, we told them that each survey will take up to 30 min to complete.

Given the nature of the survey, we expected significant challenges in securing initial recruits and retaining respondents over the course of the survey. To facilitate this process, we encourage respondents to take the survey online, but we allow them to participate by phone if they prefer. We also offer an incentive ($10 gift card) for completing a survey. Likewise, we remind respondents to participate with e-mails and phone calls at the beginning, middle, and end of each survey wave. Retention rates are relatively high, though modest attrition occurs when respondents become ineligible (i.e., they move to another state), unable, or unwilling to continue participating in the survey. To make up for lost respondents, we follow the procedures outlined above and below to recruit a small set of new participants in between survey waves. This may result in slight changes to the composition of the panel that users should consider when analyzing M-SISNet data.

a. Sample frame

The M-SISNet sample frame is derived from an address-based random sample (ABS) of Oklahoma residences, assuring that all households in the state have a known probability of participation. To ensure a broad geographical distribution of respondents, the sample was stratified by region, as shown in Fig. 2. This was done to increase representation in rural parts of the state rather than concentrating our respondents in the Oklahoma City and Tulsa regions.

Fig. 2.
Fig. 2.

Map of M-SISNet stratification regions in Oklahoma.

Citation: Journal of Atmospheric and Oceanic Technology 34, 11; 10.1175/JTECH-D-17-0088.1

Figure 3 provides a schematic representation of the M-SISNet sample frame implementation protocol. The protocol can be characterized by four key components: 1) ABS sampling; 2) address eligibility verification; 3) recruitment to the pool of potential respondents to the Oklahoma Weather, Society and Government Survey; and 4) invitation to participate in the Oklahoma Weather, Society and Government Survey, which is administered on a quarterly basis.

Fig. 3.
Fig. 3.

Schematic overview of the M-SISNet sample frame and survey implementation protocol.

Citation: Journal of Atmospheric and Oceanic Technology 34, 11; 10.1175/JTECH-D-17-0088.1

Participants in the pool of potential respondents are recruited from a random sample of addresses drawn from a list of all working Oklahoma addresses in the U.S. Postal Service (USPS) Delivery Sequence File (DSF). The sample is purchased from Survey Sampling International (SSI). Before delivery, SSI matches addresses, where possible, with telephone numbers. The list of sample addresses provided by SSI thus consists of two subgroups: “matched” addresses that include both the full address and the associated telephone number, and “unmatched” addresses for which a phone number could not be obtained. For Oklahoma, samples tend to have a slightly greater proportion of matched addresses than unmatched. Both matched and unmatched addresses are processed to identify potentially valid households for recruitment to the pool of potential survey respondents.

b. Matched address protocol

For the matched list of addresses, phone numbers are used for eligibility verification and recruitment to the pool of potential respondents. The matched list protocol is undertaken in two stages: verification and recruitment. All matched addresses are assigned a unique identification number for tracking.

1) Pool verification stage for matched addresses

In the verification stage, OU POLL used a computer-assisted telephone interviewing (CATI) survey system that was developed by VOXCO (a survey software company) to screen out “ineligible” addresses among those households with matched telephone numbers. Ineligible addresses include those that are 1) not eligible residences, for example, businesses or group homes; 2) not inhabited by an eligible and/or available individual, for example, someone 18 years of age or older that speaks English and is healthy enough to respond to a survey; or 3) outside the region of the survey. Contacts that are 4) noncooperative and therefore cannot be screened for eligibility are listed as unscreened refusals and flagged as ineligible as well.

In practice, it is not possible to verify the eligibility of every matched address in the first pass because a large fraction of the calls made by OU POLL go 1) unanswered over the available period; 2) to an answering machine, busy line, cell phone, or fax machine; or 3) to a disconnected or blocked number. Because it is not possible to verify the eligibility of these addresses, they are flagged as indeterminate and future attempts are made to screen for eligibility. Matched addresses (households with telephone numbers) that do not reach a terminal designation of eligible or ineligible remain live in the verification stage for one full year (four survey waves) before being designated as unreachable by phone.2 At that point, the address associated with the phone number will be shifted to the unmatched address list and processed according to the unmatched protocol described below. All matched addresses that are screened as eligible are passed to the pool recruitment stage.

2) Pool recruitment stage for matched addresses

When the verification stage for matched addresses is complete, interviewers at OU POLL invite the person with the most recent birthday in the household to join the pool of potential respondents. If this person agrees, he or she becomes the designated respondent from that household and recruitment to the pool is accomplished by administering a short background survey that measures basic household and respondent characteristics.3 For matched addresses, panel recruitment and administration of the baseline background survey typically occurs by phone.

After the recruitment stage, the disposition of verified (eligible) addresses on the matched list fall into one of four categories: 1) screened refusal—an eligible respondent was invited to join the pool of potential respondents but refused to participate; 2) dropout—an eligible respondent started the background survey but subsequently quit and refused to continue; 3) noncontact—a previously designated eligible respondent could not be reached after multiple attempts and/or skipped appointments; or 4) complete—an eligible respondent completed the background survey and is therefore included as a consenting participant in the pool of potential respondents.

c. Unmatched address protocol

For the unmatched list of addresses, the addresses are used for verification and recruitment to the pool. Like the matched address protocol, the unmatched address protocol is undertaken in two stages: verification and recruitment. Before these stages commence, all unmatched addresses are assigned a unique identification number for tracking.

1) Pool verification stage for unmatched addresses

A letter and a prepaid return postcard are sent to the address in an initial mailing used to screen out ineligible addresses and to recruit participants to the pool of potential respondents. Addresses are flagged as undeliverable and therefore ineligible if the mailings sent to them are returned to the sender. After 2 weeks, a follow-up postcard is sent to the remaining set of potentially eligible addresses. If both mailings are sent and no contact is made, addresses will be flagged as “indeterminate” and future attempts are made to screen for eligibility by way of periodic mailings. Unmatched addresses that do not reach a terminal designation of eligible or ineligible will remain live in the verification stage for one full year (four survey waves) before being designated as unreachable and removed from the sample.

2) Pool recruitment stage for unmatched addresses

The letter included in the initial mailing sent to respondents on the list of unmatched addresses explains the subject of the survey, the multiyear survey period, and then invites the person in the household who had the most recent birthday and who is 18 years or older to participate in the pool of potential respondents. As with the matched list, if this person agrees, he or she becomes the designated respondent from that household and recruitment to the pool is accomplished by completion of a background recruitment survey that measures basic household and respondent characteristics. The postcard included in the initial mailing offers two options for participation in this background survey: the respondent can 1) indicate interest in participating by e-mailing OU POLL using the e-mail address provided in the postcard or by providing contact information to OU POLL via the prepaid return postcard included in the initial mailing; or 2) complete the background survey on their own by navigating to the website listed on the postcard.

Respondents who indicate an interest in participating but who do not self-administer the recruitment survey (on the web) are contacted by OU POLL and are again offered two options for completing the background recruitment survey: 1) complete the survey by phone or 2) complete the survey on the Internet at a local web-enabled location (e.g., a public library).

After the recruitment stage, the disposition of verified (eligible) addresses on the unmatched list falls into one of the four categories described above: 1) screened refusal, 2) dropout, 3) noncontact, or 4) complete.

5. Recruitment response rates

The response rates reported here are calculated using data from the first two years of recruitment. During this period, OU POLL attempted to contact, verify, and recruit eligible respondents from 36 574 matched addresses and 30 582 unmatched addresses.

Contact was made with 16 437 matched addresses; 7514 of these addresses were verified as eligible and 8923 were deemed ineligible. Most of the ineligible addresses were determined to be ineligible because the phone line was disconnected (7272). It was not possible to verify the eligibility of the remaining 20 137 matched addresses because no contact was made. Many of these indeterminate cases resulted from unanswered calls (11 706) or hang-ups (4279). In total, respondents were successfully recruited from 3288 of the 7514 of the matched addresses that were verified as eligible to participate in the pool of potential respondents to the Oklahoma Weather, Society and Government Survey.

By comparison, contact was made with 2709 unmatched addresses; 610 of these addresses were verified as eligible and 2099 were not eligible to participate in the survey. Most of the ineligible determinations came from postcards that were marked as undeliverable by the USPS (2077). It was not possible to verify the eligibility of the remaining 27 873 unmatched addresses because no contact was made. For most of these indeterminate cases, postcards were sent and no responses were received (27 808). In all, 536 respondents were successfully recruited from the 610 unmatched addresses that were verified as eligible to participate.

Using these numbers, OU POLL calculates response rates based on the formula for response rate 4 (RR4), as defined by the American Association for Public Opinion Research (Smith 2009; AAPOR 2015). We use the following equation to calculate RR4:
eq1
where S is the number of eligible cases that were successfully recruited to the pool potential respondents, U is the number of eligible cases that were not successfully recruited, e is the proportion of eligible cases among all cases in the sample for which a definitive determination of eligibility was obtained, and I is the number of indeterminate cases in the sample. In this equation, e is used to estimate the number of eligible cases in the indeterminate set. The estimate is relatively conservative and falls between more extreme estimates, such as minimum or maximum allocation, which simply assumes that none (0%) or all (100%) of the indeterminate cases are eligible (Smith 2009). For the matched sample, e was 0.457 [7514/(7514 + 8923)], producing an estimate of 9203 (0.457 × 20 137) eligible cases in the indeterminate set. For the unmatched sample, e was 0.225 [610/(610 + 2099)], producing an estimate of 6271 (0.225 × 27 873) eligible cases in the indeterminate set. Using these estimates, we get a total of 16 717 (3288 + 4226 + 9203) eligible households in the matched sample and 6881 (536 + 74 + 6271) eligible cases in the unmatched sample, yielding response rates of 19.7% (3288/16 717) and 7.8% (536/6881) for the matched and unmatched samples, respectively, and 16.2% [(3288 + 536)/(16 717 + 6881)] overall.

Though modest, these rates should be interpreted in light of the demanding task that was described to potential recruits (transfer to web mode to complete a 30-min survey four times a year for 4–5 years). For reference, Link et al. (2008) used a similar (ABS) sample frame and reported response rates of 20%–37%, depending on the state. Note, however, that this was a single-mode one-time survey rather than a sequential-mode panel survey.

6. Quarterly response and retention rates

As the pool of potential respondents is populated, respondents are selected from the pool to participate in the Oklahoma Weather, Society and Government Survey, which is administered four times per year and will continue for at least 17 survey waves.4 As they are selected, respondents are asked to indicate their preferred mode for completing the survey. These include 1) via the web-based portal, using a unique link that respondents are provided upon selection; or 2) via CATI with a survey technician at OU POLL. Reminders are sent via e-mail and by phone (as appropriate) over the period of the quarterly survey administration.

The Oklahoma Weather, Society and Government Survey is administered quarterly, over a 2-month window beginning at the end of each season (winter, spring, summer, and fall) through the midpoint of the following season.5 Each quarter, the survey is completed when the designated number of survey responses is obtained or the survey window is closed, whichever happens first. After the quarterly survey wave is closed, invited respondents are placed in one of four categories (shown at the bottom of Fig. 3): 1) survey refusal—an invited respondent refused to take the survey; 2) no contact— an invited respondent could not be reached after multiple attempts and/or skipped appointments; 3) incomplete survey—an invited respondent started but did not finish the survey; or 4) complete survey —an invited respondent completed the Oklahoma Weather, Society and Government Survey in that quarter. The response rate for each quarter is calculated using RR4. At this point, however, all households in the pool are eligible to participate in the survey, so the I term in RR4 is zero. This reduces RR4 to S (the number of complete surveys by eligible households) divided by S + U (the number of incomplete surveys by eligible households). Figure 4 shows this rate for the first 13 quarters.

Fig. 4.
Fig. 4.

Weather, Society and Government Survey response rates (RR4) by quarter; S is the number of complete surveys by eligible households; U is the number of incomplete surveys by eligible households.

Citation: Journal of Atmospheric and Oceanic Technology 34, 11; 10.1175/JTECH-D-17-0088.1

As described above, the Oklahoma Weather, Society and Government Survey is a panel survey that is administered quarterly. For that reason, we start each survey wave by sending invitations to potential respondents who have completed the survey in previous quarters. If those invitations do not generate enough responses, invitations are sent to surplus members from the pool of potential respondents who have yet to complete the Oklahoma Weather, Society and Government Survey. This set of surplus members from the pool of potential respondents consists of 1) addresses that are deemed eligible during the 1-yr rolling verification process for matched and unmatched addresses and 2) addresses that come from supplemental ABS samples that are purchased from SSI and processed according to the recruitment protocol we describe above. To monitor this process, quarterly retention rates are calculated by identifying the percentage of respondents from each quarter that complete the survey in subsequent quarters. Figure 5 shows these percentages for the first 13 quarters.

Fig. 5.
Fig. 5.

Weather, Society and Government Survey retention rates by quarter. Rates show the percentage of respondents from a given wave who completed a survey in subsequent waves.

Citation: Journal of Atmospheric and Oceanic Technology 34, 11; 10.1175/JTECH-D-17-0088.1

7. Demographic representation

The characteristics of the respondents who completed surveys in the first 13 waves are shown in Table 1. As is evident in the table, the respondents overrepresent women and older segments of the target population, and underrepresent nonwhite and Hispanic segments.6 Note also that the sample distribution across the five state regions—north-central, Oklahoma City, southeast, southwest, and Tulsa—reflects oversamples in the less-populated rural regions to permit comparisons across regions and trends within regions over time. Given the regional distribution of respondents and the departures from census-estimated proportions of the adult population, use of the data for estimates of population averages will need to be appropriately weighted. We calculate these weights on a project-by-project basis and encourage others to do the same.

Table 1.

Demographic representation of respondents by survey wave.

Table 1.

8. Measures

The Oklahoma Weather, Society and Government survey includes three types of questions: 1) a battery of quarterly questions that measure dynamic concepts like perceptions of extreme weather and climate variability, and corresponding behaviors that may be related to these perceptions (e.g., energy, water, and land usage); 2) a set of yearly questions that measure less dynamic concepts, such as land-use decisions and the beliefs, values, and norms that may orient perceptions and behaviors; and 3) one-time experimental question sets that are submitted by decision-makers or scientists who are interested in specialized topics. Table 2 provides a few examples of each type of question.

Table 2.

Example survey questions

Table 2.

9. Analysis and integration

The M-SISNet provides high-quality information to scientists and decision-makers who are tasked with understanding when and/or why households may (or may not) respond to environmental hazards. For example, an experiment in the spring 2015 survey wave provides data on homeowner support for tornado damage mitigation policies, such as building code enhancements.7 The data suggest that support is relatively weak and subject to countervailing forces. On the one hand, Oklahoma residents generally perceive and often experience the risk of tornadoes. These forces encourage support for building code enhancements. On the other hand, the individualistic worldviews and conservative ideologies that are prominent in the state generate opposition to regulation and mandatory mitigation policies, including building codes. This finding provides important information and context to policy makers who are looking for ways to reduce the cost of tornadoes in Oklahoma.

Information from the M-SISNet also informs emergency managers and meteorologists who communicate with Oklahomans before, during, and after extreme weather events. In addition to quarterly questions about how people get information about the weather, multiple survey waves include experiments that measure tornado warning reception, comprehension, and response. For instance, the spring 2016 survey wave included questions about warning reception during high-impact thunderstorms on 29 April and 9 May 2016. As illustrated in Fig. 6, the results indicate that misses are extremely rare (1.4% and 0.8%, respectively)—when tornado warnings are issued, people inside the warning area generally receive them. At the same time, however, false alarms are relatively common (51.9% and 50.9%)—people outside the warning area inadvertently receive tornado warnings that are not issued for their area. Using forecast verification metrics, this pattern indicates a high probability of detection (POD; 0.929 and 0.969) and a moderate to high false alarm ratio (FAR; 0.738 and 0.667) in warning reception.8 This is potentially problematic because false alarms can cause confusion, distrust, complacency, and costly reactions to nonexistent warnings.

Fig. 6.
Fig. 6.

Tornado warning reception on 29 Apr 2016 and 9 May 2016. Red polygons display tornado warnings and points show survey respondents. Hit represents when the respondent received warning that was issued by the NWS, miss is when the respondent did not receive warning that was issued by the NWS, false alarm is when the respondent received warning that was not issued by the NWS, and correct negative is when the respondent did not receive warning that was not issued by the NWS. Probability of detection (POD); false alarm ratio (FAR).

Citation: Journal of Atmospheric and Oceanic Technology 34, 11; 10.1175/JTECH-D-17-0088.1

Data from the M-SISNet become even more potent when integrated with data from the Oklahoma Mesonet. For example, integration allows for dynamic analysis of the extent to which individuals perceive feedback—signals of health and distress—from natural systems. Figure 7 provides a glimpse into this complex process. The blue and red lines in the figure indicate precipitation and temperature anomalies and perceptions of those anomalies for each respondent in each season (survey wave); the black lines track sample means by season.9 Data on the climate anomalies come from mesonet daily averages that we interpolate to respondent addresses. Data on perceptions come from the Oklahoma Weather, Society and Government survey, which asks respondents to indicate the amount of precipitation that fell in their area (1 = less; 2 = same; 3 = more) and average temperatures in their area (1 = cooler; 2 = same; 3 = warmer) in the survey season, relative to average. A comparison of the trends shows that individuals generally perceive both forms of feedback, especially when anomalies are relatively extreme. This finding challenges the idea that climate signals are too difficult to see or are subject to cognitive biases that override perception, such as politically motivated reasoning.10

Fig. 7.
Fig. 7.

Relationship between seasonal climate anomalies and perceptions. (a) Precipitation (blue) and (b) temperature (red) anomalies, and (c),(d) perceptions of those anomalies for each respondent in each season (survey wave). Tracking sample means by season (black lines).

Citation: Journal of Atmospheric and Oceanic Technology 34, 11; 10.1175/JTECH-D-17-0088.1

10. Data management and dissemination

As of this writing, 13 waves of the Oklahoma Weather, Society and Government survey have been completed. The data are managed by researchers at the OU Center for Risk and Crisis Management, who have developed a web application that integrates recruitment, survey administration, and day-to-day maintenance of the sample, which includes tasks such as changing phone numbers, e-mail addresses, or physical addresses (when respondents move). At the end of each wave, the management team implements quality controls and then de-identifies the data for public use. M-SISNet data are available online for download (http://crcm.ou.edu/epscordata/), as are the survey instruments for each wave and a corresponding reference sheet that indicates the questions that are included by wave.

11. Summary and conclusions

Agricultural, hydrological, and meteorological systems operate alongside social systems where people perceive and respond to environmental conditions. Understanding when, why, and how people perceive and respond to different conditions requires research-quality data on both systems. The Oklahoma Mesonet and many other resources provide these data for environmental systems in Oklahoma. Data on social systems, by comparison, are lacking. The Oklahoma M-SISNet and the 1500-plus “social monitoring stations” that complete quarterly surveys begin to address this gap by providing dynamic data on perceptions, decisions, and behaviors in households across the state. As we continue to collect, disseminate, and analyze these data, we hope to provide a prototype for future attempts to monitor and study the interaction of human and natural systems around the world.

Acknowledgments

The M-SISNet was created with support from the National Science Foundation’s Program to Stimulate Competitive Research (under Grant IIA-1301789). The authors thank the anonymous peer reviewers for their valuable feedback on this manuscript.

REFERENCES

  • AAPOR, 2015: Standard definitions: Final dispositions of case codes and outcome rates for surveys. 8th ed. American Association for Public Opinion Research, 70 pp.

  • Brock, F. V., K. C. Crawford, R. L. Elliott, G. W. Cuperus, S. J. Stadler, H. L. Johnson, and M. D. Eilts, 1995: The Oklahoma Mesonet: A technical overview. J. Atmos. Oceanic Technol., 12, 519, https://doi.org/10.1175/1520-0426(1995)012<0005:TOMATO>2.0.CO;2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hsiao, C., 2014: Analysis of Panel Data. 3rd ed. Cambridge University Press, 538 pp., https://doi.org/10.1017/CBO9781139839327.

    • Crossref
    • Export Citation
  • Link, M. W., M. P. Battaglia, M. R. Frankel, L. Osborn, and A. H. Mokdad, 2008: A comparison of address-based sampling (ABS) versus random-digit dialing (RDD) for general population surveys. Public Opin. Quart., 72, 627, https://doi.org/10.1093/poq/nfn003.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McPherson, R. A., C. A. Fiebrich, K. C. Crawford, J. R. Kilby, D. L. Grimsley, J. E. Martinez, and A. D. Melvin, 2007: Statewide monitoring of the mesoscale environment: A technical update on the Oklahoma Mesonet. J. Atmos. Oceanic Technol., 24, 301321, https://doi.org/10.1175/JTECH1976.1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ripberger, J. T., H. J. Jenkins-Smith, C. L. Silva, J. Czajkowski, H. Kunreuther, and K. M. Simmons, 2017a: Tornado damage mitigation: Homeowner support for enhanced building codes in Oklahoma. National Institute for Risk and Resilience Working Paper, 41 pp., http://risk.ou.edu/downloads/news/TornadoRiskMitigation-BuildingCodes-Website.pdf.

  • Ripberger, J. T., H. J. Jenkins-Smith, C. L. Silva, D. E. Carlson, K. Gupta, N. Carlson, and R. E. Dunlap, 2017b: Bayesian versus politically motivated reasoning in human perception of climate anomalies. Environ. Res. Lett., 12, 114004, https://doi.org/10.1088/1748-9326/aa8cfc.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Smith, T. W., 2009: A revised review of methods to estimate the status of cases with unknown eligibility. University of Chicago National Opinion Research Center, 22 pp.

1

The M-SISNet collects a statewide panel sample—or “base sample”—of approximately 1500 households in each survey wave. In addition, the M-SISNet surveys several hundred households in each of five subregions, including the Oklahoma City region. This paper focuses on the statewide sample. Contact the authors for more information on the subregion samples.

2

A small portion of the indeterminate matched addresses are flagged as unreachable by phone and shifted to the unmatched address list in less than a year. This happens, for example, when the phone number associated with the address is disconnected, blocked, consistently busy, connected to a fax machine, or a cell phone. In these instances, the address is shifted to the unmatched address list as soon as that designation is made.

3

In many instances, verification and recruitment for matched addresses occur during the same phone call. To limit measurement error, the designated respondent from each household completes the recruitment survey and subsequent quarterly surveys.

4

The initial wave of the survey was implemented in February 2014, and the final wave (for the current project) is planned for March 2018. Operation will continue if resources are available.

5

We define seasons according to the meteorological calendar for the Northern Hemisphere.

6

Consistent with the U.S. Census, we treat race and ethnicity (Hispanic origin) as separate and independent categories.

7

See Ripberger et al. (2017a) for more information on this experiment.

8

POD = hits/(hits + misses); FAR = false alarms/(hits + false alarms).

9

We use all responses in this analysis, including responses from panelists who did not complete every survey. This is common in survey research, where unbalanced panels are the norm. Researchers generally agree that using all available data is better than omitting data from respondents who do not complete every wave. For more on this topic, see Hsiao (2014).

10

See Ripberger et al. (2017b) for more on perception and politically motivated reasoning.

Save