• View in gallery

    Locations of the 77 ASPM airports. Airports in red are highlighted in this work for considering regional impacts of specific weather types and events.

  • View in gallery

    ASQP delay, OPSNET delay, and ASQP cancellation causes for ASPM 77 airports. Data range used was from October 2003 to July 2015.

  • View in gallery

    Average number of departure and arrival cancellations per month for the 10 selected airports considering all weather types normalized to the maximum value for each year for each airport. The maximum values used for normalization of departure cancellations are as follows (airport, average cancellations per hour, month): JFK, 285, February; ATL, 688, January; MIA, 70, September; MSP, 116, February; ORD, 800, February; DAL, 56, February; SLC, 139, January; PHX, 78, January; SEA, 67, January; and SFO, 129, December. The maximum values used for normalization of arrival cancellations are as follows (airport, average cancellations per hour, month): JFK, 286, February; ATL, 715, January; MIA, 70, February; MSP, 116, February; ORD, 814, February; DAL, 24, February; SLC, 139, January; PHX, 86, January; SEA, 68, January; and SFO, 139, December.

  • View in gallery

    Normalized average number of departure and arrival cancellations per hour for each weather type/observed phenomena for the 10 selected airports. The maximum values used for normalization of departure cancellations are based on the maximum across all weather types for each airport. The maximum values used for normalization of departure cancellations are as follows (airport, average cancellations per hour, weather type): JFK, 6.1, freezing; ATL, 43.1, freezing; MIA, 0.4, fog; MSP, 1.5, freezing; ORD, 9.9, freezing; DAL, 1.8, snow; SLC, 1.4, freezing; PHX, 0.7, fog; SEA, 2.2, freezing; and SFO, 4.7, fog. The maximum values used for normalization of arrival cancellations are as follows (airport, average cancellations per hour, weather type): JFK, 5.0, freezing; ATL, 40.6, freezing; MIA, 0.5, drizzle; MSP, 1.3, freezing; ORD, 8.3, freezing; DAL, 1.3, snow; SLC, 1.0, fog; PHX, 0.3, mist; SEA, 1.8, freezing; and SFO, 0.5, drizzle.

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 625 625 54
PDF Downloads 550 550 54

Meteorological Impacts on Commercial Aviation Delays and Cancellations in the Continental United States

View More View Less
  • 1 The Climate Corporation, St. Louis, Missouri
  • | 2 Atmospheric Sciences Department, University of Hawaii at Mānoa, Honolulu, Hawaii
© Get Permissions
Full access

Abstract

Weather creates numerous operational and safety hazards within the National Airspace System (NAS). In 2014, extreme weather events attributed 4.3% to the total number of delay minutes recorded by the Bureau of Transportation Statistics. When factoring weather’s impact on the NAS delays and aircraft arriving late delays, weather was responsible for 32.6% of the total number of delay minutes recorded. Hourly surface meteorological aviation routine weather reports (METARs) at major airports can be used to provide valuable insight into the likely causes of weather delays at individual airports. When combined with the Federal Aviation Administration’s (FAA’s) Operations Network (OPSNET) delay data, METARs can be used to identify the major causes of delays and to create delay climatologies for a specific airport. Also, patterns for delays and cancellations for the study period of 2003–15 can be identified for the individual airports included in this study. These patterns can be useful for operators and airport planners to optimize performance in the future.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Jennifer D. Small Griswold, smalljen@hawaii.edu

Abstract

Weather creates numerous operational and safety hazards within the National Airspace System (NAS). In 2014, extreme weather events attributed 4.3% to the total number of delay minutes recorded by the Bureau of Transportation Statistics. When factoring weather’s impact on the NAS delays and aircraft arriving late delays, weather was responsible for 32.6% of the total number of delay minutes recorded. Hourly surface meteorological aviation routine weather reports (METARs) at major airports can be used to provide valuable insight into the likely causes of weather delays at individual airports. When combined with the Federal Aviation Administration’s (FAA’s) Operations Network (OPSNET) delay data, METARs can be used to identify the major causes of delays and to create delay climatologies for a specific airport. Also, patterns for delays and cancellations for the study period of 2003–15 can be identified for the individual airports included in this study. These patterns can be useful for operators and airport planners to optimize performance in the future.

© 2019 American Meteorological Society. For information regarding reuse of this content and general copyright information, consult the AMS Copyright Policy (www.ametsoc.org/PUBSReuseLicenses).

Corresponding author: Jennifer D. Small Griswold, smalljen@hawaii.edu

1. Introduction

Weather impacts aviation in numerous ways and is a major concern of the aviation community (Kulesa 2003). Pilots need to avoid weather that will negatively impact the safety of a flight and understand how it will impact the performance of the aircraft. Air traffic controllers need to understand where hazardous weather is located so that they can direct aircraft to safety or hold aircraft on the ground (Andrews 1993). Dispatchers need to understand how the environmental temperatures will affect the takeoff and landing distances (Subbotin and Gardner 2013). Passengers need to know whether or not an upcoming weather event will cancel future flights. Airlines want to reduce the number of delayed and cancelled flights and need to take into consideration weather specific to each airport and region (Hansen and Bolic 2001). From this small number of examples one can understand the significant impacts of weather phenomena on the aviation community.

Besides impacts to the operations of airlines and other aviation community members, weather impacts the economic performance of the aviation community by creating delays and cancellations. The economic costs of air traffic delays to the U.S. economy are vast and far-reaching; in 2007, the total cost of airline delays was $41 billion dollars (Joint Economic Committee 2008). Weather plays a significant role in creating delays and cancellations and may cause up to 70% of the delays in the National Airspace System (NAS) (Kulesa 2003). While forecasts have allowed aviation operators to make some adjustments to operations, weather continues to impact airlines since schedules are created and finalized weeks in advance. By better understanding the expected weather of an airport and how each specific weather type impacts each individual airport, planners and schedulers can create schedules that work to eliminate the impacts of expected weather predictions. For example, if an airport is prone to early morning fog or icing conditions in winter (e.g., December and January), an airline scheduler can adjust the schedule to take into account low visibility airport or aircraft deicing operations. One goal of this work is to identify the weather types and severity most related to flight cancellations and delays at selected airports with continuous weather records and high traffic for our period of interest (2003–15). With this knowledge, adjustments in the schedules can be made in areas where weather has created delays or cancellations in the past.

Here, we use historical hourly weather records and Federal Aviation Administration (FAA) performance datasets to determine the relationship between flight cancellations and delays due to weather type (drizzle, rain, snow, fog, mist, and haze), descriptors (freezing and thunderstorms), and weather severity (none, mild, moderate, and severe). The analysis is completed on the 77 airports reporting data to the FAA, with a focus on 10 major U.S. airports with long weather records and high density flight traffic. While previous works (Robinson 1989; Breslin 2016; NOAA 2018) have analyzed the impacts of weather on airline operations, these analyses covered past periods that may no longer be representative of current weather occurrences or aviation performance statistics or covered only a short period of time that is not representative of modern airline operations and concurrent weather patterns. In addition, the data used in this analysis utilize the current delay and cancellation definitions, metrics, and datasets utilized by the aviation and transportation community. Therefore, our analyses provide the most up-to-date analysis of weather impacts on commercial airline schedules and aviation system performance.

The organization of the paper is as follows: Section 2 describes the weather and aviation datasets and methods, section 3 presents the delay and cancellation analyses, and section 4 provides a discussion of the overall impact of weather on U.S. airport delays and cancellations and presents the conclusions and implications.

2. Data and methods

a. Meteorological data—METARs

This work utilizes raw meteorological aviation routine weather reports (METARs) for the period October 2003 to July 2015. METARs are hourly or half-hourly surface weather observations that provide important information for the planning of safe and efficient aviation operations. They contain data for temperature, dewpoint, wind speed and direction, precipitation, cloud cover and heights, visibility, and barometric pressure at a particular location. METARs are provided by human observers or by the Automated Surface Observing Systems (ASOS) in high-resolution real-time (30 min) format. METARs are used in this work rather than postprocessed surface observation datasets for several reasons: 1) many surface observation datasets do not have a long enough history to elucidate long-term trends; 2) many other datasets do not provide hourly or 30-min resolution, without which diurnal and smaller time scale events are missed; 3) some surface observation datasets are not real time and are not available for real-time flight planning and flight operations; and 4) the aviation community uses METARs for surface weather observations (with extensive records available a high temporal resolution), meaning that METARs are most useful for understanding the past impacts of weather on aviation. METARs for this study are obtained through the Iowa Environmental Mesonet (Herzmann et al. 2004) server and provide hourly weather observations for most major and minor airports within the Unites States (specific airports used in this study are described in section 2). Note that the full data record from each airport varies in length, based on the dates each individual airport started collecting data, and instances of missing data. METAR variables that are critically important for aviation, and a brief summary of their impacts on aircraft or airport operations, can be found in Table 1.

Table 1.

General METAR categories and typical impacts on aviation operations.

Table 1.

b. Aviation performance datasets

The FAA maintains a database that contains information on the performance of the aviation industry and can be used to determine the impacts of specific weather types on the operational efficiency of the aviation community. These datasets contain information on the operational efficiency of airports, airlines, or airspaces and are available through the FAA Operations and Performance Data database (FAA 2008a). A majority of the datasets contained within the database are freely accessible; however, higher-resolution or more specific information is only accessible with specific clearance or through the use of a Freedom of Information Act (FOIA) request. The datasets used in this study are the Aviation System Performance Metrics (ASPM) (FAA 2008b), Airline Service Quality Performance (ASQP) (FAA 2008c), and the Operations Network (OPSNET) (FAA 2008d). Data from the ASQP reported in this study were acquired using a FOIA request. ASQP is a novel dataset for this type of analysis because it uses aviation performance metrics with inputs from actual airline schedules and airspace efficiency data, unlike previous works and publically available data sites, which do not currently use these metrics (e.g., Robinson 1989; NOAA 2018).

1) Aviation System Performance Metrics

The ASPM dataset contains performance and efficiency information for 77 U.S. airports and designated ASPM carriers. The 77 ASPM airports are shown in Fig. 1, with those highlighted for detailed analysis in red. To study the impacts of weather and climate on airport operations, individual airports that had both a complete METAR and ASPM record were selected for in depth analysis. Also, airports that have not seen major changes in traffic due to airline mergers and subsequent dehubbing (such as in Memphis International Airport; Wilson 2015) were used so that the average delay and cancellation were relatively constant. These selection criteria are needed because it is difficult to determine the impacts of weather on aviation operations when airport traffic and demand vary over time. Finally, individual airports representative of different regions of the contiguous United States with different geographies and climates were also identified and used. ASPM records based on preliminary data are available on a next-day basis and are then updated over the next 6–8 weeks after the end of each calendar month. For this project, data from the ASPM Data Download module and the ASPM Weather Factors Frequency Report Module from October 2003 to July 2015 were used and required a FOIA request to obtain.

Fig. 1.
Fig. 1.

Locations of the 77 ASPM airports. Airports in red are highlighted in this work for considering regional impacts of specific weather types and events.

Citation: Journal of Applied Meteorology and Climatology 58, 3; 10.1175/JAMC-D-17-0277.1

The ASPM Weather Factors Frequency Report gives pertinent information on the occurrence and impacts of certain weather types/conditions (i.e., low visibility, thunderstorms) at individual ASPM airports for the total of scheduled operations. The four weather types/conditions that either directly apply to airports or can be calculated using METARs include severity of local weather conditions, visibility in statute miles, ceiling in hundreds of feet, and wind speed in knots. Within each Weather Factors Frequency Report, an impact severity category is assigned to the weather conditions based upon information given by the FAA, with 1 being the lowest severity and 3 being the highest severity and plus or minus signs (+ or −) indicating a change to severity impact based on past observations. The visibility, ceiling, and wind speed thresholds vary from airport to airport so that the severity categories correctly match the expected impact on the specific airport as determined by the FAA. These expected impacts are based on the historical relationship between the weather and percent on-time arrivals at that airport. The following procedure is used to determine these thresholds: 1) For each airport and each impacting type/condition, a frequency distribution report displaying the variable value and percent on-time arrivals based on flight plan is developed. 2) The frequency data are used to determine the value when percent on-time consistently changed to a different level. 3) Using values determined in the previous step, minimum and maximum values are determined for all weather severity categories (none, minor, moderate, and severe). 4) The percentages of scheduled operations in each weather category are examined to determine reasonableness. Note that the severity ratings for binary weather events (such as the presence of snow or drizzle) are the same for all airports.

Similar to METAR weather types, the Weather Factors Frequency Report has a report for discrete weather types such as fog and snow. This report assigns severity ratings for each weather type so that severity ratings are consistent among all airports. Note that modifiers/descriptors that are attached to weather types, as described below, also have their own severity rating. Weather types that are considered minor/severity level 1 include drizzle (DZ), haze (HZ), spray (PY), unknown precipitation (UP), and modifiers such as blowing (BL) and low drifting (DR). Weather types that are considered moderate/severity level 2 include rain (RA), mist (BR), smoke (FU), dust (DU), sand (SA), and showers (SH). Weather types that are considered severe/severity level 3 include snow (SN), snow grains (SG), ice crystals (IC), ice pellets (PL), hail (GR), small hail or snow pellets (GS), fog (FG), volcanic ash (VA), dust/sand whirls (PO), squalls (SQ), funnel cloud, tornado, or waterspout (FC, +FC), sandstorm (SS), dust storm (DS), thunderstorm (TS), and freezing (FZ).

The Data Download module provides hourly information on all flights, on-time flights, delayed flights, airport capacity, and airport efficiency for all domestic flights included within the ASPM. For this work, 17 different metrics, out of the 79 reported in the Data Download module, are identified to present an overall view of aviation operations. Note that these 17 metrics, found in Table 2, do not include cancelled or diverted flights (discussed later in this section). These metrics are combined to calculate overall delay and cancellation statistics that are provided in the ASQP dataset.

Table 2.

ASPM Data Download module metrics, available at hourly intervals, used in this study.

Table 2.

2) Airline Service Quality Performance

The Airline Service Quality Performance provides information about airline on-time performance, flight delays, and cancellations. It is based on data filed by airlines each month with the Department of Transportation’s Bureau of Transportation Statistics as described in 14 Code of Federal Regulations (CFR) Part 234 (U.S. Department of Transportation 2012). Airlines with one percent or more of the total domestic passenger revenue are required to report certain flight data for all flights within the contiguous United States; in addition, smaller carriers can voluntarily report. Since the start of the ASQP module in June 2003, the number of carriers reporting has varied from 10 to 19 (not shown). The ASQP provides the time and cause of departure or arrival cancellations for a given airport. An arrival cancellation is a flight that was scheduled to arrive at an airport but was cancelled because of conditions/issues at the arrival airport. The reportable cancellation causes are carrier, weather, NAS, and security. Carrier cancellations are the result of a decision made by the air carrier. Weather cancellations are caused by hazardous or extreme weather that manifest or are forecasted at the point of departure, on point of arrival, or en route. NAS cancellations are the result of issues within the National Airspace System (such as equipment failures). Security cancellations are caused by security concerns (such as threats against an aircraft) (FAA 2008c). The ASQP also provides the daily total minutes of carrier reported delays for the following causes: carrier, weather, NAS, security, and late arrival (U.S. Department of Transportation 2012). Carrier, NAS, and security delays are defined in the same way as cancellations. Late arrival delays are due to the late arrival of the same aircraft at a previous airport. Late arrival delays can cause a ripple effect throughout the system; this ripple effect is known as delay propagation (FAA 2008c).

3) Operations Network

The OPSNET provides information on the performance of the NAS for air sectors and select airports as mandated by the Federal Aviation Administration Order 7210.55F (FAA 2017). The OPSNET provides the daily total minutes of NAS delays for the following causes: weather, volume, equipment, runway, and other. Since the ASQP and OPSNET are collected by different groups within the aviation community, differences can exist when comparing delay causes between the two modules.

4) Focus airport characteristics

As mentioned above in section 2b(1) we identified 10 airports, highlighted in red in Fig. 1. Below are brief descriptions of characteristics of the focus airports.

John F. Kennedy International Airport (JFK) is an airport serving the New York City area and is an international gateway into the United States. In 2014, the airport had 422 415 plane movements and carried 53 254 533 passengers (Port Authority of New York and New Jersey 2017). JFK’s northerly latitude and coastal location provides cold, snowy winters and warm summers. It is also influenced by operations at other airports in the region such as La Guardia (LGA) and Newark (EWR).

Hartsfield–Jackson Atlanta International Airport (ATL) has been the busiest passenger airport (i.e., number of passengers utilizing the airport) in the world since 1998 and the busiest operations (i.e., number of takeoffs and landings) airport in the world since 2003 (City of Atlanta 2017). ATL’s weather is typical for an airport located in the southeast United States with hot summers and mild winters. Afternoon thunderstorms dominate during the summer period with few occurrences of winter precipitation.

Miami International Airport (MIA) is a major international gateway to the Caribbean and Central and South America. It currently ranks second in the United States in number of international travelers. In 2015, it carried 44.3 million passengers and had 409 324 flight operations (Miami International Airport 2017). MIA’s southern and coastal location provides a warm climate that experiences afternoon thunderstorms and impacts from tropical cyclones.

Minneapolis–St. Paul International Airport (MSP) is a large airport serving the northern Midwest United States. In 2015, 36 582 854 passengers traveled through MSP with 404 762 airport operations (Metropolitan Airports Commission 2017). Peak operations occur during daylight hours with some freight and charter operations occurring at night (Metropolitan Airports Commission 2017). MSP’s northerly latitude and central location provides cold, snowy winters and temperate summers.

Chicago O’Hare International Airport (ORD) is an important international gateway and regional hub for the Midwest United States. In 2010, ORD was the world’s third busiest airport in terms of annual passengers (Chicago Department of Aviation 2017). In 2014, ORD replaced ATL as the world’s busiest operations airport (Chicago Department of Aviation 2017). ORD weather is characterized by mild summers and cold winters with many occurrences of frozen precipitation and fog.

Dallas Love Field (DAL) is a unique airport due to the fact that it was largely restricted to short domestic flights until 13 October 2014 when the Wright Amendment was repealed (West 2013). Originally planned to deter use at DAL and other Metroplex airports and promote flights out of neighboring Dallas–Fort Worth International Airport (DFW), the Wright Amendment restricted flights to Texas and four neighboring states. Over the years, the Wright Amendment was modified to reduce restrictions until finally being repealed on 13 October 2014. Because of the restrictions on flight distances, a majority of the flights captured by the ASPM are shorter-distance flights. These short-distance flights are primarily affected by the local climate of the southern Midwest United States. In 2014, the airport served 4 724 225 passengers (City of Dallas Aviation Department 2017). DAL’s weather is characterized by hot summers, thunderstorms primarily in the spring, and mild winters with few instances of frozen precipitation.

Salt Lake City International Airport (SLC) is a major airport serving the western United States. In 2014, the airport served 21 141 610 passengers and averaged 315 daily scheduled departures (Salt Lake City International Airport 2017). Because of SLC’s high elevation, high-density altitudes are a major concern for airport operations. SLC’s western and central location provides cold, snowy winters and warm, dry summers.

Sky Harbor International Airport (PHX) is a major airport serving the desert Southwest. In 2015, PHX had 44 006 205 passengers carried and 440 411 total operations (City of Phoenix Aviation Department 2017). PHX’s desert location provides a hot and dry climate with few observations of precipitation. High temperatures in summers can result in high-density altitudes that are detrimental to airport operations (Goodman and Small Griswold 2018).

Seattle–Tacoma International Airport (SEA) is a large airport serving the northwestern United States. In 2015, 42 340 537 passengers passed through SEA with a total of 381 408 flights averaging 1045 daily operations (Port of Seattle 2017). SEA’s northwestern location provides a temperate climate.

San Francisco International Airport (SFO) is a major transpacific gateway located in California, averaging 3702 weekly flights and carrying 50 067 094 passengers in 2015 (San Francisco International Airport 2017). SFO is unique in that its runways are placed closely together, thus creating unique separation challenges during periods of low visibility. SFO’s coastal and midlatitude location provides a temperate climate.

c. Analysis methods

After reading in the METAR data, limited quality-control methods were employed, such as removing values that are physically impossible (i.e., relative humidities of 300%) or outside of the National Centers for Environmental Information (NCEI) climate extremes or obvious mistakes in transferring the data [such as the appearance of random (@#!2) code for a temperature value]. Out of the approximately 600 000 observations per airport, less than 1% of the values required quality control.

After quality controlling the METAR datasets, the severity binning scheme [i.e., applying a value of minor (1), moderate (2), or severe weather (3) impact] was applied to the local weather codes, visibility, ceiling, and wind speed data. Local weather codes are defined using the same severity rating as described in the ASPM; for example, dichotomous (observed yes or no) weather types defined using the METAR data. To determine the highest overall weather severity value for each observation, the highest score for the local weather codes was found and recorded. Note, the highest severity rating for all types is recorded as the overall severity rating. For example, if a weather code such as FZDR is observed, the two severity values would be 3 (FZ) and 1 (DR) with a resulting severity value of 3 due to the presence of FZ. To eliminate over reporting (or over representation) of the local weather codes and ASPM severity bins, numerous observations in a 1-h period were removed. These additional observations per hour are the result of including aviation special weather reports (SPECI) that are METARs issued on a nonroutine basis as dictated by changing meteorological conditions. During a given hour, the highest severity score of each category (local weather codes, visibility, ceiling, and wind speed) was recorded as the severity score for that hour. Note that the highest severity score was determined such that the highest severity score would always be reported for each hour (e.g., after removal of multiple observations for the same hour as mentioned earlier). The same method was applied to find a precipitation occurrence array or obscuration occurrence array.

1) Monthly METAR and weather codes

Average monthly values, computed using hourly observations, were found for all METAR values except the binary weather codes (such as fog) and discrete ASPM severity codes. This method was also used to determine the monthly hourly patterns (e.g., diurnal cycles) from the daily data. For those values, the total monthly count was found because the original values are binary and not continuous. After doing this, the monthly time series values were linearly detrended to remove any patterns or trends caused by natural occurrences or changes in observation techniques. Only values that occurred frequently and are important in analyzing aviation weather are shown. For example, the occurrence of funnel clouds is relatively rare so it will not be analyzed individually; however, funnel clouds are included in the overall weather severity score plots. Next, the monthly average value was subtracted from the linearly detrended monthly time series to derive monthly anomaly time series (departure from the expected monthly value; e.g., monthly counts of thunderstorms).

2) Weather delay statistics

To determine the average impact a certain weather phenomenon has, such as high winds or thunderstorms, two values are determined. First, the average of the delay statistics (e.g., average airport departure delay, average arrival delay, and airport efficiency score from Table 2) was calculated from the ASPM and OPSNET, whenever that weather phenomenon was present. Second, the total number of hourly observations in which that weather phenomenon was present was determined. To accurately define the impact of weather on commercial operations, only weather and delay data observed during active airport hours were included. This was done to eliminate skewing the values when little to no airport movement was occurring (e.g., in the middle of the night). Departure and arrival statistics as a function of time of day were used to determine airport activity (more than five flights departing or arriving per hour). All airports become active between 6 a.m. and 9 a.m. local time with activity continuing through the day and become inactive between 9 p.m. and midnight local time and remaining inactive until morning (figure not shown). As the overall number of aircraft that operate at an airport or utilize the airspace increases, the risk of airport congestion increases, overall efficiency decreases, and the amount of delays increases. Including time periods in which little to no aircraft movement occurs will produce an unrepresentative view of the challenges airlines and other operators face as a result of weather and would skew the impact weather has on airport operations. For example, if only one aircraft takes off during midnight and 80 aircraft takeoff during noon, the impacts of congestion on airport efficiency will be completely different and comparing the two time frames may result in improper conclusions about airport efficiency.

3) Cancellation statistics

To determine airline cancellations due to weather, weather phenomena that were present whenever a flight cancellation occurred during either departure or arrival at the airport in question were recorded. An extreme weather cancellation is caused by extreme or hazardous weather conditions, such as a hurricane or blizzard, which are forecasted or manifest themselves at the point of departure, en route, or at the point of arrival. To better understand the impacts of each weather phenomenon, the average number of flight cancellations per observation (i.e., h) whenever a weather phenomenon (e.g., snow or fog) was present was determined. This can give an operator or dispatcher an idea of the number of cancellations to expect during different types of weather. However, some estimation errors may occur in this determination due to the fact that a flight may be cancelled due to weather at the departure airport, weather en route, or weather that was observed in the hour prior to or after the expected departure or arrival time. Errors can also be due to propagation of any of these effects throughout the NAS.

3. Results

a. Delays and cancellations for the 77 ASPM airports

Figure 2 shows the percent breakdown for the 77 ASPM airports ASQP delay causes, OPSNET delay causes, and ASQP cancellation causes for the date range October 2003 to July 2015. When looking at the ASQP delay causes, only 5% of delays are caused by extreme weather. However, when considering that carrier, NAS, and late arrival delays can all be attributed to nonextreme weather, the impacts of weather on aviation performance can be considered to be much higher. When looking at the causes of OPSNET delays, weather is responsible for the largest percentage of delays at 82% of the total delay minutes. This shows the overall negative impact of weather on the ability of the NAS to function properly. Knowing that the great majority of NAS delays are attributed to weather, the overall impact of weather on delays recorded by the ASQP is now greater. When looking at the ASQP cancellation causes, extreme weather is the greatest contributor of airline cancellations at 42% of the total cancellations. This again highlights the overall negative impacts weather has on aviation and the need to better understand and predict the impacts of weather on the aviation community.

Fig. 2.
Fig. 2.

ASQP delay, OPSNET delay, and ASQP cancellation causes for ASPM 77 airports. Data range used was from October 2003 to July 2015.

Citation: Journal of Applied Meteorology and Climatology 58, 3; 10.1175/JAMC-D-17-0277.1

b. Delay and cancellation climatology by airport

It is important to look at airports individually when determining the impact of weather on operations. First, climates vary across the United States, causing differences in weather delays and cancellations from one airport to another. For example, airports in the Gulf Coast portion of the United States may be more concerned about thunderstorms or rain instead of snow due to their southerly location. Another reason to look at airports individually is that airports have different layouts, terrain features, proportions of carriers, and airline schedules that cause varying responses to specific weather types. For example, some airports may better handle fog or low visibility operations due to the layout of runways.

As mentioned in section 2, a more focused analysis was completed on 10 airports. The airports selected for in depth analysis are (shown in red in Fig. 1) ORD, DAL, ATL, MIA, MSP, JFK, PHX, SLC, SFO, and SEA.

1) Average airport delays

There are two ways to look at airport delays and efficiency: either by focusing on the four weather severity categories (none, minor, moderate, and severe) or by focusing on distinct weather types (e.g., drizzle, rain, snow, fog, mist, and haze) and descriptors (freezing and thunderstorm). This allows for a broad view of how weather severity and weather type impact arrival and departure delays and how much the efficiency of each airport is reduced.

Table 3 shows the average delay statistics for all 10 airports including the percentage of weather in each severity category, the length of arrival delay (in minutes), the length of departure delay (in minutes), and the airport efficiency score (in percent). The airport efficiency score is a metric with one being the best score and zero being the worst score. As discussed in Table 2, many metrics contribute to determining the overall efficiency of the airport. The airport efficiency score is multiplied by 100 and expressed as a percentage with 100% being a perfect score (FAA 2008b). All 10 airports experience weather that falls into the “none” category, with a lack of severe weather reported, the most often with an average occurrence across all 10 airports of 87%. MIA and PHX have the highest occurrences in the “none” category with averages of 92% and 96% respectively. The second most often reported weather severity category, again across all 10 airports, is the “moderate” category with a 10-airport average of 8.4%. SEA and JFK have the highest occurrences of moderate weather with averages of 12% and 16%, respectively. The “severe” weather occurrence average across all 10 airports is 3%, with MSP and ORD tied with 6% of their total weather falling into this category. The airports reporting the fewest occurrences of severe weather are PHX and SFO with 1% of their weather falling into this category. The “minor” weather occurrence accounts for the smallest percentage of weather events for 8 of the 10 airports with a 10-airport average of only 1.6%. Notably, for all 10 airports, the largest arrival and departure delays are due to severe weather. Departures are delayed, on average, by 65.8 min and arrivals are delayed by 62.9 min. JFK experiences the longest departure delays (80 min) whereas SLC experiences the shortest departure delays (50 min). ORD experiences the longest arrival delays (79 min) whereas SLC experiences the shortest arrival delays (49 min). Airport efficiency scores, hereafter referred to as airport scores, which indicate the reduction in efficiency of an airport due to the various weather severity categories, are most impacted by severe weather. On average the 10 airports scored 82%, which indicates an 18% reduction in airport efficiency. The airport that is most impacted by severe weather, DAL with an airport score of 75%, would experience a 25% reduction in airport efficiency. The airport least affected by severe weather, using this metric, is SFO with a score of 89%, reducing efficiency by 11%. In summary, Table 3 clearly shows that for all airports, as the severity of the weather increases so does the overall impact on airport delays (arrivals and departures) and airport efficiencies.

Table 3.

Statistics for weather severity categories (%) with most common severity in bold, average airport departure delays (minutes) with the longest delay in bold, average airport arrival delays (minutes) with the longest delay in bold, and airport efficiency scores (%) for weather impacts with the lowest airport score in bold. The mean across all airports is included as well.

Table 3.

Table 4 shows the average delay statistics including the percentage of occurrences that fall into each weather type category (drizzle, rain, snow, fog, mist, and haze) and descriptor (freezing and thunderstorm), length of departure delay (in minutes), and the airport score (in percent). The two weather type categories that occur most often (highest average percentage of occurrence) are rain and mist. For rain the average annual occurrence across all airports is 39%, and 54.5% when considering only airports for which rain was the dominant weather type (MIA, PHX, SEA, and SFO). For mist the average annual occurrence was 32.2%, and 36.8% when considering only airports for which mist was the dominant weather type (JFK, ATL, MSP, ORD, DAL, and SLC). For MIA and PHX, the low or nonoccurrence of snow and freezing weather precludes an assessment of arrival and departure delays and airport score for those categories. When examining the delay statistics we can identify the type of weather that results in the longest departure and arrival delays.

Table 4.

Statistics for weather type categories (%) from METAR with the most common type in bold, average airport departure delays (minutes) with the longest delays in bold, average airport arrival delays (minutes) with the longest delays in bold, and airport scores (%) for weather types with the lowest airport score in bold. The mean across all airports is included as well.

Table 4.

Average airport departure delays are most impacted by freezing with an average of 83 min across all airports, thunderstorms with an average delay of 74 min across all airports, and fog with an average of 72 min across all airports. The airports most impacted by freezing include JFK, ATL, MSP, SLC, and SEA with an average departure delay of 96.4 min. The airports most impacted by thunderstorms include JFK, ORD, DAL, and SFO with an average departure delay of 90.6 min. The airports most impacted by fog include MIA and PHX with an average departure delay of 98 min.

Similar to departures, across all airports the average airport arrival delays are most impacted by freezing with an average of 67 min, thunderstorms with an average of 68 min, and fog with an average of 74 min. The airports most impacted by freezing include JFK, ATL, MSP, SLC, and SEA with an average arrival delay 74 min. The airports most impacted by thunderstorms include JFK, ORD, DAL, and SFO with an average arrival delay of 84 min. The airports most impacted by fog include MIA, SLC, and PHX with an average arrival delay of 90 min.

The combination of weather events in each city results in decreases in airport efficiencies. For JFK, freezing precipitation and thunderstorms have the highest overall impact resulting in airport scores of 68% and 74% respectively. For ATL, freezing precipitation, snow, and thunderstorms have the overall greatest impacts to efficiency (57%, 74%, and 76% respectively). The largest reduction in efficiency for both MIA and PHX is due to fog, with airport scores of 72% and 66% respectively. Like JFK and ATL, MSP and SLC are most impacted by freezing events with airport scores of 72% and 71% respectively. Thunderstorms impact ORD, DAL, and SFO the most, resulting in scores of 70%, 72%, and 88% respectively.

Additionally, Table 4 allows for the identification of which weather type will have the largest cumulative impact based on the frequency of occurrence. For example, JFK the weather type occurring the most frequently is mist (40% of recorded weather events) with an average departure delay of 61 min and departure arrival of 59 min. Rain is a close second occurring 33% of the time with 63-min departure delays and 61-min arrival delays. The aggregate impact of these two weather types (rain and mist) is critical for scheduling and airport operations. They account for 73% of the weather occurrences with an average departure delay of 62 min and arrival delay of 60 min. However, it is important to note that while these events (mist and rain) occur more frequently, the average airport score when aggregating mist and rain events is only 90% compared to the low airport score or 68% for freezing events and 74% for thunderstorms [the two weather types with the longest arrival delays (109 min) and departure delays (90 min)].

For ATL, ORD, DAL, and SLC the two most common weather types are mist (40%, 35%, 39%, and 32% respectively) and rain (32%, 26%, 32%, and 27% respectively). Mist and rain account for 72% of ATL weather events with an average departure delay of 53 min and arrival delay of 59 min, with an airport score of 91.5%. ORD experiences average departure delays of 64.5 min, arrival delays of 72 min, and an airport score of 88% when mist and rain occur (61% of weather events). Mist and rain account for 71% of DAL weather events with an average departure delay of 48 min and arrival delay of 40 min, with an airport score of 90%. SLC experiences average departure delays of 43.5 min, arrival delays of 42.5 min, and an airport score of 90% when mist and rain occur (59% of weather events).

For MIA, SEA, and SFO the two most common weather types are rain (58%, 54%, and 40% respectively) and mist (20%, 32%, and 36% respectively. Rain and mist account for 78% of MIA weather events with an average departure delay of 78.5 min and arrival delay of 52.5 min, with an airport score of 88%. SEA experiences average departure delays of 44 min, arrival delays of 42.5 min, and an airport score of 96% when rain and mist occur (86% of weather events). Rain and mist account for 76% of SFO weather events with an average departure delay of 60 min, an arrival delay of 63.5 min, and an airport score of 93%.

MSP’s weather is diverse and dominated by mist (35%), rain (22%), and snow (22%) accounting for 79% of weather events. The aggregate impact of mist, rain, and snow leads to average departure delays of 48.6 min, average arrival delays of 52 min, and an average airport score of 87.3%. PHX’s most common weather types are rain (66%) and thunderstorms (16%), accounting for 82% of weather events. The aggregate impact of rain and thunderstorm events results in departure delays of 58.5 min, arrival delays of 76.5 min, and an average airport score of 84%. These average delays allow planning and preparation for the most common events and their aggregate impacts.

2) Departure and arrival cancellations

Figure 3 shows normalized cancellations per month for arrivals and departures while Fig. 4 shows normalized cancellations per hour for arrivals and departures at the 10 focus airports for the weather types (drizzle, rain, snow, fog, mist, and haze) and descriptors (freezing and thunderstorm). Because of the variability of the airports, the data were normalized by the largest monthly value for each airport, allowing for easy intercomparison.

Fig. 3.
Fig. 3.

Average number of departure and arrival cancellations per month for the 10 selected airports considering all weather types normalized to the maximum value for each year for each airport. The maximum values used for normalization of departure cancellations are as follows (airport, average cancellations per hour, month): JFK, 285, February; ATL, 688, January; MIA, 70, September; MSP, 116, February; ORD, 800, February; DAL, 56, February; SLC, 139, January; PHX, 78, January; SEA, 67, January; and SFO, 129, December. The maximum values used for normalization of arrival cancellations are as follows (airport, average cancellations per hour, month): JFK, 286, February; ATL, 715, January; MIA, 70, February; MSP, 116, February; ORD, 814, February; DAL, 24, February; SLC, 139, January; PHX, 86, January; SEA, 68, January; and SFO, 139, December.

Citation: Journal of Applied Meteorology and Climatology 58, 3; 10.1175/JAMC-D-17-0277.1

Fig. 4.
Fig. 4.

Normalized average number of departure and arrival cancellations per hour for each weather type/observed phenomena for the 10 selected airports. The maximum values used for normalization of departure cancellations are based on the maximum across all weather types for each airport. The maximum values used for normalization of departure cancellations are as follows (airport, average cancellations per hour, weather type): JFK, 6.1, freezing; ATL, 43.1, freezing; MIA, 0.4, fog; MSP, 1.5, freezing; ORD, 9.9, freezing; DAL, 1.8, snow; SLC, 1.4, freezing; PHX, 0.7, fog; SEA, 2.2, freezing; and SFO, 4.7, fog. The maximum values used for normalization of arrival cancellations are as follows (airport, average cancellations per hour, weather type): JFK, 5.0, freezing; ATL, 40.6, freezing; MIA, 0.5, drizzle; MSP, 1.3, freezing; ORD, 8.3, freezing; DAL, 1.3, snow; SLC, 1.0, fog; PHX, 0.3, mist; SEA, 1.8, freezing; and SFO, 0.5, drizzle.

Citation: Journal of Applied Meteorology and Climatology 58, 3; 10.1175/JAMC-D-17-0277.1

The most common cause for departure and arrival cancellations was freezing precipitation during the winter months December–February (JFL, ATL, MSP, ORD, SLC, and SEA). For departures, fog was an important cause for cancellations in MIA, PHX, and SFO. ORD and DAL also experience high occurrence of cancellations due to fog and thunderstorms. Behind winter weather, fog is the most common reason for cancellations at MSP and SLC. For MIA, the most cancellations occur in August–October (Fig. 3) and are likely associated with high numbers of cancellations per hour due to rain, thunderstorms, mist, and moderate and severe wind speeds (not shown). Airports that do not typically experience winter weather (SFO, PHX, MIA) have cancellations dominated by obscuring phenomena as fog and haze or those related to precipitation such as drizzle, rain, or mist. Arrival and departure cancellations occur at similar frequencies for the same weather type. In summary, Table 4 clearly shows that for all airports, regional and seasonal weather patterns (e.g., freezing precipitation or thunderstorms) determine the overall impact on airport departure and arrival cancellations as well as airport efficiencies.

4. Discussion and conclusions

An understanding of delay and cancellation statistics and how they relate to weather severity and type is critical for understanding weather impacts on aviation operations. Because of climatological differences between airports in the various parts of the United States, each airport will experience a unique set of weather types/descriptors as reported in the METAR data. These weather events are then categorized in the FAA datasets (ASPM and ASQP) and provide insight into the frequency and impacts of severe weather on delays and cancellations.

For all airports, delays and decreases in efficiency (lower airport scores) increase as the severity increases from “none” to “severe” (Table 3). The variability for delays between airports is a result of the frequency of occurrence of the weather types/descriptors. For example, impacts to ATL increase with freezing precipitation, snow, and thunderstorms and result in the greatest reduction in efficiency (airport score) at ATL (Table 4). The infrequent occurrences of snow and freezing precipitation mean that other weather phenomena, such as fog or thunderstorms, are likely to have higher annual impacts on the operating efficiency of ATL. Unlike ATL, ORD experiences greater delay impacts from thunderstorms (Table 4). Also, the impact from snow is far lower; however, snow observations are more common than in ATL, meaning that the overall annual impact is higher. For DAL thunderstorms and fog cause the greatest reduction in efficiency and result in more delays, unlike snow or freezing precipitation. DAL’s airport configuration or the smaller number of flights compared to ATL or ORD may mean that DAL is able to better handle snow and ice events. For MSP, the largest reductions in efficiency come from freezing precipitation, thunderstorms, and fog. However, these are less common than other phenomena, meaning that snow or rain observations have a greater overall impact. This also suggests that MSP is equipped to handle snow better than other airports, such as ATL, where snow is less common. Freezing precipitation and thunderstorms create the highest overall efficiency impacts for JFK (longest delays and lowest airport score); thunderstorms occur more frequently and therefore have a greater impact on JFK than does freezing precipitation. Freezing precipitation and fog have the highest impacts on SLC and are commonly observed during the winter months. SEA is impacted mostly by obscuring phenomena, frozen and nonfrozen precipitation, reduced visibility, and low ceilings (Table 4). For SFO, even though there are low numbers of observations of snow, freezing precipitation, and thunderstorms, their impacts result in delays and lower airport scores, especially for thunderstorms. Other more common weather types have a more significant impact on SFO weather such as rain, mist, and haze. PHX has very minimal weather impacts when compared to other airports, although the largest delays and lowest airport score are due to fog. MIA is similar to PHX in that the dominant weather type is rain and that the greatest delays and lowest airport scores result from fog (Table 4).

Each airport has its own unique annual cancellation cycle for departures and arrivals (Fig. 3) and the primary weather types/descriptors that are associated with cancellations (Fig. 4). ATL has a majority of arrival and departure average weather cancellations in January, February, and December as seen in Fig. 3. For ATL both freezing precipitation and snow vastly outnumber all other weather types in number of cancellations per hour (Fig. 4). This indicates that the vast majority of winter weather cancellations are caused by snow and freezing precipitation. ORD’s weather cancellations occur in winter as a result of overall poor weather conditions and higher occurrences of frozen precipitation and fog. Freezing precipitation, followed by fog, is responsible for the highest number of cancellations per hour (Fig. 4). Snow, thunderstorms, severe visibility, and severe wind also responsible for a relatively high number of cancellations per hour. For DAL, weather cancellations peak in December, January, and February (with February showing the highest peak) and in September. More observations of snow, freezing precipitation, and fog result in cancellations. It is difficult to pinpoint the cause of weather cancellations for DAL in September due to the relatively fair weather during that time period. There is a small peak in weather cancellations in May that coincides with the peak in thunderstorms. For MSP December, January, February, and March have the highest monthly weather cancellations (Fig. 3). This is due to poorer weather conditions as a result of frozen precipitation, fog, and reduced visibility. Freezing precipitation, fog, and severe visibility all have the highest cancellation rates (Fig. 4). Most weather cancellations in JFK occur in December, January, and February and are most likely a result of winter storms. Freezing precipitation and severe wind speed, in that order, have the two highest cancellation rates (Fig. 4); however, both events are relatively rare in the year. Snow has the third highest cancellation rate and is relatively common in the winter months, suggesting that it is responsible for a large share of the weather cancellations. December and January have the highest number of weather cancellations for SLC and are most likely the result of poor visibility, low ceilings, frozen precipitation, and obscuring phenomena. Fog and freezing precipitation have the highest cancellation rate along with the more severe weather conditions (Fig. 4). Conditions with reduced visibility are especially hazardous in mountainous areas like in SLC due to the losses of visual situational awareness in relation to terrain. This may be the reason for the high number of cancellations during periods of fog, severe ceiling heights, and severe visibility. Weather-induced cancellations in SEA are most often in December and January as a result of winter weather. Overall freezing precipitation and snow are responsible for the highest number of hourly cancellations (Fig. 4). In SFO, most weather cancellations occur during December, January, and February and are related to low visibility and low ceiling heights due to events like drizzle, rain, fog, and mist (Fig. 4). PHX’s general weather patterns favor few monthly cancellations, although cancellations do peak in winter as with other airports. The low numbers also suggest that weather cancellations at PHX are likely a result of weather en route or at other airports. For MIA, the peak in wintertime weather cancellations is likely associated with weather at other airports rather than at MIA itself. However, the August to October weather cancellation peak is most likely associated with hurricanes and tropical storms.

Additionally, by combining the results from Table 3 (weather severity categories) and Table 4 (weather type categories) one can determine the impact specific weather event types have on delays as comparison to nonweather events. In Table 3 the delays calculated for the “none” category can be used a proxy for nonweather events, which can then be subtracted from the weather events in Table 4 to give an idea what contributions are truly weather related and not influenced by other nonweather events. For example, for JFK the average departure delay due to the “none” weather severity category is 23 min and the arrival delay is 52 min. Therefore, for freezing events with average departure delays of 109 min, 23 min of this delay are likely due to non-weather-related delays. For arrivals at JFK, the 52-min “none”-related arrival delay would offset more than half of the 90-min arrival delay. From this example, we see that an understanding of both the weather and nonweather impacts is essential for planners.

The results presented in this work clearly identify a strong link between a reduction in airport efficiency and weather as shown in Table 4. The impacts on airport operations are varied and complex ranging from weather (the focus of this work) to issues with security, which are not accounted for in this analysis. For example, in the instance of precipitation, slippery runway and taxiway conditions may require slower taxiing speeds and greater distance between aircraft on the ground; this may reduce the overall efficiency of the airport since fewer aircraft may be allowed on the movement areas and those aircraft may be taxiing at slower speeds (Oda et al. 2010). Slippery ramp areas may also increase the difficulty of other ground operations, such as pushing back an aircraft or loading cargo, and may result in gate delays or other performance decreases. The negative impacts caused by frozen or freezing precipitation require the implementation of various snow and ice removal techniques. Many of these techniques, such as applications of chemicals, can close runways or taxiways and impede airport operations. Many aircraft have equipment to remove or prevent the buildup of ice on certain areas but these systems may not be effective on the ground. To ensure that necessary aircraft surfaces are not contaminated with ice, ground deicing must occur before takeoff. This increases the cost and complexity of operations resulting in decreased economic and operational performance. When visibility is impacted, airports may not be able to operate in the optimal condition, thereby reducing the number of aircraft an airport can handle. For example, San Francisco International Airport cannot use parallel approaches during low visibility conditions, reducing capacity by 50% (San Francisco International Airport 2010).

The most important finding is that every airport delay and cancellation analysis is different and it is important to look at each airport individually. For example, while the airports in this study had similar annual cancellation patterns (Fig. 3) the actual number of cancellations varied greatly from airport to airport (e.g., maximum cancellations for ATL in January was 688 whereas SEA had a maximum of 62 cancellations in January). Also, airports all have their own unique climates, layouts, and operation procedures, meaning that all airports will be affected differently by weather. Some airports handle certain weather phenomena better or worse, such as freezing precipitation or low visibility, and weather phenomena may not be as frequent at one airport as it is at another airport. For example, a snowstorm in ATL has a larger impact (e.g., delays or cancellations) than at ORD, which is routinely impacted by snow and prepared for severe winter weather. Finally, airports all have different layouts and surrounding terrain that may make operations in certain weather types more challenging than what would be at another airport. In all, generalizing weather impacts for all airports may prove an ineffective way to understand how weather impacts commercial operations. Also, while overall delay and cancellation statistics at a specific airport can be impacted by weather events at another airport, there is no current way to truly remove this type of impact. These methods provide useful information on the impacts specific weather types have on individual airports and provide a meaningful view that what is provided in many of the OPSNET, ASPM, and ASQP premade reports. Delays and cancellations overall are not solely caused by one factor but rather by multiple factors that impact each other. Thus the FAA datasets are limited in their scope and do not include all aspects (i.e., multiple parties) of the causes of delays and cancellations.

This work demonstrates the feasibility and usefulness of using METAR and FAA performance data in assessing the impacts of weather on cancellations and delays at a representative selection of U.S. airports. Important findings include the following:

  1. Differences in airport weather and airline operations impact the efficiencies of airport operations and it is important to assess airports individually.
  2. Understanding the differences in airport weather climatologies allows for an understanding of how inclement weather reduces efficiency.
  3. Weather impacts delays and cancellations in a way consistent with climatological weather patterns.

This analysis provides a stepping stone for developing new delay products and aids in better understanding the issues and problems caused by weather. By looking at more than the departure and arrival rates and the overall airport efficiency scores, specific problem areas can be identified and appropriate mitigation strategies applied.

Acknowledgments

The authors thank NSF for support through NSF CAREERS Grant 1255649.

REFERENCES

Save