SlideShare a Scribd company logo
1 of 69
Download to read offline
HAVE THE NEW SAFETY MANAGEMENT SYSTEM (SMS) POLICIES INSTATED
IN 2014 BY THE AIR TRAFFIC ORGANIZATION (ATO) AFFECTED THE
NUMBER OF AVIATION INCIDENTS?
Analysis Using NASA Aviation Safety Reporting System Data
A Project
Presented to
The Faculty of the Department of Aviation and Technology
San José State University
In Partial Fulfillment
of the Requirements for the Degree
Master of Science in Quality Assurance
by
May 2016
  ii
© 2016
ALL RIGHTS RESERVED
The Designated Project Committee Approves the Project Titled
  iii
HAVE THE NEW SAFETY MANAGEMENT SYSTEM (SMS) POLICIES INSTATED
IN 2014 BY THE AIR TRAFFIC ORGANIZATION (ATO) AFFECTED THE
NUMBER OF AVIATION INCIDENTS?
Analysis Using NASA Aviation Safety Reporting System Data
by
NABEEL OQBI
APPROVED FOR THE DEPARTMENT OF AVIATION AND TECHNOLOGY
SAN JOSÉ STATE UNIVERSITY
May 2016
________________________________________________________
Dr. Seth Bates Department of Aviation and Technology
_________________________________________________________
Professor. Daniel Neal Department of Aviation and Technology
Seth P. Bates
Digitally signed by Seth P. Bates
DN: cn=Seth P. Bates, o=San Jose State University, ou=Aviation
and Technology Department, College of Engineering,
email=seth.bates@sjsu.edu, c=US
Date: 2016.06.07 22:49:32 -07'00'
Seth P. Bates
Digitally signed by Seth P. Bates
DN: cn=Seth P. Bates, o=San Jose State University, ou=Aviation
and Technology Department, College of Engineering,
email=seth.bates@sjsu.edu, c=US
Date: 2016.06.07 22:51:30 -07'00'
  iv
ABSTRACT
HAVE THE NEW SAFETY MANAGEMENT SYSTEM (SMS) POLICIES INSTATED
IN 2014 BY THE AIR TRAFFIC ORGANIZATION (ATO) AFFECTED THE
NUMBER OF AVIATION INCIDENTS?
Analysis Using NASA Aviation Safety Reporting System Data
Prior to 2014, studies have indicated that a number of defects exist amongst the
ATO SMS policies and protocol—defects that may be factors in unwanted airline
accidents or outcomes. To address these risks, the ATO enacted policy changes aimed at
ensuring a safer provision for air traffic and reducing the number of airline incidents.
These new SMS policies, which were effective September 1, 2014, addressed safety
assurance activities, ATO personnel roles and responsibilities, and introduced safety
management as part of the SMS in efforts to alleviate future incidents. This study
examined aircraft incidents related to the ATO from the Aviation Safety Reporting
System database, pre- and post-policy change, in order to discern whether these policies
have effected unwanted incidents. The 37 Class B airspace airports in the United States
were examined, and data regarding incidents related to the ATO from the two 12-month
pre- and post-periods were analyzed. Findings indicated that the newly instated 2014
ATO SMS policies actually resulted in a significant increase in the number of aviation
incidents at the studied airports, and recommendations for future research suggest that
further work be done with special attention paid to the official ASRS incident reports as
to the causes of these incidents, and their correlations to SMS policies.
 
  
v
ACKNOWLEDGEMENTS
I would like to thank my first reader, Professor Daniel Neal, who helped me with
some important initial thoughts in this study. And I would also like to thank my second
reader, Dr. Seth Bates, who helped me hone those thoughts into a cohesive work that will
all to the breadth of literature on the topic.
Moreover, a big thank you to my mother for always keeping me in her prayers, as
well as to my wife, who supported and encouraged me throughout this long process.
 
  
vi
TABLE OF CONTENTS
CHAPTER I—INTRODUCTION.......................................................................................1
Background..............................................................................................................1
Introduction to Project .............................................................................................3
The Aviation Safety Reporting System (ASRS)......................................................4
Acronyms.................................................................................................................7
Statement of Problem...............................................................................................7
Research Question ...................................................................................................8
Hypothesis................................................................................................................8
Objective..................................................................................................................9
CHAPTER II—LITERATURE REVIEW ........................................................................10
Introduction to Literature Review..........................................................................10
The Federal Aviation Administration (FAA) ........................................................10
Early Aviation in the United States ...........................................................11
Origin of the FAA......................................................................................13
Duties of the FAA...........................................................................15
The Air Traffic Organization (ATO).....................................................................16
The Safety Management System (SMS)................................................................16
SMS Components ......................................................................................18
The Aviation Safety Reporting System (ASRS)....................................................20
ASRS Report Processing ...........................................................................21
CHAPTER III—METHODOLOGY .................................................................................23
Study Sample .........................................................................................................23
Class B Airspace Designations..................................................................23
Top 15 Movement Airports .......................................................................25
Top 15 Airports with Highest Incident Reports.........................................26
ASRS Search Criteria ............................................................................................26
Data Analysis.........................................................................................................27
Limitations.............................................................................................................27
CHAPTER IV—RESULTS AND DISCUSSION ............................................................29
T-test ......................................................................................................................31
T-test Results on Three Examined Groups................................................32
T-test Results on Written Incident Reports................................................42
Discussion..............................................................................................................45
Recommendations for Future Work.......................................................................47
 
  
vii
CHAPTER V—CONCLUSION........................................................................................49
REFERENCES ..................................................................................................................52
APPENDICES ...................................................................................................................57
Appendix I .............................................................................................................57
Appendix II............................................................................................................58
 
  
viii
LIST OF TABLES
Table 1. U.S. Aviation Key Events and Milestones ..........................................................12
Table 2. ATO SMS Key Events and Milestones ...............................................................17
Table 3. U.S. Class B Airspace Sites (37) .........................................................................24
Table 4. Top 15 Total Aircraft Movement Airports ..........................................................26
Table 5. Statistical Data on Incident Reports, Pre- and Post-2014 Policy Instatement.....29
Table 6. Number of Incidents, Class B Airports................................................................30
Table 7. t-Test on All Class B Airport Data ......................................................................33
Table 8. t-Test on Class B Airports with the Most Movement..........................................34
Table 9. t-Test on Class B Airports with the Highest Reported Incidents.........................36
Table 10. t-Test on 18-Month Incident Reports.................................................................38
Table 11. Number of Incidents, Class B Airports for 18-Month Study Period .................39
Table 12. t-Test on Total Movement at All Class B Airports............................................40
Table 13. t-Test on Total Movement at Class B Airports, with Outliers Removed ..........42
Table 14. Number of Incidents with ATC Noted as a Primary Problem...........................43
Table 15. t-Test on Incidents with ATC Noted as a Primary Problem..............................44
Table A. Total Movements for all Class B Airports..........................................................57
 
  
ix
LIST OF FIGURES
Figure 1. t-Test Visualization of All Class B Airport Data ..............................................33
Figure 2. t-Test Visualization on Class B Airports with the Most Movement .................35
Figure 3. t-Test Visualization on Airports with the Highest Reported Incidents .............36
Figure 4. t-Test Visualization on 18-Month Incident Reports..........................................38
Figure 5. t-Test Visualization on Total Movement at All Class B Airports.....................41
Figure 6. t-Test Visualization on Class B Airports, with Outliers Removed ...................42
Figure 7. t-Test Visualization on Incidents with ATC as a Primary Problem..................44
CHAPTER I—INTRODUCTION
Background
Under the U.S. Federal Aviation Administration (FAA), the Air Traffic
Organization (ATO) is tasked with ensuring efficient, safe air navigation in the National
Airspace System (NAS) and in both oceanic and international airspace controlled by the
United States (ATO Policy, 2014). Utilizing the framework of their Safety Management
System (SMS), which was first conceptualized in 2004, the ATO has continued to
improve and hone safety protocol to foster more-secure air navigation in the ever-
evolving air transportation industry (Safety Management, 2012). However, while data
collected under SMS procedures have indicated that employing the system has been a
proactive approach to monitoring the safety of aviation operations overall, a number of
reports in recent years identify hindrances existing within the SMS infrastructure
regarding key areas in aviation safety. A 2003 report by the U.S. Government
Accountability Office (GAO) indicated superfluous safety procedures and inadequate
spending in certain areas (Dillingham & Rhodes, 2003), and a 2013 GAO study, which
analyzed efforts to improve aviation safety, acknowledged many of the challenges the
FAA faces and noted the administration’s continuing endeavors to create a more
comprehensive and accurate way to assess and manage risk through further development
of the SMS (FAA Efforts, 2013).
By continuing to collect and analyze data on all aspects of aviation operations
using the SMS, the ATO should be better able to identify emerging problems with safety
and to anticipate – and, therefore, reduce – potential problems that may result in injury,
 
  
2
death, or significant property damage (FAA Efforts, 2013). In April 2013, the key areas
identified by the GAO as needing more attention by the SMS included: (a) runway and
ramp safety, (b) airborne operational errors, (c) general aviation, and (d) pilot training.
The report indicated that the SMS data collection protocols for runway and ramp safety
were significantly insufficient and “limited to certain types of incidents, notably runway
incursions … which [do] not include runway overruns” (FAA Efforts, 2013). Regarding
airborne operational errors, evidence showed that the SMS metric for operational errors
was “too narrow to account for all potential risk” (FAA Efforts, 2013). The SMS
estimates of annual flight hours used to evaluate risk are also limited to only the general
aviation sector, and do not include commercial and military data, which may deem such
estimates unreliable. Additionally, the GAO noted that the SMS did not include sufficient
regulations needed to measure pilot school inspection requirements. The GAO review
concluded by noting that though the FAA and the ATO were actively taking steps toward
resolving these issues, it would likely take several years before enough data could be
gathered to properly assess such risks and put new policies in place.
Less than 17 months later, however – on September 1, 2014 – the ATO inducted a
new version of the SMS with updated policies aimed at alleviating strategic defects and
aviation incidents (ATO Policy, 2014; ATO Safety Report, 2014; SMS Manual Version
4.0, 2014). Though there is no formal indication that these new policies were spurred by
the GAO study, the updated SMS manual includes developments that both cancel former
policies and broaden the scope on the ATO’s monitoring or air traffic control and air
navigation services according to official order JO 1000.37A (ATO Policy, 2014).
 
  
3
This order provides a summarized explanation of the SMS policy changes put in
effect in September 2014, which are relevant to the methodology of this study. According
to Order JO 1000.37A, the SMS policy changes that were made included:
(a) Addressing continued ATO SMS management and improvement;
(b) Reinforcing ATO Safety Assurance activities;
(c) Establishing ATO SMS roles and responsibilities at all levels of the
organization;
(d) Establishing that “the ATO COO may permit temporary continued use of an
operation or system with an existing high-risk hazard to allow the responsible
Service Unit to develop and implement a plan that would mitigate the risk or
eliminate the hazard;”
(e) Introducing integrated safety management as part of the SMS; and
(f) Establishing the ATO Safety Manager (ATO Safety and Technical Training
[AJI] Group Manager for Safety Management) and ATO Chief Safety
Engineer positions (ATO Policy, 2014, 1-2).
Introduction to Project
This study examined aviation incident reports documented in the Aviation Safety
Reporting System (ASRS) database. Specifically, data from incidents related to the ATO
that were reported from the year (12 months) preceding the new SMS policies that took
effect on September 1, 2014, and from the year following the order were selected from
the ASRS for this study.
 
  
4
This study’s purpose was to determine the number of flight incidents related to
the ATO before and after the execution of these new SMS policies in order to establish
whether significant statistical fluctuations exist between the two examined incident-
outcome periods. A determination that a significant statistical difference exists in the
number of incidents associated with the ATO between the before-and-after examined
periods may be indicative of the effectiveness of these policy adjustments.
The Aviation Safety Reporting System (ASRS)
Established by the National Aeronautics and Space Administration (NASA), the
ASRS is used to identify issues in the aviation system that may require attention (NASA
ASRS, 2009). Data from occurrences are reported by aviation personnel (including pilots,
mechanics, and air traffic controllers) to ASRS staff, and are then supplied to the FAA
and used to prevent future incidents (ASRS Program Briefing, 2014). This informing and
evaluation process may be limited, as not all incidents are reported and those that are
reported may include bias or misinformation.
The occurrences reported via the ASRS, which are submitted by various aviation
personnel, are coded according to a number of defining signifiers set in place by the
Department of Transportation (DOT) and predetermined elements related to different
types of occurrences. Specifics from the collective data from these reported issues can be
cross-referenced and examined to expose unsafe conditions or protocol that lead to
unwanted outcomes. Therefore, the information gathered by the ASRS can be explored to
uncover patterns leading to potential hazards, and can be used aviation administrators to
create rules and regulations aimed at avoiding such hazards.
 
  
5
The types of issues that may be reported via the ASRS include (listed in order as
they are presented on the official government title): (a) “aircraft accident,” (b) “civil
aircraft,” (c) “fatal injury,” (d) “incident,” (e) “operator,” (f) “public aircraft,” (g)
“serious injury,” (h) “substantial damage,” and (i) “unmanned aircraft accident” (Title 49,
2016).
(a) Aircraft accident. This term refers to “an occurrence associated with the
operation of an aircraft which takes place between the time any person boards the aircraft
with the intention of flight and all such persons have disembarked, and in which any
person suffers death or serious injury, or in which the aircraft receives substantial
damage” (Title 49, 2016).
(b) Civil aircraft. This term refers to “any aircraft other than a public aircraft”
(c) Fatal injury. This term refers to “any injury [that] results in death within 30
days of the accident”
(d) Incident. This term refers to “an occurrence other than an accident, associated
with the operation of an aircraft, which affects or could affect the safety of operations”
(e) Operator. This term refers to “any person who causes or authorizes the
operation of an aircraft, such as the owner, lessee, or bailee of an aircraft”
(f) Public aircraft. This term refers to “an aircraft used only for the United States
Government, or an aircraft owned and operated (except for commercial purposes) or
exclusively leased for at least 90 continuous days by a government other than the United
States Government” (Title 49, 2016). Additionally, this term does not encompass aircraft
 
  
6
that are owned by the government and are transporting property for commercial purposes,
as well as other stipulations.
(g) Serious injury. This term refers to any injury that “(i) requires hospitalization
for more than 48 hours, commencing within 7 days from the date of the injury was
received; (ii) results in a fracture of any bone (except simple fractures of fingers, toes, or
nose); (iii) causes severe hemorrhages, nerve, muscle, or tendon damage; (iv) involves
any internal organ; or (v) involves second- or third-degree burns, or any burns affecting
more than 5 percent of the body surface” (Title 49, 2016).
(h) Substantial damage. This term refers to “damage or failure which adversely
affects the structural strength, performance, or flight characteristics of the aircraft, and
which would normally require major repair or replacement of the affected component”
(Title 49, 2016).
(i) Unmanned aircraft accident. This term refers to “an occurrence associated
with the operation of any public or civil unmanned aircraft system that takes place
between the time that the system is activated with the purpose of flight and the time that
the system is deactivated at the conclusion of its mission, in which: (i) any person suffers
death or serious injury; or (ii) the aircraft has a maximum gross takeoff weight of 300
pounds or greater and sustains substantial damage” (Title 49, 2016).
The ASRS occurrences examined in the context of this study refer only to the
above-mentioned “incidents;” however, it is important to discern these classifications
from one another in order to have a richer understanding of this study’s implications and
the duties of the ATO SMS.
 
  
7
Acronyms
A4A Airlines for America
ACI Airports Council International
AJI ATO Safety and Technical Training
AOV Air Traffic Safety Oversight Service
ASAP Aviation Safety Action Program
ASRS Aviation Safety Reporting System
ATC Air Traffic Control
ATO Air Traffic Organization
CAA Civil Aeronautics Authority
CAB Civil Aeronautics Board
CASS Continuing Analysis and Surveillance System
C.A.S.E. Coordinating Agency for Supplier Evaluation
COO Chief Operating Officer
DOT Department of Transportation
FAA Federal Aviation Administration
IATA International Air Transport Association
ICAO International Civil Aviation Organization
LOB Line of Business
LOSA Line Operations Safety Assessments.
MRLOSA Maintenance and Ramp Line Operations Safety Assessment
NAS National Airspace System
NASA National Aeronautics and Space Administration
NextGen Next Generation Air Transportation System
SDP Service Delivery Point
SMS Safety Management System
SRM Safety Risk Management
SRMGSA Safety Risk Management Guidance for System Acquisition
VSRP Voluntary Safety Reporting Program
Statement of Problem
Many recent studies have shown the need for continued honing of the ATO SMS
to alleviate key safety issues and to improve air transportation outcomes (ATO Safety
Report, 2014; Better Quality, 2013; Dillingham, 2014; FAA Efforts, 2013; FAA Final
Rule, 2015; FAA Requires, 2015). In an attempt to address these concerns, the ATO
 
  
8
revised many of its SMS policies. These revised policies went into effect on September 1,
2014, and it is yet to be seen whether or not they have been effective in inhibiting
unwanted aviation outcomes. It would be beneficial to examine the volume and specifics
of the aviation incidents that occurred both before and after the new policies were
instated in order to discern their effectiveness. Such detailed consideration of this data
may better inform aviation personnel on the advantages and disadvantages of these
policies. If it is found that the new policies correlate with an increasing number of
incidents, it may be appropriate to and make further adjustments.
Research Question
Have the new Safety Management System (SMS) policies instated in 2014 by the
Air Traffic Organization (ATO) affected the number of aviation incidents?
Hypothesis
The hypothesis of this study is that the new ATO SMS policies have had a
significant influence on the number of aviation incidents since they took effect on
September 1, 2014; therefore, the number of incidents related to ATO that occurred after
the new policies went into effect should be significantly different than the number of
incidents that occurred before the new policies went into effect.
H0: µ1 - µ2 = 0
H1: µ1 - µ2 ≠ 0
The value µ is defined as the number of incidents for selected airports per year,
wherein:
 
  
9
µ1 = number of incidents occurring from September 1, 2013 to August 31, 2014;
and
µ2 = number of incidents occurring from September 1, 2014 to August 31, 2015.
Objective
The main objective of this study was to examine whether the new ATO SMS
policies instated in September 2014 had an affect on the number of incidents related to
ATO reported by the ASRS. If so, this analysis may offer valuable insight as to the
effectiveness of these policies and could provide detailed information to aviation
personnel as to which policies should be maintained as is and which need further scrutiny.
 
  
10
CHAPTER II—LITERATURE REVIEW
Introduction to Literature Review
This chapter provides an outline of the literature that is relevant to the purposes of
this study, including histories of the various coordinating organizations and events
leading up to the contemporaneous aviation safety policies and an overarching review of
the ATO SMS. The first section offers an overview of the FAA, including: preceding
efforts in U.S. aviation safety, formation of the FAA, and the official duties of the FAA.
The second section reviews the NAS. The third section describes the ATO. The fifth
section goes into detail about the SMS, including: the origins of the SMS, components of
the SMS, and recent successes of the SMS. The sixth and final section of this literature
review examines the ASRS, including a brief history and information on report
processing.
The Federal Aviation Administration (FAA)
The U.S. Federal Aviation Administration (FAA) states that their “continuing
mission is to provide the safest, most efficient aerospace system in the world,” and their
organization is responsible for the protocols and policies put in place to do so (About
FAA, 2015). According to the DOT, the FAA is currently the leading authority in the
international aerospace community in this regard, and they are responsible for being
responsive to “the dynamic nature of customer needs, economic conditions, and
environmental concerns” (Planning Glossary, 2012). To better understand the FAA and
their operations, it is necessary to have a cursory exploration of the organization’s
inception, history, and evolution.
 
  
11
Early Aviation Safety in the United States
With the increased popularity and reliance on air travel in the first half of the
twentieth century, aviation personnel came to realize that the mode of transportation
could not reach its full potential without a system of official protocols that would regulate
the standards of safety, maintenance, air traffic, and other allowances (History, 2015).
Before such legislation was officiated, local airport operators provided remedial air traffic
control (ATC) and support in the form of controllers who would stand on the airfield and
wave flags to help direct incoming and outgoing flights. However, as the avocation
burgeoned into an industry, a number of federal actions were passed to instate safety and
regulatory standards that evolved with the development of U.S. aviation (see Table 1 for
a summarized list of such legislation and milestone events).
The first of these actions regarding aviation safety was the Air Commerce Act of
1926, which certified aircraft and their safety requirements, established airways, issued
and enforced air traffic rules, licensed pilots, and maintained airspace navigation routes
and communication (Flener, 1968; History, 2015). Additionally, the Department of
Commerce created a new branch for aeronautics to oversee the aviation industry (History,
2015). In 1934, as the growth of aviation and the importance of the industry became more
apparent, this branch was cultivated into its own entity: The Bureau of Air Commerce
(Komons, 1978). This organization established the country’s first ATC centers, in
Newark, New Jersey; Cleveland, Ohio; and Chicago, Illinois. These centers used very
remedial methods to track their aircraft initially, and had no direct contact with the pilots
they were guiding. Though these techniques worked to improve safety, eventually a
 
  
12
number of high-profile accidents—including one involving a U.S. senator in 1935—
caused the Department of Commerce to overhaul the program and its responsibilities.
In 1938, President Franklin Roosevelt initiated the Civil Aeronautics Act, which
established the Civil Aeronautics Authority (CAA) and created a three-person Air Safety
Board that conducted detailed investigations of aviation accidents in order to make
recommendations to safer procedures (Wilson, 1979). This legislation also “expanded the
government's role in civil aviation by giving CAA power to regulate airline fares and
determine the routes individual carriers served” (History, 2015). Two years later, the
CAA was divided into two different agencies: the CAA and the Civil Aeronautics Board
(CAB; Wilson, 1979). The former maintained regulation over ATC, airways, pilot and
craft certification, and safety; and the latter’s responsibilities included the economic
airline regulation, the investigation of accidents, and safety rulemaking. America’s entry
into WWII prompted CAA’s ATC procedures to build airport towers, for safety and
efficiency reasons, and these installations became standard at most airports following the
war.
Table 1
U.S. Aviation Key Events and Milestones
Year Event/Milestone Significance
1926 Air Commerce Act signed Required regulations for pilots and passengers, and
also established the nation’s first air traffic
regulations (Source: Komons, 1979)
1926 Aeronautics branch created by Dept. of
Commerce
Indicated growth in interest of the aviation industry
in the United States (Source: Komons, 1979)
1934 Bureau of Air Commerce formed Had complete control and authority over the
regulation of air safety and the airways (Source:
Komons, 1979)
1936 Formation of first ATC centers Allowed for coordinated efforts and data sharing
between a limited group of aviation experts for
increased air safety and cooperation (Komons:
 
  
13
1979)
1938 Civil Aeronautics Act signed Established Air Safety Board and Civil Aeronautics
Authority (CAA); CAA had power to regular airfare
(Source: Komons, 1979; Wilson, 1979)
1940 Development of Civil Aeronautics Board Separated aviation duties into two entities to ensure
greater concentration on safety (Source: History,
2015; Wilson, 1979)
1941 U.S. entry into WWII Prompted ATC systems to include air towers at
airport (Source: Wilson, 1979)
1956 Trans World Air and United Air collision
in Arizona
One-hundred-and-twenty-eight fatalities prompted
further federal mitigation of U.S. aviation safety,
leading to Federal Aviation Act (Source: Rochester,
1976)
1958 Federal Aviation Act signed Responsibilities led to formation of the Federal
Aviation Agency (Source: Rochester, 1976; Wilson,
1976)
1967 Formation of Dept. of Transportation Federal Aviation Administration formed as a modal
organization within the DOT (Source: Kent, 1980)
1970 Airport and Airway Development Act
signed
Increased funding for U.S. aviation substantially
(Source: Kent, 1980; Murphy, 1999; Preston, 1987)
1978 Airline Deregulation Act signed Removed federal control over fares, market entry
for new commercial airline carriers, and routes
(Source: History, 2015)
1982 National Airspace System (NAS)
introduced
Placed parameters on airspace classes to better
allow safety regulation (Source: History, 2015)
1988 Aviation Safety Research Act signed Authorized further federal funding for new aviation
technologies and developments (Source: FAA
Chronological History, 2011)
2001 Transportation Security Administration
(TSA) formed following 9/11 terrorist
attacks
Extended spending to more security-based
developments (Source: 1997-2015 Update, 2016)
2004 Air Traffic Organization (ATO) begins
operations
Consolidated FAA’s air traffic services to be much
more efficient and allowed for a stronger focus on
public consideration (Source: 1997-2015 Update,
2016)
2004 SMS initiated New rules, procedures, and tools put in place to
better monitor aviation safety in the NAS via data
collection and analysis (Source: 1997-2015 Update,
2016; ATO Safety Report, 2014)
Origin of the FAA. After WWII, a midair collision in Arizona that killed 128
occupants of two aircraft prompted further federal action on the regulation of air traffic
(History, 2015). Two years after this accident, the Federal Aviation Act was signed in
1958, which “transferred the Civil Aeronautics Authority’s functions to a new
 
  
14
independent Federal Aviation Agency responsible for civil aviation safety” (History,
2015). While the initial years of the agency were a bit jumbled, as the organization was
working to grow alongside a quickly evolving industry that was continually met with new
issues, the 1967 development of the DOT created new purpose for it. It was that year, due
to President Lyndon B. Johnson’s insistence on a more comprehensive transportation
strategy for the entire nation, that the Federal Aviation Agency became the Federal
Aviation Administration (pursuant to DOT Act 49 U.S.C. App. 1651; FAA, 2014).
Duties of the FAA. The FAA’s responsibilities developed and transformed over
time, with continual challenges posing the need for restructuring and innovative thought.
Developing safety concerns—such as in-air accidents and aircraft hijackings—have
caused the FAA to shift focus a number of times, and a number of safety policies and
corresponding organizations have been formed to accommodate safety concerns.
However, according to the DOT’s Bureau of Transportation Statistics (FAA, 2014), the
FAA is currently charged with the following duties:
(a) Regulating air commerce in ways that best promote its development and
safety and fulfill the requirements of national defense;
(b) Controlling the use of navigable airspace of the United States and regulating
both civil and military operations in such airspace in the interest of safety and
efficiency; (c) promoting, encouraging, and developing civil aeronautics;
(d) Consolidating research and development with respect to air navigation
facilities;
(e) Installing and operating air navigation facilities;
 
  
15
(f) Developing and operating a common system of air traffic control and
navigation for both civil and military aircraft; and (g) developing and
implementing programs and regulations to control aircraft noise, sonic boom,
and other environmental effects of civil aviation.
Of course, these responsibilities are vast in scope. Therefore, the FAA utilizes a number
of internal and external infrastructures aimed at assisting them in their endeavors. While
these infrastructures—in the forms of various safety systems, protocols, and associated
organizations—are numerous, only those most relevant to this study in particular are
explored in the following sections.
The Air Traffic Organization (ATO)
The ATO’s mission is to provide efficient and safe air navigation services in U.S.
airspaces, including services regarding air traffic management, communications,
navigation, and surveillance (ATO Policy, 2014). Specifically, according to ATO Order
JO 1000.37A, the role of the organization as being encompassed by the following two
comprehensive responsibilities:
(1) Establishing and maintaining ATO’s safety guidance, policies, and processes
to support mission requirements that:
(a) Are consistent with FAA policy, requirements, and guidance (e.g., current
edition of Order 8040.4, Order 8000.369, and the FAA Acquisition
Management System);
(b) Meet the NAS safety management requirements established by Order
1100.161; and
 
  
16
(c) Are consistent with the basic principles of safety management established by
the ATO SMS Manual;” and
(2) Developing minimum requirements for NAS service level availability (ATO
Policy, 2014).
In line with these responsibilities, the ATO sets broad-sweeping initiatives and
goals for each fiscal year, and makes strives to accomplish them. In their most recent
safety report, FY 2014, such goals included “measuring more accurately the effectiveness
of our hazard mitigation strategies; developing a more dynamic, data-driven training
curriculum; developing tools to improve our ability to analyze safety risks; advancing
Next Generation Air Transportation System safety initiatives; aligning the assumptions
built into our safety simulation models; and addressing safety issues affecting the
organization as a whole” (ATO Safety Report, 2014, p. 1). Additionally, the COO noted
that out of more than 25 million flights monitored by the ATO, the organization fully
complied with 99.994 percent of the FAA’s air traffic operations, and addressed over 100
safety concerns that had been previously uncovered.
According to the ATO’s 2014 internal safety report, the organizations “data
collection, safety assessment, and risk mitigation efforts—all essential to risk-based
decision making—are made possible by the policies, procedures, and tools that compose
our Safety Management System” (ATO Safety Report, 2014).
The Safety Management System (SMS)
The ATO SMS is an integrated set of policies, procedures, and tools used – in part
– to collect data on aviation safety (ATO Safety Report, 2014). The system is a
 
  
17
“multidisciplinary, integrated, and closed-loop framework used to help maintain safe and
efficient air navigation services and infrastructure throughout the NAS and in United
States-controlled international/oceanic airspace” (ATO Policy, 2014, 1-2). Working
directly in support of the FAA’s mission, this formalized approach to safety was first
initiated in 2004, with its policies officially adopted by the ATO three years later (Safety
Management, 2012). By 2008, correlations were found between the SMS and aviation
performance, indicating that the changes made according to data uncovered by SMS
protocol were resulting in fewer unwanted flight outcomes. The SMS procedures are
largely guided by safety, which is the principle consideration of all ATO activities.
According to the SMS Manual (Version 4.0, 2014), safety is defined as “the state in
which the risk of harm to persons or property damage is acceptable” (p. 1).
The policies and procedures used by the SMS have evolved since its inception.
An overview of many of the key events and milestones leading up to the SMS’s current
state is outlined in Table 2.
Table 2
ATO SMS Key Events and Milestones
Date Event/Milestone Significance
Dec 7, 2000 Executive Order 13180 signed Air Traffic Organization (ATO) formed, with
intentions of improving the provision of air traffic
operations (Source: Executive Order 13180, 2000)
Nov 23, 2006 ICAO Annex 6, Part 1 Directed all member states to require that an
aviation operator implement and abide by an SMS,
effective Jan 1, 2009 (Source: Safety Management
Systems Update, N 8900.133, 2010)
Jul 23, 2009 Advanced Notice of Proposed
Rulemaking published
Solicited public guidance toward rulemaking and
development of a SMS (Source: ANPRM, 2009)
Jul 28, 2010 Bill HR 5900 introduced Outlined the Airline Safety and Federal Aviation
Administration Act of 2010, which required that all
Part 121 Air Carriers implement a SMS (Source:
H.R. 5900, 2010)
 
  
18
Aug 1, 2010 Public Law 111-216 HR 5900 signed into law (Source: Airline Safety
and Federal Aviation Administration Extension Act,
2010).
Nov 18, 2010 ICAO Annex 6, Amendment 33 Required operator to implement an SMS in
accordance with ICAO SMS Framework, including
4 new components and 12 new elements (Source:
SMS Quick Reference Guide, 2015; Swickard,
2010)
Aug 1, 2012 PL 111-216 final rule modification Mandated a final rule to be issued that required Part
121 Certificate Holders to implement a SMS
(Source: SMS Quick Reference Guide, 2015)
Nov 14, 2013 ICAO Annex 19 Consolidated existing safety management
provisions (Source: SMS Quick Reference Guide,
2015)
Jun 1, 2014 Safety Assurance System deployed “New FAA oversight system to be used for
oversight of certificate holders under 14 CFR part
121, 135, and 145. SAS deployment in Flight
Standards offices including FSDOs, CMOs and
IFOs will be complete in September, 2015”
(Source: SMS Quick Reference Guide, 2015, p. 2)
Aug 1, 2014 SMS Voluntary Program activated Service providers who are implementing SMS
participate voluntarily using new program standards
(Source: SMS Quick Reference Guide, 2015)
Sep 1, 2014 New SMS policies in effect New policies aimed at further ensuring the safety of
air traffic in NAS are put in order (Source: Air
Traffic Organization Policy, Order JO 1000.37A,
2014)
SMS components. Every task and procedure in the SMS is a component of a
greater component that is working toward aviation safety—almost as the SMS is part of
the ATO, and the ATO is part of the FAA. The SMS operates under four components in
efforts to create an all-encompassing strategy to manage and ensure safety in U.S.
airspace. These components are: (1) safety policy; (2) safety risk management (SRM); (3)
safety assurance; and (4) safety promotion (SMS Manual, 2014; Safety Management,
2012). Each and every procedure and process of the SMS is completed with consideration
to these components and to foster a positive safety culture amongst ATO personnel.
(1) Safety policy. The first of these components, safety policy, “[e]stablishes
senior management’s commitment to continually improve safety [, and also] defines the
 
  
19
methods, processes, and organizational structure needed to meet the safety goals” (SMS
Quick, 2015, p. 1). The guidance, methods, standards, and rules that to ATO has put in
place in this regard serve to not only establish and execute safety policy for the tasks at
hand, but also to improve upon the SMS proactively and to ensure NAS safety (ATO
Policy, 2014).
(2) SRM. The second component, SRM, “[d]etermines the need for, and adequacy
of, new or revised risk controls based on the assessment of acceptable risk” (SMS Quick,
2015, p. 1). The policies that fall beneath this component first and foremost are utilized
“by ATO safety practitioners to identify hazards, analyze and assess their risks, determine
safety performance targets, and implement and track appropriate risk controls for all air
traffic operations, facilities, equipment, and systems in the NAS” (ATO Policy, 2014, p.
1-3).
(3) Safety assurance. Safety assurance “[e]valuates the continued effectiveness of
implemented risk control strategies, supports the identification of new hazards” (SMS
Quick, 2015, p. 1). This component ensures that the policies and procedures enacted by
ATO personnel under the other three components are operated according to the
organizations standards and expectations (ATO Policy, 2014). Safety assurance ensures
that the processes and efforts of ATO personnel are not fruitless, and secures these
processes from having adverse effects on the organization. If adverse effects are
identified as a result of faulty or superfluous SMS methods, safety assurance is the
component that ensures that the issue is solved accordingly by assessing the risk via data
collection, discovering a safer solution, and implementing the solution throughout the
 
  
20
entire system accordingly. Regular audits and performance reviews may be conducted
under safety assurance to establish the best possible outcomes.
(4) Safety promotion. The final component, safety promotion, “[i]ncludes training,
communication, and other actions to create a positive safety culture within all levels of
the workforce” (SMS Quick, 2015, p. 1). Safety promotion uses SMS data, processes,
and the behaviors and daily interactions of ATO personnel to advocate safety amongst
every facet of the organization’s formalized systems (ATO Policy, 2014). Employee
conduct and adherence to SMS rules and regulations are closely monitored under this
component.
The Aviation Safety Reporting System (ASRS)
The ASRS is a voluntary reporting system that gathers information submitted by
aviation personnel, including air traffic technicians, cabin crews, dispatchers,
maintenance technicians, and pilots (ASRS Program Briefing, 2014). Such reports can be
filed confidentially, and add to the collective database of aviation accidents and incidents
in U.S. airspace, and the system’s focus is particularly concerned with human
performance and human error. The purpose of the ASRS is to improve the U.S. aviation
safety by identifying the deficiencies and discrepancies, and to provide the data submitted
to the system to be used for planning and improvements on the NAS. The ASRS staff is
composed of highly experienced aviation personnel, from a range of professions, and
they boast “over 470 cumulative years of aviation expertise covering the full spectrum of
aviation activity: air carrier, corporate, military, and general aviation; Air Traffic Control
in Towers, TRACONs, Centers, and Military Facilities,” and their analyst’s “cumulative
 
  
21
flight time exceeds 140,000 hours in over 50 different aircraft” (ASRS Program Briefing,
2014, p. 7).
ASRS report processing. The FAA “considers the filing of a report with NASA
concerning an incident or occurrence involving a violation of 49 U.S.C. subtitle VII or
the 14 CFR to be indicative of a constructive attitude,” and filling out a report results it in
being sent to Moffett Field in California (ASRS Program Briefing, 2014, p. 9). During
the first days of the ASRS, the program averaged around 400 reports per month, and
report submissions have increased significantly over the years. Recent figures note that
report intakes number, on average, 1,684 per week and 6,736 per month (p. 12). Since the
program’s inception in 1981, a total of 1,140,440 reports were submitted throughout 2013,
with the vast majority of reports coming from air carrier personnel in the last decade (p.
13-14).
Reports that are submitted are completely confidential, and are coded, with
personal information stricken out to maintain anonymity (ASRS Program Briefing, 2014).
The online submission form offers a number of boxes that the volunteer reporter may
check regarding certain criteria, and also provides ample space for the individual to write
out a narrative response describing the event (NASA ASRS, 2009). Additionally, the
report form prompts reporters to offer up suggestions as to what they believe could help
alleviate the issue they are reporting. Once the reports are received, two ASRS
professional analysts screen them within three working days to categorize and assess the
content (ASRS Program Briefing, 2014). If analysts have questions or concerns when
processing the report, they may contact the reporter to inquire for further details or
 
  
22
clarification. Reports are run through a final check for quality assurance, and are then
coded accordingly for analysis.
 
  
23
CHAPTER III—METHODOLOGY
This study analyzed ASRS incidents related to the ATO from two different
yearlong periods: before the effective date of new SMS protocol and policies and after.
These periods, specifically, ranged from September 1, 2013 to August 31, 2014, and from
September 1, 2014 to August 31, 2015. The incidents examined were reported via the
ASRS and were associated specifically with the ATO. Additionally, all incidents
analyzed in this study occurred at airports in Class B airspaces in the United States. It
was hypothesized that the recent adjustment in ATO SMS policies would result in a
significant statistical difference in the number of flight incidents that were reported.
Study Sample
According to the ACI, North America handles more cargo and passengers than
any other region on the planet (ACI – North America, 2015). The data analyzed in this
study was pulled from three different groups, including: (1) all 37 Class B airports in the
United States; (2) the top 15 airports with the most movements (based on the 37 Class B
airport list); and (3) the top 15 airports with the highest number of incidents according to
the ASRS criteria search (based on the 37 Class B airport list).
Class B airspace designations. There are 37 Class B airspace sites in the United
States (Airspace Designations, 2013). Class B airspace areas are some of the busiest air
traffic sites in the country. According to their official FAA designation, Class B airspace
areas are defined as “airspace from the surface to 10,000 feet mean sea level (MSL)
surrounding the nation’s busiest airports in terms of airport operations or passenger
enplanements” (Procedures for Handling Airspace Matters, 2012, 4.14.1.2). Each
 
  
24
individual Class B airspace site is defined and tailored by the FAA, and consists of a
surface area and at least two more layers. Each Class B site is also “designed to contain
all published instrument procedures” (Procedures, 2012, 4.14.1.2). Additionally, in Class
B airspace zones, an “ATC clearance is required for all aircraft to operate in the area, and
all aircraft that are so cleared receive separation services within the airspace” (Procedures,
2012, 4.14.1.2). Incidents from these airport sites were specifically utilized in this study
because they make up a large enough sample to be representative of the comprehensive
flights in the United States.
The 37 Class B airspace sites examined in this study were listed in the FAA
ATO’s policy statement on “Airspace Designations and Reporting Points” (2013). Each
of these Class B airspaces contained at least one primary airport site, and all aircraft
operators within each site is subject to minimum aircraft equipment requirements,
operation rules, and pilot qualification requirements in accordance with Federal Aviation
Regulation 14 CFR 91.131 (Airspace Designations, 2013). The Class B airspace sites
examined in this study are sorted in Table 3 according to their geographic locations by
state, including their official civilian titles and their International Air Transport
Association (IATA) airport codes.
Table 3
U.S. Class B Airspace Sites (37)
State IATA Airport Code Civilian Title
Arizona PHX Phoenix Sky Harbor International
California LAX Los Angeles International
California NKX Marine Corps Air Station Miramar
California SAN San Diego International
California SFO San Francisco International
Colorado DEN Denver International
 
  
25
Florida MCO Orlando International
Florida MIA Miami International
Florida TPA Tampa International
Georgia ATL Hartsfield-Jackson Atlanta Intl.
Hawaii HNL Honolulu International
Illinois ORD Chicago-O’Hare International
Kentucky CVG Cincinnati/N. Kentucky Intl.
Louisiana MSY Louis Armstrong New Orleans Intl.
Maryland ADW Andrews Air Force Base
Maryland BWI Baltimore/Washington Intl.
Massachusetts BOS Boston-Logan International
Michigan DTW Detroit Metro Wayne County
Minnesota MSP Minneapolis-St. Paul Intl.
Missouri MCI Kansas City International
Missouri STL Lambert-St. Louis International
Nevada LAS Las Vegas-McCarran Intl.
New Jersey EWR Newark Liberty International
New York JFK New York-John F. Kennedy Intl.
New York LGA New York-LaGuardia
North Carolina CLT Charlotte Douglas International
Ohio CLE Cleveland Hopkins International
Pennsylvania PHL Philadelphia International
Pennsylvania PIT Pittsburgh International
Tennessee MEM Memphis International
Texas DFW Dallas-Ft. Worth International
Texas HOU Houston-Hobby (Secondary Class B)
Texas IAH Houston-George Bush Intl.
Utah SLC Salt Lake City International
Virginia DCA Ronald Reagan Washington Natl.
Virginia IAD Washington Dulles International
Washington SEA Seattle-Tacoma International
Source: Airspace Designations and Reporting Points (Order JO 7400.9X). (2013). FAA Air Traffic
Organization.
Top 15 movement airports. The second study group utilized by this study will
be the top 15 airports according to movement, which also fall within the 37 Class B
airports listed above. According to 2014 data from the Airports Council International
(ACI), total movements, in this sense, refer to the landing and take off of an aircraft. The
top 15 airports in this group to be examined within this study’s subtext are listed in Table
4, including their state, IATA airport code, and their total aircraft movements for 2014.
 
  
26
Table 4
Top 15 Total Aircraft Movement Airports (According to Class B Airspace Classification)
State IATA Airport Code Total Aircraft Movements
Illinois ORD 891,933
Georgia ATL 868,359
California LAX 708,674
Texas DFW 679,820
Colorado DEN 565,525
North Carolina CLT 545,178
Nevada LAS 522,399
Texas HOU 499,802
California SFO 431,633
Arizona PHX 430,461
New York JFK 422,415
Pennsylvania PHL 419,253
Minnesota MSP 412,586
Florida MIA 402,663
New Jersey EWR 395,524
Source: Airports Council International – North America (2015). 2013 North American Airport Traffic
Summary.
Top 15 airports with highest incident reports. The third study group utilized by
this study will be the top 15 airports according to those with the highest incident reports
according to the ASRS search criteria to follow, and which also fall within amongst 37
Class B airports listed above.
ASRS Search Criteria
The ASRS database was searched using criteria to identify incident reports that
were specifically related to the ATO. These selections were made to best assess
fluctuations that may have occurred in accordance with ATO SMS policies. This criteria
search included the following specifications: (a) incidents occurring from September 1,
2013 to August 31, 2014, and from September 1, 2014 to August 31, 2015; (b) incidents
occurring at Class B airport sites; (c) all available options on the ASRS “Reported
Organization” criteria listing; (d) all available options on the ASRS “Reported Function”
 
  
27
criteria listing; and (e) the ASRS “Contributing Factors” “ATC Equipments,” “NAV
Facility,” and “Building.”
Data Analysis
The t-Test data analysis technique was used to analyze the data gathered from the
ASRS.
Limitations
Though this inclusive study sample group will offer general insight pertaining to
the purpose of this study, a number of factors could, in fact, contribute to disparities in
the number of incidents that take place in the pre- and post-periods examined herein. For
example, trends at certain airports show higher volumes of aircraft movement than at
others. While this sample was chosen, in part, because Class B airspace sites tend to
impact the civilian populace to the greatest extent, this must be taken into consideration.
In an ideal scenario, all 37 Class B airports would also rank, in a descending order,
amongst the top 37 airports for (a) total annual commercial passenger counts, (b) top
cargo airports, and (c) total movements. However, that is simply not the case. In fact, the
2014 traffic count for the top five North American in those three categories is as follows
(in descending order):
(a) Annual Commercial Passenger Counts: Atlanta, 96,178,899 passengers; Los
Angeles, 70,663,265; Chicago O’Hare, 69,999,010; Dallas/Ft. Worth, 63,554,402; and
Denver, 53,472,514 (ACI – NA, 2015).
 
  
28
(b) Annual Cargo Flights (in Metric Tons): Memphis, 4,258,531; Anchorage,
2,492,754; Louisville, 2,293,231; Miami, 1,998,779; and Los Angeles, 1,816,269 (ACI –
NA, 2015).
(c) Total Aircraft Movements (incl. Landing and Take-Off Flights): Chicago,
881,933; Atlanta, 868,359; Los Angeles, 708,674; Dallas/Ft. Worth, 679,820; and Denver,
565,525 (ACI – NA, 2015).
A more in-depth study that has the resources to analyze a larger sample of airports, and to
cross-examine them from a number of different perspectives would likely yield better and
more-comprehensive reports.
Additionally, as ASRS reports are submitted on a voluntary basis, and it is
unlikely that every aviation incident is properly reported, it is likely that the frequency of
incidents is underrepresented. Because the ASRS report form does not offer strict
guidelines or parameters, and allows for a more qualitative rather than quantitative
responding mechanism, it may also be that different types of incidents are affected to a
different extent, or elements of incidents may be under- or over-reported. Such a
reporting mechanism also allows significant room for bias.
Lastly, the Automatic Dependent Surveillance—Broadcast (ADS—B) was
excluded as a contributing factor, as its compliance date is noted as January 2020
(Automatic Dependent Surveillance—Broadcast, 2010). While the ADS—B is a potential
contributing factor regarding the number of incidents reported and, therefore, examined
in this study, as it has not yet come to its mandate date, it is not considered in this study.
This has the potential to skew this study’s results.
 
  
29
CHAPTER IV—RESULTS AND DISCUSSION
Of the three groups examined within this study—all Class B airports, the top 15
Class B airports with the most movement, and the 15 Class B airports with the most
reported incidents—all of them experienced significant increases in their total number of
incidents following the instatement of the new 2014 ATO SMS policies. These truncated
findings are represented in Table 5.
Table 5
Statistical Data on Incident Reports, Pre- and Post-2014 Policy Instatement
All Class B Airports Most Movement Group
Highest No. of
Incidents
2014 2015 2014 2015 2014 2015
Total
incidents
15 46 8 29 7 39
Mean 0.4054 1.2432 0.53333 1.93333 0.46666 2.6
Standard
Deviation
0.55073 1.4024 0.63994 1.3475 0.51639 1.1832
It is clear that all three groups that were examined indicate very significant
increases in the number of incidents from 2014 to 2015—following the instatement of the
new ATO SMS policies. The Class B airport group showed a 206% increase in incidents,
the movement group showed a 262.5% increase in incidents, and the highest number of
incidents rose 457% from 2014 to 2015. Correspondingly, the mean number of incidents
rose for all groups during from 2014 to 2015. However, the standard deviations increased
for all of the examined groups in this timeframe examined as well. Table 6 lists the
number of incidents at all of the Class B airports examined.
 
  
30
Table 6
Number of Incidents, Class B Airports
Incidents in 2014 IATA Airport Code Incidents in 2015
0 PHX 1
0 LAX 6
0 NKX 0
1 SAN 0
1 SFO 3
0 DEN 3
0 MCO 0
0 MIA 1
0 TPA 0
1 ATL 2
0 HNL 0
1 ORD 2
1 CVG 2
0 MSY 1
0 ADW 0
0 BWI 1
1 BOS 1
1 DTW 1
1 MSP 1
0 MCI 0
0 STL 0
1 LAS 2
0 EWR 0
0 JFK 2
1 LGA 1
0 CLT 2
0 CLE 4
1 PHL 2
0 PIT 0
0 MEM 0
2 DFW 0
0 HOU 2
1 IAH 0
1 SLC 0
0 DCA 3
0 IAD 3
0 SEA 0
Of the 37 Class B airports examined, only 4 of the locations saw decreases in the
number of incidents from 2014 to 2015, 15 of the locations reported the same number of
incidents for each year, and 18 of them saw increases in the number of incidents, with 9
 
  
31
of those locations reporting at least 200% increase in incidents. Those with decreases
were DFW (2 to 0), IAH (1 to 0), SAN (1 to 0), and SLC (1 to 0). Those showing the
most significant increases in incidents were CLE (0 to 4), DEN (0 to 3), DCA (0 to 3),
IAD (0 to 3), and LAX (0 to 6). It is important to note that CLE and LAX—due to their
more significant increase in incidents—may be considered outliers, their results the
product of sort of anomaly or unrelated factor.
T-test
To establish whether a statistically significant difference existed between the
amount of incidents reported before and after the new 2014 ATO SMS policies were put
into order, a pooled variance t-test for the difference between means was performed on
each of the three examined groups. As t-tests calculate the statistical differences between
means, data from each of the three groups was examined to pinpoint any correlations.
The t-tests determine distinctions between means and the distribution or
variability of the data, and it also identifies critical values that define the area(s) where
the null hypothesis is rejected. If the t-test statistic that is deduced falls within the
rejection region, then the null hypothesis is rejected; if the statistic does not fall within
the rejection region, the null hypothesis stating that the means are the same cannot be
rejected. The t-value distinguishes the direction of the difference, and it is negative if the
first mean is found to be smaller in value and positive if the first mean is found to be
larger.
P-values are also indicators of statistical significance found using t-tests. The
function of the P-value is that it indicates the probability (hence, P-value) that qualifies
 
  
32
the strength of evidence against the null hypothesis. With the null distribution of the t-test
statistic and the value of the statistic, it is important to identify whether the statistic is an
outlier or in the middle of distribution (or consistent with the null hypothesis). Depending
on how the hypotheses and the test statistic are defined, outliers on either side of the
distribution may be considered. It is the P-value that helps to provide a measurement for
this outlier: the further out the value is, the smaller the P-value and the stronger the
substantiation opposing the null hypothesis in favor of the other.
T-test results on three examined groups. This analysis utilized a risk or alpha
level of 0.05, indicating that it would be statistically significant if the sample result was
observed in 5% or fewer than 5% of the samples. Therefore, in the subject case, a P-value
less than 0.025 (0.025 on each of the two tails of the data) would indicate a statistically
significant difference in the means and the null hypothesis would be rejected.
The t-test on all Class B airports can be seen in Table 7, with Figure 1 visually
depicting the t-test statistic of -3.3 and critical values of -2.02 and 2.02, respectively,
which indicate regions of both rejection and non-rejection. Given that the t-test statistic
lies within the region of rejection, the initial assumption is that the null hypothesis—that
is, that the means are the same—is rejected. However, Table 7 indicates a P-value of
0.002, which connotes that the probability of getting a test statistic as extreme as the
indicated test statistic is 0.002 or 0.2%. The P-value, when considered alongside the
0.025 confidence interval on either left or right outlier of the data, indicated a statistically
significant difference in the means. This significant difference concludes that the null
 
  
33
hypothesis would be rejected, and that data combined with the data from the t-test
statistic are evidence that there is not sufficient evidence supporting a null hypothesis.
Table 7
t-Test on All Class B Airport Data
Variable 1 Variable 2
Mean 0.4054054 1.243243
Variance 0.3033033 1.966967
Observations 37 37
Pearson Correlation -0.059291
Hypothesized Mean Difference 0
df 36
t Stat -3.316146
P (T ≤ t) one-tail 0.0010461
t Critical one-tail 1.6882977
P (T ≤ t) two-tail 0.0020922
t Critical two-tail 2.028094
Figure 1
t-Test Visualization of All Class B Airport Data
The t-test on the airports with the most movement can be seen in Table 8, with
Figure 2 visually depicting the t-test statistic of -3.1 and critical values of -2.14 and 2.14,
 
  
34
respectively, which indicate regions of both rejection and non-rejection. Given that the t-
test statistic lies within the region of rejection, the initial assumption is that the null
hypothesis—that is, that the means are the same—is rejected. However, Table 8 indicates
a P-value of 0.007, which connotes that the probability of getting a test statistic as
extreme as the indicated test statistic is 0.007 or 0.7%. The P-value, when considered
alongside the 0.025 confidence interval on either left or right outlier of the data, indicated
a statistically significant difference in the means. As with the first studied group, all Class
B airports, this significant difference concludes that the null hypothesis would be rejected,
and the P-value-confidence interval data combined with the data from the t-test statistic
are also evidence that there is not sufficient evidence supporting a null hypothesis.
Table 8
t-Test on Class B Airports with the Most Movement
Variable 1 Variable 2
Mean 0.5333333 1.933333
Variance 0.4095238 2.066667
Observations 15 15
Pearson Correlation -0.269159
Hypothesized Mean Difference 0
Df 14
t Stat -3.14551
P (T ≤ t) one-tail 0.0035776
t Critical one-tail 1.7613101
P (T ≤ t) two-tail 0.0071552
t Critical two-tail 2.1447867
 
  
35
Figure 2
t-Test Visualization on Class B Airports with the Most Movement
Data from the third and final group examined, Class B airports with highest
number of reported incidents, can be seen in Table 9, with Figure 3 visually depicting the
t-test statistic of -5.4 and critical values of -2.14 and 2.14. Given that the t-test statistic
lies within the region of rejection, the initial assumption is that the null hypothesis—that
is, that the means are the same—is rejected. However, Table 9 indicates a P-value of
0.00007, which connotes that the probability of getting a test statistic as extreme as the
indicated test statistic is 0.00007 or 0.007%. The P-value, when considered alongside the
0.025 confidence interval on either left or right outlier of the data, indicated a statistically
significant difference in the means. As with the first two groups examined, this
significant difference concludes that the null hypothesis would be rejected, and the P-
value-confidence interval data combined with the data from the t-test statistic are also
evidence that there is not sufficient evidence supporting a null hypothesis.
 
  
36
Table 9
t-Test on Class B Airports with the Highest Reported Incidents
Variable 1 Variable 2
Mean 0.4666667 2.6
Variance 0.2666667 1.4
Observations 15 15
Pearson Correlation -0.49099
Hypothesized Mean Difference 0
df 14
t Stat -5.487955
P (T ≤ t) one-tail 0.0000399
t Critical one-tail 1.7613101
P (T ≤ t) two-tail 0.0000799
t Critical two-tail 2.1447867
Figure 3
t-Test Visualization on Class B Airports with the Highest Reported Incidents
Once the null hypothesis was effectively rejected according to the results from the
t-tests of the three examined groups for the one-year study period, an extended period of
data collection—for a study period of 18 months—was conducted as a means of cross-
 
  
37
checking the data for deference in the results. The studied period extended six months
longer than the original one-year planned collection periods—from March 1, 2013 to
August 31, 2014, and from September 1, 2015 to February 29, 2016, respectively (the
latter of which was the maximum data available up until the date of this study). T-tests
were performed with the information gathered for these time periods, as seen in Table 10
and Figure 4. Additionally, Table 11 provides a detailed list of the number of incidents
for each of the Class B airports during this lengthened 18-month time period as well.
Figure 4 visually depicts the t-test statistic of -3.21 and critical values of -2.02 and
2.02. Given that the t-test statistic lies within the region of rejection, the null
hypothesis—that is, that the means are the same—is rejected. Table 10 indicates a P-
value of 0.002, which connotes that the probability of getting a test statistic as extreme as
the indicated test statistic is 0.002 or 0.2%. The P-value, when considered alongside the
0.025 confidence interval on either left or right outlier of the data, indicated a statistically
significant difference in the means. This significant difference concludes that the null
hypothesis would be rejected, as the P-value-confidence interval data combined with the
data from the t-test statistic are evidence that there is not sufficient evidence supporting a
null hypothesis.
 
  
38
Table 10
t-Test on 18-Month Incident Reports
Variable 1 Variable 2
Mean 0.7837838 1.648649
Variance 1.1741742 3.123123
Observations (No. of Airports) 37 37
Pearson Correlation 0.4234077
Hypothesized Mean Difference 0
df 36
t Stat -3.216121
P (T ≤ t) one-tail 0.0013727
t Critical one-tail 1.6882977
P (T ≤ t) two-tail 0.0027453
t Critical two-tail 2.028094
Figure 4
t-Test Visualization on 18-Month Incident Reports
 
  
39
Table 11
Number of Incidents, Class B Airports for 18-Month Study Period
Incidents in 2014 IATA Airport Code Incidents in 2015
0 PHX 1
2 LAX 7
0 NKX 0
1 SAN 0
3 SFO 6
1 DEN 4
0 MCO 0
0 MIA 1
0 TPA 0
3 ATL 2
0 HNL 0
4 ORD 3
1 CVG 2
0 MSY 1
0 ADW 0
1 BWI 3
1 BOS 1
2 DTW 1
1 MSP 1
0 MCI 0
0 STL 0
2 LAS 3
0 EWR 0
0 JFK 2
1 LGA 2
0 CLT 3
0 CLE 5
1 PHL 2
0 PIT 0
0 MEM 0
3 DFW 1
0 HOU 2
1 IAH 1
1 SLC 0
0 DCA 3
0 IAD 3
0 SEA 1
As another means of cross-checking these results, on why incidents were reported
to have increased significantly from 2014 to 2015, t-tests were performed on the total
number of movements in all Class B airports. This was crucial, as an increase in
 
  
40
movements at any of these given locations could statistically lead to increases in the
number of incidents. Data on the number of movements was collection via FAA’s Air
Traffic Activity System (FAA ATAS, 2016). However, as indicated in Table 12 and
Figure 5, there was no significant difference between the total number of movements
between the two sets of time periods examined. Figure 5 visually depicts the t-test
statistic of -0.382 and critical values of -2.02 and 2.02. Given that the t-test statistic does
not lie within the region of rejection, the null hypothesis—that is, that the means are the
same—cannot be rejected. Table 12 indicates a P-value of 0.7, which connotes that the
probability of getting a test statistic as extreme as the indicated test statistic is 0.7 or 70%,
which suggests—even though the total number of movements are acutely aligned with
this study’s allotted study period—that the total number of movements is not a potential
factor influencing the results of this study.
Table 12
t-Test on Total Movement at All Class B Airports
Variable 1 Variable 2
Mean 358933.4054 359788.1351
Variance 39,223,177,015 39,461,064,208
Observations (No. of Airports) 37 37
Pearson Correlation 0.997653623
Hypothesized Mean Difference 0
df 36
t Stat -0.382265728
P (T ≤ t) one-tail 0.352255881
t Critical one-tail 1.688297714
P (T ≤ t) two-tail 0.704511761
t Critical two-tail 2.028094001
 
  
41
Figure 5
t-Test Visualization on Total Movement at All Class B Airports
A detailed account of the total movement for all airports for the given study
periods is laid out in Table A in Appendix I.
Additionally, another t-test was performed on this data, with two airport locations
confirmed to be outliers—LAX, with 6 incident reports, and CLE with 4 incident
reports—removed from the data pool to allow for more succinct analysis that would be
free of anomalies. Table 13 and Figure 6 depict the data below. Figure 6 visually depicts
the t-test statistic of -3.03 and critical values of -2.03 and 2.03. Given that the t-test
statistic does lie within the region of rejection, the null hypothesis—that is, that the
means are the same—is rejected. Table 13 indicates a P-value of 0.002, which connotes
that the probability of getting a test statistic as extreme as the indicated test statistic is
0.002 or 0.2%. These data show the same results, even with these two outlier locations
removed from the equation.
 
  
42
Table 13
t-Test on Class B Airports, with Outliers Removed
Variable 1 Variable 2
Mean 0.428571429 1.028571429
Variance 0.31092437 1.146218487
Observations (No. of Airports) 35 35
Pearson Correlation 0.077429263
Hypothesized Mean Difference 0
df 34
t Stat -3.038545478
P (T ≤ t) one-tail 0.002273588
t Critical one-tail 1.690924255
P (T ≤ t) two-tail 0.004547176
t Critical two-tail 2.032244509
Figure 6
t-Test Visualization on Class B Airports, with Outliers Removed
T-test results on written incident reports. To shed more light on the causes of
the increase in incidents between the two studied time periods, pre- and post-new ATO
SMS policy instatement, a number of the written incident reports were collected. Each of
 
  
43
these reports noted either a contributing factor or a primary source of the problem leading
to each of these given incidents by a designated professional (as outlined in the literature
review). These contributing factors included ATC equipment, navigation facility, and
buildings, among others.
After careful scrutiny of the written reports taken from the study period, those that
indicated the “ATC equipment / Navigation Facility / Buildings” option on the reporting
infrastructure’s selections as the primary problem were identified and collected in order
to create a dataset to be used for testing for further clarification. Further, the research
searching amongst this sample for those specifying ATC irregularities as the primary
problem to get a better understanding as to whether the new ATO SMS policies affected
this aspect of aviation in particular. Table 14 shows those locations listing ATC as a
primary problem, as well as the number of incidents indicated. Interestingly, this only
appeared to be the case at 12 of the examined Class B airport locations.
Table 14
Number of Incidents with ATC Noted as a Primary Problem
Incidents in 2014 IATA Airport Code Incidents in 2015
0 LAX 3
0 SFO 2
0 ATL 2
1 ORD 0
0 CVG 1
0 BWI 1
1 DTW 0
0 CLE 3
1 PHL 1
2 DFW 0
0 HOU 2
0 DCA 1
 
  
44
The t-test on the incident reports specifying ATC irregularities or issues as the
primary problem can be seen in Table 15, with Figure 7 visually depicting the t-test
statistic of -1.95 and critical values of -2.2 and 2.2. Additionally, Table 15 indicates a P-
value of 0.08, which connotes that the probability of getting a test statistic as extreme as
the indicated test statistic is 0.08 or 8.0%. Given that the P-value and that the t-test
statistic does not lie within the region of rejection, the null hypothesis cannot be rejected.
Table 15
t-Test on Incidents with ATC Noted as a Primary Problem
Variable 1 Variable 2
Mean 0.41666667 1.333333333
Variance 0.44696967 1.151515152
Observations (No. of Airports) 12 12
Pearson Correlation -0.718060625
Hypothesized Mean Difference 0
df 11
t Stat -1.958503222
P (T ≤ t) one-tail 0.038004087
t Critical one-tail 1.795884819
P (T ≤ t) two-tail 0.076008173
t Critical two-tail 2.20098516
Figure 7
t-Test Visualization on Incidents with ATC Noted as a Primary Problem
 
  
45
For samples of actual incident reports noting ATC as both a contributing factor
and primary problem of the incident, see Appendix II.
Discussion
There are a number of reasons as to why the number of incidents increased and
the null hypothesis for this study was rejected.
Though it is evident that there is a correlation between the new SMS ATO
policies and the increase in the reported number of incidents, the policies were relatively
new to employees. Since these new policies took effect on September 1, 2014, and the
second study period for this study ranged from September 1, 2014 to August 31, 2015, it
is highly possible that employee habits were still akin to the old ATO policies, and that
they had not yet become accustomed to these newer policies aimed at increasing
efficiency and safety—and, therefore, lessening the amount of overall aviation incidents.
Human error and habitual practices in the workplace must be considered to this end.
Additionally, it could be that there was a period of time allotted to transition from
the old policies to the new, which could add to confusion on behalf of management and
employees on trying to adapt to new rules and regulations. Often, adjustments to
infrastructure in the workplace are worked in little by little, as to allow for an easy
adjustment to new systems and practices. This is done so that margin for error due to
employee error is decreased, and to help establish convention in the workplace. However,
given that the new SMS ATO policies were quite detailed in scope and range, and that
they affected a number of divisions within the aviation industry to an extent, it would be
 
  
46
difficult for all changes to occur without incident and naïve to assume that no
inaccuracies or miscalculations would occur due to the human component of the system.
Most likely, it could be that the number of incidents increased from pre- and post-
SMS ordinance adjustments because the new policies established roles and
responsibilities at all levels of the ATO, rather than just for management. According to
ATO Policy Order JO 7200.20 (2011), these new policies also increased the utilization of
the internal Voluntary Safety Reporting Program (VSRP) to be used by all ATO
personnel, regardless of their level of responsibility. The order states that reports via
VSRPs “applies to all ATO personnel directly engaged in and/or supporting air traffic
services and only to events that occur while acting in that capacity” (ATO Policy Order
JO 7200.20, 2011, p. 1). It can be assumed that this order changed the ATO employees’
attitudes toward voluntary reporting, causing them to report more on the main aviation
voluntary database, the ASRS, which is the database used for this study. If this is found
to be the case, a cross-examination of both databases could yield results with more depth
that could lead to the acute pinpointing of the causes of these incident reports.
Of course, the study periods examined herein were relatively short in scope, and
may also include abnormalities or aberrations related to factors that are independent of
this study’s focus.
Regardless, it is clear that aviation incidents increased—at least at Class B
airports—after the date of instatement of the new ATO SMS policies in 2014. Whether or
not these policies are directly related to this increase for certain may bear further scrutiny.
 
  
47
Recommendations for Future Work
The researcher recommends that further studies spanning more time and depth on
this topic could yield more definite results, which may lead to valuable information that
could allow policymakers within the aviation industry to modify their methods to further
their goals toward safety and efficiency.
One way of doing so would be to examine larger study periods, when the data
becomes so available, in order to gather more information to be tested. This will allow for
a more cohesive and richer understanding of the problems that lie within aviation
incidents and could lead to the pinpointing of specific issues that are not unsystematic in
nature and that can be alleviated with new policies and practices.
Additionally, such examinations of data may already be possible with the
information that already exists, within the written incident reports made available by the
ASRS. These reports shed light as to the specifics of irregularities that can lead to
aviation incidents, and careful review of the various aspects may help professionals better
understand the affect that ATO policies have on employees. This study examined only a
small breadth of the data available, and not to its fullest extent; however, future studies
may have success in identifying consistencies in these professional reports that may
indicate larger trends leading to these incidents. If so, SMS personnel could use their
findings to better modify the next round of policies, and could make ATO employees
aware of these shortcomings within the system until such policies could be officially
instated.
 
  
48
The ASRS provides a large swath of data from which professionals can pull to
become better informed about many aspects relating to aviation in the United States. Data
from the database was utilized in this study to determine the number of incidents that
occurred before and after the 2014 SMS ATO policies were instated, and future studies
could foster valuable findings if they continued to examine this data and find correlations
in the specific causes and factors related to these incidents.
 
  
49
CHAPTER V—CONCLUSION
The ATO SMS strives to improve key safety issues and air transportation
outcomes by revising their policies as more is learned about the specifics leading to
unwanted incidents (ATO Safety Report, 2014; FAA Efforts, 2013; FAA Final Rule,
2015). In an attempt to address these concerns, the ATO revised many of its SMS
policies in 2014, and they were put into effect on September 1 of that year.
In an attempt to address the effectiveness of these new policies, this study
examined reports of incidents reported in the Aviation Safety Reporting System (ASRS)
database, both before and after the 2014 policies were put in place. This included an
examination of the number of incidents reported for a one-year period before the policies,
and a one-year period directly following their adoption. Additionally, the researcher
examined the incident reports of those incidents that indicated “ATC equipment /
Navigation Facility / Buildings” as either a contributing factor or the primary problem
resulting in these incidents to determine whether policies related to this specific pre-
assigned indicator resulted in more or less incidents after the policies were modified.
Data from this study concludes that the 2014 ATO SMS policies have not been
effective in inhibiting unwanted aviation outcomes, and may even have lead to an
increase in aviation incidents reports. The researcher believes that, due to the mandates of
ATO Policy Order JO 7200.20 (2011) regarding employee duties, these new policies
increased the volume of reports because ATO personnel at all levels were given
authorization and responsibility utilize the organization’s Voluntary Safety Reporting
Program to relay adverse instances within the system. Because this order extended to all
 
  
50
employees, rather than those at only management-level or those who work in higher-risk
positions regarding air traffic control (such as air traffic controllers), it is fair to assume
that the number of reports would increase drastically. It could be that the increased
number of incidents reported is due to the newly developed habits and requirements of
ATO employees to use the ATO VSRP to regularly submit voluntary reports. Since the
ASRS was used to compile the data found in this study, the new ATO SMS policies that
have created a culture more attuned to using voluntary reporting systems may have
affected the reports that were submitted to gather information herein. However, further
work should be done to examine whether the volume increase with incidents correlated
with this aspect of the new policies in particular to overrule or affirm its influence in the
matter, and a cross-examination of incident reports from both data systems could yield
more complete results.
While aberrations or independent factors that do not relate to these new policies
may have also contributed to this significant increase in aviation incidents, further work
is needed to ascertain the specifics of this possibility. It would also be beneficial to
examine the volume and specifics of the aviation incidents that occurred both before and
after the SMS policies were instated in order to discern their effectiveness in the varying
areas available to professionals on the ASRS reports, i.e. “ATC equipment / Navigation
Facility / Buildings” and others. Such detailed consideration of this data may better
inform aviation personnel on the advantages and disadvantages of these policies, and if it
is found that the new policies correlate with an increasing number of incidents it may be
appropriate to and make further adjustments.
 
  
51
REFERENCES
Automatic Dependent Surveillance—Broadcast (ADS–B) Out Performance
Requirements to Support Air Traffic Control Service (14 CFR Part 91). (2010).
Federal Register, 75(103): 30160-30195.
1997-2015 Update to FAA Historical Chronology: Civil Aviation and the Federal
Government, 1926-1996. (2016, Jan 5). Federal Aviation Administration online.
Retrieved from https://www.faa.gov/about/history/media/final_1997-
2015_chronology.pdf
Airports Council International – North America (2015). 2013 North American Airport
Traffic Summary. Retrieved from http://www.aci-na.org/content/airport-traffic-
reports
About FAA. (2015, Dec 29). Federal Aviation Administration online. Retrieved from
http://www.faa.gov/about/
Advance Notice of Proposed Rulemaking (ANPRM; 14 CFR Parts 21, 119, 121, 125, 135,
141, 142, and 145). (2009, Jul 23). Proposed Rules. Federal Register, 74(140):
36414-36413. Retrieved from https://www.gpo.gov/fdsys/pkg/FR-2009-07-
23/pdf/E9-17553.pdf
Air Traffic Organization 2014 Safety Report. (2014). FAA Air Traffic Organization.
Retrieved from https://www.faa.gov/about/office_org/headquarters_offices/ato/
service_units/safety/media/ato-2014-safety-report.pdf
Air Traffic Organization Policy (Order JO 1000.37A). (2014). FAA Air Traffic
Organization. Retrieved from http://www.faa.gov/documentLibrary/media/
Order/1000-37A_ATO_Safety_Management_System_508CFINAL.pdf
Air Traffic Organization Policy (Order JO 7200.20). (2011). FAA Air Traffic
Organization. Retrieved from http://www.faa.gov/regulations_policies/orders_
notices/index.cfm/go/document.information/documentID/322841
Airline Safety and Federal Aviation Administration Extension Act of 2010 (PL 111-216).
(2010). Congress.gov. Retrieved from https://www.congress.gov/111/plaws/publ
216/PLAW-111publ216.pdf
Airspace Designations and Reporting Points (Order JO 7400.9X). (2013). FAA Air
Traffic Organization. Retrieved from http://www.faa.gov/documentLibrary/
media/Order/JO_7400.9X.pdf
 
  
52
ASRS Program Briefing. (2014). Aviation Safety Reporting System, NASA. Retrieved
from http://asrs.arc.nasa.gov/docs/ASRS_ProgramBriefing2013.pdf
ASRS: The Case for Confidential Incident Reporting Systems. (n.a.). NASA ASRS, Pub.
60: 1-7. White paper. Retrieved from http://asrs.arc.nasa.gov/docs/rs/60_Case_
for_Confidential_Incident_Reporting.pdf
Automatic Dependent Surveillance Broadcast (ADS—B), 14 CFR Part 91. (2010).
Federal Register, 75(103): 30160-30195.
Better Quality and More Complete Data Could Help FAA Further Improve Safety
Oversight. (2013). GAO Reports, 4-11. Retrieved from
http://web.b.ebscohost.com.libaccess
.sjlibrary.org/ehost/pdfviewer/pdfviewer?sid=af5a1bdd-9ba4-4494-b14b-
74d223452cbf%40sessionmgr111&vid=1&hid=106
Dillingham, G. L., & Rhodes, K. A. (2003). Better Cost Data Could Improve FAA’s
Management of the Standard Terminal Automation Replacement System. U. S.
GAO online. Retrieved from http://www.gao.gov/assets/240/237145.pdf
Executive Order 13180. (2000, Dec 7). Presidential Documents. Federal Register,
65(238): 77493-77494. Retrieved from https://www.gpo.gov/fdsys/pkg/FR-2000-
12-11/pdf/00-31697.pdf
FAA Air Traffic Activity System, Airport Operations (2016). FAA online. Retrieved from
http://aspm.faa.gov/opsnet/sys/Airport.asp
FAA Efforts Have Improved Safety, but Challenges Remain in Key Areas. (2013). GAO
Reports, 1. Retrieved from http://libaccess.sjlibrary.org/login?url=http://search.
ebscohost.com/login.aspx?direct=true&db=bth&AN=87283091&site=ehost-live
FAA Final Rule Requires Safety Management System for Airlines. (2015). Thomas Net
News, 1. Retrieved from http://libaccess.sjlibrary.org/login?url=http://search.
ebscohost.com/login.aspx?direct=true&db=bwh&AN=101295470&site=ehost-
live
FAA Historical Chronology, 1926-1996. (2011, Feb 25). FAA online. Retrieved from
https://www.faa.gov/about/media/b-chron.pdf
FAA Requires Data-driven Safety Management Systems. (2015). Air Transport
World, 52(2), 12. Retrieved from http://libaccess.sjlibrary.org/login?url=http://
search.ebscohost.com/login.aspx?direct=true&db=bft&AN=100963177&site=eho
st-live
 
  
53
Federal Aviation Administration (FAA). (2014, Feb 11). DOT Bureau of Transportation
Statistics online. Retrieved from http://www.rita.dot.gov/bts/node/265521
Flener, W. M. (1968, Aug 28). Aeronautical Beacons and True Lights (AC NO:
170/6850-1). DOT National Transportation Library online. Retrieved from
http://dotlibrary.specialcollection.net/Document?db=DOT-
ADVISORY&query=(select+0+(byhits+(field+DOCUMENT+(phrase+Air+Com
merce+Act))))
H.R. 5900. (2010, Jul 28). 111th
Congress, 2nd
Session. U.S. Government Printing Office
online. Retrieved from https://www.gpo.gov/fdsys/pkg/BILLS-
111hr5900eh/pdf/BILLS-111hr5900eh.pdf
History. (2015, Feb 19). Federal Aviation Administration online. Retrieved from
http://www.faa.gov/about/history/brief_history/
Johnson, W. (2012). SMS Jargon and Collecting Predictive Data. Airport Business, 26(3),
28-30. Retrieved from http://libaccess.sjlibrary.org/login?url=http://search.
ebscohost.com/login.aspx?direct=true&db=hjh&AN=71797745&site=ehost-live
Kent, R. J. (1980). Safe, Separated, and Soaring: A History of Federal Civil Aviation
Policy, 1961-1972. U.S. DOT: Federal Aviation Administration.
Komons, N. A. (1978). Bonfires to Beacons: Federal Civil Aviation Policy Under the Air
Commerce Act, 1926-1938. U.S. DOT: Federal Aviation Administration.
Larson, G. C. (2010). The Dubious Dawning of SMS. Business & Commercial
Aviation, 106(10), 78. Retrieved from http://libaccess.sjlibrary.org/login?url=
http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=55602109&site
=ehost-live
Lercel, D. J. (2013). Safety management systems in FAA part 145 repair stations:
Barriers and opportunities (Order No. 3587351). Available from ProQuest
Dissertations & Theses Full Text: The Humanities and Social Sciences Collection.
(1426441281). Retrieved from http://search.proquest.com.libaccess.sjlibrary.org/
docview/1426441281?accountid=10361
McNeely, S. C. (2012). Examining the relationship between organizational safety and
culture and safety management system implementation in aviation (Order No.
3504812). Available from ProQuest Dissertations & Theses Full Text: The
Humanities and Social Sciences Collection; ProQuest Dissertations & Theses Full
Text: The Sciences and Engineering Collection. (1002445201). Retrieved from
http://search.proquest.com.libaccess.sjlibrary.org/docview/1002445201?accountid
=10361
 
  
54
Murphy, R. P. (1999). Whether the Airport and Airway Trust Fund was created solely to
finance aviation infrastructure, Letter to The Honorable Frank R. Wolf Chairman,
Subcommittee on Transportation and Related Agencies. U.S. General Accounting
Office. Retrieved from http://avstop.com/history/needregulations/281779.pdf
NASA Aviation Safety Reporting System, General Form. (2009). NASA.gov. Retrieved
from https://titan-server.arc.nasa.gov/HTML_ERS/general.html
Planning Glossary. (2012, Mar 21). DOT Federal Highway Administration online.
Retrieved from http://www.fhwa.dot.gov/planning/glossary/glossary_listing.
cfm?TitleStart=F
Pierobon, M. (2015). Growing influence. Asian Aviation Magazine, 13(9), 30-32.
Preston, E. (1987). Troubled Passage: The Federal Aviation Administration During the
Nixon-Ford Term, 1973-1977. U.S. DOT: Federal Aviation Administration.
Procedures for Handling Airspace Matters (Order JO 7400.2J). (2012). FAA Air Traffic
Organization. Retrieved from http://www.faa.gov/documentLibrary/media/Order/
AIR.pdf
Rochester, S. I. (1976). Takeoff at Mid-Century: Federal Civil Aviation Policy in the
Eisenhower Years, 1953-1961. U.S. DOT: Federal Aviation Administration.
SMS Quick Reference Guide. (2015, March). FAA online. Retrieved from
http://www.faa.gov/about/office_org/headquarters_offices/avs/offices/afs/afs900/s
ms/media/newsletter/sms_qr_guide.pdf
Safety Management. (2012). FAA Air Traffic Organization. Retrieved from
http://energy.gov/sites/prod/files/2013/12/f5/DeNicuolo.pdf
Safety Management System (SMS), Manual Version 4.0. (2014). FAA Air Traffic
Organization. Retrieved from https://www.faa.gov/air_traffic/publications/media/
faa_ato_SMS_manual_v4_20140901.pdf
Safety Management Systems Update (N 8900.133). (2010). Retrieved from
http://fsims.faa.gov/PICDetail.aspx?docId=F7720BB569E0804A862577930062B
C2E
Swickard, J. (2010). FAA begins safety management system for national airspace.
Business & Commercial Aviation, 106(5), 21. Retrieved from
http://libaccess.sjlibrary.org/login?url=http://search.ebscohost.com/login.aspx?dir
ect=true&db=bth&AN=51672404&site=ehost-live
 
  
55
Title 49: Transportation. (2016). U.S. Government Publishing Office Online (U.S.
Department of Transportation, Code of Federal Regulations, Title
49.B.VIII.830.A §830.2). Retrieved from http://www.ecfr.gov/cgi-bin/text-
idx?SID=8307dfccbf197a2501ae5353856b7c44&mc=true&node=se49.7.830_12
&rgn=div8
Wilson, J. R. M. (1979). Turbulence Aloft: The Civil Aeronautics Administration Amid
Wars and Rumors of Wars, 1938-1953. U.S. DOT: Federal Aviation
Administration.
 
  
56
APPENDICES
Appendix I
Table A
Total Movements for all Class B Airports
Mvmts in 2014 IATA Airport Code Mvmts in 2015
878,366 ATL 875,606
368,288 BOS 370,429
248,249 BWI 244,748
152,665 CLE 116,496
546,747 CLT 545,658
133,507 CVG 133,656
287,731 DCA 295,103
579,335 DEN 554,314
672,891 DFW 684,595
402,405 DTW 381,403
404,435 EWR 413,382
206,291 HOU 199,987
302,634 HNL 315,841
320,554 IAD 300,528
503,529 IAH 508,853
423,613 JFK 441,127
522,588 LAS 520,739
632,485 LAX 645,467
370,132 LGA 369,773
295,120 MCO 309,670
68,796 ADW 65,312
220,989 MEM 219,146
400,700 MIA 407,271
419,789 MSP 405,279
880,061 ORD 881,128
127,299 MCI 121,612
419,831 PHL 413,437
431,416 PHX 437,771
136,322 PIT 139,899
188,670 SAN 195,003
330,809 SEA 368,197
431,710 SFO 429,504
324,878 SLC 317,085
184,142 STL 185,430
184,036 TPA 189,313
153,434 NKX 180,696
126,089 MSY 128,703
Source: FAA Air Traffic Activity System, Airport Operations (2016). FAA online. Retrieved from
http://aspm.faa.gov/opsnet/sys/Airport.asp
 
  
57
Appendix II
Incident Report Example with ATC as a Contributing Factor
Assessments	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  Weather	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  Human	
  Factors	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  Equipment	
  /	
  Tooling	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  ATC	
  Equipment	
  /	
  Nav	
  Facility	
  /	
  Buildings	
  
Primary	
  Problem	
  :	
  Equipment	
  /	
  Tooling	
  
ACN:	
  1280574	
  
Time	
  /	
  Day	
  
Date	
  :	
  201507	
  
Local	
  Time	
  Of	
  Day	
  :	
  1801-­‐‑2400	
  
	
  
Place	
  
Locale	
  Reference.Airport	
  :	
  LAX.Airport	
  
State	
  Reference	
  :	
  CA	
  
Altitude.MSL.Single	
  Value	
  :	
  100	
  
	
  
Environment	
  
Flight	
  Conditions	
  :	
  IMC	
  
Light	
  :	
  Night	
  
	
  
Aircraft	
  
Reference	
  :	
  X	
  
ATC	
  /	
  Advisory.Tower	
  :	
  LAX	
  
Aircraft	
  Operator	
  :	
  Air	
  Carrier	
  
Make	
  Model	
  Name	
  :	
  B737	
  Undifferentiated	
  or	
  Other	
  Model	
  
Crew	
  Size.Number	
  Of	
  Crew	
  :	
  2	
  
Operating	
  Under	
  FAR	
  Part	
  :	
  Part	
  121	
  
Flight	
  Plan	
  :	
  IFR	
  
Mission	
  :	
  Passenger	
  
Flight	
  Phase	
  :	
  Landing	
  
	
  
Person	
  
Reference	
  :	
  1	
  
Location	
  Of	
  Person.Aircraft	
  :	
  X	
  
Location	
  In	
  Aircraft	
  :	
  Flight	
  Deck	
  
Reporter	
  Organization	
  :	
  Air	
  Carrier	
  
Function.	
  Flight	
  Crew	
  :	
  First	
  Officer	
  
 
  
58
Function.	
  Flight	
  Crew	
  :	
  Pilot	
  Flying	
  
Qualification.	
  Flight	
  Crew	
  :	
  Air	
  Transport	
  Pilot	
  (ATP)	
  
Experience.	
  Flight	
  Crew.	
  Last	
  90	
  Days	
  :	
  151	
  
Experience.	
  Flight	
  Crew.	
  Type	
  :	
  1694	
  
ASRS	
  Report	
  Number.	
  Accession	
  Number	
  :	
  1280574	
  
Human	
  Factors	
  :	
  Distraction	
  
	
  
Events	
  
Anomaly.	
  Inflight	
  Event	
  /	
  Encounter	
  :	
  Other	
  /	
  Unknown	
  
Detector.	
  Person	
  :	
  Flight	
  Crew	
  
When	
  Detected	
  :	
  In-­‐‑flight	
  
Result.	
  Flight	
  Crew:	
  Returned	
  to	
  Clearance	
  
	
  
Assessments	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  Weather	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  Human	
  Factors	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  Equipment	
  /	
  Tooling	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  ATC	
  Equipment	
  /	
  Nav	
  Facility	
  /	
  Buildings	
  
Primary	
  Problem	
  :	
  Equipment	
  /	
  Tooling	
  
	
  
Narrative:	
  1	
  
During	
  night	
  flight	
  and	
  after	
  break	
  out	
  at	
  200	
  feet	
  and	
  1/2	
  mile	
  or	
  better,	
  the	
  new	
  
LED	
  runway	
  lights	
  at	
  25L	
  were	
  so	
  bright	
  that	
  the	
  actual	
  runway	
  could	
  not	
  be	
  seen	
  
normally.	
  To	
  clarify,	
  the	
  approach	
  lights	
  and	
  runway	
  outer	
  marking	
  lights	
  may	
  be	
  
bright,	
  but	
  the	
  newer,	
  high	
  intensity	
  runway	
  centerline	
  and	
  landing	
  threshold	
  LED	
  
lights	
  are	
  far	
  too	
  bright	
  for	
  safe	
  operation,	
  and	
  the	
  high	
  intensity,	
  especially	
  during	
  
hazy	
  or	
  foggy	
  visibility	
  conditions,	
  makes	
  it	
  very	
  difficult	
  to	
  actually	
  see	
  the	
  runway	
  
itself.	
  
	
  
Synopsis	
  	
  
Pilot reports of LED that are so bright that it was difficult to see the actual runway.
	
  
	
  
Incident Report Example with ATC as a Primary Problem
Assessments	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  ATC	
  Equipment	
  /	
  Nav	
  Facility	
  /	
  Buildings	
  
Primary	
  Problem	
  :	
  ATC	
  Equipment	
  /	
  Nav	
  Facility	
  /	
  Buildings	
  
 
  
59
ACN:	
  1248258	
  
Time	
  /	
  Day	
  
Date	
  :	
  201503	
  
Local	
  Time	
  Of	
  Day	
  :	
  1201-­‐‑1800	
  
	
  
Place	
  
Locale	
  Reference.	
  Airport:	
  LAX.	
  Airport	
  
State	
  Reference	
  :	
  CA	
  
Altitude.	
  AGL.	
  Single	
  Value:	
  0	
  
	
  
Aircraft	
  
Reference	
  :	
  X	
  
ATC	
  /	
  Advisory.	
  Tower	
  :	
  LAX	
  
Make	
  Model	
  Name	
  :	
  Light	
  Transport,	
  Low	
  Wing,	
  2	
  Turboprop	
  Eng.	
  
Crew	
  Size.	
  Number	
  Of	
  Crew	
  :	
  2	
  
Flight	
  Plan	
  :	
  IFR	
  
Flight	
  Phase	
  :	
  Taxi	
  
Route	
  In	
  Use	
  :	
  None	
  
Airspace.	
  Class	
  B:	
  LAX	
  
	
  
Person	
  
Reference	
  :	
  1	
  
Location	
  Of	
  Person.	
  Facility	
  :	
  LAX.TOWER	
  
Reporter	
  Organization	
  :	
  Government	
  
Function.	
  Air	
  Traffic	
  Control	
  :	
  Local	
  
Qualification.	
  Air	
  Traffic	
  Control	
  :	
  Fully	
  Certified	
  
Experience.	
  Air	
  Traffic	
  Control.	
  Time	
  Certified	
  In	
  Pos	
  1	
  (yrs)	
  :	
  6	
  
ASRS	
  Report	
  Number.	
  Accession	
  Number	
  :	
  1248258	
  
Human	
  Factors	
  :	
  Confusion	
  
Human	
  Factors	
  :	
  Human-­‐‑Machine	
  Interface	
  
Human	
  Factors	
  :	
  Situational	
  Awareness	
  
Human	
  Factors	
  :	
  Troubleshooting	
  
Human	
  Factors	
  :	
  Distraction	
  
	
  
Events	
  
Anomaly.	
  ATC	
  Issue	
  :	
  All	
  Types	
  
Anomaly.	
  Deviation	
  -­‐‑	
  Procedural	
  :	
  Other	
  /	
  Unknown	
  
Detector.	
  Person	
  :	
  Air	
  Traffic	
  Control	
  
When	
  Detected	
  :	
  Taxi	
  
	
  
Assessments	
  
Contributing	
  Factors	
  /	
  Situations	
  :	
  ATC	
  Equipment	
  /	
  Nav	
  Facility	
  /	
  Buildings	
  
Primary	
  Problem	
  :	
  ATC	
  Equipment	
  /	
  Nav	
  Facility	
  /	
  Buildings	
  
Thesis

More Related Content

Viewers also liked

Viewers also liked (7)

L'edat mitjana - 6è
L'edat mitjana - 6èL'edat mitjana - 6è
L'edat mitjana - 6è
 
Equity and Differentiation
Equity and Differentiation Equity and Differentiation
Equity and Differentiation
 
Marca Personal PDF
Marca Personal PDFMarca Personal PDF
Marca Personal PDF
 
RECLUTAMIENTO 2.0 EN LINKEDIN CASO DE ÉXITO
RECLUTAMIENTO 2.0 EN LINKEDIN CASO DE ÉXITORECLUTAMIENTO 2.0 EN LINKEDIN CASO DE ÉXITO
RECLUTAMIENTO 2.0 EN LINKEDIN CASO DE ÉXITO
 
Давні слов'яни
Давні слов'яниДавні слов'яни
Давні слов'яни
 
Xamarin forms en el mundo real
Xamarin forms en el mundo realXamarin forms en el mundo real
Xamarin forms en el mundo real
 
Epiglotitis
EpiglotitisEpiglotitis
Epiglotitis
 

Similar to Thesis

Yi-Fan (Tom) Chen's thesis
Yi-Fan (Tom) Chen's thesisYi-Fan (Tom) Chen's thesis
Yi-Fan (Tom) Chen's thesisTom Chen
 
Coughlin - Thesis - Development of a Forecasting Model of Naval Aviator Rete...
Coughlin - Thesis -  Development of a Forecasting Model of Naval Aviator Rete...Coughlin - Thesis -  Development of a Forecasting Model of Naval Aviator Rete...
Coughlin - Thesis - Development of a Forecasting Model of Naval Aviator Rete...Matt Coughlin C.M.
 
Investigating Geographic Information System Technologies A Global Positioning...
Investigating Geographic Information System Technologies A Global Positioning...Investigating Geographic Information System Technologies A Global Positioning...
Investigating Geographic Information System Technologies A Global Positioning...Simon Sweeney
 
450595389ITLS5200INDIVIDUALASSIGNMENTS22015
450595389ITLS5200INDIVIDUALASSIGNMENTS22015450595389ITLS5200INDIVIDUALASSIGNMENTS22015
450595389ITLS5200INDIVIDUALASSIGNMENTS22015Leonard Ong
 
PA DEP Technical Support Document for Marcellus Air Study in SW PA
PA DEP Technical Support Document for Marcellus Air Study in SW PAPA DEP Technical Support Document for Marcellus Air Study in SW PA
PA DEP Technical Support Document for Marcellus Air Study in SW PAMarcellus Drilling News
 
Dissertation_Capital Structure final
Dissertation_Capital Structure finalDissertation_Capital Structure final
Dissertation_Capital Structure finalJasmin Taylor
 
We care-linking unpaid care work and mobile value added extension services i...
We care-linking unpaid care work and mobile value added extension services  i...We care-linking unpaid care work and mobile value added extension services  i...
We care-linking unpaid care work and mobile value added extension services i...Farm Radio Trust Mw
 
INL-EXT-16-39808 CBP_ Design_Guidance
INL-EXT-16-39808 CBP_ Design_GuidanceINL-EXT-16-39808 CBP_ Design_Guidance
INL-EXT-16-39808 CBP_ Design_GuidanceJohanna Oxstrand
 
INFORMS AAS Newsletter Spring 2013 - Copy
INFORMS AAS Newsletter Spring 2013 - CopyINFORMS AAS Newsletter Spring 2013 - Copy
INFORMS AAS Newsletter Spring 2013 - CopyBenjamin Levy
 
EPANET 2 Users Manual
EPANET 2 Users ManualEPANET 2 Users Manual
EPANET 2 Users ManualMawar 99
 
National environmental quality_standards_assignment-5
National environmental quality_standards_assignment-5National environmental quality_standards_assignment-5
National environmental quality_standards_assignment-5fahadkhalil1978
 
Doer biomass emissions & safety regulations
Doer biomass emissions & safety regulationsDoer biomass emissions & safety regulations
Doer biomass emissions & safety regulationsCarlos Mendes
 
FAA Integration of Civil UAS in National Airspace Roadmap 2013
FAA Integration of Civil UAS in National Airspace Roadmap 2013FAA Integration of Civil UAS in National Airspace Roadmap 2013
FAA Integration of Civil UAS in National Airspace Roadmap 2013Tom "Blad" Lindblad
 
ENERGY CONSUMPTION AND REAL GDP IN ASEAN.pdf
ENERGY CONSUMPTION AND REAL GDP IN ASEAN.pdfENERGY CONSUMPTION AND REAL GDP IN ASEAN.pdf
ENERGY CONSUMPTION AND REAL GDP IN ASEAN.pdfHanaTiti
 
Cesar working document 3 urban strategy experiment 2
Cesar working document 3 urban strategy experiment 2Cesar working document 3 urban strategy experiment 2
Cesar working document 3 urban strategy experiment 2Marco
 

Similar to Thesis (20)

Yi-Fan (Tom) Chen's thesis
Yi-Fan (Tom) Chen's thesisYi-Fan (Tom) Chen's thesis
Yi-Fan (Tom) Chen's thesis
 
Coughlin - Thesis - Development of a Forecasting Model of Naval Aviator Rete...
Coughlin - Thesis -  Development of a Forecasting Model of Naval Aviator Rete...Coughlin - Thesis -  Development of a Forecasting Model of Naval Aviator Rete...
Coughlin - Thesis - Development of a Forecasting Model of Naval Aviator Rete...
 
Investigating Geographic Information System Technologies A Global Positioning...
Investigating Geographic Information System Technologies A Global Positioning...Investigating Geographic Information System Technologies A Global Positioning...
Investigating Geographic Information System Technologies A Global Positioning...
 
450595389ITLS5200INDIVIDUALASSIGNMENTS22015
450595389ITLS5200INDIVIDUALASSIGNMENTS22015450595389ITLS5200INDIVIDUALASSIGNMENTS22015
450595389ITLS5200INDIVIDUALASSIGNMENTS22015
 
PA DEP Technical Support Document for Marcellus Air Study in SW PA
PA DEP Technical Support Document for Marcellus Air Study in SW PAPA DEP Technical Support Document for Marcellus Air Study in SW PA
PA DEP Technical Support Document for Marcellus Air Study in SW PA
 
Dissertation_Capital Structure final
Dissertation_Capital Structure finalDissertation_Capital Structure final
Dissertation_Capital Structure final
 
We care-linking unpaid care work and mobile value added extension services i...
We care-linking unpaid care work and mobile value added extension services  i...We care-linking unpaid care work and mobile value added extension services  i...
We care-linking unpaid care work and mobile value added extension services i...
 
INL-EXT-16-39808 CBP_ Design_Guidance
INL-EXT-16-39808 CBP_ Design_GuidanceINL-EXT-16-39808 CBP_ Design_Guidance
INL-EXT-16-39808 CBP_ Design_Guidance
 
Dacota sunflower
Dacota sunflowerDacota sunflower
Dacota sunflower
 
INFORMS AAS Newsletter Spring 2013 - Copy
INFORMS AAS Newsletter Spring 2013 - CopyINFORMS AAS Newsletter Spring 2013 - Copy
INFORMS AAS Newsletter Spring 2013 - Copy
 
EPANET 2 Users Manual
EPANET 2 Users ManualEPANET 2 Users Manual
EPANET 2 Users Manual
 
National environmental quality_standards_assignment-5
National environmental quality_standards_assignment-5National environmental quality_standards_assignment-5
National environmental quality_standards_assignment-5
 
Doer biomass emissions & safety regulations
Doer biomass emissions & safety regulationsDoer biomass emissions & safety regulations
Doer biomass emissions & safety regulations
 
UAS roadmap 2013
UAS roadmap 2013UAS roadmap 2013
UAS roadmap 2013
 
FAA Integration of Civil UAS in National Airspace Roadmap 2013
FAA Integration of Civil UAS in National Airspace Roadmap 2013FAA Integration of Civil UAS in National Airspace Roadmap 2013
FAA Integration of Civil UAS in National Airspace Roadmap 2013
 
ENERGY CONSUMPTION AND REAL GDP IN ASEAN.pdf
ENERGY CONSUMPTION AND REAL GDP IN ASEAN.pdfENERGY CONSUMPTION AND REAL GDP IN ASEAN.pdf
ENERGY CONSUMPTION AND REAL GDP IN ASEAN.pdf
 
Tutorial epanet
Tutorial epanetTutorial epanet
Tutorial epanet
 
P1007 wwu
P1007 wwuP1007 wwu
P1007 wwu
 
Manual epanet
Manual epanetManual epanet
Manual epanet
 
Cesar working document 3 urban strategy experiment 2
Cesar working document 3 urban strategy experiment 2Cesar working document 3 urban strategy experiment 2
Cesar working document 3 urban strategy experiment 2
 

Thesis

  • 1. HAVE THE NEW SAFETY MANAGEMENT SYSTEM (SMS) POLICIES INSTATED IN 2014 BY THE AIR TRAFFIC ORGANIZATION (ATO) AFFECTED THE NUMBER OF AVIATION INCIDENTS? Analysis Using NASA Aviation Safety Reporting System Data A Project Presented to The Faculty of the Department of Aviation and Technology San José State University In Partial Fulfillment of the Requirements for the Degree Master of Science in Quality Assurance by May 2016
  • 2.   ii © 2016 ALL RIGHTS RESERVED The Designated Project Committee Approves the Project Titled
  • 3.   iii HAVE THE NEW SAFETY MANAGEMENT SYSTEM (SMS) POLICIES INSTATED IN 2014 BY THE AIR TRAFFIC ORGANIZATION (ATO) AFFECTED THE NUMBER OF AVIATION INCIDENTS? Analysis Using NASA Aviation Safety Reporting System Data by NABEEL OQBI APPROVED FOR THE DEPARTMENT OF AVIATION AND TECHNOLOGY SAN JOSÉ STATE UNIVERSITY May 2016 ________________________________________________________ Dr. Seth Bates Department of Aviation and Technology _________________________________________________________ Professor. Daniel Neal Department of Aviation and Technology Seth P. Bates Digitally signed by Seth P. Bates DN: cn=Seth P. Bates, o=San Jose State University, ou=Aviation and Technology Department, College of Engineering, email=seth.bates@sjsu.edu, c=US Date: 2016.06.07 22:49:32 -07'00' Seth P. Bates Digitally signed by Seth P. Bates DN: cn=Seth P. Bates, o=San Jose State University, ou=Aviation and Technology Department, College of Engineering, email=seth.bates@sjsu.edu, c=US Date: 2016.06.07 22:51:30 -07'00'
  • 4.   iv ABSTRACT HAVE THE NEW SAFETY MANAGEMENT SYSTEM (SMS) POLICIES INSTATED IN 2014 BY THE AIR TRAFFIC ORGANIZATION (ATO) AFFECTED THE NUMBER OF AVIATION INCIDENTS? Analysis Using NASA Aviation Safety Reporting System Data Prior to 2014, studies have indicated that a number of defects exist amongst the ATO SMS policies and protocol—defects that may be factors in unwanted airline accidents or outcomes. To address these risks, the ATO enacted policy changes aimed at ensuring a safer provision for air traffic and reducing the number of airline incidents. These new SMS policies, which were effective September 1, 2014, addressed safety assurance activities, ATO personnel roles and responsibilities, and introduced safety management as part of the SMS in efforts to alleviate future incidents. This study examined aircraft incidents related to the ATO from the Aviation Safety Reporting System database, pre- and post-policy change, in order to discern whether these policies have effected unwanted incidents. The 37 Class B airspace airports in the United States were examined, and data regarding incidents related to the ATO from the two 12-month pre- and post-periods were analyzed. Findings indicated that the newly instated 2014 ATO SMS policies actually resulted in a significant increase in the number of aviation incidents at the studied airports, and recommendations for future research suggest that further work be done with special attention paid to the official ASRS incident reports as to the causes of these incidents, and their correlations to SMS policies.
  • 5.     v ACKNOWLEDGEMENTS I would like to thank my first reader, Professor Daniel Neal, who helped me with some important initial thoughts in this study. And I would also like to thank my second reader, Dr. Seth Bates, who helped me hone those thoughts into a cohesive work that will all to the breadth of literature on the topic. Moreover, a big thank you to my mother for always keeping me in her prayers, as well as to my wife, who supported and encouraged me throughout this long process.
  • 6.     vi TABLE OF CONTENTS CHAPTER I—INTRODUCTION.......................................................................................1 Background..............................................................................................................1 Introduction to Project .............................................................................................3 The Aviation Safety Reporting System (ASRS)......................................................4 Acronyms.................................................................................................................7 Statement of Problem...............................................................................................7 Research Question ...................................................................................................8 Hypothesis................................................................................................................8 Objective..................................................................................................................9 CHAPTER II—LITERATURE REVIEW ........................................................................10 Introduction to Literature Review..........................................................................10 The Federal Aviation Administration (FAA) ........................................................10 Early Aviation in the United States ...........................................................11 Origin of the FAA......................................................................................13 Duties of the FAA...........................................................................15 The Air Traffic Organization (ATO).....................................................................16 The Safety Management System (SMS)................................................................16 SMS Components ......................................................................................18 The Aviation Safety Reporting System (ASRS)....................................................20 ASRS Report Processing ...........................................................................21 CHAPTER III—METHODOLOGY .................................................................................23 Study Sample .........................................................................................................23 Class B Airspace Designations..................................................................23 Top 15 Movement Airports .......................................................................25 Top 15 Airports with Highest Incident Reports.........................................26 ASRS Search Criteria ............................................................................................26 Data Analysis.........................................................................................................27 Limitations.............................................................................................................27 CHAPTER IV—RESULTS AND DISCUSSION ............................................................29 T-test ......................................................................................................................31 T-test Results on Three Examined Groups................................................32 T-test Results on Written Incident Reports................................................42 Discussion..............................................................................................................45 Recommendations for Future Work.......................................................................47
  • 7.     vii CHAPTER V—CONCLUSION........................................................................................49 REFERENCES ..................................................................................................................52 APPENDICES ...................................................................................................................57 Appendix I .............................................................................................................57 Appendix II............................................................................................................58
  • 8.     viii LIST OF TABLES Table 1. U.S. Aviation Key Events and Milestones ..........................................................12 Table 2. ATO SMS Key Events and Milestones ...............................................................17 Table 3. U.S. Class B Airspace Sites (37) .........................................................................24 Table 4. Top 15 Total Aircraft Movement Airports ..........................................................26 Table 5. Statistical Data on Incident Reports, Pre- and Post-2014 Policy Instatement.....29 Table 6. Number of Incidents, Class B Airports................................................................30 Table 7. t-Test on All Class B Airport Data ......................................................................33 Table 8. t-Test on Class B Airports with the Most Movement..........................................34 Table 9. t-Test on Class B Airports with the Highest Reported Incidents.........................36 Table 10. t-Test on 18-Month Incident Reports.................................................................38 Table 11. Number of Incidents, Class B Airports for 18-Month Study Period .................39 Table 12. t-Test on Total Movement at All Class B Airports............................................40 Table 13. t-Test on Total Movement at Class B Airports, with Outliers Removed ..........42 Table 14. Number of Incidents with ATC Noted as a Primary Problem...........................43 Table 15. t-Test on Incidents with ATC Noted as a Primary Problem..............................44 Table A. Total Movements for all Class B Airports..........................................................57
  • 9.     ix LIST OF FIGURES Figure 1. t-Test Visualization of All Class B Airport Data ..............................................33 Figure 2. t-Test Visualization on Class B Airports with the Most Movement .................35 Figure 3. t-Test Visualization on Airports with the Highest Reported Incidents .............36 Figure 4. t-Test Visualization on 18-Month Incident Reports..........................................38 Figure 5. t-Test Visualization on Total Movement at All Class B Airports.....................41 Figure 6. t-Test Visualization on Class B Airports, with Outliers Removed ...................42 Figure 7. t-Test Visualization on Incidents with ATC as a Primary Problem..................44
  • 10. CHAPTER I—INTRODUCTION Background Under the U.S. Federal Aviation Administration (FAA), the Air Traffic Organization (ATO) is tasked with ensuring efficient, safe air navigation in the National Airspace System (NAS) and in both oceanic and international airspace controlled by the United States (ATO Policy, 2014). Utilizing the framework of their Safety Management System (SMS), which was first conceptualized in 2004, the ATO has continued to improve and hone safety protocol to foster more-secure air navigation in the ever- evolving air transportation industry (Safety Management, 2012). However, while data collected under SMS procedures have indicated that employing the system has been a proactive approach to monitoring the safety of aviation operations overall, a number of reports in recent years identify hindrances existing within the SMS infrastructure regarding key areas in aviation safety. A 2003 report by the U.S. Government Accountability Office (GAO) indicated superfluous safety procedures and inadequate spending in certain areas (Dillingham & Rhodes, 2003), and a 2013 GAO study, which analyzed efforts to improve aviation safety, acknowledged many of the challenges the FAA faces and noted the administration’s continuing endeavors to create a more comprehensive and accurate way to assess and manage risk through further development of the SMS (FAA Efforts, 2013). By continuing to collect and analyze data on all aspects of aviation operations using the SMS, the ATO should be better able to identify emerging problems with safety and to anticipate – and, therefore, reduce – potential problems that may result in injury,
  • 11.     2 death, or significant property damage (FAA Efforts, 2013). In April 2013, the key areas identified by the GAO as needing more attention by the SMS included: (a) runway and ramp safety, (b) airborne operational errors, (c) general aviation, and (d) pilot training. The report indicated that the SMS data collection protocols for runway and ramp safety were significantly insufficient and “limited to certain types of incidents, notably runway incursions … which [do] not include runway overruns” (FAA Efforts, 2013). Regarding airborne operational errors, evidence showed that the SMS metric for operational errors was “too narrow to account for all potential risk” (FAA Efforts, 2013). The SMS estimates of annual flight hours used to evaluate risk are also limited to only the general aviation sector, and do not include commercial and military data, which may deem such estimates unreliable. Additionally, the GAO noted that the SMS did not include sufficient regulations needed to measure pilot school inspection requirements. The GAO review concluded by noting that though the FAA and the ATO were actively taking steps toward resolving these issues, it would likely take several years before enough data could be gathered to properly assess such risks and put new policies in place. Less than 17 months later, however – on September 1, 2014 – the ATO inducted a new version of the SMS with updated policies aimed at alleviating strategic defects and aviation incidents (ATO Policy, 2014; ATO Safety Report, 2014; SMS Manual Version 4.0, 2014). Though there is no formal indication that these new policies were spurred by the GAO study, the updated SMS manual includes developments that both cancel former policies and broaden the scope on the ATO’s monitoring or air traffic control and air navigation services according to official order JO 1000.37A (ATO Policy, 2014).
  • 12.     3 This order provides a summarized explanation of the SMS policy changes put in effect in September 2014, which are relevant to the methodology of this study. According to Order JO 1000.37A, the SMS policy changes that were made included: (a) Addressing continued ATO SMS management and improvement; (b) Reinforcing ATO Safety Assurance activities; (c) Establishing ATO SMS roles and responsibilities at all levels of the organization; (d) Establishing that “the ATO COO may permit temporary continued use of an operation or system with an existing high-risk hazard to allow the responsible Service Unit to develop and implement a plan that would mitigate the risk or eliminate the hazard;” (e) Introducing integrated safety management as part of the SMS; and (f) Establishing the ATO Safety Manager (ATO Safety and Technical Training [AJI] Group Manager for Safety Management) and ATO Chief Safety Engineer positions (ATO Policy, 2014, 1-2). Introduction to Project This study examined aviation incident reports documented in the Aviation Safety Reporting System (ASRS) database. Specifically, data from incidents related to the ATO that were reported from the year (12 months) preceding the new SMS policies that took effect on September 1, 2014, and from the year following the order were selected from the ASRS for this study.
  • 13.     4 This study’s purpose was to determine the number of flight incidents related to the ATO before and after the execution of these new SMS policies in order to establish whether significant statistical fluctuations exist between the two examined incident- outcome periods. A determination that a significant statistical difference exists in the number of incidents associated with the ATO between the before-and-after examined periods may be indicative of the effectiveness of these policy adjustments. The Aviation Safety Reporting System (ASRS) Established by the National Aeronautics and Space Administration (NASA), the ASRS is used to identify issues in the aviation system that may require attention (NASA ASRS, 2009). Data from occurrences are reported by aviation personnel (including pilots, mechanics, and air traffic controllers) to ASRS staff, and are then supplied to the FAA and used to prevent future incidents (ASRS Program Briefing, 2014). This informing and evaluation process may be limited, as not all incidents are reported and those that are reported may include bias or misinformation. The occurrences reported via the ASRS, which are submitted by various aviation personnel, are coded according to a number of defining signifiers set in place by the Department of Transportation (DOT) and predetermined elements related to different types of occurrences. Specifics from the collective data from these reported issues can be cross-referenced and examined to expose unsafe conditions or protocol that lead to unwanted outcomes. Therefore, the information gathered by the ASRS can be explored to uncover patterns leading to potential hazards, and can be used aviation administrators to create rules and regulations aimed at avoiding such hazards.
  • 14.     5 The types of issues that may be reported via the ASRS include (listed in order as they are presented on the official government title): (a) “aircraft accident,” (b) “civil aircraft,” (c) “fatal injury,” (d) “incident,” (e) “operator,” (f) “public aircraft,” (g) “serious injury,” (h) “substantial damage,” and (i) “unmanned aircraft accident” (Title 49, 2016). (a) Aircraft accident. This term refers to “an occurrence associated with the operation of an aircraft which takes place between the time any person boards the aircraft with the intention of flight and all such persons have disembarked, and in which any person suffers death or serious injury, or in which the aircraft receives substantial damage” (Title 49, 2016). (b) Civil aircraft. This term refers to “any aircraft other than a public aircraft” (c) Fatal injury. This term refers to “any injury [that] results in death within 30 days of the accident” (d) Incident. This term refers to “an occurrence other than an accident, associated with the operation of an aircraft, which affects or could affect the safety of operations” (e) Operator. This term refers to “any person who causes or authorizes the operation of an aircraft, such as the owner, lessee, or bailee of an aircraft” (f) Public aircraft. This term refers to “an aircraft used only for the United States Government, or an aircraft owned and operated (except for commercial purposes) or exclusively leased for at least 90 continuous days by a government other than the United States Government” (Title 49, 2016). Additionally, this term does not encompass aircraft
  • 15.     6 that are owned by the government and are transporting property for commercial purposes, as well as other stipulations. (g) Serious injury. This term refers to any injury that “(i) requires hospitalization for more than 48 hours, commencing within 7 days from the date of the injury was received; (ii) results in a fracture of any bone (except simple fractures of fingers, toes, or nose); (iii) causes severe hemorrhages, nerve, muscle, or tendon damage; (iv) involves any internal organ; or (v) involves second- or third-degree burns, or any burns affecting more than 5 percent of the body surface” (Title 49, 2016). (h) Substantial damage. This term refers to “damage or failure which adversely affects the structural strength, performance, or flight characteristics of the aircraft, and which would normally require major repair or replacement of the affected component” (Title 49, 2016). (i) Unmanned aircraft accident. This term refers to “an occurrence associated with the operation of any public or civil unmanned aircraft system that takes place between the time that the system is activated with the purpose of flight and the time that the system is deactivated at the conclusion of its mission, in which: (i) any person suffers death or serious injury; or (ii) the aircraft has a maximum gross takeoff weight of 300 pounds or greater and sustains substantial damage” (Title 49, 2016). The ASRS occurrences examined in the context of this study refer only to the above-mentioned “incidents;” however, it is important to discern these classifications from one another in order to have a richer understanding of this study’s implications and the duties of the ATO SMS.
  • 16.     7 Acronyms A4A Airlines for America ACI Airports Council International AJI ATO Safety and Technical Training AOV Air Traffic Safety Oversight Service ASAP Aviation Safety Action Program ASRS Aviation Safety Reporting System ATC Air Traffic Control ATO Air Traffic Organization CAA Civil Aeronautics Authority CAB Civil Aeronautics Board CASS Continuing Analysis and Surveillance System C.A.S.E. Coordinating Agency for Supplier Evaluation COO Chief Operating Officer DOT Department of Transportation FAA Federal Aviation Administration IATA International Air Transport Association ICAO International Civil Aviation Organization LOB Line of Business LOSA Line Operations Safety Assessments. MRLOSA Maintenance and Ramp Line Operations Safety Assessment NAS National Airspace System NASA National Aeronautics and Space Administration NextGen Next Generation Air Transportation System SDP Service Delivery Point SMS Safety Management System SRM Safety Risk Management SRMGSA Safety Risk Management Guidance for System Acquisition VSRP Voluntary Safety Reporting Program Statement of Problem Many recent studies have shown the need for continued honing of the ATO SMS to alleviate key safety issues and to improve air transportation outcomes (ATO Safety Report, 2014; Better Quality, 2013; Dillingham, 2014; FAA Efforts, 2013; FAA Final Rule, 2015; FAA Requires, 2015). In an attempt to address these concerns, the ATO
  • 17.     8 revised many of its SMS policies. These revised policies went into effect on September 1, 2014, and it is yet to be seen whether or not they have been effective in inhibiting unwanted aviation outcomes. It would be beneficial to examine the volume and specifics of the aviation incidents that occurred both before and after the new policies were instated in order to discern their effectiveness. Such detailed consideration of this data may better inform aviation personnel on the advantages and disadvantages of these policies. If it is found that the new policies correlate with an increasing number of incidents, it may be appropriate to and make further adjustments. Research Question Have the new Safety Management System (SMS) policies instated in 2014 by the Air Traffic Organization (ATO) affected the number of aviation incidents? Hypothesis The hypothesis of this study is that the new ATO SMS policies have had a significant influence on the number of aviation incidents since they took effect on September 1, 2014; therefore, the number of incidents related to ATO that occurred after the new policies went into effect should be significantly different than the number of incidents that occurred before the new policies went into effect. H0: µ1 - µ2 = 0 H1: µ1 - µ2 ≠ 0 The value µ is defined as the number of incidents for selected airports per year, wherein:
  • 18.     9 µ1 = number of incidents occurring from September 1, 2013 to August 31, 2014; and µ2 = number of incidents occurring from September 1, 2014 to August 31, 2015. Objective The main objective of this study was to examine whether the new ATO SMS policies instated in September 2014 had an affect on the number of incidents related to ATO reported by the ASRS. If so, this analysis may offer valuable insight as to the effectiveness of these policies and could provide detailed information to aviation personnel as to which policies should be maintained as is and which need further scrutiny.
  • 19.     10 CHAPTER II—LITERATURE REVIEW Introduction to Literature Review This chapter provides an outline of the literature that is relevant to the purposes of this study, including histories of the various coordinating organizations and events leading up to the contemporaneous aviation safety policies and an overarching review of the ATO SMS. The first section offers an overview of the FAA, including: preceding efforts in U.S. aviation safety, formation of the FAA, and the official duties of the FAA. The second section reviews the NAS. The third section describes the ATO. The fifth section goes into detail about the SMS, including: the origins of the SMS, components of the SMS, and recent successes of the SMS. The sixth and final section of this literature review examines the ASRS, including a brief history and information on report processing. The Federal Aviation Administration (FAA) The U.S. Federal Aviation Administration (FAA) states that their “continuing mission is to provide the safest, most efficient aerospace system in the world,” and their organization is responsible for the protocols and policies put in place to do so (About FAA, 2015). According to the DOT, the FAA is currently the leading authority in the international aerospace community in this regard, and they are responsible for being responsive to “the dynamic nature of customer needs, economic conditions, and environmental concerns” (Planning Glossary, 2012). To better understand the FAA and their operations, it is necessary to have a cursory exploration of the organization’s inception, history, and evolution.
  • 20.     11 Early Aviation Safety in the United States With the increased popularity and reliance on air travel in the first half of the twentieth century, aviation personnel came to realize that the mode of transportation could not reach its full potential without a system of official protocols that would regulate the standards of safety, maintenance, air traffic, and other allowances (History, 2015). Before such legislation was officiated, local airport operators provided remedial air traffic control (ATC) and support in the form of controllers who would stand on the airfield and wave flags to help direct incoming and outgoing flights. However, as the avocation burgeoned into an industry, a number of federal actions were passed to instate safety and regulatory standards that evolved with the development of U.S. aviation (see Table 1 for a summarized list of such legislation and milestone events). The first of these actions regarding aviation safety was the Air Commerce Act of 1926, which certified aircraft and their safety requirements, established airways, issued and enforced air traffic rules, licensed pilots, and maintained airspace navigation routes and communication (Flener, 1968; History, 2015). Additionally, the Department of Commerce created a new branch for aeronautics to oversee the aviation industry (History, 2015). In 1934, as the growth of aviation and the importance of the industry became more apparent, this branch was cultivated into its own entity: The Bureau of Air Commerce (Komons, 1978). This organization established the country’s first ATC centers, in Newark, New Jersey; Cleveland, Ohio; and Chicago, Illinois. These centers used very remedial methods to track their aircraft initially, and had no direct contact with the pilots they were guiding. Though these techniques worked to improve safety, eventually a
  • 21.     12 number of high-profile accidents—including one involving a U.S. senator in 1935— caused the Department of Commerce to overhaul the program and its responsibilities. In 1938, President Franklin Roosevelt initiated the Civil Aeronautics Act, which established the Civil Aeronautics Authority (CAA) and created a three-person Air Safety Board that conducted detailed investigations of aviation accidents in order to make recommendations to safer procedures (Wilson, 1979). This legislation also “expanded the government's role in civil aviation by giving CAA power to regulate airline fares and determine the routes individual carriers served” (History, 2015). Two years later, the CAA was divided into two different agencies: the CAA and the Civil Aeronautics Board (CAB; Wilson, 1979). The former maintained regulation over ATC, airways, pilot and craft certification, and safety; and the latter’s responsibilities included the economic airline regulation, the investigation of accidents, and safety rulemaking. America’s entry into WWII prompted CAA’s ATC procedures to build airport towers, for safety and efficiency reasons, and these installations became standard at most airports following the war. Table 1 U.S. Aviation Key Events and Milestones Year Event/Milestone Significance 1926 Air Commerce Act signed Required regulations for pilots and passengers, and also established the nation’s first air traffic regulations (Source: Komons, 1979) 1926 Aeronautics branch created by Dept. of Commerce Indicated growth in interest of the aviation industry in the United States (Source: Komons, 1979) 1934 Bureau of Air Commerce formed Had complete control and authority over the regulation of air safety and the airways (Source: Komons, 1979) 1936 Formation of first ATC centers Allowed for coordinated efforts and data sharing between a limited group of aviation experts for increased air safety and cooperation (Komons:
  • 22.     13 1979) 1938 Civil Aeronautics Act signed Established Air Safety Board and Civil Aeronautics Authority (CAA); CAA had power to regular airfare (Source: Komons, 1979; Wilson, 1979) 1940 Development of Civil Aeronautics Board Separated aviation duties into two entities to ensure greater concentration on safety (Source: History, 2015; Wilson, 1979) 1941 U.S. entry into WWII Prompted ATC systems to include air towers at airport (Source: Wilson, 1979) 1956 Trans World Air and United Air collision in Arizona One-hundred-and-twenty-eight fatalities prompted further federal mitigation of U.S. aviation safety, leading to Federal Aviation Act (Source: Rochester, 1976) 1958 Federal Aviation Act signed Responsibilities led to formation of the Federal Aviation Agency (Source: Rochester, 1976; Wilson, 1976) 1967 Formation of Dept. of Transportation Federal Aviation Administration formed as a modal organization within the DOT (Source: Kent, 1980) 1970 Airport and Airway Development Act signed Increased funding for U.S. aviation substantially (Source: Kent, 1980; Murphy, 1999; Preston, 1987) 1978 Airline Deregulation Act signed Removed federal control over fares, market entry for new commercial airline carriers, and routes (Source: History, 2015) 1982 National Airspace System (NAS) introduced Placed parameters on airspace classes to better allow safety regulation (Source: History, 2015) 1988 Aviation Safety Research Act signed Authorized further federal funding for new aviation technologies and developments (Source: FAA Chronological History, 2011) 2001 Transportation Security Administration (TSA) formed following 9/11 terrorist attacks Extended spending to more security-based developments (Source: 1997-2015 Update, 2016) 2004 Air Traffic Organization (ATO) begins operations Consolidated FAA’s air traffic services to be much more efficient and allowed for a stronger focus on public consideration (Source: 1997-2015 Update, 2016) 2004 SMS initiated New rules, procedures, and tools put in place to better monitor aviation safety in the NAS via data collection and analysis (Source: 1997-2015 Update, 2016; ATO Safety Report, 2014) Origin of the FAA. After WWII, a midair collision in Arizona that killed 128 occupants of two aircraft prompted further federal action on the regulation of air traffic (History, 2015). Two years after this accident, the Federal Aviation Act was signed in 1958, which “transferred the Civil Aeronautics Authority’s functions to a new
  • 23.     14 independent Federal Aviation Agency responsible for civil aviation safety” (History, 2015). While the initial years of the agency were a bit jumbled, as the organization was working to grow alongside a quickly evolving industry that was continually met with new issues, the 1967 development of the DOT created new purpose for it. It was that year, due to President Lyndon B. Johnson’s insistence on a more comprehensive transportation strategy for the entire nation, that the Federal Aviation Agency became the Federal Aviation Administration (pursuant to DOT Act 49 U.S.C. App. 1651; FAA, 2014). Duties of the FAA. The FAA’s responsibilities developed and transformed over time, with continual challenges posing the need for restructuring and innovative thought. Developing safety concerns—such as in-air accidents and aircraft hijackings—have caused the FAA to shift focus a number of times, and a number of safety policies and corresponding organizations have been formed to accommodate safety concerns. However, according to the DOT’s Bureau of Transportation Statistics (FAA, 2014), the FAA is currently charged with the following duties: (a) Regulating air commerce in ways that best promote its development and safety and fulfill the requirements of national defense; (b) Controlling the use of navigable airspace of the United States and regulating both civil and military operations in such airspace in the interest of safety and efficiency; (c) promoting, encouraging, and developing civil aeronautics; (d) Consolidating research and development with respect to air navigation facilities; (e) Installing and operating air navigation facilities;
  • 24.     15 (f) Developing and operating a common system of air traffic control and navigation for both civil and military aircraft; and (g) developing and implementing programs and regulations to control aircraft noise, sonic boom, and other environmental effects of civil aviation. Of course, these responsibilities are vast in scope. Therefore, the FAA utilizes a number of internal and external infrastructures aimed at assisting them in their endeavors. While these infrastructures—in the forms of various safety systems, protocols, and associated organizations—are numerous, only those most relevant to this study in particular are explored in the following sections. The Air Traffic Organization (ATO) The ATO’s mission is to provide efficient and safe air navigation services in U.S. airspaces, including services regarding air traffic management, communications, navigation, and surveillance (ATO Policy, 2014). Specifically, according to ATO Order JO 1000.37A, the role of the organization as being encompassed by the following two comprehensive responsibilities: (1) Establishing and maintaining ATO’s safety guidance, policies, and processes to support mission requirements that: (a) Are consistent with FAA policy, requirements, and guidance (e.g., current edition of Order 8040.4, Order 8000.369, and the FAA Acquisition Management System); (b) Meet the NAS safety management requirements established by Order 1100.161; and
  • 25.     16 (c) Are consistent with the basic principles of safety management established by the ATO SMS Manual;” and (2) Developing minimum requirements for NAS service level availability (ATO Policy, 2014). In line with these responsibilities, the ATO sets broad-sweeping initiatives and goals for each fiscal year, and makes strives to accomplish them. In their most recent safety report, FY 2014, such goals included “measuring more accurately the effectiveness of our hazard mitigation strategies; developing a more dynamic, data-driven training curriculum; developing tools to improve our ability to analyze safety risks; advancing Next Generation Air Transportation System safety initiatives; aligning the assumptions built into our safety simulation models; and addressing safety issues affecting the organization as a whole” (ATO Safety Report, 2014, p. 1). Additionally, the COO noted that out of more than 25 million flights monitored by the ATO, the organization fully complied with 99.994 percent of the FAA’s air traffic operations, and addressed over 100 safety concerns that had been previously uncovered. According to the ATO’s 2014 internal safety report, the organizations “data collection, safety assessment, and risk mitigation efforts—all essential to risk-based decision making—are made possible by the policies, procedures, and tools that compose our Safety Management System” (ATO Safety Report, 2014). The Safety Management System (SMS) The ATO SMS is an integrated set of policies, procedures, and tools used – in part – to collect data on aviation safety (ATO Safety Report, 2014). The system is a
  • 26.     17 “multidisciplinary, integrated, and closed-loop framework used to help maintain safe and efficient air navigation services and infrastructure throughout the NAS and in United States-controlled international/oceanic airspace” (ATO Policy, 2014, 1-2). Working directly in support of the FAA’s mission, this formalized approach to safety was first initiated in 2004, with its policies officially adopted by the ATO three years later (Safety Management, 2012). By 2008, correlations were found between the SMS and aviation performance, indicating that the changes made according to data uncovered by SMS protocol were resulting in fewer unwanted flight outcomes. The SMS procedures are largely guided by safety, which is the principle consideration of all ATO activities. According to the SMS Manual (Version 4.0, 2014), safety is defined as “the state in which the risk of harm to persons or property damage is acceptable” (p. 1). The policies and procedures used by the SMS have evolved since its inception. An overview of many of the key events and milestones leading up to the SMS’s current state is outlined in Table 2. Table 2 ATO SMS Key Events and Milestones Date Event/Milestone Significance Dec 7, 2000 Executive Order 13180 signed Air Traffic Organization (ATO) formed, with intentions of improving the provision of air traffic operations (Source: Executive Order 13180, 2000) Nov 23, 2006 ICAO Annex 6, Part 1 Directed all member states to require that an aviation operator implement and abide by an SMS, effective Jan 1, 2009 (Source: Safety Management Systems Update, N 8900.133, 2010) Jul 23, 2009 Advanced Notice of Proposed Rulemaking published Solicited public guidance toward rulemaking and development of a SMS (Source: ANPRM, 2009) Jul 28, 2010 Bill HR 5900 introduced Outlined the Airline Safety and Federal Aviation Administration Act of 2010, which required that all Part 121 Air Carriers implement a SMS (Source: H.R. 5900, 2010)
  • 27.     18 Aug 1, 2010 Public Law 111-216 HR 5900 signed into law (Source: Airline Safety and Federal Aviation Administration Extension Act, 2010). Nov 18, 2010 ICAO Annex 6, Amendment 33 Required operator to implement an SMS in accordance with ICAO SMS Framework, including 4 new components and 12 new elements (Source: SMS Quick Reference Guide, 2015; Swickard, 2010) Aug 1, 2012 PL 111-216 final rule modification Mandated a final rule to be issued that required Part 121 Certificate Holders to implement a SMS (Source: SMS Quick Reference Guide, 2015) Nov 14, 2013 ICAO Annex 19 Consolidated existing safety management provisions (Source: SMS Quick Reference Guide, 2015) Jun 1, 2014 Safety Assurance System deployed “New FAA oversight system to be used for oversight of certificate holders under 14 CFR part 121, 135, and 145. SAS deployment in Flight Standards offices including FSDOs, CMOs and IFOs will be complete in September, 2015” (Source: SMS Quick Reference Guide, 2015, p. 2) Aug 1, 2014 SMS Voluntary Program activated Service providers who are implementing SMS participate voluntarily using new program standards (Source: SMS Quick Reference Guide, 2015) Sep 1, 2014 New SMS policies in effect New policies aimed at further ensuring the safety of air traffic in NAS are put in order (Source: Air Traffic Organization Policy, Order JO 1000.37A, 2014) SMS components. Every task and procedure in the SMS is a component of a greater component that is working toward aviation safety—almost as the SMS is part of the ATO, and the ATO is part of the FAA. The SMS operates under four components in efforts to create an all-encompassing strategy to manage and ensure safety in U.S. airspace. These components are: (1) safety policy; (2) safety risk management (SRM); (3) safety assurance; and (4) safety promotion (SMS Manual, 2014; Safety Management, 2012). Each and every procedure and process of the SMS is completed with consideration to these components and to foster a positive safety culture amongst ATO personnel. (1) Safety policy. The first of these components, safety policy, “[e]stablishes senior management’s commitment to continually improve safety [, and also] defines the
  • 28.     19 methods, processes, and organizational structure needed to meet the safety goals” (SMS Quick, 2015, p. 1). The guidance, methods, standards, and rules that to ATO has put in place in this regard serve to not only establish and execute safety policy for the tasks at hand, but also to improve upon the SMS proactively and to ensure NAS safety (ATO Policy, 2014). (2) SRM. The second component, SRM, “[d]etermines the need for, and adequacy of, new or revised risk controls based on the assessment of acceptable risk” (SMS Quick, 2015, p. 1). The policies that fall beneath this component first and foremost are utilized “by ATO safety practitioners to identify hazards, analyze and assess their risks, determine safety performance targets, and implement and track appropriate risk controls for all air traffic operations, facilities, equipment, and systems in the NAS” (ATO Policy, 2014, p. 1-3). (3) Safety assurance. Safety assurance “[e]valuates the continued effectiveness of implemented risk control strategies, supports the identification of new hazards” (SMS Quick, 2015, p. 1). This component ensures that the policies and procedures enacted by ATO personnel under the other three components are operated according to the organizations standards and expectations (ATO Policy, 2014). Safety assurance ensures that the processes and efforts of ATO personnel are not fruitless, and secures these processes from having adverse effects on the organization. If adverse effects are identified as a result of faulty or superfluous SMS methods, safety assurance is the component that ensures that the issue is solved accordingly by assessing the risk via data collection, discovering a safer solution, and implementing the solution throughout the
  • 29.     20 entire system accordingly. Regular audits and performance reviews may be conducted under safety assurance to establish the best possible outcomes. (4) Safety promotion. The final component, safety promotion, “[i]ncludes training, communication, and other actions to create a positive safety culture within all levels of the workforce” (SMS Quick, 2015, p. 1). Safety promotion uses SMS data, processes, and the behaviors and daily interactions of ATO personnel to advocate safety amongst every facet of the organization’s formalized systems (ATO Policy, 2014). Employee conduct and adherence to SMS rules and regulations are closely monitored under this component. The Aviation Safety Reporting System (ASRS) The ASRS is a voluntary reporting system that gathers information submitted by aviation personnel, including air traffic technicians, cabin crews, dispatchers, maintenance technicians, and pilots (ASRS Program Briefing, 2014). Such reports can be filed confidentially, and add to the collective database of aviation accidents and incidents in U.S. airspace, and the system’s focus is particularly concerned with human performance and human error. The purpose of the ASRS is to improve the U.S. aviation safety by identifying the deficiencies and discrepancies, and to provide the data submitted to the system to be used for planning and improvements on the NAS. The ASRS staff is composed of highly experienced aviation personnel, from a range of professions, and they boast “over 470 cumulative years of aviation expertise covering the full spectrum of aviation activity: air carrier, corporate, military, and general aviation; Air Traffic Control in Towers, TRACONs, Centers, and Military Facilities,” and their analyst’s “cumulative
  • 30.     21 flight time exceeds 140,000 hours in over 50 different aircraft” (ASRS Program Briefing, 2014, p. 7). ASRS report processing. The FAA “considers the filing of a report with NASA concerning an incident or occurrence involving a violation of 49 U.S.C. subtitle VII or the 14 CFR to be indicative of a constructive attitude,” and filling out a report results it in being sent to Moffett Field in California (ASRS Program Briefing, 2014, p. 9). During the first days of the ASRS, the program averaged around 400 reports per month, and report submissions have increased significantly over the years. Recent figures note that report intakes number, on average, 1,684 per week and 6,736 per month (p. 12). Since the program’s inception in 1981, a total of 1,140,440 reports were submitted throughout 2013, with the vast majority of reports coming from air carrier personnel in the last decade (p. 13-14). Reports that are submitted are completely confidential, and are coded, with personal information stricken out to maintain anonymity (ASRS Program Briefing, 2014). The online submission form offers a number of boxes that the volunteer reporter may check regarding certain criteria, and also provides ample space for the individual to write out a narrative response describing the event (NASA ASRS, 2009). Additionally, the report form prompts reporters to offer up suggestions as to what they believe could help alleviate the issue they are reporting. Once the reports are received, two ASRS professional analysts screen them within three working days to categorize and assess the content (ASRS Program Briefing, 2014). If analysts have questions or concerns when processing the report, they may contact the reporter to inquire for further details or
  • 31.     22 clarification. Reports are run through a final check for quality assurance, and are then coded accordingly for analysis.
  • 32.     23 CHAPTER III—METHODOLOGY This study analyzed ASRS incidents related to the ATO from two different yearlong periods: before the effective date of new SMS protocol and policies and after. These periods, specifically, ranged from September 1, 2013 to August 31, 2014, and from September 1, 2014 to August 31, 2015. The incidents examined were reported via the ASRS and were associated specifically with the ATO. Additionally, all incidents analyzed in this study occurred at airports in Class B airspaces in the United States. It was hypothesized that the recent adjustment in ATO SMS policies would result in a significant statistical difference in the number of flight incidents that were reported. Study Sample According to the ACI, North America handles more cargo and passengers than any other region on the planet (ACI – North America, 2015). The data analyzed in this study was pulled from three different groups, including: (1) all 37 Class B airports in the United States; (2) the top 15 airports with the most movements (based on the 37 Class B airport list); and (3) the top 15 airports with the highest number of incidents according to the ASRS criteria search (based on the 37 Class B airport list). Class B airspace designations. There are 37 Class B airspace sites in the United States (Airspace Designations, 2013). Class B airspace areas are some of the busiest air traffic sites in the country. According to their official FAA designation, Class B airspace areas are defined as “airspace from the surface to 10,000 feet mean sea level (MSL) surrounding the nation’s busiest airports in terms of airport operations or passenger enplanements” (Procedures for Handling Airspace Matters, 2012, 4.14.1.2). Each
  • 33.     24 individual Class B airspace site is defined and tailored by the FAA, and consists of a surface area and at least two more layers. Each Class B site is also “designed to contain all published instrument procedures” (Procedures, 2012, 4.14.1.2). Additionally, in Class B airspace zones, an “ATC clearance is required for all aircraft to operate in the area, and all aircraft that are so cleared receive separation services within the airspace” (Procedures, 2012, 4.14.1.2). Incidents from these airport sites were specifically utilized in this study because they make up a large enough sample to be representative of the comprehensive flights in the United States. The 37 Class B airspace sites examined in this study were listed in the FAA ATO’s policy statement on “Airspace Designations and Reporting Points” (2013). Each of these Class B airspaces contained at least one primary airport site, and all aircraft operators within each site is subject to minimum aircraft equipment requirements, operation rules, and pilot qualification requirements in accordance with Federal Aviation Regulation 14 CFR 91.131 (Airspace Designations, 2013). The Class B airspace sites examined in this study are sorted in Table 3 according to their geographic locations by state, including their official civilian titles and their International Air Transport Association (IATA) airport codes. Table 3 U.S. Class B Airspace Sites (37) State IATA Airport Code Civilian Title Arizona PHX Phoenix Sky Harbor International California LAX Los Angeles International California NKX Marine Corps Air Station Miramar California SAN San Diego International California SFO San Francisco International Colorado DEN Denver International
  • 34.     25 Florida MCO Orlando International Florida MIA Miami International Florida TPA Tampa International Georgia ATL Hartsfield-Jackson Atlanta Intl. Hawaii HNL Honolulu International Illinois ORD Chicago-O’Hare International Kentucky CVG Cincinnati/N. Kentucky Intl. Louisiana MSY Louis Armstrong New Orleans Intl. Maryland ADW Andrews Air Force Base Maryland BWI Baltimore/Washington Intl. Massachusetts BOS Boston-Logan International Michigan DTW Detroit Metro Wayne County Minnesota MSP Minneapolis-St. Paul Intl. Missouri MCI Kansas City International Missouri STL Lambert-St. Louis International Nevada LAS Las Vegas-McCarran Intl. New Jersey EWR Newark Liberty International New York JFK New York-John F. Kennedy Intl. New York LGA New York-LaGuardia North Carolina CLT Charlotte Douglas International Ohio CLE Cleveland Hopkins International Pennsylvania PHL Philadelphia International Pennsylvania PIT Pittsburgh International Tennessee MEM Memphis International Texas DFW Dallas-Ft. Worth International Texas HOU Houston-Hobby (Secondary Class B) Texas IAH Houston-George Bush Intl. Utah SLC Salt Lake City International Virginia DCA Ronald Reagan Washington Natl. Virginia IAD Washington Dulles International Washington SEA Seattle-Tacoma International Source: Airspace Designations and Reporting Points (Order JO 7400.9X). (2013). FAA Air Traffic Organization. Top 15 movement airports. The second study group utilized by this study will be the top 15 airports according to movement, which also fall within the 37 Class B airports listed above. According to 2014 data from the Airports Council International (ACI), total movements, in this sense, refer to the landing and take off of an aircraft. The top 15 airports in this group to be examined within this study’s subtext are listed in Table 4, including their state, IATA airport code, and their total aircraft movements for 2014.
  • 35.     26 Table 4 Top 15 Total Aircraft Movement Airports (According to Class B Airspace Classification) State IATA Airport Code Total Aircraft Movements Illinois ORD 891,933 Georgia ATL 868,359 California LAX 708,674 Texas DFW 679,820 Colorado DEN 565,525 North Carolina CLT 545,178 Nevada LAS 522,399 Texas HOU 499,802 California SFO 431,633 Arizona PHX 430,461 New York JFK 422,415 Pennsylvania PHL 419,253 Minnesota MSP 412,586 Florida MIA 402,663 New Jersey EWR 395,524 Source: Airports Council International – North America (2015). 2013 North American Airport Traffic Summary. Top 15 airports with highest incident reports. The third study group utilized by this study will be the top 15 airports according to those with the highest incident reports according to the ASRS search criteria to follow, and which also fall within amongst 37 Class B airports listed above. ASRS Search Criteria The ASRS database was searched using criteria to identify incident reports that were specifically related to the ATO. These selections were made to best assess fluctuations that may have occurred in accordance with ATO SMS policies. This criteria search included the following specifications: (a) incidents occurring from September 1, 2013 to August 31, 2014, and from September 1, 2014 to August 31, 2015; (b) incidents occurring at Class B airport sites; (c) all available options on the ASRS “Reported Organization” criteria listing; (d) all available options on the ASRS “Reported Function”
  • 36.     27 criteria listing; and (e) the ASRS “Contributing Factors” “ATC Equipments,” “NAV Facility,” and “Building.” Data Analysis The t-Test data analysis technique was used to analyze the data gathered from the ASRS. Limitations Though this inclusive study sample group will offer general insight pertaining to the purpose of this study, a number of factors could, in fact, contribute to disparities in the number of incidents that take place in the pre- and post-periods examined herein. For example, trends at certain airports show higher volumes of aircraft movement than at others. While this sample was chosen, in part, because Class B airspace sites tend to impact the civilian populace to the greatest extent, this must be taken into consideration. In an ideal scenario, all 37 Class B airports would also rank, in a descending order, amongst the top 37 airports for (a) total annual commercial passenger counts, (b) top cargo airports, and (c) total movements. However, that is simply not the case. In fact, the 2014 traffic count for the top five North American in those three categories is as follows (in descending order): (a) Annual Commercial Passenger Counts: Atlanta, 96,178,899 passengers; Los Angeles, 70,663,265; Chicago O’Hare, 69,999,010; Dallas/Ft. Worth, 63,554,402; and Denver, 53,472,514 (ACI – NA, 2015).
  • 37.     28 (b) Annual Cargo Flights (in Metric Tons): Memphis, 4,258,531; Anchorage, 2,492,754; Louisville, 2,293,231; Miami, 1,998,779; and Los Angeles, 1,816,269 (ACI – NA, 2015). (c) Total Aircraft Movements (incl. Landing and Take-Off Flights): Chicago, 881,933; Atlanta, 868,359; Los Angeles, 708,674; Dallas/Ft. Worth, 679,820; and Denver, 565,525 (ACI – NA, 2015). A more in-depth study that has the resources to analyze a larger sample of airports, and to cross-examine them from a number of different perspectives would likely yield better and more-comprehensive reports. Additionally, as ASRS reports are submitted on a voluntary basis, and it is unlikely that every aviation incident is properly reported, it is likely that the frequency of incidents is underrepresented. Because the ASRS report form does not offer strict guidelines or parameters, and allows for a more qualitative rather than quantitative responding mechanism, it may also be that different types of incidents are affected to a different extent, or elements of incidents may be under- or over-reported. Such a reporting mechanism also allows significant room for bias. Lastly, the Automatic Dependent Surveillance—Broadcast (ADS—B) was excluded as a contributing factor, as its compliance date is noted as January 2020 (Automatic Dependent Surveillance—Broadcast, 2010). While the ADS—B is a potential contributing factor regarding the number of incidents reported and, therefore, examined in this study, as it has not yet come to its mandate date, it is not considered in this study. This has the potential to skew this study’s results.
  • 38.     29 CHAPTER IV—RESULTS AND DISCUSSION Of the three groups examined within this study—all Class B airports, the top 15 Class B airports with the most movement, and the 15 Class B airports with the most reported incidents—all of them experienced significant increases in their total number of incidents following the instatement of the new 2014 ATO SMS policies. These truncated findings are represented in Table 5. Table 5 Statistical Data on Incident Reports, Pre- and Post-2014 Policy Instatement All Class B Airports Most Movement Group Highest No. of Incidents 2014 2015 2014 2015 2014 2015 Total incidents 15 46 8 29 7 39 Mean 0.4054 1.2432 0.53333 1.93333 0.46666 2.6 Standard Deviation 0.55073 1.4024 0.63994 1.3475 0.51639 1.1832 It is clear that all three groups that were examined indicate very significant increases in the number of incidents from 2014 to 2015—following the instatement of the new ATO SMS policies. The Class B airport group showed a 206% increase in incidents, the movement group showed a 262.5% increase in incidents, and the highest number of incidents rose 457% from 2014 to 2015. Correspondingly, the mean number of incidents rose for all groups during from 2014 to 2015. However, the standard deviations increased for all of the examined groups in this timeframe examined as well. Table 6 lists the number of incidents at all of the Class B airports examined.
  • 39.     30 Table 6 Number of Incidents, Class B Airports Incidents in 2014 IATA Airport Code Incidents in 2015 0 PHX 1 0 LAX 6 0 NKX 0 1 SAN 0 1 SFO 3 0 DEN 3 0 MCO 0 0 MIA 1 0 TPA 0 1 ATL 2 0 HNL 0 1 ORD 2 1 CVG 2 0 MSY 1 0 ADW 0 0 BWI 1 1 BOS 1 1 DTW 1 1 MSP 1 0 MCI 0 0 STL 0 1 LAS 2 0 EWR 0 0 JFK 2 1 LGA 1 0 CLT 2 0 CLE 4 1 PHL 2 0 PIT 0 0 MEM 0 2 DFW 0 0 HOU 2 1 IAH 0 1 SLC 0 0 DCA 3 0 IAD 3 0 SEA 0 Of the 37 Class B airports examined, only 4 of the locations saw decreases in the number of incidents from 2014 to 2015, 15 of the locations reported the same number of incidents for each year, and 18 of them saw increases in the number of incidents, with 9
  • 40.     31 of those locations reporting at least 200% increase in incidents. Those with decreases were DFW (2 to 0), IAH (1 to 0), SAN (1 to 0), and SLC (1 to 0). Those showing the most significant increases in incidents were CLE (0 to 4), DEN (0 to 3), DCA (0 to 3), IAD (0 to 3), and LAX (0 to 6). It is important to note that CLE and LAX—due to their more significant increase in incidents—may be considered outliers, their results the product of sort of anomaly or unrelated factor. T-test To establish whether a statistically significant difference existed between the amount of incidents reported before and after the new 2014 ATO SMS policies were put into order, a pooled variance t-test for the difference between means was performed on each of the three examined groups. As t-tests calculate the statistical differences between means, data from each of the three groups was examined to pinpoint any correlations. The t-tests determine distinctions between means and the distribution or variability of the data, and it also identifies critical values that define the area(s) where the null hypothesis is rejected. If the t-test statistic that is deduced falls within the rejection region, then the null hypothesis is rejected; if the statistic does not fall within the rejection region, the null hypothesis stating that the means are the same cannot be rejected. The t-value distinguishes the direction of the difference, and it is negative if the first mean is found to be smaller in value and positive if the first mean is found to be larger. P-values are also indicators of statistical significance found using t-tests. The function of the P-value is that it indicates the probability (hence, P-value) that qualifies
  • 41.     32 the strength of evidence against the null hypothesis. With the null distribution of the t-test statistic and the value of the statistic, it is important to identify whether the statistic is an outlier or in the middle of distribution (or consistent with the null hypothesis). Depending on how the hypotheses and the test statistic are defined, outliers on either side of the distribution may be considered. It is the P-value that helps to provide a measurement for this outlier: the further out the value is, the smaller the P-value and the stronger the substantiation opposing the null hypothesis in favor of the other. T-test results on three examined groups. This analysis utilized a risk or alpha level of 0.05, indicating that it would be statistically significant if the sample result was observed in 5% or fewer than 5% of the samples. Therefore, in the subject case, a P-value less than 0.025 (0.025 on each of the two tails of the data) would indicate a statistically significant difference in the means and the null hypothesis would be rejected. The t-test on all Class B airports can be seen in Table 7, with Figure 1 visually depicting the t-test statistic of -3.3 and critical values of -2.02 and 2.02, respectively, which indicate regions of both rejection and non-rejection. Given that the t-test statistic lies within the region of rejection, the initial assumption is that the null hypothesis—that is, that the means are the same—is rejected. However, Table 7 indicates a P-value of 0.002, which connotes that the probability of getting a test statistic as extreme as the indicated test statistic is 0.002 or 0.2%. The P-value, when considered alongside the 0.025 confidence interval on either left or right outlier of the data, indicated a statistically significant difference in the means. This significant difference concludes that the null
  • 42.     33 hypothesis would be rejected, and that data combined with the data from the t-test statistic are evidence that there is not sufficient evidence supporting a null hypothesis. Table 7 t-Test on All Class B Airport Data Variable 1 Variable 2 Mean 0.4054054 1.243243 Variance 0.3033033 1.966967 Observations 37 37 Pearson Correlation -0.059291 Hypothesized Mean Difference 0 df 36 t Stat -3.316146 P (T ≤ t) one-tail 0.0010461 t Critical one-tail 1.6882977 P (T ≤ t) two-tail 0.0020922 t Critical two-tail 2.028094 Figure 1 t-Test Visualization of All Class B Airport Data The t-test on the airports with the most movement can be seen in Table 8, with Figure 2 visually depicting the t-test statistic of -3.1 and critical values of -2.14 and 2.14,
  • 43.     34 respectively, which indicate regions of both rejection and non-rejection. Given that the t- test statistic lies within the region of rejection, the initial assumption is that the null hypothesis—that is, that the means are the same—is rejected. However, Table 8 indicates a P-value of 0.007, which connotes that the probability of getting a test statistic as extreme as the indicated test statistic is 0.007 or 0.7%. The P-value, when considered alongside the 0.025 confidence interval on either left or right outlier of the data, indicated a statistically significant difference in the means. As with the first studied group, all Class B airports, this significant difference concludes that the null hypothesis would be rejected, and the P-value-confidence interval data combined with the data from the t-test statistic are also evidence that there is not sufficient evidence supporting a null hypothesis. Table 8 t-Test on Class B Airports with the Most Movement Variable 1 Variable 2 Mean 0.5333333 1.933333 Variance 0.4095238 2.066667 Observations 15 15 Pearson Correlation -0.269159 Hypothesized Mean Difference 0 Df 14 t Stat -3.14551 P (T ≤ t) one-tail 0.0035776 t Critical one-tail 1.7613101 P (T ≤ t) two-tail 0.0071552 t Critical two-tail 2.1447867
  • 44.     35 Figure 2 t-Test Visualization on Class B Airports with the Most Movement Data from the third and final group examined, Class B airports with highest number of reported incidents, can be seen in Table 9, with Figure 3 visually depicting the t-test statistic of -5.4 and critical values of -2.14 and 2.14. Given that the t-test statistic lies within the region of rejection, the initial assumption is that the null hypothesis—that is, that the means are the same—is rejected. However, Table 9 indicates a P-value of 0.00007, which connotes that the probability of getting a test statistic as extreme as the indicated test statistic is 0.00007 or 0.007%. The P-value, when considered alongside the 0.025 confidence interval on either left or right outlier of the data, indicated a statistically significant difference in the means. As with the first two groups examined, this significant difference concludes that the null hypothesis would be rejected, and the P- value-confidence interval data combined with the data from the t-test statistic are also evidence that there is not sufficient evidence supporting a null hypothesis.
  • 45.     36 Table 9 t-Test on Class B Airports with the Highest Reported Incidents Variable 1 Variable 2 Mean 0.4666667 2.6 Variance 0.2666667 1.4 Observations 15 15 Pearson Correlation -0.49099 Hypothesized Mean Difference 0 df 14 t Stat -5.487955 P (T ≤ t) one-tail 0.0000399 t Critical one-tail 1.7613101 P (T ≤ t) two-tail 0.0000799 t Critical two-tail 2.1447867 Figure 3 t-Test Visualization on Class B Airports with the Highest Reported Incidents Once the null hypothesis was effectively rejected according to the results from the t-tests of the three examined groups for the one-year study period, an extended period of data collection—for a study period of 18 months—was conducted as a means of cross-
  • 46.     37 checking the data for deference in the results. The studied period extended six months longer than the original one-year planned collection periods—from March 1, 2013 to August 31, 2014, and from September 1, 2015 to February 29, 2016, respectively (the latter of which was the maximum data available up until the date of this study). T-tests were performed with the information gathered for these time periods, as seen in Table 10 and Figure 4. Additionally, Table 11 provides a detailed list of the number of incidents for each of the Class B airports during this lengthened 18-month time period as well. Figure 4 visually depicts the t-test statistic of -3.21 and critical values of -2.02 and 2.02. Given that the t-test statistic lies within the region of rejection, the null hypothesis—that is, that the means are the same—is rejected. Table 10 indicates a P- value of 0.002, which connotes that the probability of getting a test statistic as extreme as the indicated test statistic is 0.002 or 0.2%. The P-value, when considered alongside the 0.025 confidence interval on either left or right outlier of the data, indicated a statistically significant difference in the means. This significant difference concludes that the null hypothesis would be rejected, as the P-value-confidence interval data combined with the data from the t-test statistic are evidence that there is not sufficient evidence supporting a null hypothesis.
  • 47.     38 Table 10 t-Test on 18-Month Incident Reports Variable 1 Variable 2 Mean 0.7837838 1.648649 Variance 1.1741742 3.123123 Observations (No. of Airports) 37 37 Pearson Correlation 0.4234077 Hypothesized Mean Difference 0 df 36 t Stat -3.216121 P (T ≤ t) one-tail 0.0013727 t Critical one-tail 1.6882977 P (T ≤ t) two-tail 0.0027453 t Critical two-tail 2.028094 Figure 4 t-Test Visualization on 18-Month Incident Reports
  • 48.     39 Table 11 Number of Incidents, Class B Airports for 18-Month Study Period Incidents in 2014 IATA Airport Code Incidents in 2015 0 PHX 1 2 LAX 7 0 NKX 0 1 SAN 0 3 SFO 6 1 DEN 4 0 MCO 0 0 MIA 1 0 TPA 0 3 ATL 2 0 HNL 0 4 ORD 3 1 CVG 2 0 MSY 1 0 ADW 0 1 BWI 3 1 BOS 1 2 DTW 1 1 MSP 1 0 MCI 0 0 STL 0 2 LAS 3 0 EWR 0 0 JFK 2 1 LGA 2 0 CLT 3 0 CLE 5 1 PHL 2 0 PIT 0 0 MEM 0 3 DFW 1 0 HOU 2 1 IAH 1 1 SLC 0 0 DCA 3 0 IAD 3 0 SEA 1 As another means of cross-checking these results, on why incidents were reported to have increased significantly from 2014 to 2015, t-tests were performed on the total number of movements in all Class B airports. This was crucial, as an increase in
  • 49.     40 movements at any of these given locations could statistically lead to increases in the number of incidents. Data on the number of movements was collection via FAA’s Air Traffic Activity System (FAA ATAS, 2016). However, as indicated in Table 12 and Figure 5, there was no significant difference between the total number of movements between the two sets of time periods examined. Figure 5 visually depicts the t-test statistic of -0.382 and critical values of -2.02 and 2.02. Given that the t-test statistic does not lie within the region of rejection, the null hypothesis—that is, that the means are the same—cannot be rejected. Table 12 indicates a P-value of 0.7, which connotes that the probability of getting a test statistic as extreme as the indicated test statistic is 0.7 or 70%, which suggests—even though the total number of movements are acutely aligned with this study’s allotted study period—that the total number of movements is not a potential factor influencing the results of this study. Table 12 t-Test on Total Movement at All Class B Airports Variable 1 Variable 2 Mean 358933.4054 359788.1351 Variance 39,223,177,015 39,461,064,208 Observations (No. of Airports) 37 37 Pearson Correlation 0.997653623 Hypothesized Mean Difference 0 df 36 t Stat -0.382265728 P (T ≤ t) one-tail 0.352255881 t Critical one-tail 1.688297714 P (T ≤ t) two-tail 0.704511761 t Critical two-tail 2.028094001
  • 50.     41 Figure 5 t-Test Visualization on Total Movement at All Class B Airports A detailed account of the total movement for all airports for the given study periods is laid out in Table A in Appendix I. Additionally, another t-test was performed on this data, with two airport locations confirmed to be outliers—LAX, with 6 incident reports, and CLE with 4 incident reports—removed from the data pool to allow for more succinct analysis that would be free of anomalies. Table 13 and Figure 6 depict the data below. Figure 6 visually depicts the t-test statistic of -3.03 and critical values of -2.03 and 2.03. Given that the t-test statistic does lie within the region of rejection, the null hypothesis—that is, that the means are the same—is rejected. Table 13 indicates a P-value of 0.002, which connotes that the probability of getting a test statistic as extreme as the indicated test statistic is 0.002 or 0.2%. These data show the same results, even with these two outlier locations removed from the equation.
  • 51.     42 Table 13 t-Test on Class B Airports, with Outliers Removed Variable 1 Variable 2 Mean 0.428571429 1.028571429 Variance 0.31092437 1.146218487 Observations (No. of Airports) 35 35 Pearson Correlation 0.077429263 Hypothesized Mean Difference 0 df 34 t Stat -3.038545478 P (T ≤ t) one-tail 0.002273588 t Critical one-tail 1.690924255 P (T ≤ t) two-tail 0.004547176 t Critical two-tail 2.032244509 Figure 6 t-Test Visualization on Class B Airports, with Outliers Removed T-test results on written incident reports. To shed more light on the causes of the increase in incidents between the two studied time periods, pre- and post-new ATO SMS policy instatement, a number of the written incident reports were collected. Each of
  • 52.     43 these reports noted either a contributing factor or a primary source of the problem leading to each of these given incidents by a designated professional (as outlined in the literature review). These contributing factors included ATC equipment, navigation facility, and buildings, among others. After careful scrutiny of the written reports taken from the study period, those that indicated the “ATC equipment / Navigation Facility / Buildings” option on the reporting infrastructure’s selections as the primary problem were identified and collected in order to create a dataset to be used for testing for further clarification. Further, the research searching amongst this sample for those specifying ATC irregularities as the primary problem to get a better understanding as to whether the new ATO SMS policies affected this aspect of aviation in particular. Table 14 shows those locations listing ATC as a primary problem, as well as the number of incidents indicated. Interestingly, this only appeared to be the case at 12 of the examined Class B airport locations. Table 14 Number of Incidents with ATC Noted as a Primary Problem Incidents in 2014 IATA Airport Code Incidents in 2015 0 LAX 3 0 SFO 2 0 ATL 2 1 ORD 0 0 CVG 1 0 BWI 1 1 DTW 0 0 CLE 3 1 PHL 1 2 DFW 0 0 HOU 2 0 DCA 1
  • 53.     44 The t-test on the incident reports specifying ATC irregularities or issues as the primary problem can be seen in Table 15, with Figure 7 visually depicting the t-test statistic of -1.95 and critical values of -2.2 and 2.2. Additionally, Table 15 indicates a P- value of 0.08, which connotes that the probability of getting a test statistic as extreme as the indicated test statistic is 0.08 or 8.0%. Given that the P-value and that the t-test statistic does not lie within the region of rejection, the null hypothesis cannot be rejected. Table 15 t-Test on Incidents with ATC Noted as a Primary Problem Variable 1 Variable 2 Mean 0.41666667 1.333333333 Variance 0.44696967 1.151515152 Observations (No. of Airports) 12 12 Pearson Correlation -0.718060625 Hypothesized Mean Difference 0 df 11 t Stat -1.958503222 P (T ≤ t) one-tail 0.038004087 t Critical one-tail 1.795884819 P (T ≤ t) two-tail 0.076008173 t Critical two-tail 2.20098516 Figure 7 t-Test Visualization on Incidents with ATC Noted as a Primary Problem
  • 54.     45 For samples of actual incident reports noting ATC as both a contributing factor and primary problem of the incident, see Appendix II. Discussion There are a number of reasons as to why the number of incidents increased and the null hypothesis for this study was rejected. Though it is evident that there is a correlation between the new SMS ATO policies and the increase in the reported number of incidents, the policies were relatively new to employees. Since these new policies took effect on September 1, 2014, and the second study period for this study ranged from September 1, 2014 to August 31, 2015, it is highly possible that employee habits were still akin to the old ATO policies, and that they had not yet become accustomed to these newer policies aimed at increasing efficiency and safety—and, therefore, lessening the amount of overall aviation incidents. Human error and habitual practices in the workplace must be considered to this end. Additionally, it could be that there was a period of time allotted to transition from the old policies to the new, which could add to confusion on behalf of management and employees on trying to adapt to new rules and regulations. Often, adjustments to infrastructure in the workplace are worked in little by little, as to allow for an easy adjustment to new systems and practices. This is done so that margin for error due to employee error is decreased, and to help establish convention in the workplace. However, given that the new SMS ATO policies were quite detailed in scope and range, and that they affected a number of divisions within the aviation industry to an extent, it would be
  • 55.     46 difficult for all changes to occur without incident and naïve to assume that no inaccuracies or miscalculations would occur due to the human component of the system. Most likely, it could be that the number of incidents increased from pre- and post- SMS ordinance adjustments because the new policies established roles and responsibilities at all levels of the ATO, rather than just for management. According to ATO Policy Order JO 7200.20 (2011), these new policies also increased the utilization of the internal Voluntary Safety Reporting Program (VSRP) to be used by all ATO personnel, regardless of their level of responsibility. The order states that reports via VSRPs “applies to all ATO personnel directly engaged in and/or supporting air traffic services and only to events that occur while acting in that capacity” (ATO Policy Order JO 7200.20, 2011, p. 1). It can be assumed that this order changed the ATO employees’ attitudes toward voluntary reporting, causing them to report more on the main aviation voluntary database, the ASRS, which is the database used for this study. If this is found to be the case, a cross-examination of both databases could yield results with more depth that could lead to the acute pinpointing of the causes of these incident reports. Of course, the study periods examined herein were relatively short in scope, and may also include abnormalities or aberrations related to factors that are independent of this study’s focus. Regardless, it is clear that aviation incidents increased—at least at Class B airports—after the date of instatement of the new ATO SMS policies in 2014. Whether or not these policies are directly related to this increase for certain may bear further scrutiny.
  • 56.     47 Recommendations for Future Work The researcher recommends that further studies spanning more time and depth on this topic could yield more definite results, which may lead to valuable information that could allow policymakers within the aviation industry to modify their methods to further their goals toward safety and efficiency. One way of doing so would be to examine larger study periods, when the data becomes so available, in order to gather more information to be tested. This will allow for a more cohesive and richer understanding of the problems that lie within aviation incidents and could lead to the pinpointing of specific issues that are not unsystematic in nature and that can be alleviated with new policies and practices. Additionally, such examinations of data may already be possible with the information that already exists, within the written incident reports made available by the ASRS. These reports shed light as to the specifics of irregularities that can lead to aviation incidents, and careful review of the various aspects may help professionals better understand the affect that ATO policies have on employees. This study examined only a small breadth of the data available, and not to its fullest extent; however, future studies may have success in identifying consistencies in these professional reports that may indicate larger trends leading to these incidents. If so, SMS personnel could use their findings to better modify the next round of policies, and could make ATO employees aware of these shortcomings within the system until such policies could be officially instated.
  • 57.     48 The ASRS provides a large swath of data from which professionals can pull to become better informed about many aspects relating to aviation in the United States. Data from the database was utilized in this study to determine the number of incidents that occurred before and after the 2014 SMS ATO policies were instated, and future studies could foster valuable findings if they continued to examine this data and find correlations in the specific causes and factors related to these incidents.
  • 58.     49 CHAPTER V—CONCLUSION The ATO SMS strives to improve key safety issues and air transportation outcomes by revising their policies as more is learned about the specifics leading to unwanted incidents (ATO Safety Report, 2014; FAA Efforts, 2013; FAA Final Rule, 2015). In an attempt to address these concerns, the ATO revised many of its SMS policies in 2014, and they were put into effect on September 1 of that year. In an attempt to address the effectiveness of these new policies, this study examined reports of incidents reported in the Aviation Safety Reporting System (ASRS) database, both before and after the 2014 policies were put in place. This included an examination of the number of incidents reported for a one-year period before the policies, and a one-year period directly following their adoption. Additionally, the researcher examined the incident reports of those incidents that indicated “ATC equipment / Navigation Facility / Buildings” as either a contributing factor or the primary problem resulting in these incidents to determine whether policies related to this specific pre- assigned indicator resulted in more or less incidents after the policies were modified. Data from this study concludes that the 2014 ATO SMS policies have not been effective in inhibiting unwanted aviation outcomes, and may even have lead to an increase in aviation incidents reports. The researcher believes that, due to the mandates of ATO Policy Order JO 7200.20 (2011) regarding employee duties, these new policies increased the volume of reports because ATO personnel at all levels were given authorization and responsibility utilize the organization’s Voluntary Safety Reporting Program to relay adverse instances within the system. Because this order extended to all
  • 59.     50 employees, rather than those at only management-level or those who work in higher-risk positions regarding air traffic control (such as air traffic controllers), it is fair to assume that the number of reports would increase drastically. It could be that the increased number of incidents reported is due to the newly developed habits and requirements of ATO employees to use the ATO VSRP to regularly submit voluntary reports. Since the ASRS was used to compile the data found in this study, the new ATO SMS policies that have created a culture more attuned to using voluntary reporting systems may have affected the reports that were submitted to gather information herein. However, further work should be done to examine whether the volume increase with incidents correlated with this aspect of the new policies in particular to overrule or affirm its influence in the matter, and a cross-examination of incident reports from both data systems could yield more complete results. While aberrations or independent factors that do not relate to these new policies may have also contributed to this significant increase in aviation incidents, further work is needed to ascertain the specifics of this possibility. It would also be beneficial to examine the volume and specifics of the aviation incidents that occurred both before and after the SMS policies were instated in order to discern their effectiveness in the varying areas available to professionals on the ASRS reports, i.e. “ATC equipment / Navigation Facility / Buildings” and others. Such detailed consideration of this data may better inform aviation personnel on the advantages and disadvantages of these policies, and if it is found that the new policies correlate with an increasing number of incidents it may be appropriate to and make further adjustments.
  • 60.     51 REFERENCES Automatic Dependent Surveillance—Broadcast (ADS–B) Out Performance Requirements to Support Air Traffic Control Service (14 CFR Part 91). (2010). Federal Register, 75(103): 30160-30195. 1997-2015 Update to FAA Historical Chronology: Civil Aviation and the Federal Government, 1926-1996. (2016, Jan 5). Federal Aviation Administration online. Retrieved from https://www.faa.gov/about/history/media/final_1997- 2015_chronology.pdf Airports Council International – North America (2015). 2013 North American Airport Traffic Summary. Retrieved from http://www.aci-na.org/content/airport-traffic- reports About FAA. (2015, Dec 29). Federal Aviation Administration online. Retrieved from http://www.faa.gov/about/ Advance Notice of Proposed Rulemaking (ANPRM; 14 CFR Parts 21, 119, 121, 125, 135, 141, 142, and 145). (2009, Jul 23). Proposed Rules. Federal Register, 74(140): 36414-36413. Retrieved from https://www.gpo.gov/fdsys/pkg/FR-2009-07- 23/pdf/E9-17553.pdf Air Traffic Organization 2014 Safety Report. (2014). FAA Air Traffic Organization. Retrieved from https://www.faa.gov/about/office_org/headquarters_offices/ato/ service_units/safety/media/ato-2014-safety-report.pdf Air Traffic Organization Policy (Order JO 1000.37A). (2014). FAA Air Traffic Organization. Retrieved from http://www.faa.gov/documentLibrary/media/ Order/1000-37A_ATO_Safety_Management_System_508CFINAL.pdf Air Traffic Organization Policy (Order JO 7200.20). (2011). FAA Air Traffic Organization. Retrieved from http://www.faa.gov/regulations_policies/orders_ notices/index.cfm/go/document.information/documentID/322841 Airline Safety and Federal Aviation Administration Extension Act of 2010 (PL 111-216). (2010). Congress.gov. Retrieved from https://www.congress.gov/111/plaws/publ 216/PLAW-111publ216.pdf Airspace Designations and Reporting Points (Order JO 7400.9X). (2013). FAA Air Traffic Organization. Retrieved from http://www.faa.gov/documentLibrary/ media/Order/JO_7400.9X.pdf
  • 61.     52 ASRS Program Briefing. (2014). Aviation Safety Reporting System, NASA. Retrieved from http://asrs.arc.nasa.gov/docs/ASRS_ProgramBriefing2013.pdf ASRS: The Case for Confidential Incident Reporting Systems. (n.a.). NASA ASRS, Pub. 60: 1-7. White paper. Retrieved from http://asrs.arc.nasa.gov/docs/rs/60_Case_ for_Confidential_Incident_Reporting.pdf Automatic Dependent Surveillance Broadcast (ADS—B), 14 CFR Part 91. (2010). Federal Register, 75(103): 30160-30195. Better Quality and More Complete Data Could Help FAA Further Improve Safety Oversight. (2013). GAO Reports, 4-11. Retrieved from http://web.b.ebscohost.com.libaccess .sjlibrary.org/ehost/pdfviewer/pdfviewer?sid=af5a1bdd-9ba4-4494-b14b- 74d223452cbf%40sessionmgr111&vid=1&hid=106 Dillingham, G. L., & Rhodes, K. A. (2003). Better Cost Data Could Improve FAA’s Management of the Standard Terminal Automation Replacement System. U. S. GAO online. Retrieved from http://www.gao.gov/assets/240/237145.pdf Executive Order 13180. (2000, Dec 7). Presidential Documents. Federal Register, 65(238): 77493-77494. Retrieved from https://www.gpo.gov/fdsys/pkg/FR-2000- 12-11/pdf/00-31697.pdf FAA Air Traffic Activity System, Airport Operations (2016). FAA online. Retrieved from http://aspm.faa.gov/opsnet/sys/Airport.asp FAA Efforts Have Improved Safety, but Challenges Remain in Key Areas. (2013). GAO Reports, 1. Retrieved from http://libaccess.sjlibrary.org/login?url=http://search. ebscohost.com/login.aspx?direct=true&db=bth&AN=87283091&site=ehost-live FAA Final Rule Requires Safety Management System for Airlines. (2015). Thomas Net News, 1. Retrieved from http://libaccess.sjlibrary.org/login?url=http://search. ebscohost.com/login.aspx?direct=true&db=bwh&AN=101295470&site=ehost- live FAA Historical Chronology, 1926-1996. (2011, Feb 25). FAA online. Retrieved from https://www.faa.gov/about/media/b-chron.pdf FAA Requires Data-driven Safety Management Systems. (2015). Air Transport World, 52(2), 12. Retrieved from http://libaccess.sjlibrary.org/login?url=http:// search.ebscohost.com/login.aspx?direct=true&db=bft&AN=100963177&site=eho st-live
  • 62.     53 Federal Aviation Administration (FAA). (2014, Feb 11). DOT Bureau of Transportation Statistics online. Retrieved from http://www.rita.dot.gov/bts/node/265521 Flener, W. M. (1968, Aug 28). Aeronautical Beacons and True Lights (AC NO: 170/6850-1). DOT National Transportation Library online. Retrieved from http://dotlibrary.specialcollection.net/Document?db=DOT- ADVISORY&query=(select+0+(byhits+(field+DOCUMENT+(phrase+Air+Com merce+Act)))) H.R. 5900. (2010, Jul 28). 111th Congress, 2nd Session. U.S. Government Printing Office online. Retrieved from https://www.gpo.gov/fdsys/pkg/BILLS- 111hr5900eh/pdf/BILLS-111hr5900eh.pdf History. (2015, Feb 19). Federal Aviation Administration online. Retrieved from http://www.faa.gov/about/history/brief_history/ Johnson, W. (2012). SMS Jargon and Collecting Predictive Data. Airport Business, 26(3), 28-30. Retrieved from http://libaccess.sjlibrary.org/login?url=http://search. ebscohost.com/login.aspx?direct=true&db=hjh&AN=71797745&site=ehost-live Kent, R. J. (1980). Safe, Separated, and Soaring: A History of Federal Civil Aviation Policy, 1961-1972. U.S. DOT: Federal Aviation Administration. Komons, N. A. (1978). Bonfires to Beacons: Federal Civil Aviation Policy Under the Air Commerce Act, 1926-1938. U.S. DOT: Federal Aviation Administration. Larson, G. C. (2010). The Dubious Dawning of SMS. Business & Commercial Aviation, 106(10), 78. Retrieved from http://libaccess.sjlibrary.org/login?url= http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=55602109&site =ehost-live Lercel, D. J. (2013). Safety management systems in FAA part 145 repair stations: Barriers and opportunities (Order No. 3587351). Available from ProQuest Dissertations & Theses Full Text: The Humanities and Social Sciences Collection. (1426441281). Retrieved from http://search.proquest.com.libaccess.sjlibrary.org/ docview/1426441281?accountid=10361 McNeely, S. C. (2012). Examining the relationship between organizational safety and culture and safety management system implementation in aviation (Order No. 3504812). Available from ProQuest Dissertations & Theses Full Text: The Humanities and Social Sciences Collection; ProQuest Dissertations & Theses Full Text: The Sciences and Engineering Collection. (1002445201). Retrieved from http://search.proquest.com.libaccess.sjlibrary.org/docview/1002445201?accountid =10361
  • 63.     54 Murphy, R. P. (1999). Whether the Airport and Airway Trust Fund was created solely to finance aviation infrastructure, Letter to The Honorable Frank R. Wolf Chairman, Subcommittee on Transportation and Related Agencies. U.S. General Accounting Office. Retrieved from http://avstop.com/history/needregulations/281779.pdf NASA Aviation Safety Reporting System, General Form. (2009). NASA.gov. Retrieved from https://titan-server.arc.nasa.gov/HTML_ERS/general.html Planning Glossary. (2012, Mar 21). DOT Federal Highway Administration online. Retrieved from http://www.fhwa.dot.gov/planning/glossary/glossary_listing. cfm?TitleStart=F Pierobon, M. (2015). Growing influence. Asian Aviation Magazine, 13(9), 30-32. Preston, E. (1987). Troubled Passage: The Federal Aviation Administration During the Nixon-Ford Term, 1973-1977. U.S. DOT: Federal Aviation Administration. Procedures for Handling Airspace Matters (Order JO 7400.2J). (2012). FAA Air Traffic Organization. Retrieved from http://www.faa.gov/documentLibrary/media/Order/ AIR.pdf Rochester, S. I. (1976). Takeoff at Mid-Century: Federal Civil Aviation Policy in the Eisenhower Years, 1953-1961. U.S. DOT: Federal Aviation Administration. SMS Quick Reference Guide. (2015, March). FAA online. Retrieved from http://www.faa.gov/about/office_org/headquarters_offices/avs/offices/afs/afs900/s ms/media/newsletter/sms_qr_guide.pdf Safety Management. (2012). FAA Air Traffic Organization. Retrieved from http://energy.gov/sites/prod/files/2013/12/f5/DeNicuolo.pdf Safety Management System (SMS), Manual Version 4.0. (2014). FAA Air Traffic Organization. Retrieved from https://www.faa.gov/air_traffic/publications/media/ faa_ato_SMS_manual_v4_20140901.pdf Safety Management Systems Update (N 8900.133). (2010). Retrieved from http://fsims.faa.gov/PICDetail.aspx?docId=F7720BB569E0804A862577930062B C2E Swickard, J. (2010). FAA begins safety management system for national airspace. Business & Commercial Aviation, 106(5), 21. Retrieved from http://libaccess.sjlibrary.org/login?url=http://search.ebscohost.com/login.aspx?dir ect=true&db=bth&AN=51672404&site=ehost-live
  • 64.     55 Title 49: Transportation. (2016). U.S. Government Publishing Office Online (U.S. Department of Transportation, Code of Federal Regulations, Title 49.B.VIII.830.A §830.2). Retrieved from http://www.ecfr.gov/cgi-bin/text- idx?SID=8307dfccbf197a2501ae5353856b7c44&mc=true&node=se49.7.830_12 &rgn=div8 Wilson, J. R. M. (1979). Turbulence Aloft: The Civil Aeronautics Administration Amid Wars and Rumors of Wars, 1938-1953. U.S. DOT: Federal Aviation Administration.
  • 65.     56 APPENDICES Appendix I Table A Total Movements for all Class B Airports Mvmts in 2014 IATA Airport Code Mvmts in 2015 878,366 ATL 875,606 368,288 BOS 370,429 248,249 BWI 244,748 152,665 CLE 116,496 546,747 CLT 545,658 133,507 CVG 133,656 287,731 DCA 295,103 579,335 DEN 554,314 672,891 DFW 684,595 402,405 DTW 381,403 404,435 EWR 413,382 206,291 HOU 199,987 302,634 HNL 315,841 320,554 IAD 300,528 503,529 IAH 508,853 423,613 JFK 441,127 522,588 LAS 520,739 632,485 LAX 645,467 370,132 LGA 369,773 295,120 MCO 309,670 68,796 ADW 65,312 220,989 MEM 219,146 400,700 MIA 407,271 419,789 MSP 405,279 880,061 ORD 881,128 127,299 MCI 121,612 419,831 PHL 413,437 431,416 PHX 437,771 136,322 PIT 139,899 188,670 SAN 195,003 330,809 SEA 368,197 431,710 SFO 429,504 324,878 SLC 317,085 184,142 STL 185,430 184,036 TPA 189,313 153,434 NKX 180,696 126,089 MSY 128,703 Source: FAA Air Traffic Activity System, Airport Operations (2016). FAA online. Retrieved from http://aspm.faa.gov/opsnet/sys/Airport.asp
  • 66.     57 Appendix II Incident Report Example with ATC as a Contributing Factor Assessments   Contributing  Factors  /  Situations  :  Weather   Contributing  Factors  /  Situations  :  Human  Factors   Contributing  Factors  /  Situations  :  Equipment  /  Tooling   Contributing  Factors  /  Situations  :  ATC  Equipment  /  Nav  Facility  /  Buildings   Primary  Problem  :  Equipment  /  Tooling   ACN:  1280574   Time  /  Day   Date  :  201507   Local  Time  Of  Day  :  1801-­‐‑2400     Place   Locale  Reference.Airport  :  LAX.Airport   State  Reference  :  CA   Altitude.MSL.Single  Value  :  100     Environment   Flight  Conditions  :  IMC   Light  :  Night     Aircraft   Reference  :  X   ATC  /  Advisory.Tower  :  LAX   Aircraft  Operator  :  Air  Carrier   Make  Model  Name  :  B737  Undifferentiated  or  Other  Model   Crew  Size.Number  Of  Crew  :  2   Operating  Under  FAR  Part  :  Part  121   Flight  Plan  :  IFR   Mission  :  Passenger   Flight  Phase  :  Landing     Person   Reference  :  1   Location  Of  Person.Aircraft  :  X   Location  In  Aircraft  :  Flight  Deck   Reporter  Organization  :  Air  Carrier   Function.  Flight  Crew  :  First  Officer  
  • 67.     58 Function.  Flight  Crew  :  Pilot  Flying   Qualification.  Flight  Crew  :  Air  Transport  Pilot  (ATP)   Experience.  Flight  Crew.  Last  90  Days  :  151   Experience.  Flight  Crew.  Type  :  1694   ASRS  Report  Number.  Accession  Number  :  1280574   Human  Factors  :  Distraction     Events   Anomaly.  Inflight  Event  /  Encounter  :  Other  /  Unknown   Detector.  Person  :  Flight  Crew   When  Detected  :  In-­‐‑flight   Result.  Flight  Crew:  Returned  to  Clearance     Assessments   Contributing  Factors  /  Situations  :  Weather   Contributing  Factors  /  Situations  :  Human  Factors   Contributing  Factors  /  Situations  :  Equipment  /  Tooling   Contributing  Factors  /  Situations  :  ATC  Equipment  /  Nav  Facility  /  Buildings   Primary  Problem  :  Equipment  /  Tooling     Narrative:  1   During  night  flight  and  after  break  out  at  200  feet  and  1/2  mile  or  better,  the  new   LED  runway  lights  at  25L  were  so  bright  that  the  actual  runway  could  not  be  seen   normally.  To  clarify,  the  approach  lights  and  runway  outer  marking  lights  may  be   bright,  but  the  newer,  high  intensity  runway  centerline  and  landing  threshold  LED   lights  are  far  too  bright  for  safe  operation,  and  the  high  intensity,  especially  during   hazy  or  foggy  visibility  conditions,  makes  it  very  difficult  to  actually  see  the  runway   itself.     Synopsis     Pilot reports of LED that are so bright that it was difficult to see the actual runway.     Incident Report Example with ATC as a Primary Problem Assessments   Contributing  Factors  /  Situations  :  ATC  Equipment  /  Nav  Facility  /  Buildings   Primary  Problem  :  ATC  Equipment  /  Nav  Facility  /  Buildings  
  • 68.     59 ACN:  1248258   Time  /  Day   Date  :  201503   Local  Time  Of  Day  :  1201-­‐‑1800     Place   Locale  Reference.  Airport:  LAX.  Airport   State  Reference  :  CA   Altitude.  AGL.  Single  Value:  0     Aircraft   Reference  :  X   ATC  /  Advisory.  Tower  :  LAX   Make  Model  Name  :  Light  Transport,  Low  Wing,  2  Turboprop  Eng.   Crew  Size.  Number  Of  Crew  :  2   Flight  Plan  :  IFR   Flight  Phase  :  Taxi   Route  In  Use  :  None   Airspace.  Class  B:  LAX     Person   Reference  :  1   Location  Of  Person.  Facility  :  LAX.TOWER   Reporter  Organization  :  Government   Function.  Air  Traffic  Control  :  Local   Qualification.  Air  Traffic  Control  :  Fully  Certified   Experience.  Air  Traffic  Control.  Time  Certified  In  Pos  1  (yrs)  :  6   ASRS  Report  Number.  Accession  Number  :  1248258   Human  Factors  :  Confusion   Human  Factors  :  Human-­‐‑Machine  Interface   Human  Factors  :  Situational  Awareness   Human  Factors  :  Troubleshooting   Human  Factors  :  Distraction     Events   Anomaly.  ATC  Issue  :  All  Types   Anomaly.  Deviation  -­‐‑  Procedural  :  Other  /  Unknown   Detector.  Person  :  Air  Traffic  Control   When  Detected  :  Taxi     Assessments   Contributing  Factors  /  Situations  :  ATC  Equipment  /  Nav  Facility  /  Buildings   Primary  Problem  :  ATC  Equipment  /  Nav  Facility  /  Buildings