ASRS Directlline

 Issue Number 8 : June 1996

ASRS Database Statistics, 1994

ASRS codes descriptive characteristics of every report it receives and places that information in a computerized database. We code the function of the person who submitted the report; the place and time of the reported incident; and the descriptive nature of the occurrence. Following are 13 pages of graphs and statistics portraying these and other data.

Time Frame

The data presented are for two specific time periods--a 1-year period from January 1994 through December 1994, and a 7-year period from January 1988 through December 1994. The reader will see that, with few exceptions, the 1-year and 7-year data are remarkably similar, with few changes in percentages.

Relationship of ASRS Data to All Aviation Incidents

ASRS reports are voluntarily submitted and are not obtained through a statistically valid sampling process. Thus, the ASRS cannot specify the relationship between its database and the total volume of aviation safety incidents that occur, nor can it say with certainty that this relationship has remained fixed over time. This is known as the self-reporting bias problem.

However, the ASRS can say with certainty that its database provides definitive lower-bound estimates of the frequencies at which various types of aviation safety events actually occur. For example, 34,404 altitude overshoots were reported to the ASRS from January 1988 through December 1988. It can be confidently concluded that at least this number of overshoots occurred during the 1988-94 period--and probably many more. Often, such lower-bound estimates are all that decision makers need to determine that a problem exists and requires attention.

Known Biases

We are aware of two prominent factors that bias ASRS statistical data. The first is the relatively high number of reports received from pilots (currently about 96 percent of ASRS report intake) versus controllers (roughly 3 percent). This imbalance causes the ASRS database to have many more records describing pilot errors (altitude deviations, runway transgressions, etc.) than controller errors (operational errors, coordination failures, etc.).

The second biasing factor is the computerized error detection capabilities at FAA Air Route Traffic Control Centers (ARTCCs). These are very effective at capturing altitude and track deviations that result in a loss of aircraft separation. Thus, the ASRS receives disproportionately large numbers of reports describing these kinds of events, mostly from pilots.

Number of Reports vs. Number of Incidents

Many incidents are reported by more than one individual. For example, an incident may be reported by a pilot and a controller, several pilots and several controllers, the entire flight crew of a given aircraft, and pilots of more than one aircraft. In 1994, ASRS received 32,272 reports describing 26,413 unique incidents; thus, 5,859 reports were "secondary," in that they described incidents which had already been reported to the ASRS.

Total and Percent Distributions

Multiple entries are permitted in many of the data fields coded by ASRS analysts. For example, an altitude bust that resulted in a loss of standard separation would be coded in the Anomaly field as an altitude deviation, an airborne conflict, and an ATC clearance violation. While this is the most accurate way of coding events, it means that incidents do not fall into neat, mutually exclusive categories that always add up to 100 percent. Moreover, it is not unusual for selected data fields to be left blank during coding, either because needed information is not available, or because the field is not deemed relevant to a particular report. This presents an added complication when incidents are totaled and percent distributions are calculated.

The first chart in the following pages shows the number of unique incidents reported to the ASRS over the past 7 years. This provides a baseline for interpreting data in succeeding charts which characterize the time, location, and other aspects of the reported incidents. The data in these latter tables are presented in a consistent format that provides for unknown or inapplicable data, and for cases in which more than one category applies. An example is shown below in the hypothetical table.

In this example, incident records are categorized as A, B, or C. Any incident may be placed in one, two, or even three of these categories. If categories A, B, and C are simply added together, incidents that are recorded in more than one category will be double-counted in the "Total Row." Since double-counting is usually unwanted in summations, the totals have been adjusted to eliminate double-counted events. The results are presented in the row entitled Total Unique Incidents.

Thus, in the Hypothetical Example Table, a total of 165 incidents were reported during the current time period. This is the Incident Base for that period. Out of the Incident Base, 127 unique events fell into categories A, B, or C, or some combination of these categories. The remaining 38 incidents did not fit any of the categories, or there was insufficient data to classify them. These are shown in the Inapplicable or Unknown row.

Because the number of Total Unique Incidents varies from table to table, we decided to use the Incident Base to calculate percent distributions for all data sets. By calculating the percentages in this matter, we created a common yardstick which can be used to compare the data presented in the various charts.

Finally, all of the percentages shown were rounded to whole numbers. In those cases where the number of relevant incidents is very small (less than one-half of one percent) the percentages round down to, and are presented as, zero percent. Similarly, in those cases where the number of reports in a category exceed 99.5 percent of the Incident Base, the result was rounded up to, and is presented as, 100 percent.

Hypothetical Examples

Index To Statistical Charts

You may click on an individual chart to view the appropriate data, or you may start with the first chart, then use the bottom navigation table to view the next chart, go back to the previous chart, or return to this statistics article, the ASRS Directline or ASRS Publications sub-page, or the ASRS Home Page.
Figure Chart
1 Year of Occurrence
2 Reporting Sources
3 Month of Occurrence
4 Weekday of Occurrence
5 Time of Day of Occurrence
6 Involved Facilities
7 Involved Airpaces
8 Anomalies (Top-Level Categorizations)
9 Airborne Spatial Deviations & Conflicts
10 Ground Incidents
11 Non-Adherence to Rules & Requirements
12 Other Aircraft Anomalies
13 ATC Handling Anomalies


Reproduction and redistribution of ASRS Directline articles is not only permitted--it is encouraged. We ask that you give attibution to ASRS Directline, to the Aviation Safety Reporting System (ASRS), and of course, to the authors of each article.