National Academies Press: OpenBook

An Assessment of NASA's National Aviation Operations Monitoring Service (2009)

Chapter: 6 The Redacted Data and Their Limitations

« Previous: 5 Analysis of NAOMS Questionnaires
Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×

6
The Redacted Data and Their Limitations

Task 4 of the charge to the committee asks for “an analysis of the NAOMS project survey data provided by NASA to determine its potential utility.” This chapter focuses on the limitations of the redacted data that have been released to the public.

To maintain the confidentiality of the survey participants, NASA released only redacted versions of the survey data. Two versions of these redacted data are currently available: Phase 1 was released on December 31, 2007 (with an update, Phase 1a, on February 6, 2008); Phase 2 was released on September 30, 2008. NASA has not officially released any analyses based on the NAOMS survey data.

Only data from the air carrier (AC) survey are discussed here, because the survey of general aviation (GA) pilots lasted for only 9 months. Nevertheless, a number of the conclusions made here, such as those on the limitations of the redacted data and data quality, will also apply to the survey results from GA pilots.

6.1
PHASE 1 AND 1a REDACTIONS

NASA characterized the initial release of data in Phase 1 as “conservative to ensure the responses do not contain confidential commercial information or information that could compromise the anonymity of individual pilots.”1 The strategies for redaction included the reordering, generalization, disaggregation, deletion, and/or editing of the survey responses. In particular, the recall period and recall date were removed, and the legs flown were disaggregated from the majority of responses. The structure of the released data is discussed below.

Because Phase 1a was a relatively minor update of the release in Phase 1, it is not discussed here. (The main point of this update was the reclassification of 407 AC survey responses that had been filed with the GA responses. These respondents were contacted during the screening for the GA survey, but they actually were air carrier pilots and so were given the AC questionnaire instead of the GA questionnaire. Phase 1a also included an additional 701 GA rotorcraft responses.)

A detailed description of Phase 1 data and the modifications follows:

1

NASA, National Aviation Operations Monitoring Service (NAOMS) Information Release, available at http://www.nasa.gov/news/reports/NAOMS.html, accessed April 14, 2008; see section headed “December 31, 2007.”

Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×
  • Section A—Pilot background questions and relevant exposure information. The original responses from each pilot are categorized to the following groups:

    1. Time of Interview by Year (4 levels): 2001, 2002, 2003, 2004.

    2. Flight Hours (5 levels): Less than 51, 51-90, 91-130, 131-170, Greater than 170 (hours)

    3. Aircraft size (4 levels): Small or Other, Medium, Large, and Wide-body with only the aircraft that was flown the most being reported

    4. Propulsion (2 levels): Turbofan and Turboprop or Other

    5. Flight Type (3 levels): Domestic, International, and Unknown

    6. Crew Role (2 levels): Captain and First Officer or Other

    7. Amount of Pilot Experience (3 levels): Low, Medium, and High

    8. Mission type (2 levels): Passenger and Cargo or Other

Phase 1 provides pilot responses on flight hours (item 2) but not on flight legs (see Section 6.2).

  • Section B—Counts for safety events. The responses are provided in their original raw form except for the following modifications:

    1. Responses that were considered to be rare (occurred in less than 0.1 percent of the surveys) were removed and were given in separate tables without linking them to the pilot response for other events.

    2. “High unique” response values were replaced with the next closest numerical value in that field. “High unique response” denotes cases where a pilot’s response for an event count was unusually high in comparison to the responses of the other survey participants.

  • Section C—Responses to special questions concerning baseline performance measures and “in-close” changes to approach and landing. High unique and rare counts are redacted in the same manner as in Section B. Free-text responses are aggregated into a separate file.

  • Section D—Pilot feedback on the questionnaire. The numerical values are reported with free-text responses again aggregated into a separate file.

In addition, files of partial raw responses for aircraft type flown, complete set of hours and legs, and career hours flown are provided separately.

Other features of Phase 1 redaction include the following:

  • Individual pilot responses in Sections A and B are linked by means of a uniquely assigned Random Identification Number. This permits the examination of all the responses from a particular pilot in the analysis.

  • The main release was cleaned, with those rows having missing data or outliers reported in separate files.2

It appears that the redaction strategies were developed primarily with confidentiality issues in mind. The following comments discuss the advantages and disadvantages from the viewpoint of information value in the redacted data.

2

“Prior to the recent redaction steps taken, NAOMS air carrier survey responses were evaluated by Battelle at two stages. During initial processing, Battelle refined the set of survey responses using a technique called the Chebyshev process and related criteria to remove 322 responses of doubtful quality to avoid contaminating analyses of the responses. Battelle cites the following specific reasons for their removal: 1) number of flight hours too small; 2) unreasonable ratio of hours-to-legs; 3) unreasonable responses to multiple questions; and/or 4) Section B not completed. These responses are provided below and are identified as the ‘Outlier Survey Responses.’ A second refinement was then made by Battelle during subsequent tabulation activities when the NAOMS project team sought a set of responses with all explanatory flight activity variables present (no null values in Section A fields). This was done to ensure that all tabulations on the responses based on flight activity fields had a consistent total. This resulted in an additional 335 responses being removed. These responses are provided below, identified as the ‘Survey Responses with Unknowns in Flight Activity Fields.’” NASA, NAOMS Air Carrier Survey Responses, available at http://erc.ivv.nasa.gov/news/reports/NAOMS_air_carrier_survey_data.html, accessed June 11, 2008.

Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×

There are several good features in the Phase 1 redacted data:

  • The ability to link responses across sections so that all the numerical responses can be traced to a particular (anonymous) pilot during analysis;

  • Post-processing of data to remove outliers before release (although the committee cannot comment on the validity of the method); and

  • A separate file with actual (not categorized) flight hours, flight legs, and primary aircraft type for all respondents.

However, there are also several deficiencies in the Phase 1 redacted data:

  • The grouping of the time of response into years is too coarse to permit sensitive analyses of trends over time.

  • Grouping the data on “number of hours flown” reduces the ability to calculate event rates by the different explanatory variables, such as type of aircraft, pilot experience, and so on.

  • The lack of information for number of flight legs flown is a serious limitation of this release. This is the right exposure variable for several event types, and one cannot compute event rates without this information.

  • There is no way to judge the effects of the modification that was used to adjust for rare or high unique events.

  • No information was provided on whether the NAOMS team made any attempts to identify causes for “outliers” that were removed. Removing outliers without assignable causes can lead to biased analyses that depend on the thresholds used to identify the outliers.

These issues and their consequences are discussed in more detail in Section 6.3.

6.2
PHASE 2 REDACTION

As of the release of this report, Phase 2 is NASA’s last release of redacted NAOMS survey data and documents. The intent in this version was to release “the maximum amount of survey information”3 without compromising (or at least minimizing the threat of compromising) the anonymity of the pilots or releasing confidential commercial information. Data on all 26,168 records were released. This release took a very different form from that of Phase 1; rather than relatively few spreadsheets, this release included more than 100 separate files. The tables included responses that were deleted or separated from Phase 1, such as incomplete survey responses, rare events, and high unique events.

The structure of the Phase 2 data is as follows. For each event in Section B of both questionnaires (for example, AC1 = number of bird strikes), a file with the following information is provided:

  • Column 1: ID number: 1, 2, … , 26,168 (in random order that varies across the files for different events and hence cannot be linked).

  • Column 2: Time of survey grouped into 4 years: 2001, 2002, 2003, 2004.

  • Column 3: Number of hours flown in recall period, grouped into six categories: less than 46, 46-70, 71-100, 101-120, 121-150, greater than 150 (this is a different grouping from the one used in Phase 1).

  • Column 4: Number of flight legs flown in recall period, grouped into five categories: less than 14, 14-22, 23-36, 37-60, greater than 60 (this information was not provided in Phase 1).

  • Column 5: Aircraft type, grouped into 34 categories (more categories than in Phase 1).

  • Column 6: Number of events (say, bird strikes) reported by the pilot as having occurred during recall period (data were not modified for rare and unique events as in Phase 1).

3

Ron Colantonio, NASA Glenn Research Center, “National Aviation Operations Monitoring Service (NAOMS) 2008 Information Release Project,” presentation to the NRC Committee on NASA’s NAOMS Project, October 13, 2008, slide 7.

Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×

A separate file with the actual number of hours and number of flight legs flown by year for each respondent is provided, so the joint distribution of these two responses is available.

The advantages of Phase 2 redaction over that of Phase 1, from the viewpoint of information value, include the following:

  • No redaction of Section B events;

  • Availability of data for both exposure variables—hours (six categories) and flight legs (five categories)—for each safety event, as well as primary aircraft type; and

  • Grouping of aircraft type into finer levels (34 categories).

However, there are also many problems, some of which are in common with Phase 1:

  • The grouping time of the survey was in years rather than on a finer scale; the problem with this coarse grouping was discussed previously for the Phase 1 data.

  • Exposure (in flight legs and hours) is coarsely categorized; the categories for hours do not correspond to those for Phase 1, which was apparently deliberate.

  • Information for each safety event is given in separate files that cannot be linked. This was done to address the privacy/confidentiality concerns, but it does not allow the user to link information across multiple safety events—for example, if a single pilot reported multiple event types. This information would be especially useful in detecting aberrant data points.

  • Some of the aircraft do not fit clearly into one of the 34 categories. (Even the pilots on the committee could not map all of the aircraft into the 34 categories consistently.) For example, B707 is not a category, but the aircraft does not really fit into any of the existing 34 categories.

  • Respondents who “refused to answer” or “did not know” and fields with “missing information” are coded (999, 998, 997) or (99, 98, 97) or (9, 8, 7). But 9, 8, and 7 could be possible values for the responses, so a “missing code” of 9, 8, or 7 cannot be distinguished from a response of “9,” or “8,” or “7,” reported events. The survey questionnaires include the codes used for each question,4 but cross-referencing each response from the data set with each question in the survey is impractical.

6.3
LIMITATIONS OF THE REDACTED DATA

6.3.1
Effect of Grouping Time of Survey into Years

The categorizing of the time of the survey response by years in the redacted data of the NAOMS survey is too coarse to be very useful. The problem is exacerbated by the fact that pilots are recalling events that occurred within a recall period ranging from 30 to 90 days. So a non-negligible proportion (about 1/6 for a 60-day recall period) of the events reported in one year (say 2003) could have occurred in the previous year (2002). Such a level of ambiguity with respect to the time of the events makes it difficult to analyze the redacted data to achieve the objectives of NAOMS, which include (1) determining the effect of changes to airline safety procedures from the data and (2) tracking changes over time. For example, an event that had a seasonal trend (such as more bird strikes in the summer) cannot be captured from yearly data.

Obviously, there is a trade-off in retaining information for data analysis versus maintaining the confidentiality of pilots, and there has been considerable work in this area.5 Since the committee does not have access to the original data, it cannot comment on the appropriate degree of categorization to achieve the best trade-off. From a

4

NASA, National Aviation Operations Monitoring Service (NAOMS) Phase 2 Information Release Survey Response Disclaimer, Washington, D.C., September 30, 2008, p. 2, available at http://www.nasa.gov/pdf/279939main_Phase%202%20Release%20Summary%20092408_Final_2(508).pdf, accessed July 20, 2009.

5

See American Statistical Association, Privacy and Confidentiality, Alexandria, Va., 2003, and references therein, available at http://www.amstat.org/committees/cmtepc/index.cfm?fuseaction=main, accessed July 15, 2009.

Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×

data analysis perspective, however, grouping by years is much too coarse and substantially undermines the goal of providing public data for independent analysis.

6.3.2
Effect of Grouping the Number of Hours and Number of Flight Legs Flown

The number of hours flown and the number of flight legs flown are the primary measures of exposure for calculating event rates. Depending on the type of safety event, one or the other is the right denominator for calculating event rates. In the absence of additional information on the categorization, one would typically use a surrogate value (such as the midpoint of the interval) as proxies for all of the values in that interval. However, the redacted data provided raw data on the numbers of hours and flight legs flown in a separate file. This allows one to compute the total numbers of hours and flight legs flown and hence the rates for each category exactly—calculating the total number of events in a category and dividing by the total number of hours or flight legs flown in that category to get rates for each category.

However, it is not possible to compute the total numbers of hours or flight legs flown by various subpopulations of interest from the redacted data. Consider the two subpopulations of AC pilots corresponding to domestic fights and to international flights. The redacted data in Phase 2 do not provide the total numbers of hours and flight legs flown for these two subpopulations, so one cannot compute the event rates. Various types of approximations are possible: for example, use a surrogate such as the midpoint of a category for the actual but unknown values in that category. However, all of these approximations will lead to additional uncertainty in the estimates. There are other subpopulations, such as the experience level of pilots and aircraft type, for which event rates would also be of interest but cannot be computed from the redacted data.

Phase 2 provides the total numbers of hours and flight legs flown for one particular subpopulation—aircraft make and model. Similar tables could easily have been provided for other, selected subpopulations without any danger of sacrificing respondents’ confidentiality. It appears that the data redaction efforts did not fully take into consideration how the data would be analyzed and used.

6.3.3
Information on Size of Aircraft Flown

The redacted survey data provide the size of only one (primary) aircraft for each respondent: the aircraft that the pilot reported as having flown the most, grouped into four categories in Phase 1. This raises the possibility that the events reported may not be traced to the aircraft size. However, Table 6.1 shows that about 75 percent of the pilots reported flying 100 percent of the time in one aircraft. Even for the remaining 25 percent, it is likely that some fraction of them would have flown in the same size aircraft. Thus, the redaction does not appear to pose a serious problem for estimating event rates by aircraft size.

6.3.4
Grouping of Other Variables

Some other variables also exhibited the “most often” limitation. For example, a pilot is reported as flying “domestic” if that is what that pilot mainly flies. If most pilots fly either domestic or international but rarely both, the effect of this simplification (to just “domestic” or “international”) would not be serious.

TABLE 6.1 Percentage of Time That Pilots Reported Flying in a Primary Aircraft and Corresponding Proportion of Respondents

Percentage of Time in Primary Aircraft

Proportion of Respondents

100%

0.743

90% to less than 100%

0.054

75% to less than 90%

0.070

50% to less than 75%

0.131

Less than 50%

0.002

Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×

6.3.5
Modification of Counts of Rare and High Unique Events in Phase 1

Many of the event types in the NAOMS survey are rare (that is, for many of the event types, pilots rarely reported the occurrence of the events). Modifications to the numbers of rare and high unique events in the redaction can severely alter the information in the data. There is no way to know if the reported non-zero counts referred to genuine events that should not have been removed, or if they are outliers that should be ignored (for example, typographical errors). Replacing the large values with the next closest numerical value leads to an underestimation of the event rates. It is difficult to assess the degree of this problem from the Phase 1 redacted data.

6.4
DATA ANOMALIES

The data values released in Phase 2 were not cleaned or modified for other unusual features, as they were in Phase 1. An examination of these raw data shows that an unusually high proportion of the numerical values are implausible, both for event counts (numerators) and number of legs/hours flown (denominators). This problem is discussed in more detail in the next chapter.

If the complete record for each respondent were available, it might be possible to detect whether a particular record stands out as an outlier. Specifically, if a respondent consistently reported implausible values for many questions, one might have some justification for removing the record. However, the complete set of responses from each survey is not available in Phase 2; rather it was split into separate files for each event.


Finding: The redacted data from the NAOMS survey have several limitations that further constrain the ability to analyze the data to meet the study objectives. The nature of the redaction differs in its two phases, so the type and severity of the limitations vary. The time of survey response is grouped into years (this feature is common to both phases), so estimates of event rates can be computed only by years. This limits the ability to (1) track the changes in event rates over shorter timescales, (2) determine the effects of changes in the aviation system on event rates, and (3) assess seasonal and similar types of effects. Grouping the exposure data (number of hours/legs flown) into categories increases the uncertainty of the estimates of event rates broken down by key characteristics, such as pilot experience. The separation of records into different files also constrains one’s ability to detect anomalous records and thereby apply methods that could improve data quality.


The NAOMS data redaction efforts released by NASA resulted from a Freedom of Information Act (FOIA) request and the publicity that resulted from the denial of the request for the purposes of protecting the anonymity and confidentiality of the respondents (which was promised during the survey). The 1999 briefing to the ASRS Advisory Subcommittee asserted that “Participant Confidentiality is assured…. It will have no means of tracing a survey response to the individual who provided it; neither FOIA nor discovery actions will pose a confidentiality risk to NAOMS.”6 It appears that the NAOMS management team anticipated neither the need to eventually release the survey data to the public nor the consequent problems that would develop. (As some of the project team members noted in presentations to the committee7 and as NASA noted in its response to the GAO report on NAOMS8 a possible reason is that NAOMS was viewed, at least by some, as primarily a research study for developing a methodology. However, the project’s submission for clearance from the Office of Management and Budget, dated June 12, 2000, clearly characterizes the project as a data-collection effort, not just a research study.9)

It is clear that data analysis, reporting, and other post-survey activities were not adequately planned or properly anticipated. As a result, NASA appears to have rushed into developing after-the-fact redaction strategies for

6

NASA, “NAOMS Development and Proof of Concept,” presentation by NAOMS management team to ASRS Advisory Subcommittee, November 13, 1998, slide 15, available at http://www.nasa.gov/pdf/209893main_1998-11-13%20ASRS%20Advisory%20Subcommittee.pdf, accessed July 20, 2009.

7

Connors, presentation to NRC Committee on NAOMS, 2009, p. 1.

8

Government Accountability Office, Aviation Safety, 2009, p. 90.

9

NASA, Request to Conduct Federal Agency Survey, National Aviation Operations Monitoring Service (NAOMS), OMB Number 2700-0102, Washington, D.C., June 12, 2000.

Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×

releasing the data. The committee does not know the specific trade-offs that were considered and ultimately made in arriving at the redaction strategies. It can comment only on the severe negative impact of the chosen redaction strategy on the data analysis.

The issue of preserving the privacy and confidentiality of survey participants is not a new problem. It has been studied extensively, and considerable literature was published on the topic even prior to 2000.10 In fact, many federal agencies have faced this problem regularly over the years. The NAOMS study would have benefited considerably if it had anticipated the problem in the planning stage. Consultation with other federal agencies (for example, the U.S. Census Bureau) would have avoided many of the problems, both with data release and in losing information content in the released data.


Finding: The issues associated with preserving respondents’ anonymity and confidentiality and with the public release of data should have been anticipated and addressed at the design stage of the NAOMS project. There is considerable expertise in this area in both the research literature and among practitioners in the federal agencies. Such advance planning would have avoided the need for after-the-fact, ad hoc redaction methods and the resulting loss of information.

10

See American Statistical Association, Privacy and Confidentiality, 2003.

Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×
Page 32
Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×
Page 33
Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×
Page 34
Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×
Page 35
Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×
Page 36
Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×
Page 37
Suggested Citation:"6 The Redacted Data and Their Limitations." National Research Council. 2009. An Assessment of NASA's National Aviation Operations Monitoring Service. Washington, DC: The National Academies Press. doi: 10.17226/12795.
×
Page 38
Next: 7 An Assessment of the Utility of NAOMS Data »
An Assessment of NASA's National Aviation Operations Monitoring Service Get This Book
×
Buy Paperback | $51.00 Buy Ebook | $40.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The National Research Council of the National Academies was requested by the National Aeronautics and Space Administration (NASA) to perform an independent assessment of NASA's National Aviation Operations Monitoring Service (NAOMS) project, which was a survey administered to pilots from April 2001 through December 2004.

The NRC reviewed various aspects of the NAOMS project, including the survey methodology, and conducted a limited analysis of the publicly available survey data. An Assessment of NASA's National Aviation Operations Monitoring Service presents the resulting analyses and findings.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!