National Academies Press: OpenBook

How Airports Measure Customer Service Performance (2013)

Chapter: Chapter Three - Data Collection

« Previous: Chapter Two - Overview of Customer Service at Airports
Page 14
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 14
Page 15
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 15
Page 16
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 16
Page 17
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 17
Page 18
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 18
Page 19
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 19
Page 20
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 20
Page 21
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 21
Page 22
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 22
Page 23
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 23
Page 24
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 24
Page 25
Suggested Citation:"Chapter Three - Data Collection ." National Academies of Sciences, Engineering, and Medicine. 2013. How Airports Measure Customer Service Performance. Washington, DC: The National Academies Press. doi: 10.17226/21937.
×
Page 25

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

14 Airports use a variety of methods to collect feedback from customers. Responding to complaints, compliments, and comments is a high priority for most airports, and is a cen- tral element of customer service. For some airports, listening and responding is the main method of communication with customers. Comments or complaints are received and sent on to the responsible party. For complaints, airport staff will follow through to make sure the problem is resolved and the customer notified. If there is time and resources, the airport will track the types of comments, location of the incident, and the time needed to respond and resolve the issue. Table 4 lists the various ways that airports obtain feedback from their internal and external customers. Figure 8 shows responses from the online survey to the question: “Which methods of customer feedback were most helpful to improve your customer service program?” The five top responses included customer feedback delivered through the airport website, ambassadors, employees, or social media. The Air- ports Council International (ACI) Airport Service Quality (ASQ) Survey was also cited by the nine airports that use this service. Methods at the bottom of the list were offered as free responses and do not represent the opinions of all airports surveyed. VOLUNTEER AMBASSADORS Many airports use volunteer ambassadors as an integral part of customer service. Volunteers operate help desks and wel- come centers and are stationed at key areas of the terminal to answer questions. At airports such as Colorado Springs (COS), ambassadors serve as a primary point of contact for customers transitioning through the terminal and are trained to help spe- cial customer segments such as military personnel or athletes going to the Olympic Training Center. At Hartsfield–Jackson Atlanta International (ATL), volunteers distribute airport wings to children and “Be Our Guest” coupons (with a value of $10) to spend in the airport. At Nashville (Tennessee) International (BNA), ambassadors are assigned to help direct traffic on the baggage claim level and at entrances to security checkpoints. In Port Columbus International (CMH), ambassadors operate the information desks and provide interactive tours upon request. Volunteers also staff the eight Travelers Assistance desks at MSP and provide customer feedback for a complaint/compliment database. San Diego ambassadors drive the courtesy carts. These carts are the only ones available in the SAN terminal. At San Fran- cisco International (SFO), volunteers work at help desks on the departure decks, but paid employees staff the help desks on arrival decks. Airports reported a few management strategies to train and acknowledge the importance of volunteers: • Stable volunteer coordinators. Volunteers build strong relationships with their coordinators, especially those who serve in the position a long time. • Recognition awards and appreciation dinners. Recog- nition of volunteers is exceedingly important. Airports schedule appreciation events quarterly and/or annually. Some give volunteers gas cards or gift cards from stores in the airport. • Special lounges for volunteers. As part of the recog- nition package, some airports set up dedicated areas where food, tables, and comfortable chairs are available to volunteers. • Training. A number of airports are introducing customer service training for employees, volunteers, and business partners. Halifax’s Stanfield Way Program and MSP’s MSP Nice are two notable training programs. CUSTOMER FEEDBACK Airports use a variety of ways to communicate with their cus- tomers. The most traditional are comment cards, customer hot- lines, letters, and airport websites. Airport customers use these chapter three DATA COLLECTION Atlanta Coupon

15 External Customers Internal Customers M et ho ds Pa ss en ge rs M ee te rs a nd G re et er s G en er al A vi at io n O th er A irp or t V isi to rs Co m m un ity A irl in es B us in es s Pa rtn er s A irp or t Te na nt s G ov er nm en t A ge nc ie s Em pl oy ee s Assistance/Information Desks X X Ambassadors X X Web Comments X X X X X Comment Cards X X X X X Customer Hot Line X X X X X Social Media Listening X X X X X Idea Collaboration X X X X X X X X Surveys X X X X X X X X X Focus Groups X X X X X X X Community/Stakeholder Meetings X Mystery Shoppers X X X Tenant Meetings X X X X Station Manager Meetings X X Business Partner Meetings X X Customer Experience Committees X X X X X Executive Contact X X X X X X Quality Assurance Audits X X X X Adapted by KRAMER aerotek inc. from Metropolitan Nashville Airport Authority: 2012 Baldrige Award Application. TABLE 4 METHODS TO LISTEN AND ENGAGE AIRPORT CUSTOMERS FIGURE 8 Methods to obtain customer feedback. Source: Synthesis Airport Survey (2012). Canmark Airport Satisfaction Survey Passenger Processing Times Airport Ambassadors Feedback Website Customer Feedback Airport Tenant Surveys Focus Groups Idea Campaign Help/Information Desks Consultant Custom Surveys Customer Hotline Spot Inspections Passenger Surveys Comment Cards Mystery Shoppers ACI-ASQ Programme Employee Feedback Social Media 1 1 1 1 2 3 4 4 6 6 6 6 7 9 9 10 10 Free Response Additional Suggestions

16 channels effectively, although airports interviewed reported that comment cards were less frequently used than other methods. The Tucson Airport Authority (TAA) also relies on 115 volun- teer members who report on community sentiment and other customer service issues heard about the airport. Airports are also using Facebook, Twitter, Pinterest, and YouTube to communicate with their customers, monitor sen- timent, and make announcements about events, construc- tion, and changes in service. Figure 9 shows San Francisco International’s Facebook page. A new element of customer feedback is sentiment analy- sis. This linguistic analysis technique is used to monitor and characterize the overall feeling or mood of the airport’s cus- tomers as reflected in social media. Though sentiment analy- sis predates Twitter, Facebook, and other social media, its use has accelerated with the development of computational capacity to analyze large unstructured textual data sets. Sen- timent analysis can provide rapid feedback to an airport. A number of airports monitor sentiment on Twitter and Face- book. BNA and MSP are developing in-house capabilities to analyze key indicators using word analysis. FOCUS GROUPS DFW uses focus groups to find out more about a problem area at the airport or to speak with a particular group of air- port users. Four focus groups are held quarterly and facili- tated by a marketing contractor. HELP/INFORMATION DESKS Most airports have help or information desks. These may be staffed by paid employees, volunteers, or a combination. A number of airports contract this function out to local Cham- bers of Commerce or economic development groups. Aspen/ Pitkin County Airport contracts with the Aspen Chamber Resort Association. The Ohio State University Airport (OSU) Social network keyboard FIGURE 9 San Francisco International Airport Facebook page. Source: https://www.facebook.com/ flySFO?ref=ts&fref=ts (2012).

17 uses its own employees to staff the FBO concierge desk. The Airport Foundation at MSP operates Travelers Assistance. At Tucson International, the Convention and Visitors Bureau trains and certifies airport ambassadors and employees to be knowledgeable about the Tucson area when helping passen- gers. Help desks have telephones that connect directly to the airport authority reception desk or dispatch area. Ambassa- dors are stationed around the airport. Airports are also experimenting with interactive informa- tion kiosks. Jacksonville (Florida) International implemented an interactive touch screen kiosk. In July 2012, PANYNJ began installing customer service avatars at JFK, LaGuardia and Newark airports, virtual assistants programmed to answer basic questions frequently asked by passengers, but not yet interactive. IDEA COLLABORATION Ottawa International Airport (YOW) has been on the cutting edge in creating an online environment for airport users to make suggestions, evolve the ideas, discuss, and rank them. The airport launched the Ottawa Airport Ideas Campaign which was aimed at “improving the airport passenger expe- rience through ideas generated from crowd sourcing.” The idea of crowd sourcing is to tap into the collective intelli- gence of the public to gain insight into what customers really want. The crowd of airport users was invited to log onto a website and become participants. The airport was striving for a transparent mechanism to engage the community in a discussion about how to improve the airport experience. The objective was to incorporate ideas into operational and capital plans for future years. The campaign was set up in French and English and lasted eight weeks. YOW advertised the campaign extensively on signs and digital media in the airport and also advertised in print and on billboards in the community. The website had thousands of visitors and 697 regis- tered users. There were 136 improvements suggested and 84 unique ideas. Several passenger “pain points” were revealed, while others that airport staff expected never materialized. Registered users are now part of an airport data base; how- ever, its plan is to communicate with this group sparingly, so as to keep it engaged on the most important issues. The top-rated ideas emerging from the campaign were: 1. A cell phone lot 2. A lounge in U.S. departure areas 3. Better food service at U.S. gates 4. Speedier baggage handling 5. Added chairs at security 6. Healthy food improvements 7. Improvements to access the BizPark 8. Better passenger flow through U.S. Customs 9. Remove ashtrays from parking garages and walkway entrances 10. O-Train (light rail) to YOW. For each of these suggestions, airport administration gen- erated an action plan. Many unexpected ideas surfaced during the campaign and for this reason, the airport is considering additional idea campaigns that target specific airport users. MEETINGS Customer service regularly appears on meeting agenda with employees, airline station managers, business partners, air- port tenants, and the community. Some airports have standing committees that address customer service. MSP’s Customer Service Action Council (CSAC) is one of the longest standing groups. Membership consists of representatives from all MAC divisions and a broad cross-section of airport tenants. Hali- fax formed an Ambiance Group of 15 participants includ- ing legal counsel, the Canadian Border Services Agency (CBSA), customer relations, engineering, and the cleaning contractor. The group meets as needed to discuss improve- ments in the terminal, such as cleanliness, and the schedul- ing of art exhibits, performing groups and special events. It was the Ambience Group that recommended introduction of Adirondack and rocking chairs into the terminal. Customer service avatar Ottawa ideas campaign

18 DFW has a Customer Service Task Force of seven vice president-level participants. Terminal management and mar- keting vice presidents are permanent members of this group; the remaining five members rotate each year. Port Colum- bus has the CMH Customer Experience Partnership hosted by the CRAA Customer Service team. The group meets six times per year to discuss customer service issues and effec- tive practices and includes all in-terminal business partners. QUALITY ASSURANCE AUDITS Most airports perform quality assurance (QA) audits of their facilities, employees, and third-party contractors such as con- cessionaires, rental car companies, parking operators, and other private service providers to measure and track their per- formance. These audits are completed on a scheduled basis, as spot inspections, and/or through mystery shoppers. PANYNJ has an extensive QA audit program that is documented in its Customer Care Airport Standards Manual (ASM). The audit program covers four airports and 15 ter- minals, 13 of which are managed by private entities. The QA monitoring program is comprehensive and includes four components: 1. A customer satisfaction survey is conducted annually in May and June at JFK, Newark, LaGuardia, and Stewart airports. This survey asks passengers detailed questions about passenger experience upon arrival or prior to departure from one of PANYJU’s airports. A total of 10,400 arriving and departing passengers are surveyed each year and 2012 was the 11th year. Twice as many departing passengers are sampled as arriving passengers. 2. Mystery shopping is conducted twice per month. Mys- tery shoppers evaluate performance and quality of ser- vice of employees at various concessions in the terminals on the basis of ASM standards, focusing on employee attitude, appearance, awareness, and knowledge. 3. QA facility audits are also conducted annually, in April before the passenger surveys. Every facility is inspected for cleanliness, condition, and functional- ity in accordance with the standards published in the ASM. Deficiencies are considered either “routine”— quick fixes such as cleaning and management issues— or “high priority” deficiencies requiring repair. After the audits, PANYNJ issues evaluation reports to each business partner and posts them on the customer care website. 4. Processing evaluations are performed on an as-needed basis. PANYNJ and its partners have looked at queu- ing and delivery issues at baggage claim, check-in, taxi dispatch, parking lot exits, security checkpoints, U.S. entry points, and truck waiting times at cargo facilities. Many airports engage in some or all of the quality assur- ance activities that PANYNJ undertakes. DFW deploys mys- tery shoppers daily in different parts of the airport. Mystery shoppers also call the various customer service phone lines. ATL has a comprehensive employee and business partner training program based on the Disney Institute model. Its mystery shopper program evaluates employee encounters according to the standards shown in Figure 10. FIGURE 10 Hartsfield–Jackson Atlanta International Mystery Shopper Scorecard. Source: Hartsfield– Jackson Atlanta International Airport (2012).

19 SURVEYS Passenger surveys are the primary way that airports collect information about customer experience. A variety of third-party surveys are in wide use to monitor performance of facilities and services. ACI offers its ASQ Survey by subscription. There are other industry ranking systems offered by SKYTRAX, Canmark, and J.D. Power. Many airports use customized pas- senger surveys to track specific segments of their passenger base. The next sections summarize some of the surveys that are available. ACI-ASQ Survey Background ASQ is an international initiative run by ACI, which was estab- lished in 1991 to represent airport interests with governments and international organizations. There are five worldwide regions served by ACI, the largest being North America. The mission of ACI World is “to advance the interests of airports and to promote professional excellence in airport management and operations.” ASQ Service Management (http://www.airportservicequal ity.aero) offers four distinct product lines: (1) the ASQ Survey; (2) the ASQ Performance Program; (3) ASQ Assured Certifica- tion; and (4) ASQ Retail. The most widely used product is the ASQ Survey. As of March 2012, more than 224 airports world- wide participated in the ASQ Survey, 44 in North America. Sub- scribing airports are able to track airport performance over time and compare results with other participating airports, although all ASQ results are confidential. Data collection procedures are standardized and means-tested for each airport. Minimum sample sizes are required to ensure statistical validity. ASQ Survey ASQ relies on a comprehensive survey to collect information from passengers. Passengers are surveyed at the gate prior to departure. Airports have the choice to have ASQ contractors perform the survey; conduct the survey using airport staff; or subcontract to third-party surveyors. Figure 11 shows the questions in the ASQ Survey that pertain to customer satis- faction. The entire survey is reproduced in Appendix E. The survey presents departing passengers with 34 categories on which they rank the airport. Data are collected on a quar- terly basis. Airports with fewer than 2 million passengers can choose to survey two or four times per year. Survey data are collected on paper surveys. ACI determines the sample size and the exact flights/gates to be surveyed. ASQ provides participating airports with six deliverables after each survey period: 1. Management summary. The summary is an interactive file allowing airports to create customized reports showing ASQ results for selected airports. The man- agement summary is available three weeks after the end of each quarter. 2. Core report. This report includes graphs and tables cover- ing all service items and demographics for each partici- pating airports. In addition, each airport selects its own benchmark panel of up to 24 airports and receives a cus- tomized report of rankings just for that group of airports. 3. ASQ Data Centre. This custom-designed analysis tool allows users to run their own analysis of the service items, looking at performance over time, split by any of the available dimensions, and for any participating airports. It also allows direct benchmark comparisons and analysis of customer groups. 4. Tables in Excel. The cross-tab tables prepared for the core report are available in Excel format. This is useful for creating graphs linked to the tables or analyzing the data directly from the spreadsheet. 5. Raw data in SPSS/database formats. This is the full set of raw data in SPSS format, which includes the data of all participating airports. Using SPSS, airports can pre- pare custom tables and comparisons. (Several airports interviewed found the raw data the most useful ASQ deliverable.) 6. Prioritization analysis. This analysis shows the rela- tive importance of each factor and indicates the rela- tive urgency of issues. ASQ offers other custom services including airport spe- cific surveys, a detailed analysis of individual airport pro- cesses and services, focus groups and interviews, root cause analysis, and success measurement and review. In addition, the ASQ Certification program involves a rigorous assess- ment of service quality management processes and an on-site verification audit. Recertification is required each year. J.D. Power and Associates—North America Airport Satisfaction Survey J.D. Power and Associates is a global consumer satisfaction and marketing services firm based in California. The firm positions itself as a trusted, independent evaluator and rates hundreds of products and services each year in a wide variety of industries ranging from finance to electronics to health- care. J.D. Power and Associates develops survey and indexes for its rankings, then solicits customer feedback, aggregates the data, and releases its findings. Typically the company releases an abridged version of its findings through a press release and sells the complete set of findings to companies or organizations or sells the rights to use the results. J.D. Power’s methodology reflects its market position as an independent evaluator. For the 2010 North America Airport Satisfaction Study, J.D. Power identified 81 large, medium, and small airports across the United States in major markets. Airports were not permitted to opt in or opt out of the study. For all of its syndicated studies, J.D. Power hires

FIGURE 11 Sample questions from the ASQ survey. Source: ACI (2012). Write in Your Response or Place an 'X' in the Box Where Applicable Write in Your Response or Place an ' X' in the Box Where Applicable 7. Based on your experience today, please rate THIS airport on each service item: ? 5 4 3 2 1 ? 5 4 3 2 1 AIRPORT FACILITIES Did not ~@ ~ @ ~ e ACCESS Oidnot ~@ 00 ~ ~ e notiee/us. Excellent Very Good Good F~ir Poor notiu-/us. EJceollent Very Good Good Fair Poor u. Availabiity of bank/ATI.l faci~ies/ 0 0 0 0 0 0 A. Ground transportation to/from airport 0 D 0 D D D money changers v. Shopping faciiJes 0 0 0 D 0 0 B. Parking facilities 0 D 0 D D D w. Vallie for money of shoppong faci ties D 0 D 0 D D c. Value for money of parking fa ci ties 0 D 0 D D D 0 D 0 D D D X. Internet access/W~fi 0 0 0 D 0 0 D. Availabtilty of baggage carts/trolleys Y. Bustness/Executive lounges 0 0 0 D 0 0 CHECK-IN (at this a~port) z. Availabiity of wa.st.-oomsltoi ets 0 0 0 D 0 0 E. Waiting time in check-in queue/line D 0 0 0 0 0 D 0 0 0 0 0 AA. Clean~ness ofwashrooms/1orlets 0 0 0 D 0 0 F. Efficiency of check-in staff G. Courte.sy and helpfulness of check-in staff D 0 0 0 0 0 BB- Comfort ofwating/gate areas 0 0 0 D 0 0 AIRPORT ENVIRONMENT PASSPORT/ PERSONAL 10 CONTROL D 0 0 0 0 0 CC. Clean iness of at port terminal 0 0 0 D 0 0 H. Wading trme at passport/personaiiD inspectiOn 0 D 0 D D D DD- Ambience of the uport 0 0 0 D 0 0 Courtesy and helpfulness of inspectiOn staff SECURITY Overall satisfaction with the airport D 0 D 0 D D J . Courtesy and helpfulne.ss of security staff D 0 0 0 0 0 K. Thoroughness of seciJ'ity inspection D 0 0 0 0 0 8. Which of the items listed i n Question 7 are MOST IMPORTANT to you at TH IS air port? L. Wa~ing t ime at security inspection D 0 0 0 0 0 (Please use the letters in front of the items for your rating) M. Fee~ng of being safe and secure D 0 0 0 0 0 1st: I (eg PJ 2nd: I (eg K) 3rd: I (eg V) FI NDING YOUR WAY 9. What was your BEST and WORST experience at THI S a irport t o day? N. Ease of ftnding your way through airport D 0 0 0 0 0 Best: 0 . Flight tn formattOn screens D 0 0 0 0 0 P. Walkong distance tnstde the terminal 0 D 0 D D D Worst: Q . Ease of making connections with other fltghts 0 0 0 0 0 0 10. Arrivals services at this airport: ? 5 4 3 2 1 (&>M on pr<VJOUs expenence Did not @Q@ @@ @ ~ e AIRPORT FACI LITIES ., last3 wa~rh>) noticeluH E.xeell•nt Very Good Good Fair Poor R. Courtesy and helpfulness of airport staff 0 D 0 D D D A. Passport/Personal 10 inspectiOn D 0 0 0 0 0 (oxcAJdln<} cNck-., puspon CMfrol ond seeu1J1)'j s. Restaurant/Eatrng factlottes 0 D D D D D B. Speed of baggage delivery servtce D 0 0 0 0 0 T. Value for money of restaurant/eattng faci ties D 0 0 0 0 0 c. Customs tnspecttOn 0 0 0 0 0 0

21 panelists from nationally represented panel companies to conduct the research. Panelists contact survey participants to determine their eligibility. In the case of the airport study, participants needed to be 18 years of age or older and have flown out of one of the airports in the study in the past 30 days. J.D. Power required at least 100 responses between January and December 2009 in order for an airport to be eligible for the study. Once determined eligible, participants received an online survey from a panelist. Of the 81 pre-selected airports, panelists collected enough responses for 42. A total of 24,000 passengers took the North America Air- port Satisfaction Survey in 2009. J.D. Power ranked airports on the following six attributes: • Airport accessibility • Baggage claim • Check-in/baggage check process • Terminal facilities • Security check-in • Food and retail services. Passengers answered three different types of questions: (1) basic demographic questions; (2) rating questions; and (3) diagnostic queries (e.g., “How long did it take you to collect your baggage?”). J.D. Power’s analysts aggregated each response from every airport and created an index model using a series of hierarchical regressions that weighted the importance of each of the six attributes. J.D. Power published the North America Airport Satisfac- tion Study from 2001 to 2010, but the survey has been discon- tinued. The company reported that airports were not finding value in purchasing the study or advertising rights, and as of May 2012, it had no plans to replace the survey with a new product. SKYTRAX SKYTRAX was founded in 1989 and is operated as a non- profit, independent purveyor of airport customer experience rankings. SKYTRAX markets itself as a research advisor for airlines, airline alliances, airports, and related air transport product and service suppliers (www.skytraxreasearch.com). The company is based in London with an office in Beijing. SKYTRAX offers four airport products: (1) World Airport Audit; (2) Quality Certification; (3) Airport Star Rankings; and (4) Service Benchmarking. All assessments of airports are completed in-house. SKYTRAX staff looks at more than 800 key performance indicators per airport to prepare “star” rankings of airports. The ranking system is based on an inter- nally developed, internationally standardized methodology. In 2012, the company compiled data in 39 different airport service and product areas from over 12 million passengers in 160 dif- ferent countries at 388 different airports. Customer experience is evaluated for each airport for departures, arrivals, and transit through the terminal to the boarding gate. SKYTRAX data col- lection relies primarily on an online survey questionnaire that passengers complete. SKYTRAX does not charge airports to participate, nor does it pay customers to take the survey. In addition, SKYTRAX also seeks audiences input through four other methods: business research groups or travel panel interviews; corporate travel questionnaires and interviews; telephone interviews; and selective passenger interviews. Individual Airport Custom Surveys and Samples Most airports conduct surveys to explore specific aspects of customer experience. Table 5 summarizes data collection tools that airports participating in this synthesis reported using. A few of the different types of surveys are highlighted Airport Other Performance Management Tools Aspen/Pitkin County Visitor/Resident Survey Dallas/Fort Worth International ASQ, Monthly Special Focus Follow-up Surveys, Mystery Shopper Halifax Stanfield International ASQ, Passenger Processing Hartsfield–Jackson Atlanta Int’l. ASQ, Mystery Shopper Jacksonville International ASQ, Online Website Passenger Survey, Airport Survey.com Minneapolis/St. Paul Int’l. ASQ, Mystery Shopper Nashville International ASQ, Meeters Survey in June, Leisure Passenger Survey during Spring Break, Passenger Survey in Fall, Mystery Shopper Ottawa International ASQ, Mystery Shoppers, Baggage Delivery Time, Concession Satisfaction Survey Port Columbus International ASQ, Secret Shoppers San Diego International ASQ, Additional 200 Passengers Survey each Quarter, In-Terminal Tenant survey San Francisco International ASQ, Annual Passenger Survey in May, 3,800 passengers, Mystery Shoppers, Wait Time Analysis The Ohio State University Airport Customer Survey for GA Users The Port Authority of NY and NJ Customer Satisfaction Survey, Mystery Shopper, Facility Quality Audit Tucson International Survey of Business Travelers Source: Synthesis Airport Interviews (2012). TABLE 5 OTHER PERFORMANCE MANAGEMENT TOOLS USED BY AIRPORTS

22 in the next sections. Appendix E presents examples of cus- tom surveys that airports use. Aspen/Pitkin County (ASE) Aspen is a destination resort and ski area in the Colorado Rocky Mountains. In 2007, the Pitkin County Commission- ers requested a survey of four groups that were either users or stakeholders for the airport or both: county residents, second- home owners, visitors using the airport, and residents in the region. The purpose of the survey was to: • Obtain sentiment about the airport, use of the facilities, and expectations; • Gather suggestions about what could be done to improve passenger experience and increase use of the airport; and • Develop a basis for community outreach. The survey was distributed to residents by mail, the Inter- net, and telephone calls. There were also face-to-face inter- views. Survey results were instrumental in assisting with air service development, in-terminal improvements, and design specifications for a new terminal. Jacksonville International (JAX) Jacksonville International is an origin and destination airport with more than 2.7 million annual enplaned passengers. It serves as a regional transportation center in northeast Flor- ida and a destination airport for many visitors. The airport is very proactive with respect to customer service and measure- ment of customer service performance. The Airport Authority subscribes to quarterly ASQ passenger surveys. In addition, the Authority invites passengers to fill out a “How Are We Doing?” survey on the airport website. The online survey parallels ASQ in content and gives the customer service manager immediate feedback on any cus- tomer service issues that arise. The survey also provides space for free response comments and invites participants to leave their contact information. A full copy of the survey is reproduced in Appendix E. Nashville International (BNA) Nashville is also a medium hub with approximately 4.7 mil- lion annual enplanements. The Metropolitan Nashville Airport Authority has employed both the Performance Excellence (Baldrige) management framework and the Lean Six Sigma management approach to continuous improvement. Customer service is a focus of both of these methodologies. Perfor- mance Excellence and the Voice of the Customer program that includes three customer surveys each year that are administered by BNA staff. A passenger survey targeting leisure travelers is conducted during spring break. In the early summer, another survey targets concert-goers. (Both the Country Music Asso- ciation Fan Fest and Bonnaroo Music and Arts Festival are held in June.) In the fall, the same survey is administered during what the airport considers a normal travel period. Also in June, BNA conducts intercept surveys of meters and greeters and of passengers who will soon check in in the waiting areas outside security checkpoints for Concourses A/B and Concourse C. The meeter survey asks why people wait outside security and, if they are meeting someone, what is their relationship to that passenger. The survey also asks how the person got to the airport and why he or she chose to wait inside the terminal rather than go to the cell phone wait- ing area or 10-minute parking area. The passenger survey asks how the person checked in, information about the travel party, trips per year, frequent destinations, and airport access. Full copies of the surveys are reproduced in Appendix E. Both surveys have rating questions about airport services and features and overall airport experience. These questions seek to pinpoint the most important features and services that drive customer experience. (Live music in the terminal ranks very high!) The Ohio State University Airport (OSU) The Ohio State University Airport is the fourth busiest airport in Ohio and is the only GA airport to participate in this synthe- sis. The university owns and operates the airport and the FBO. For GA users, the customer experience is directly related to line operations, concierge services, aircraft maintenance, res- taurant, and aircraft storage (community hangars, T-hangars, and tie-downs). In the past, airport management has surveyed its customers; this short survey is shown in Figure 12. The Port Authority of NY and NJ (PANYNJ) PANYNJ oversees customer service for more than 100 mil- lion passengers using JFK International, Newark Liberty International, LaGuardia, and Stewart International airports. The Port Authority uses its own arriving and departing pas- senger surveys which measure overall customer experience and delve in detail about the passenger experience through key airport touch-points: 1. For departing passengers, the survey investigates their experience with: a. Getting to the terminal b. Check-in process c. Terminal facilities d. Security check e. Food and beverage f. Retail g. Gate area.

23 FIGURE 12 OSU Airport Customer Survey Card. Source: The Ohio State University Airport.

24 2. For arriving passengers, the survey investigates their experience with: a. Gate area b. U.S. entry c. Terminal facilities d. Baggage claim e. Leaving the terminal. Each point is examined in detail. Surveys are conducted for two weeks in May. The sampling distribution among PANYNJ airports is listed in Table 6. PANYNJ has conducted this annual survey for more than 10 years and has accumulated an extensive database of information about its passengers and their airport expe- rience. Results of surveys, audits, and benchmarking are posted on a website that is maintained by the Port Authority for its Airport Customer Care Performance Measurement and Research Program and available to PANYNJ’s busi- ness partners. Figure 13 reproduces a portion of PANYNJ’s departing passenger survey to provide a glimpse at the level of detail probed. Tucson International (TUS) Tucson is a small-hub airport with 1.8 million enplanements. The area has a large retirement population and many busi- ness travelers. With limited resources for customer service, Tucson targets its efforts carefully. In 2012, the Airport Authority sponsored a 500-response business traveler survey which measured business use of the airport and was designed to create a conversation about air service among the airlines TABLE 6 SAMPLE SIZES BY AIRPORT FOR PANYNJ PASSENGER SURVEYS JFK Newark LaGuardia Stewart Total Arriving 1,607 1,068 928 100 3,703 Departing 3,260 1,612 1,672 203 6,747 Total 4,867 2,680 2,600 303 10,450 Source: PANYNJ (2012). FIGURE 13 Sample questions for Departing Passenger Survey. Source: PANYNJ (2012).

25 and the business community. The survey covered the follow- ing areas: • Booking practices for business travel • Priorities for purchase • Airport services that appeal to business travelers • Frequent destinations and originating airports • Connecting travel patterns. VISITS TO OTHER AIRPORTS Airports in the United States frequently compare themselves to other airports of similar size or competing airports in the region. This type of data collection is often qualitative and can be an idea generator. At the Metropolitan Airport Commission (MSP airport), staff undertakes a semi-annual airport bench- FIGURE 14 Measuring customer service at MSP. Source: Metropolitan Airports Commission (MAC). marking tour to visit airports that offer innovative or competi- tive services and facilities. In 2010, site visits were made to: • Boston, Terminals A & C • JFK Terminal 5 • Halifax, Canada • Detroit, McNamara and North Terminals • Portland, Oregon • San Francisco International • DFW and Dallas Love Field These trips produce good visual comparisons and stimu- late ideas about how MSP can immediately improve airport operations, customer service, signage, concessions, and facil- ities and move MSP from “good to great.” Figure 14 shows the different data inputs used by MSP.

Next: Chapter Four - Measurement of Performance »
How Airports Measure Customer Service Performance Get This Book
×
 How Airports Measure Customer Service Performance
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s Airport Cooperative Research Program (ACRP) Synthesis 48: How Airports Measure Customer Service Performance examines the strategic importance of customer service and how airports are measuring the quality of customer service.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!