National Academies Press: OpenBook

Use of Market Research Panels in Transit (2013)

Chapter: Chapter Four - Case Examples A Variety of Panel Survey Applications

« Previous: Chapter Three - Survey Results Use of Panels
Page 30
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 30
Page 31
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 31
Page 32
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 32
Page 33
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 33
Page 34
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 34
Page 35
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 35
Page 36
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 36
Page 37
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 37
Page 38
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 38
Page 39
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 39
Page 40
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 40
Page 41
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 41
Page 42
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 42
Page 43
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 43
Page 44
Suggested Citation:"Chapter Four - Case Examples A Variety of Panel Survey Applications ." National Academies of Sciences, Engineering, and Medicine. 2013. Use of Market Research Panels in Transit. Washington, DC: The National Academies Press. doi: 10.17226/22563.
×
Page 44

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

30 • MnDOT conducts online surveys weekly through an inter- active online research community, a mix of qualitative and quantitative studies. • MTA conducted quantitative surveys of 12–15 panel members by telephone each day of the week, year-round, with the exception of major holidays. • WSF conducts quantitative surveys with panelists online, notifying them by means of e-mail when a new survey is available. Each of the case examples covers six topic areas: (1) research purpose; (2) panel sampling, recruitment, and maintenance; (3) implementation, analysis, and reporting; (4) benefits, cost, and concerns; (5) legal, ethical, and privacy issues; and (6) lessons learned/elements of success. REGIONAL TRANSPORTATION DISTRICT CASE EXAMPLE Overview RTD (Denver) has conducted rider panel research involv- ing its bus and light rail passengers for more than 20 years. The fixed-route service panel of 16 members is recruited annually through a link on the agency website, and meets quarterly for one year, after which a new panel is recruited. Although most panel research is survey-based, RTD’s panel meets in a tightly structured focus group setting, where topics are covered in depth. The panel meetings are not appropriate when statistical accuracy is required, but are used to test and refine concepts and ideas before they are applied to larger, statistically accurate rider surveys. The market research staff works with the requesting departments to match the research technique to the research question and ensure that the panel is appropriate for the topic. Examples of research topics include testing parking payment systems, RTD existing and potential branding and marketing efforts, mobile applications for the RTD website, and long-range planning activities. All panel members are riders; therefore the panel does not provide per- spectives from the nonriding population. In 2012, a second rider panel was added to address issues of the disabled community on both fixed-route and ADA demand-responsive services. The program is conducted entirely in-house, from the online panel recruitment through final analysis and reporting, making it a very cost-effective research program. Four agencies were selected for full case examples to illustrate the broad range of ways in which panel survey research is being implemented. The case examples highlight the following aspects of panel research and agencies’ different techniques for recruiting panel members: • RTD recruits panel members using a link on the agency website, a non-probabilistic sampling technique. • MnDOT hired a consultant to recruit panel members online using pop-up messages to join a research panel, a non-probabilistic sampling technique, but recruited respondents to represent the population. • MTA used a consultant to recruit panel members by random-digit-dial telephoning, a probabilistic sampling technique. • WSF has posters on board vehicles and at terminals with the web address, encouraging passengers to sign up for the panel. They also have experimented with QR codes to have passengers vote on the value of service and then be linked to the website to join the panel. Both are non- probabilistic sampling methods. The size of the panels: • RTD recruits 16 panel members, who are replaced every year. • MnDOT recruits 600 panel members, 300 from the Minneapolis region and 300 from the rest of the state. • MTA recruited 1,500 panel members by zip code and borough to represent the general population. • WSF does not limit panel members. There are currently approximately 6,500 panel members; the agency’s goal is 18,000. Varying levels of in-house versus consultant use: • RTD conducts the panel research completely in-house. • MnDOT and MTA use a mix of staff and consultants. • WSF contracts out all panel research activities. Different methods of surveying the panel members, and frequency of interaction: • RTD has a qualitative panel, with focus group-style panel meetings every three months but no interaction between meetings. chapter four CASE EXAMPLES—A VARIETY OF PANEL SURVEY APPLICATIONS

31 Panel Sampling, Recruitment, and Maintenance Panel Sampling and Recruitment At the inception of the panel research program, more than 20 years ago, two fixed-route panels were recruited, one for bus riders and one for light rail riders. Today, most riders have regular experience with both bus and light rail, eliminating the need for separate panels. Currently, there is one fixed-route customer panel representing all of the non-ADA service modes operated by the agency. Panel members were initially recruited through RTD’s “Read and Ride” customer newsletter available on-board vehicles. The penetration of the Internet into everyday life has changed the recruitment strategy, which is now exclu- sively online and conducted during a four-week period each November. This is publicized through the news and an online alert on the agency website that invites riders to “Please apply for customer panel” (see textbox) and includes a link to an online screening application. The application gathers informa- tion on RTD services used, frequency of usage, availability of a car, usual park and ride lot (if any), previous service on an RTD panel, and demographics; and poses two open-ended questions: “Why do you want to be on the panel?” and “What general area or topic most interests or most concerns you about RTD?” These questions are used to gauge the respon- dents’ communication skills. It is not necessary to be fluent in English, but the respondent does need to demonstrate a willingness to participate and ability to share their thoughts and ideas with the agency. There are typically 200–300 applications received each year. The high value incentive of a monthly pass after each meeting and a regional annual pass if all meetings are attended is seen as a critical element for attracting quality applicants. The applications are reviewed by RTD research staff. The 16 panel members selected (15 plus an alternate in case of attrition) represent each of the 15 RTD service districts; they are also selected to represent a cross-section of RTD customers based on age, race, gender, and transit dependence (avail- ability of an automobile). Because of the college campus population served by RTD, it attempts to recruit at least one college student each year. The customer profile is obtained separately through a regular on-board customer satisfaction survey handed out by bus drivers and distributed at rail stations according to a sampling plan designed to achieve representa- tive sample of RTD riders. Although panel members are selected to reflect the riding population, they differ from the customer profile in one area: The selection methodology requires that all panelists have at least some college education. Experience has shown that those who have been in a college classroom understand the process of group discussion: listening, speaking, and the give-and-take of ideas. Those without the college classroom experience are less likely to understand the research and dis- covery process and hinder the panel discussions. After the potential panelists are selected, they receive an e-mail outlining the program, schedule, what to expect at the panel meetings, what is expected of them as panelists, and the rules to earn their incentives. Exact dates in February, May, August, and November are set in advance, so that potential panelists can plan accordingly. On rare occasions, potential panel members have indicated that they would be unable to attend all of the meetings or meet the requirements of the program, and were immediately replaced. (One person was getting married on a meeting day; another was scheduled to be traveling for business.) Those who are not selected or decline to participate receive two free one-way tickets in appreciation for their applying. There have been some riders who clearly applied only to get the free tickets, but the number is few, and doesn’t cause a problem for the agency or the research program. Share your ideas on the RTD customer panel: We are looking for enthusiastic, interested RTD cus- tomers who are team players and want to make their ideas and opinions heard by actively participating on the RTD customer panel! If you ride RTD buses or light rail and want to share your ideas to help improve RTD, please complete our panel application. The RTD customer panel is a year-long discussion group on RTD policies, procedures, products, and ser- vices. The 15-member panel meets four times a year for two hours on Wednesday evenings starting at 5:15 p.m. at the RTD Administration Building at 1600 Blake Street in Downtown Denver. In return for their participation, panel members receive a FREE buffet dinner at each meeting and a FREE Regional monthly pass for each month they attend a meeting. Panel members who attend all four meetings receive a FREE annual Eco Pass for the following year. You can apply to become a member of the RTD Cus- tomer Panel by completing this short online application by December 17, 2010. RTD employees and their fam- ily members cannot be considered for this panel. If you are selected as a candidate, a copy of the Panel Meet- ing Guidelines will be sent to you by January 5, 2011, to review before you make a final decision to become a panel member.

32 to be moved to another meeting or addressed through another format, such as an online survey on the agency website. Panel members are sent reminder notices of the meetings. All meetings are held from 5:00 p.m. to 7:30 p.m., usually at the RTD administrative offices, although they are sometimes held at another RTD location (the light rail transit mainte- nance facility, computer lab, etc), and moderated by RTD research staff. The topic of the meeting is not made known to members in advance. Agency employees who have requested the research or have an interest in the outcome are invited to attend the meet- ings as observers and, if needed, technical experts. Typically at least 14 panel members attend any one meeting. The meet- ing starts with a buffet dinner and social conversation. The remainder of the meeting is highly structured to ensure that the participants stay busy and engaged. At the end of the meeting, staff is available to answer any concerns raised by any panel member or to set up a time to contact him/her. Any testing of tool or materials is done on-site. For test- ing a prototype mobile website, participants were given smart phones to use at the meeting. When the topic was a new method of paying a parking fee, a parking ticket vending machine was brought into the meeting for panel members to use along with laptop computers so they could evaluate the online information (see the following parking pavement example project text box). In addition to the regular service customer panel, a second panel was added in 2012 to focus on issues relevant to persons with disabilities. The panel will consist of about eight persons who use fixed-route service and eight who use ADA demand- responsive service. Recruitment is being conducted by RTD market research staff through contacts at the Colorado Fed- eration of the Blind and other agencies that serve persons with neurological, visual, hearing, and physical impairments. Recruiting for the panel for persons with disabilities is occur- ring at the same time as the regular service panel, and panel meeting focus groups are scheduled for one week after the regular service panel. The initial topic is how the agency can move riders from ADA service to fixed-route service, reduc- ing reliance on the demand response service and increasing riders’ options through improvements that would allow use of the more flexible fixed-route service. Panel Maintenance Panel members are informed of the incentives for participation as part of the information packet when they are invited to join the panel. The incentives include a buffet dinner the night of the meetings, a regional bus pass for the month following each panel meeting they attend, and an annual regional bus pass for the following year if they attend all four panel meetings. However, no panel member can be more than 15 minutes late to the meeting, or leave early. If a panelist misses the first meeting in February, he or she is immediately replaced with a new panelist; however, after that, no new panel members are added for the remainder of the year. Because any new member would not have attended all four panel meetings, he/she would not be eligible for the annual pass, substantially reducing the incentive to participate in the remaining sessions. There is no contact with panel members between the quar- terly meetings. Attrition is not an issue because the panel is notified of meeting dates at the outset of the program, and because the regional transit pass is a valuable incentive. Implementation, Analysis, and Reporting Implementation The research work is developed and implemented entirely in-house. The panel research program is administered by the market research department, which recruits and manages the panel, determines the research topics, establishes the meeting agenda and research questions, oversees the development of the materials to be used at the meeting, and writes and pre- sents the final results. The panel meetings occur four times a year, so there can be competition for research topics to be conducted on a specific date. The research staff determines if the time can be split between two topics, or if a topic needs Customer panel members were supplied with a packet containing survey materials to be filled out during the course of the session. Panel members were also given two parking payment scenarios which contained a valid license plate number, a length of stay, and identifi- cation as either “in-district” or “out-of-district” to be used during the website evaluation portion of the activity. Panel members were asked to “act out” several activities during the course of the session: • Pre-test: Used to determine RTD parking payment use and knowledge. • Payment attempt #1: Following an initial attempt at paying for parking, panel members were asked to evaluate their experience. • Payment attempt #2: Following another attempt to pay for parking using a different scenario card, panel members were again asked to evaluate their experience. • Website evaluation: Panel members were asked to obtain several pieces of parking information using the RTD website and evaluate their experiences. • Evaluation of payment alternatives: Panel members were provided descriptions of payment alternatives

33 Analysis and Reporting The research question is clearly identified as the panel meeting discussion guide is developed. The research staff summarizes the findings of the meeting into a PowerPoint presentation, and if requested, an executive summary. All stakeholders who participate in the panel meeting receive a copy of the presen- tation and executive summary, and market research staff will make a presentation of the results to staff, board members, or management, if requested. The requesting department distrib- utes the findings to its stakeholders, and those staffers who attended the meeting can address specific questions about the proceedings. Panel members do not routinely receive the final report. On rare occasions the final report is requested by a panel member, which is provided as a matter of open-records law. Benefits, Cost, and Concerns Benefits The benefits of the panel survey mirror those of a traditional focus group. There is an opportunity to gain an in-depth under- standing of the research topic, and to understand how the rider perceives and processes information regarding the transit system. Panel members become familiar with the process after the first meeting and are then able to focus more quickly on the tasks required, because they know the routine. The group discussion produces data and insights that would be less accessible without the interaction of a group setting. Listening to others’ experiences stimulates thoughts and ideas in each of the panel members that might otherwise not have been brought to light. Often research is conducted by telephone or in a format where the client has little participation until a final report is produced. The panel meetings allow the stakeholders to be at the meeting, so they see and hear findings directly from the customer. This first-hand experience helps each staff mem- ber gain a better appreciation of customer needs, creating a customer orientation throughout the agency. Since the panels are replaced annually, recruitment costs are reduced, and members are less likely to become too sen- sitized to transit issues; similarly, the creation of a new panel every year ensures that fresh perspectives are brought to the table. Cost This focus-group style panel research provides a rich source of information, and because everything is done internally, the program is very affordable. The cost is mostly limited to the buffet dinners provided at the meetings, along with the foregone revenue from the monthly and annual transit passes. Online recruiting reduces the staff time needed to establish the panel, estimated to be approximately one week. Devel- oping the meeting materials, setting up the meetings, and reporting afterward require one to two weeks of staff time per meeting, for a total of 0.1 to 0.15 FTE needed to administer the program. Concerns A drawback to the panel approach is its small sample size, which doesn’t provide the statistical accuracy needed for some research topics. Additional research techniques are needed to address all of the various questions posed by agency staff. A second concern is that the panel is made up of engaged transit riders who do not necessarily reflect the views of those unfamiliar with RTD services. The panel members can be too focused on a specific issue, or too knowledgeable about the transit services, and thus not open to certain approaches, ideas, or topics. Legal, Ethical, and Privacy Issues RTD has not had any legal, ethical, or privacy issue concerns related to its panel research program. Panel members use only their first names, and the meetings are structured so that they do not encourage personal interaction between panel members. Personal information is collected on a secure server and is only available to market research staff. The information is not shared with any other departments or entities outside of RTD. Lessons Learned/Elements for Success Experience has shown that persons who regularly call the RTD complaint line tend to come to meetings with a specific agenda in mind and distract from the meeting topics without adding to the discussion. All potential panel members are screened against the RTD complaint database to identify any who are habitual callers to the system and would not make good participants. Meetings need to be tightly structured around the research question, with a full schedule of both discussion and inter- active exercises. This keeps panel members engaged and and asked to compare the likelihood of their use of each of the potential solutions. • Demographics: Panel members were asked to pro- vide several pieces of demographic information. • Observations: Time to completion and other observa- tions from RTD staff members were recorded to pro- vide additional details on panel member performance.

34 moving forward so there is less likelihood of their straying off topic. If the discussion does stray, the moderator steps in immediately to re-focus the discussion. Off-topic comments can be acknowledged but noted as something to bring up with RTD staff after the meeting. RTD management and staff have been pleasantly surprised over the years at the quality of the panel members, their involvement in RTD issues, and the thoughtful, considered input that they provide. It is enjoyable for everyone, with very positive feedback from all involved, panel members and RTD staff. MINNESOTA DEPARTMENT OF TRANSPORTATION CASE EXAMPLE Overview The MnDOT research department uses a variety of techniques to understand the needs of its customers (the taxpayers of Minnesota). Extensive ad-hoc telephone and annual track- ing surveys have been supplemented with focus groups that address the qualitative “why?” questions that are sometimes difficult to capture through quantitative studies (see Figure 6). The dramatic decline in the number of households that have and use a landline telephone, and the rise of social media as a communication tool not only between individuals but also between individuals and organizations, has made telephone surveying more difficult and has resulted in survey samples that do not mirror the general population.. These changes have led the MnDOT to introduce a new research strategy building on the use of the Internet and social media communities to supplement its existing research program. The online panel community approach provides an oppor- tunity to combine the sample sizes of quantitative methods with the depth of discussion provided by focus groups through on-going and iterative surveying and discussions. The results of one test might spark changes to the program, which could be re-tested with the community, allowing experimentation and refinement of ideas and improvements to the final product. A vendor with extensive experience in establishing online communities was selected to manage the program. To preserve the integrity of this program, the community is not used for public outreach or public relations. Panel Sampling, Recruitment, and Maintenance Panel Sampling and Recruitment Several recruitment strategies were used by the vendor to determine what would be the best method of obtaining a representative sample. In the end, online recruitment was the easiest and least expensive method for obtaining a valid sample of the general population with Internet access. The panel consists of 600 panel members, approximately 300 members from the Minneapolis/St. Paul urban and sur- rounding suburbs and 300 panel members from the remainder of the state. The membership is designed to mirror the popula- tion based on geography, gender, age, income, and education. There is a known bias in that the panel is an online community, so that members must have Internet access to participate. A transportation profile is captured at the time of panel recruitment. This includes whether members commute dur- ing peak period and their mode of travel (for example, single occupant vehicle, carpool, transit, bike, train, or pedestrian). Quantitative Research Methods Hybrid Online Customer Community Qualitative Research Methods Higher Precision (Omnibus) Deeper Understanding (Focus Groups) Understanding Research Methods FIGURE 6 Market research techniques employed at MnDOT.

35 In addition, panel members are asked to provide a personal profile that includes self-reporting on whether or not they describe themselves as having a disability. MnDOT was specific in wanting to develop an online panel community for the purpose of research, not public relations. The community is not marketed or publicized. Persons who hear about the online community and volunteer to participate are politely refused. Panel Maintenance A key element of panel maintenance is ongoing engagement among panel members. The community is not just surveys— it is also a community discussion opportunity. High activity generates high engagement and response. Every week, there is something new posed to the community. Some weeks it is a strategic survey to inform program development; other weeks there are broader topics that invite open discussion. Both are important and add value to transportation planning while keeping the panel engaged and reducing attrition. Panel members receive a $10 gift card each month that they fully participate. This was vetted by members, who indicated that they believed it was the right amount of incentive: They deserved a small token of appreciation for weekly feedback over an extended period, but any greater amount would be perceived as too much of a taxpayer expense. The panel is refreshed once a year to keep members engaged and avoid their becoming too sensitized to MnDOT issues. If needed, the option exists to refresh twice a year. Approximately 30% of the panel is replaced each year, based on the level of member involvement. Those who have not participated regularly are thanked for their past input, then removed from the panel and replaced with new members recruited to maintain the target demographic profile of the community. Implementation, Analysis, and Reporting Implementation The online community is open only to research panel members. The community is intended to be useful for MnDOT while being interesting to participants. There are a variety of online tools with which researchers interact with the panel members, including surveys, brainstorms, discussions, live chats, image galleries, video clips and attachments. The community’s home page, Mn/DOT Talk, features the week’s survey, discussion questions, and an informational section on the homepage that describes what is happening with the online community that week. The home page and sign-in screen is shown in Figure 7. To earn their monthly incentives, panel members need to sign in each week and participate in all surveys and commu- nity interactions. MnDOT staff identifies the weekly issue to be addressed: For example, if there was a major snow storm, panelists might be asked if they traveled to work during the snow and how MnDOT performed on snow removal. The research topics are gathered from MnDOT leadership, district/regional offices, and other transportation stakeholders, and agency staff meets with the vendor each week to coor- dinate on survey objectives and topics for that week, and to draft the survey questions. Although the weekly content is typically developed by MnDOT, the vendor is available to assist with developing community engagement activities and other tasks as needed. MnDOT came into the process with the belief that monthly surveys would be better than weekly activities, which MnDOT feared could burn out the panel members. However, the expe- rience of the private sector has been that frequent communi- cation on a variety of levels actually increases response rates and quality of participation by the community; and the vendor recommended hosting weekly activities. Response rates are good, panel members are interested and engaged, and the content of the feedback is of high value. MnDOT panel members may send comments to MnDOT staff at any time through this community. MnDOT also encourages informal exchanges among members address- ing transportation-related issues, and community members can even create their own transportation discussions and surveys to distribute to all members. It is believed that through these three-way interactions (MnDOT to members, mem- bers to MnDOT, and members to members) that new learning occurs. One example is the use of roundabouts, a traffic control device used in Minnesota. A panel member raised questions about how to safely navigate a roundabout. This generated an active discussion among members, signaling to MnDOT that there might be misperceptions about roundabouts and a need for more education. Surveys on the topic of roundabouts were conducted and information was added to the online community site. The information was also made available to the general public through other MnDOT communication channels. Analysis and Reporting The goal was to obtain close to 300 responses (out of 600 mem- bers) per week. The outgoing sample yields a close represen- tation of the state’s population, but the returning data are not weighted to adjust for non-response. The online community results are not expected to have the highest level of precision; other research tools provide that information to the agency. A comparison of the unweighted results to the statewide profile has shown that the results, although not be perfect, are adequate for the agency’s needs and intended use of results. The results provided are usually simple cross-tabulations of the data results, for example, comparing commuters to non-commuters; or Metro area residents vs. outer-state

36 Minnesotans. MnDOT recognizes the limitations of the online community and is careful when interpreting and projecting results from surveys and discussions. Results of surveys are not directly provided to the panel on the community site. Instead, the community is shown how the data were used and is provided advance notice of any campaign or changes to the system as a result of its input. MnDOT has also created a video in which State Transportation Commissioner Tom Sorel expresses his appreciation for the work of the community, its value to the agency, and how, spe- cifically, its feedback has been used. The video demonstrates top management’s support and appreciation of the participants, and puts a public face on a large state agency. Key to the internal success of this program is closing the loop with the rest of the agency. A website was created on the MnDOT intranet dedicated to the information collected from the online community (see Figure 8). This site is intended to be directly accessible, providing regular summaries of the survey and discussions that offer insight into MnDOT’s customer base. As a result, overall understanding of customer needs has increased and is regularly discussed in decision- making circles. Benefits, Cost, and Concerns Benefits The online community is a nimble, cost-effective resource. Previously, the agency relied primarily on large telephone studies to provide quantitative data. That type of study is still used for annual tracking, but is not helpful in responding to the agency’s immediate and dynamic needs for customer input. Focus groups also have drawbacks. Organizing groups of 10 to 12 persons to explore a topic through in-depth study is FIGURE 7 Home page for the “Mn/DOT Talk” online customer community.

37 labor-intensive, and yields limited information from residents of only one geographic area. The online panel community approach has the ability to respond quickly to statewide ques- tions with a larger, more representative sample, while at the same time offering an iterative, interactive learning venue without additional cost. An example of how the community works addressed the topic of a “zipper merge,” where cars entering a freeway needed to merge into one lane of traffic (see Figure 9 for an example of a zipper merge). Rather than use the entire length of the merge lane and merge near the end of the zone, drivers would typically attempt to merge as soon as possible, thereby significantly extending backups. Initially, MnDOT staff thought the lack of proper zipper merging was the result of ineffective signage. However, after posing survey questions, testing new signs with several hundred panel members, and instituting online discussions, the agency realized that the problem was not direction, but a “Minnesota nice” issue. In Minnesota, merging late was perceived as being rude, and no amount of signage would change that perception. Few people wanted to risk being viewed by other drivers as “that guy,” meaning the one who is “budging in line.” As a result, an infor- mation campaign was launched to educate drivers on how the zipper merge is intended to work (see Figure 10). Without the repeated testing and conversations with the online com- munity, it is unlikely that staff would have realized that they were dealing with a cultural issue. This ability of the community to provide a quick response to emerging needs has resulted in staff from all over the agency bringing forward new topics to engage the customers. After two years of continuous research, and with the iHUB internal site making customer research accessible to all staff, employ- ees are developing a deeper understanding and appreciation of customer feedback in decision-making. Another advantage FIGURE 8 MnDOT’s internal webpage “iHub” for the online customer community. FIGURE 9 Example of a zipper merge.

38 that became clear after the community was established was the value of using the customers’ own language. The online com- munity provides a rich source of words and terminology that can be used in to create clear communications to the public. Cost The online community program at MnDOT is entering its third year. One full-time employee is needed to maintain the program, coordinate the research questions, develop the ques- tionnaires, report on results, and work closely with the full- service vendor. The vendor implements the program, recruits and maintains the panel, runs the community site, assists with the panel engagement activities, collects and summarizes survey data, and performs other tasks as needed. Total cost for the vendor services in 2012 was $260,000, including respondent incentives. The panel size is being reduced to 400 members, 200 from Minneapolis–St. Paul and 200 from the remainder of the state, in an effort to contain costs while still keeping the community active. Concerns There was initial concern that some panel members might use inappropriate language or otherwise be disrespectful during online discussions, because all comments are accessible to the community; so a strategy to handle inappropriate activity was developed even before the community was launched. Panel members are warned that communication is open and that all postings must be respectful. Some panel members have used some strong language and had to be reminded of the rules of engagement; but this has not been a serious problem; In the first two years of the program, only one person had to be removed for continuing inappropriate communication after receiving a warning. Legal, Ethical, and Privacy Issues Before proceeding with the online community, MnDOT research staff conferred with general counsel to discuss risks and limitations of the program. Minnesota has legislation referred to as the Tennessen warning: Whenever a government entity collects private or confidential data from an individ- ual about that individual, the agency must give him/her a Tennessen warning notice (see Minnesota Statutes, section 13.04, subdivision 2). The purpose of the notice is to enable an individual to make an informed decision about whether to provide the government entity with that data. A government agency may not collect data on individuals unless the collec- tion is necessary for the agency to carry out its duties under a program that is authorized by law (see Minnesota Statutes, section 13.05, subdivision 3). Following is an example of how this warning is used in MnDOT market research, including the online community: Hello, this is [YOUR NAME] from [Market Research Sup- plier Name] and we are calling ON BEHALF OF the Minnesota Department of Transportation. Mn/DOT (pronounced “mindot”) is interested in your opinions about your driving experiences on Minnesota’s freeways and state highways. We are not selling any- thing; this is for research purposes only. You are not obligated to do this survey but your responses will help to inform Mn/DOT of public attitudes when making decisions. All your responses will be combined with others in the study and your name is never made known to Mn/DOT or the rest of the public. The information provided to potential panel members in the screening questionnaire includes the Tennessen warning that any information members provide is strictly voluntary, and that they can refuse to participate or opt out of the program at any time. The individual’s name and contact information is collected and maintained by the vendor, providing a wall between the panel members and the state. Identification on the community website is only by first name and last initial; however, the community is a social media-style interactive website, and panel members are able to include a photo with their online information, if they so choose. Lessons Learned/Elements for Success MnDOT recognized the value of the online research com- munity approach and did not want its foray into this new research strategy to fail. The agency selected a vendor based on a number of factors: cost of services, depth of experience in developing online communities, and the agency’s desire for three-way communications (MnDOT to the customer, customer to customer, and customer to MnDOT). The vendor provided significant experience and counsel as the agency was charting this new ground; consequently, MnDOT was able to avoid possible challenges and problems as a result of the partnership. FIGURE 10 Signage developed from the online community discussion.

39 Weekly discussions and activities designed to interest panel members have proven to be the key to continued engagement. Response rates are good and exchanges are substantive, which result in high quality information from the online community. Market researchers are used to developing formal ques- tions to be used in telephone, in-person intercept, and written surveys. An online community needs a more conversational style of questions. The depth of experience provided by the vendor helped MnDOT staff transition from the traditional research and question format methods to the world of social media. Even as a research tool, the community has public rela- tions value. Panel members are fascinated and impressed that MnDOT is using this technique and asking for feedback from the public. Occasional videos of the transportation com- missioner addressing the community provide a “face” for what might be an impersonal public agency. Panel members “expect it from government, but are surprised it actually happens!” METROPOLITAN TRANSPORTATION AUTHORITY CASE EXAMPLE Overview The MTA in New York City initiated its panel research pro- gram in 1995 to monitor operation of the system through the perspective of the average New Yorker. The study was designed as a “transportation” survey and did not specifi- cally identify itself as an MTA project. This helped keep the focus on New York and travel within the city, and reduced the potential for the panel to become a forum about public transit. The panel consisted of 1,500 members recruited by tele- phone to mirror the general population of New York City. Every day, the vendor surveyed 12 to 15 panel members by telephone, providing a continuous stream of data that could be used at a moment’s notice to answer operational questions. Typical uses of the data included tracking ratings of satisfac- tion with safety and security in the days immediately following any incident, and adding questions to the survey for a period of time to gauge reaction to a new advertising campaign. The panel survey was utilized by staff from all over the agency to identify issues of concern to the public; results were, in effect, used as a report card for customer service and attitudes about MTA service measured over time. The panel research program ended in 2010 after 15 years. Panel Sampling, Recruitment, and Maintenance Panel Sampling and Recruitment MTA employed an outside vendor to recruit and maintain the panel, and to administer the surveys. The panel was recruited through random-digit-dial calls to match census data based on location (zip code of residence and borough) and demo- graphics (age, income, ethnicity/race, etc.). MTA staff had no direct contact with the panelists and no access to panel names or contact information. Panel Maintenance To sustain panel interest, MTA offered three levels of incen- tives: (1) Every time the vendor successfully contacted a panel member, the panel member received a $10 incentive, even if the survey was not completed; (2) all panelists were mailed a professionally produced quarterly newsletter from the vendor that focused on life in New York City; and (3) every quarter, there were raffles for savings bonds in various denominations up to $1,000. All incentives were managed and distributed by the vendor. The vendor completed 400 surveys each month, a total of 4,800 each year. With a pool of 1,500 members, the vendor only needed to contact each panel member about every four months. Panel members who were successfully contacted three times within 18 months were replaced to keep the panel fresh. Panel members were also replaced if they could not be reached after multiple attempts. Replacements were also recruited by telephone, to match the demographics of the person being replaced, maintaining the overall demographic and geographic profile of the region. Replacing panel members avoided concerns with creating “professional” respondents and ensured that a wide variety of viewpoints were being obtained. For example, New York City has a large immigrant population, with many more arriv- ing every year. The study recruitment would naturally reach both new immigrants and newly arrived residents. As they adjusted to life in the city, however, their perceptions changed. Continuous refreshment of panel members new to New York maintained the “newcomer” perspective. Implementation, Analysis, and Reporting The survey was developed by MTA staff in coordination with the vendor. It collected travel behavior data, attitudes about travel in New York, satisfaction with service attributes, specifically focused questions (for example, tracking adver- tising or promotional campaigns), and demographics. The tele- phone call typically lasted 22–23 minutes, including the time required to connect with the correct person in the household and update contact information; the heart of the questionnaire took approximately 16 minutes. Surveying was conducted seven days a week, with the exception of major holidays. Four hundred interviews were conducted each month (12–15 per day), providing a continu- ous stream of data. This allowed the surveys to be modified at very short notice. For example, when there was a need to understand what the public believed about cellular service in

40 the subway, questions could be inserted and left in the panel survey for a specified period of time or until the desired sam- ple size was reached (a sample size of 100 could be obtained in about a week). Questions related to advertising and pro- motional activities were typically fielded for six months to ensure that a sufficient portion of the population had seen the advertising. The vendor provided data to MTA staff on a monthly basis. The data were weighted to mirror the population of the New York City both demographically and geographically. (Weights were applied based on the profile of the entire panel, not the profile of the respondents from that particular quarter’s surveys.) Use of public transportation was not a weighting criterion. MTA staff prepared the ad-hoc and quarterly reports using Statistical Package for the Social Sciences (SPSS) software. Quarterly reports on public satisfaction with surface or sub- way service gave the operational departments a checklist on what areas were improving and which needed more attention. Ad hoc reports using monthly data provided an early warn- ing system for new or growing problems in the system, such as concerns with station personnel or smells in the subway. Daily survey results, combined with the monthly data, pro- vided a measurement of public response to specific issues. For example, if there was a significant security incident, public satisfaction measures with service attributes in the days before and immediately following the event could be tracked to better understand how the public perceptions changed in response to the incident. Benefits, Cost, and Concerns Benefits The primary benefit of the panel was the daily surveying, which provided a nimble, dynamic research tool. Because it was daily, information on public reactions was captured real- time, not by asking people to “think back to when . . . ” and then report on what they thought they believed at that time. Operations managers for both the subway and bus systems also relied on this continuous approach and systematic feed- back to alert them to developing issues and areas needing immediate attention. The fluidity of this technique also allowed the addition of new questions as they arose from operations and management. A question could be crafted and implemented within days, and the quick turnaround of results helped inform manage- ment decision making. Over the 15 years of the panel survey, a vast and in-depth database of information was developed, providing an in-depth understanding of the MTA customer and the image of MTA in the minds of the typical New Yorker. These longitudinal data could be mined for years after it had been collected. Cost The main drawback to the study was the cost. By 2010, when the program was ended, the average vendor cost was $250,000–300,000 per year to maintain the panel (includ- ing incentives and refreshment of the panel) and to field the surveys. The MTA project manager had daily contact with the vendor, managed the questionnaire, analyzed the data, and wrote the ad hoc and quarterly reports. Staff time needed was approximately 0.5 FTE. Concerns There were no concerns with this panel survey program other than the on-going cost. Legal, Ethical, and Privacy Issues MTA did not have any legal, ethical, or privacy issues with its panel. The panel was developed and maintained by the vendor, with all contact funneled through the vendor. On occasion, a call from the public would be forwarded internally through MTA to the project manager. He would then pass the contact to the vendor, who would respond to the public inquiry. This provided a wall between the agency and the panel, protecting the integrity of the research and assuring anonymity for the panel members. In addition, this eliminated the concern that panel identities could be released through public records laws. Lessons Learned/Elements for Success MTA had 15 years of experience with on-going panel research. The key element of success was replenishing the panel to keep it dynamic and fresh. Rules were put in place and strictly adhered to throughout the life of the panel effort. This kept participants from becoming “professional survey takers” and ensured a rotating panel with fresh ideas. A second element that was deemed critical for the long- term success of the panel was having it address broad travel patterns and travel modes, not New York City public transit, to avoid sensitizing participants to subway and bus issues. For example, the survey would ask if the panel member traveled to the airport and, if so, what mode of travel he/she used to get there. The popular quarterly newsletter focused on travel and life in New York City, reinforcing that the survey was about urban culture, not public transportation. To manage costs, MTA recommends a long-term contract with the vendor. The initial cost is in the set-up of the project. Agreeing on a long-term contract allows the agency to budget for a consistent expected expense, and the vendor may offer a better price in exchange for having a steady revenue stream for several years at a time.

41 WASHINGTON STATE TRANSPORTATION COMMISSION CASE EXAMPLE Overview In 2006, the Washington state legislature passed a law requiring the WSTC to conduct surveys of ferry riders every two years to help inform level of service, operational, pricing, planning, and investment decisions. (See Appendix D for relevant sections of the code.) WSTC conducted comprehen- sive, on-board surveys of ferry riders’ attitudes and opinions regarding ferry service. However, while they provided excel- lent information from the riding public, it was recognized that these paper surveys did not allow researchers an opportunity to follow up with the same riders over time to track trends, or to conduct additional research such as conjoint studies to support fare elasticity models. In addition, it was realized that passengers were eager to share their opinions about the Washington State Ferries (WSF) system, incentive-free. As a result, WSTC decided to create and maintain a panel of riders to communicate regarding service, fares, planning, and investments in the ferry system. The online panel, called the Ferry Riders’ Opinion Group (FROG), provides a mechanism to conduct the required ferry rider surveys and other topical surveys with quick turnaround times and timely feedback on issues of importance to the WSTC, the state legislature, and the WSF. The program was developed by a market research consul- tant who reports directly to the WSTC. Together, they hired a team of market research vendors to implement the program, providing the software backbone of the project, panel man- agement, survey invitations, and questionnaire programming software. The FROG online panel had more than 6,500 par- ticipants in 2012, and is open to anyone who wishes to join, providing both market research and public relations benefits for the state ferry system. Panel Sampling, Recruitment, and Maintenance Panel Sampling and Recruitment Initial recruitment for the panel was conducted in the winter of 2010 using in-person intercepts to distribute a paper recruit- ment form. Passengers were approached either while waiting on the dock or while travelling on board the ferry and asked if they would be interested in being a part of panel of riders that could give the state feedback on ferry fares, service quality, and system operation issues. Those who were interested were asked to provide their names and e-mail addresses, which were entered into a database and were then used to send invitations to join the panel. A secondary recruitment process was con- ducted the following summer to attract the more casual and recreational riders of WSF to the survey panel. In subsequent years, riders were recruited to the FROG panel using a variety of traditional methods, including mailers, press releases in the print and TV media, and WSTC and WSDOT/WSF web postings. The WSTC has also experi- mented with recruiting panelists using smart phone-based QR codes, which are provided on posters on the ferries and terminals. Passengers scan the QR code with a smart phone to vote on whether ferry service is a good or poor value for the fare paid, and then are connected to the FROG website where they can join the panel (see Figure 11). About 120 new panel members have joined as a result of following up on the QR link. The posters also provide the FROG website address so passengers can sign up directly without using the QR codes (an unknown number did so). Although the number of new sign-ups is relatively small in this case, the promotion also provides a visual reminder that the WSTC & WSF are interested in customer feedback, and was used in conjunction with the start of a system-wide survey on fare media utilizing the FROG panel members. Gathering input from the occasional and/or recreational ferry riders, who are much less likely to become FROG panel members, is more difficult, yet their input is required under Washington state law. The WSTC’s primary survey efforts have been done two times over the two-year budget biennium: once in the winter and once in the summer. Summer is the prime time to gather input from the recreational riders in FIGURE 11 Poster to recruit membership in the FROG panel.

42 particular, but because they are unlikely to be panel members, the WSTC has had to rely more on traditional on-board sur- veying of this customer segment. However, the plans for the summer 2012 surveying shifted towards a more technology- driven platform: Instead of handing out paper surveys, sur- veyors were to use notepad technology to conduct on-board surveys and input the data directly into the notepad device. Passengers were to be asked whether they are FROG mem- bers or not. Current FROG members will be encouraged to take the survey by means of the link in the e-mail sent to them. Non-FROG members will be asked if they would be interested in joining the FROG panel, and if so, their e-mail addresses will be collected and they will receive an e-mail invitation to take the survey online. Passengers not interested in being part of the panel will be asked an abbreviated set of service quality questions. One of the main purposes for this process is to capture the casual and recreational riders who might be making their only trips on WSF, but who collectively make up a significant segment of WSF’s customers. The move to technology-based data collection was done for three reasons: (1) collecting data on board the ferries in a fast and efficient paperless manner increases the probabil- ity of gathering input from more riders who otherwise may not have participated in the survey; (2) having the notepad device means on-board interviewers won’t be approaching passengers with a clipboard, so riders are less likely to try to avoid the surveyor; and (3) the technology and QR codes may appeal to younger riders less likely to participate in a traditional paper-based survey. The move to technology has the added benefit of demonstrating that the WSTC is innovative and open to the input of riders in a variety of forms, thus raising public trust and interest in critical issues. One of the benefits of a survey panel is that information collected on static questions can be saved in FROG mem- bers’ individual profiles and reused on future surveys. This reduces the burden on the respondent because subsequent surveys can omit redundant questions, thus making the sur- vey shorter and more issue-focused without losing the static demographic information for data analysis. This information can also be appended to future surveys to provide a longitu- dinal view of to the data; registrants are asked to establish a unique password to allow them to modify their panel profile submissions at a later point. When registering to participate in the FROG panel, riders are required to provide contact information (e-mail address, phone number, home address, zip code, etc.), demographic information (gender, birth year, family size, education, house- hold income, etc.), and baseline attitudinal information (ferry usage, trip purpose, perceptions of value, etc.). Individuals under the age of 18 are barred from registering based upon their birth date, but their contact information is retained, and future surveys are sent once they reach 18. The profile data are designed both to meet legislative requirements and to ensure specific rider segments—casual, vacation, and commuter— are correctly captured. Membership in the panel is open to anyone who wishes to join, and they remain members until they remove themselves from the e-mail list. The current pool of more than 6,500 ferry riders can provide reliable data at the system-wide, ferry route, county, and legislative district levels. The value of FROG research has been acknowledged by both the WSF manage- ment and legislature, resulting in greater utilization of survey findings and results. The WSTC’s goal is to continue to expand the panel so that it can provide reliable data at even more specific levels, such as sailing time or ferry terminal. It is anticipated that FROG will need to grow to about 18,000 mem- bers to provide data at that level of detail. This will require additional advertising through posters, announcements, and on-board surveys as the agency looks at new recruitment enticements. Panel Maintenance Panel membership has grown with the announcement of every study since its inception in 2010. Since members are only removed at their own request, attrition has been limited. Maintaining the panel’s viability requires measuring the personal information of new members against those who drop out so that recruitment efforts are scaled and implemented effectively. Frequent interaction is essential to keeping panel members engaged. In a given two-year biennial state budget cycle, two large (35–50 questions) surveys are conducted, along with two to four smaller (15–25 questions) studies. To keep the panel members engaged between the larger surveys, simple “quick poll” surveys of one to three questions are conducted. The larger surveys are used to gather ongoing customer service-related data and test a variety of issues and ideas. In contrast, the smaller studies are typically focused on a given operational matter, such as riders’ ability to shift modes of transportation, capital funding approaches, ferry fares, strategies, and approaches, as well as other key issues. The quick-poll surveys, while limited in nature, do keep the panel members interested and provide valuable data for the WSTC and WSF to consider. One of the benefits of the “quick polls” is that when panel members complete one, they are instantly shown how their responses compare to all other respondents. This feedback provides a non-monetary incentive to partici- pate, allowing members to see how they line up with other ferry riders and how the system as a whole is responding to a given question. No financial incentives for completing a survey have been provided to panel members up to this point. This is a state- funded effort and the project manager believes that using taxpayer funds to benefit FROG participants is questionable

43 and controversial. In a separate study, the WSTC conducted a survey on statewide transportation funding and issues. Participation rates were a bit low in certain parts of the state. After careful review, the agency decided to offer the opportu- nity to win airline tickets as an incentive to stimulate participa- tion in target markets that were under-represented. To date, the agency has received no complaints regarding the use of this incentive, opening the door to exploring future incentives as a last resort. Another incentive concept that may be explored is the idea of creating a game for participants on the FROG panel, or what is called “gamification.” The more participation a member has, the more points she/he earns to advance in the game at play—similar to a simplified “Farmville” type concept. Implementation, Analysis, and Reporting Implementation The WSTC hired an expert market research professional to act as the commission’s project manager and oversee its survey efforts. The WSTC Executive Director and the contracted project manager together hired a market research company to handle questionnaire design, analysis, reporting, and pre- sentations to legislative bodies and decision makers. Surveys are deployed to the panel members by e-mail. Panel members can also log onto the FROG website at any time to see if new surveys are available or to review reports on past surveys. While surveys are being conducted, the online pro- gram checks the IP address of each computer being used to fill out a survey, so that it cannot be used to submit duplicate surveys should panel members attempt to sway the results. The panel is used to gather information regarding pos- sible operational changes, pricing, and investment/funding approaches. Recent survey topics include capital funding, fare policies and fare levels, mode shift opportunities, elasticity of demand, options for reservations, etc. The completion rate from FROG panel members has typically been 20%–30%, regardless of the length or topic of the survey. The required customer satisfaction studies are conducted each biennium in winter (to capture input from commuters) and summer (to capture input from recreational riders). On-board intercept surveys are still conducted in the summer because many recreational riders are visitors to the area and unlikely to be FROG members. Analysis and Reporting Because panel membership is open, it is not representative of the ferry ridership unless panel data are adjusted against actual rider data. To ensure each survey is statistically reliable and projectable, results are weighted to match total ridership by ferry route and time of day during the period of the survey. Spot checks are conducted to make sure that responses by mode (vehicle driver, vehicle passenger, pedestrian, etc.) are also in line. As the total number of panel members grows, the ability to weight by mode within a route is enhanced. The ferry survey research is typically analyzed at two levels: (1) at the personal or rider level (one ferry rider = one vote); or (2) the trip level (one ferry ride = one vote). Some analyses are more appropriate at the rider level, such as opinions on how ferry improvements can be funded; whereas other analyses are more appropriate at the trip level, such as the percentage of ferry trips taken for recreation purposes. Using information on the frequency of ferry ridership, individual rider responses are weighted to represent trip level data. Reports are typically presented from both the rider total trips perspectives. The “quick polls” are designed to keep the panel engaged and provide data for future surveys. Results of all research conducted each biennium are provided to the legislature and posted on the WSTC website. As a new feature of the panel program, PowerPoint presentations and full reports given to the legislature are being posted on the FROG website so that panel members can easily access the information and see how their opinions were presented to representatives. Benefits, Costs, and Concerns Benefits Panel surveys provide valuable longitudinal data that make it easier to understand and track what is happening with ferry riders and how those changes may be impacting ferry ridership. The panel has proven to be a good way to establish an ongoing dialogue between decision makers and the riding public. It would appear that most people do not join the panel to complain but rather to be a part of the conversation and help shape the future of the ferry system. This results in a win/win outcome—decision makers hear from a vast number of riders and the riders play an active role in setting policies that impact their lives directly. FROG’S success has reaped many direct benefits for the state legislature. Regulations have been drafted based on the results of a given survey and enacted after legislators saw how their constituents felt on the subject. This is because the surveys are conducted by an independent body—the WSTC, which has no political party connection—and the WSTC’s contracted market research firms, who serve as technical experts and also have no political affiliation. As a result, the data are highly credible and both sides of the legislative aisles are able to use the data to identify solutions that have bipartisan support. As an example, one FROG survey tested support for paying a per-ticket fee of varying amounts if the revenues were dedicated to a capital fund for future ferry system improve- ments. The overwhelmingly positive response resulted in the

44 legislature adopting a 25¢ per ticket fee and citing the FROG survey. An active conversation has begun in which the legis- lature has requested questions be asked of the panel to shape future ferry-related legislation. The success of the FROG panel has resulted in the expansion of the panel technique to other statewide trans- portation issues. A recent statewide survey on transportation needs and sources of funding for new projects resulted in the development of a statewide citizen’s panel called VOWS (Voice of Washington Survey). The results from an initial survey in Fall 2011 helped shape the passage of an electric vehicle tax with overwhelming support of the public and legislators. Cost The Washington legislature appropriated $350,000 per bien- nium for the FROG program. The funding paid for the project manager, various market research firms, and the vendor who provides the software platform. The WSTC and WSF have limited staff allocated to the program. Concerns There is a concern that the FROG panel could become skewed toward persons who have a high vested interest in the ferries. This has resulted in a desire to recruit more general and casual riders, not just the “enthusiastic” commuters. At present, the agency hopes that increasing the total panel membership will provide a broader representation across all markets. This will require more public relations and advertising efforts to alert riders to the panel and help them understand the value of participating. The value and viability of the panel data are directly tied to the size of the panel. The smaller it gets, the less the WSTC can do in terms of drilling down into detailed subsets of the data, such as slicing the data by time of day, sailing direction, etc. Again, keeping the overall panel size stable and growing will require advertising and public relation campaigns to elicit interest and participation. Legal, Ethical, and Privacy Issues There have been no overriding legal or ethical concerns regarding the implementation of the FROG panel. The only concern is with the protection of panel participants’ privacy and information that might be obtained from the WSF’s using freedom of information petitions. There are state safeguards in place to protect individuals’ privacy, but it is unclear as to the amount and type of information that would need to be provided, if requested. This is an issue that would need to be resolved by the Washington State Attorney General’s office. The FROG website has a very detailed privacy policy informing panel members of the legal aspects of joining the FROG panel as a state government hosted activity. The legal information is provided on the Privacy Policy link at http:// www.ferryridersopiniongroup.com. In addition to the privacy policy, the screening question- naire for joining the panel states: Participation in the research is voluntary. Individual survey responses will be kept confidential and will only be used for statistical purposes. The Commission would like to encourage all customers to join the group and play a role in WSF’s future. Lessons Learned/Elements for Success Electronic data collection is more effective and efficient than using paper surveys. Communicating with survey panel members and gathering data through the online FROG web- site can be done very quickly, as compared with traditional approaches. There was an initial expectation that ferry riders would not want to participate in the FROG panel and complete surveys online; however, WSTC’s experience with FROG demon- strates that an agency should not be reluctant to create online panels and solicit customer feedback actively and electroni- cally. People are very willing to participate in the decisions that will be made regarding their services and how the govern- ment is spending their money. The WSTC has received very few complaints out of all its interactions with customers. Initially, research results were not shared directly with the panel members, but there appears to be greater interest and value in sharing them. Survey results sent to the legislature have been posted on the WSTC website. More recently, the WSTC started posting the resulting survey reports on the FROG website so panel members can easily access them by going to the “My Reports” page on their FROG member account. The results may not always be rosy from the agency’s point, but publishing them on the panel website adds credibil- ity to the process and demonstrates that the agency is listening to the customer. This builds trust and respect from the panel members, provides transparency to the process, and creates a public relations value beyond the results of the survey.

Next: Chapter Five - Conclusions »
Use of Market Research Panels in Transit Get This Book
×
 Use of Market Research Panels in Transit
Buy Paperback | $48.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s Transit Cooperative Research Program (TCRP) Synthesis 105: Use of Market Research Panels in Transit describes the various types of market research panels, identifies issues that researchers should be aware of when engaging in market research and panel surveys, and provides examples of successful market research panel programs.

The report also provides information about common pitfalls to be avoided and successful techniques that may help maximize research dollars without jeopardizing the quality of the data or validity of the results.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!