National Academies Press: OpenBook

Review of the Marine Recreational Information Program (2017)

Chapter: 3 Sampling and Statistical Estimation for the Fishing Effort Survey

« Previous: 2 Study Design and Estimation Considerations for the MRIP
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

3

Sampling and Statistical Estimation for the Fishing Effort Survey

INTRODUCTION

Fishing effort, a key component required for the estimation of fishery removals, historically had been estimated with data collected from a random-digit-dialing (RDD) landline telephone survey within the Marine Recreational Fisheries Statistics Survey. The 2006 National Research Council (NRC) report cited a growing number of biases affecting the accuracy and precision of estimates with this study design. These included, for example, decreasing coverage of the angler population with an ever-increasing proportion of cell phone–only households, decreasing participation in telephone surveys in general, and increasing inefficiencies because of the inability to target households with one or more anglers. In response to these challenges, the National Marine Fisheries Service (NMFS) developed an innovative mail survey design through an enhanced sampling frame to improve effectiveness and appropriateness of fishing effort estimation for the Marine Recreational Information Program (MRIP).

This chapter discusses NMFS’s initiatives to research and address the 2006 recommendations, along with the present committee’s evaluation of those initiatives, recommendations for future pilot studies, and areas of focus to guide continuing improvements for the MRIP.

DATA COLLECTION AND SAMPLING FRAMES

The 2006 NRC report included several recommendations for improving the estimation of fishing effort, including a call for research to identify a “comprehensive, universal sampling frame with national coverage” and to address ever-decreasing response rates that limit the utility of the data. NMFS accepted

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

this challenge and conducted a series of informative pilot studies in consultation with independent survey statisticians and survey methodologists. The committee briefly summarizes the findings presented in Andrews et al. (2014) below, beginning with relevant background on the original effort survey.

The Coastal Household Telephone Survey

The original fishing effort study, the Coastal Household Telephone Survey (CHTS), was a telephone survey conducted on a targeted random sample drawn from a list-assisted, landline RDD sampling frame. The intended population was all residents living in coastal county households identified by prespecified telephone area codes and exchanges associated with the geographic areas. The specific goal was to collect information from anglers regarding their fishing activities during the previous 2-month period (referred to as a wave). The 2006 NRC report pointed to potentially low data quality because of problems such as undercoverage bias from a growing proportion of households without a landline phone (Boyle et al., 2009; Blumberg and Luke, 2015; McCarthy, 2015), as well as already low response rates, which were projected to further decrease over time (e.g., Curtin et al., 2005; Keeter et al., 2006). Based on the experiences of several states with licenses or registries, the 2006 committee suggested that a national angler registry could provide considerable efficiencies for sampling and data collection and improved data quality over the RDD design.

National Saltwater Angler Registry

The National Oceanic and Atmospheric Administration (NOAA) established the National Saltwater Angler Registry (NSAR) on January 1, 2010 (NOAA, 2009). States with saltwater license registries were allowed to sign a memorandum of agreement (MOA) whereby their existing license frames could serve to meet the federal requirements. In accordance with the MOA, states agree to share data regarding their license holders or registrants, and in return, NMFS does not require anglers who fish in those states to register federally.1 These states, however, had (and still have) varying exemptions from having a license. As a result, coverage of the angler population is not consistent throughout the NSAR. Table 3.1 provides a summary of coverage issues for the NSAR related to exemptions. To address the viability of using the NSAR as a sampling frame, NMFS conducted a targeted pilot study summarized below.

Several steps can be taken to address the issues of undercoverage in the sampling frame (and nonresponse to the mail survey). For example, people with license exemptions can be interviewed at access sites, and a correction factor can

___________________

1 500 CFR §600, Subpart P.

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

TABLE 3.1 Sources of Undercoverage due to the Registry Requirements of the NSAR by Statea

State Basic License Requirement Exception for Fishing on Licensed For-Hire Vessel? Exception for Anglers 16 and Under? Other Major Exceptions?
Alabama Salt Water License Y Y N. License-exempt residents over age 64 and anglers fishing on state-licensed piers must obtain a state saltwater angling registration.
Alaska Fishing License N Y N. Residents over age 60 are exempt, but must obtain a Senior Alaska Resident card
California Fishing License N Y Fishing on a public fishing pier.
Connecticut Salt Water License Y Y N
Delaware Fishing License + FIN # Registration. Anglers declare intent to fish in salt water on FIN registration. Y Y N (Persons exempted from license requirement must still have FIN #)
Florida Salt Water License Y Y Residents over age 65. Anglers fishing from state-licensed piers.
Georgia Fishing License + Saltwater Information Program (SIP) registration Y Y N
Hawaii None - - -
Louisiana Salt Water License N Y Persons who turned 60 yrs of age before 6/1/00 are exempt. Persons who turned 60 after 6/1/00 must have a senior license.
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
State Basic License Requirement Exception for Fishing on Licensed For-Hire Vessel? Exception for Anglers 16 and Under? Other Major Exceptions?
Maine Saltwater fishing registration OR Freshwater License + state they fished in saltwater Y Y N
Maryland Bay and Coastal Fishing License OR Bay Boat License OR Bay and Coastal Registration if exempt (e.g. unlicensed angler on licensed boat) Y Y Fishing from commercial pier. Piers provide list of users to registry. Persons otherwise exempted from license must obtain registration.
Massachusetts Salt Water Permit Y Y N
Mississippi Salt Water License N Y N
National Saltwater Angler Registryb Registration Y Y Any person currently licensed by, or a resident exempted from the state’s license requirements by, an Exempted State; persons angling for non-anadromous species in state waters.
New Hampshire Salt Water License Y Y N
New Jersey Salt Water Registration Y Y N
New York Salt Water Registration Y Y N
North Carolina Coastal Recreational Fishing License Y Y Grandfathered lifetime license holders as of 1/1/2006; anglers fishing on licensed piers
Oregon Fishing License N Y (under 12) N
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Pennsylvania Fishing License + Pennsylvania Saltwater Angler Registration (lower Delaware River only) N Y N
Rhode Island Salt Water License Y Y N
South Carolina Salt Water License; Senior Fishing License Y Y Anglers fishing on licensed piers.
Texas Fishing License + a Saltwater Stamp N Y (under 17) Persons born before 1/1/1930 are exempt.
Virginia Tidal Waters Fishing License OR Boat License OR Fishing Information Program (FIP) Registration if exempt (e.g., fishing on licensed boat) Y Y N. Anglers exempt from licensing must get FIP registration.
Washington Fishing License N Y (under 15) N

a https://www.countmyfish.noaa.gov/register.

b Per www.countmyfish.noaa.gov/register: “Starting January 1, 2011, if you have a saltwater recreational fishing license or registration from any state or U.S. territory EXCEPT Hawaii, Puerto Rico, or the U.S. Virgin Islands, you are AUTOMATICALLY registered and do not need to take further action.” (NMFS Communication to 2016 committee, 22 June 2016.)

SOURCE: Modified from NMFS.

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

be applied to the Fishing Effort Survey (FES). Also, survey personnel can try to contact people who do not return a mail survey by email or telephone.

Pilot Study—Angler License Directory Telephone Survey

NMFS conducted pilot studies in several states with established angler registries, including the Angler License Directory Survey (ALDS). The data-collection methodology for the ALDS was similar to the CHTS as both were telephone surveys, but its sampling methodology was different. CHTS samples were drawn from randomly generated telephone numbers in designated coastal area codes and exchanges but no household information (e.g., name, angler license status) was available. ALDS samples, by contrast, were randomly selected from the licensure database with associated contact information that was sometimes incomplete or out of date. Thus, the pilot study afforded a direct comparison of the two sampling frames—landline RDD for the CHTS and the angler registry for the ALDS. Although the ALDS response rates were only “marginally higher” than those for the CHTS, the new sampling frame resulted in significant data-collection efficiencies through an increased number of interviews from the target population of saltwater anglers (Andrews et al., 2014). This research, however, also suggested that sizable coverage issues existed with the registries related to errors in contact information (e.g., old/incorrect telephone numbers), state-specific exemptions, and anglers that should have a license but did not, ranging as high as 70 percent in some states.

Pilot Study—Dual-Frame Telephone Survey

In addition to the angler registry, the 2006 report also suggested a general dual-frame approach to increase coverage of the target population for estimation of fishing effort (Lohr, 2007; Brick et al., 2011) and to increase data-collection efficiencies with already identified anglers, regardless of the chosen data-collection mode (e.g., telephone, mail). Building on the results of the ALDS pilot, NMFS examined the combination of the angler registry with a landline RDD survey. Acknowledging that anglers could be listed in either or both sampling frames, NMFS selected independent samples from each frame (CHTS RDD and ALDS registry) and then weighted the results from each frame to produce a series of unified estimates (e.g., Lohr and Rao, 2000). Response rates from the pilot survey were low and similar in magnitude to those for the CHTS; undercoverage concerns noted for the ALDS design remained. In addition, there was insufficient information to determine whether a sampled household/angler was listed in both frames to construct efficient weighting adjustments to lower nonresponse bias and coverage bias associated with those not present in either frame. Consequently, NMFS abandoned this alternative design.

NMFS evaluated neither a landline/cellular dual-frame RDD design for the

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

estimation of fishing effort, nor a telephone survey using cell phone numbers alone. Instead, it assessed prior research on telephone surveys for characteristics relevant to the needs of the MRIP. For example, since May 2004, residents may “port” their landline (or wired) numbers to a cellular (wireless) carrier and device (FCC, 2016). In addition, cell phone numbers can travel, meaning that the number assigned upon activation need not change when a person moves to another area in the United States. Thus, ported-landline and cell phone numbers introduce inefficiencies in data collection because they are not necessarily linked to the geographic areas targeted by the MRIP (e.g., Keeter et al., 2015).

Fishing Effort Survey

The ALDS research uncovered sampling and data-collection efficiencies in using the NSAR as a sampling frame (as suggested in the 2006 report), but NMFS noted that the remaining undercoverage could limit the quality of the fishing effort estimates. Additionally, the general, ongoing decline of response rates to telephone surveys was a growing challenge. For example, Brick et al. (2011) discovered that the coverage of the CHTS was only about 50 percent in the aggregate of Florida, Massachusetts, New York, and North Carolina, and that the aggregate response rate was around 10 percent, while a test mail survey resulted in a response rate of greater than 30 percent. Consequently, NMFS evaluated the feasibility of a mail survey.

Address-based sampling (ABS) frames have been available to the public since the early 1990s (Iannacchione, 2011). These frames are developed from commercially available versions of the U.S. Postal Service’s Computerized Delivery Sequence (CDS) file, the route taken by postal carriers to deliver mail. The CDS, like the NSAR, alone is not a complete list and is therefore subject to undercoverage. The CDS may be supplemented with information to produce a more complete sampling frame. Supplemental files include, for example, the No-Stat file, a file containing more than seven million primarily rural mailing addresses not listed on the CDS (Shook-Sa et al., 2013), and ancillary data from public and private sources related to population demographics and other characteristics (AAPOR, 2016). With augmentation of the No-Stat file alone, ABS frames provide near-complete coverage of the U.S. household population (Iannacchione, 2011).

NMFS then tested a new list that incorporated the coverage benefits of ABS and a state-specific licensure database (NSAR) for the new FES. All ABS addresses in relevant East and Gulf Coast states were retained, excluding grouped quarters without individual unit addresses (e.g., correctional and nursing facilities; Reist, 2012) and known businesses. Additional records found on the NSAR that did not match information on the ABS address list were also retained, including those with addresses outside the coastal state. Addresses on the new FES sampling frame were then stratified (grouped) within state to allow for differential

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

sampling by (1) coastal counties (specified distance from the coast) versus noncoastal counties and (2) NSAR exact match (address and/or telephone number, if available) versus no match (Andrews et al., 2014). The sampling literature refers to this design as a single-stage stratified design (e.g., Valliant et al., 2013).

The FES pilot test using the new address frame resulted in impressive improvements over its predecessor survey, the CHTS (Andrews et al., 2014). The augmented-ABS frame enabled a direct link to coastal households through geolocation information; this provided efficient sampling and data-collection methods to target angler households. In addition, this new approach provided a new level of stratification for sampling associated with licensure status (Yes versus No/Unknown). Then, samples in the matched strata were drawn at a higher rate to gain efficiency under the assumption that this stratum has higher rates of saltwater anglers. Finally, a subsample of nonrespondents was contacted to assess nonresponse bias; data collected did not show any detectable levels of nonresponse bias, suggesting high-quality data. Many studies include nonresponse follow-up components to their study designs to measure and adjust for nonresponse bias following guidance provided in Standards and Guidelines for Statistical Surveys from the U.S. Office of Management and Budget (OMB, 2006).

FES documentation to date is not clear on the level of augmentation of the sampling frame beyond the NSAR such as the No-Stats file. This suggests an area of future research toward ensuring maximal coverage of the coastal-state household population especially for those with private boat docks. Additional augmentation of the FES frame could afford further targeted sampling and associated data-collection efficiencies if information, for example, from market research vendors proves fruitful.

The 2006 report recommended that a dual-frame survey (i.e., using more than one sampling frame to draw a probability sample) “should be used wherever possible to reduce sample bias” associated with undercoverage noted for the single sampling frame design (see Chapter 2 discussion). Although not a true dual-frame design as suggested by the 2006 report, NMFS correctly argues that the advantage of its approach is that it avoids biases in the dual-frame estimator resulting from identification of households listed on multiple frames (Andrews et al., 2014). These records can be identified either prior to sample selection through frame matching or post data collection through respondent-provided information such as whether they have a saltwater fishing license. Here, the frame matching errors create modest efficiency loss but do not create bias since the weights remain the inverse of the probability of selection and all households are covered by the frame.

Results from the FES pilot study were striking. The new study design produced a 1.6-fold increase in the likelihood of surveying a household with at least one angler over the other pilot designs evaluated. There was also a threefold increase in the response rate, along with a 4.1-fold increase in “the mail survey

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

estimate of total fishing effort” relative to the CHTS (Andrews et al., 2014). The sizable increase in estimated effort from this pilot does not necessarily suggest higher-quality data (e.g., lower nonresponse bias), but it could indicate a true change over time; without a “gold standard” (from another survey or source) on which to compare, the reason for the change is only speculative.

NMFS officially launched the FES in January 2015 in tandem with the CHTS for states in the Atlantic and Gulf Coast regions (NMFS, 2016a). Administering the two surveys simultaneously is part of the 3-year plan to transition from the CHTS to FES while gathering needed information to recalibrate historical CHTS estimates to adjust the data series. The FES achieved an overall response rate of 35.1 percent (ranging from 32.3 percent to 44.7 percent), almost 28 percentage points higher than the CHTS (7.3 percent, ranging from 4.6 percent to 11.2 percent). As with the pilot study, fishing effort estimated from the FES was 4.7 times larger than the value tabulated from the CHTS responses.

Other Data-Collection Research

The 2006 NRC report also mentions the need to investigate other modes such as electronic data collection. Although much has been accomplished, to our knowledge NMFS has yet to investigate the role of electronic data collection (e.g., a web-based survey) either alone or in combination with an initial mode of data collection. This investigation, however, should proceed with caution. For example, providing participating households with access to a web instrument (in lieu of completing the mail questionnaire) may provide cost savings and reduce the time needed to key and process the data. However, the digital divide (Horrigan, 2015) may result in coverage bias by excluding lower-income households, which would suggest that this option is not viable for full implementation. Data collection via a smartphone application (app) or text messaging may supplement the web option in a mixed-mode survey as long as the questionnaire remains relatively short (Link et al., 2014); coverage bias is less likely a concern here because an estimated 92 percent of adults in the United States have a cell phone (Anderson, 2015).

However, mixed-mode surveys, that is, those with multiple ways for a respondent to provide information (e.g., mail a hardcopy questionnaire or use the web), present several advantages and disadvantages (Dillman and Messer, 2010). Advantages include reduction of errors associated with non-negligible nonresponse and possibly reduction of survey costs. Disadvantages include mode effects (differential patterns of reported information associated with the data-collection methods), lower data quality (Sakshaug et al., 2010), and possibly lower participation rates (Medway and Fulton, 2012).

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

Fishing Effort Estimates from the For-Hire Surveys

The CHTS includes only households with a landline area code and exchange linked to coastal counties of the United States. The FES sampling frame is much larger, consisting of all addresses in the East and Gulf Coast states. Unlike the pilot study, the FES sampling frame excludes anglers identified from licensure databases who live outside these coastal states. Consequently, undercoverage in the FES frame will remain if this list does not adequately cover noncoastal state anglers. Because the MRIP’s scope covers recreational angler fishing effort regardless of where the person lives, both the CHTS and FES may include some level of undercoverage in the fishing effort estimates if the adjustment for noncoastal anglers estimated from the Access Point Angler Intercept Survey (APAIS) is somehow insufficient (Fisheries Statistics Division, 2016). The committee welcomes ongoing analyses of FES coverage, both before and after the APAIS adjustment is applied, along with the direct evaluations of the APAIS noncoastal adjustments.

The For-Hire Survey (FHS) was designed to collect information on “fishing effort and catch by marine recreational anglers fishing on professionally licensed for-hire vessels (including charter, guide, and large party boats)” simultaneously (Sauls et al., 2008). The FHS was initially “developed to resolve undercoverage of charter and headboat angler effort” inherent in the CHTS for the Atlantic and Gulf Coasts (NMFS, 2014b). The committee presumes that the FHS may also provide an undercoverage adjustment for the FES to either confirm or supplement the APAIS adjustment. NMFS states in the MRIP Data Users Handbook that most anglers who take these types of boat trips do not live in coastal states (NMFS, 2014a).

Unlike the CHTS and FES, the FHS includes samples of for-hire vessels selected from a “comprehensive directory of for-hire boats” stratified by vessel type, state, and week within the data-collection wave. To date, the committee is unaware of studies to assess and address the coverage properties of the FHS sampling frame and agrees with a consultants’ report that stresses the need for a comprehensive list (Chromy et al., 2009). The handbook notes that an adjustment factor from the APAIS is applied to the FHS effort estimates to account for angler trips on for-hire vessels not on the sampling frame. Details of the undercoverage adjustment and other survey weight components are found in Sauls et al. (2008). Evaluative studies along with documentation on sampling, frame coverage, and other measurement issues for the FHS would benefit the MRIP and provide needed information to the public.

A vessel representative is contacted by telephone to relay details of the fishing trips that occurred during the prior week, including the number of customers who fished for a particular period. The committee noted that the survey does not ask respondents to identify the number of anglers living outside the coastal areas, or whether they have their own fishing license (and hence are captured on the NSAR). Thus, the committee cannot confirm or refine the statement that

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

most anglers on Atlantic/Gulf Coast for-hire vessels are from noncoastal areas. Gathering such information may be feasible from a cost and burden perspective if NMFS collects electronic logbook information from the vessel captains as recommended by Chromy et al. (2009). Additional information on the FHS is discussed in Chapter 4.

SURVEY MATERIALS

The 2006 report did not provide key recommendations for the study questionnaire used to estimate effort. However, noting that interviewer-assisted questionnaires are different from self-administered ones (Groves et al., 2009; Dillman et al., 2014), both in form and in content, NMFS set out to develop and test a new questionnaire (also called the survey instrument) for the FES.

NMFS used cognitive testing (Groves et al., 2009) to evaluate changes in the short CHTS instrument to improve, for example, the angler’s ability to report on saltwater fishing sites to the exclusion of freshwater sites. NMFS also focused on telescoping errors where respondents inadvertently include or exclude fishing trips from the designated 2-month reporting period (Gaskell et al., 2000). Pilot studies conducted by NMFS suggest that these challenges have been reduced, although they recognize that some “residual reporting errors” may still exist.

As with the CHTS questionnaire, the FES questionnaire is relatively short, covering both sides of one page. The FES questionnaire contains 10 questions on weather information, whether anyone in the household has been fresh- or saltwater fishing in the past 12 months, the type of telephone service, household tenure (e.g., rent, own), length of stay, and household size. Nonfishing questions are included in the FES questionnaire based on research that shows such items increase participation from non-angler households (NMFS, 2014a). Six questions are asked of, at most, the five oldest members of the household: demographics, whether they saltwater fished from shore/boat, and the number of days fished by location in the designated 2-month period and within the past 12 months. Because of the undercoverage of private-access anglers in the intercept survey, an additional question to determine whether respondents used public or private access would provide valuable information to the MRIP.

Materials included with the mail questionnaire are a cover letter providing details about informed consent to participate in FES, along with frequently asked questions, a prepaid return envelope, and a small cash incentive. Noting challenges with respondents distinguishing between fresh- from saltwater locations, NMFS could evaluate the utility of including a state map with identified saltwater access points. This may also improve angler recall.

Another item of note is the 2-month recall period common to both the CHTS and FES. Limited documentation is available on the historical decision to set 2 months as the recall period (Groves et al., 2009; see also Chapter 1) for CHTS other than methodological studies conducted in the 1970s that suggested a recall

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

period longer than 2 months would result in unreliable estimates (NMFS, 2014a). Presentations from NMFS to the committee in open meetings suggest that reducing the size of the data-collection wave and the recall period could increase the FES sample size and consequently the cost of the study. NMFS will soon comment on results from a recent pilot experiment to compare results from a 1-month recall period against the standard 2-month period. Any design changes to address non-negligible recall bias should be made in light of sample size (cost) implications, along with effects on when estimates are made available to the public.

Noting problems associated with recall bias, research has been conducted using prospective data-collection techniques. For example, the Migratory Bird Hunter Survey requests sample members to maintain a prospective diary to record hunting trips during the season (USFWS, 2016). However, some research suggests that prospective diaries could increase participant burden and lower response rates, and should be evaluated through a pilot study (Fricker and Tourangeau, 2010).

Prospective electronic data submission by a household respondent, perhaps through smartphone or tablet apps, may ease these concerns and could be a focus for future research. The ability to “capture data in the moment” may reduce recall bias, an issue raised about the CHTS and FES, provided that the participation burden does not affect participation rates (Link et al., 2014). Therefore, NMFS is encouraged to consider a prospective design with electronic data collection as a future pilot study. As discussed in Chapter 8, NMFS should consider implications on the data series when evaluating the pros and cons and introducing enhancements to the FES.

SAMPLE DESIGN

The CHTS sample design is described as a stratified simple random sample of RDD landline telephone numbers associated with targeted coastal areas and subareas. The study telephone numbers are randomly selected from banks of 100 numbers with at least one working residential landline phone (1+ banks), excluding those designated as a business (Link et al., 2008). Biases associated with undercoverage (excluding, for example, cell phone–only households) and cognitive burden in recalling fishing trips in the past 2 months during a brief telephone interview are a few challenges noted in the 2006 report and by NMFS (Andrews et al., 2014).

The FES sample design, by contrast, is a stratified simple random sample selected bimonthly from an ABS frame of addresses in Atlantic and Gulf Coast states. Mutually exclusive strata (groups of addresses) are defined by the interaction of county proximity to the coast and NSAR match (yes/no) within each state under the FES purview, all important characteristics to the estimation of the annual fishing effort. NMFS uses differential sampling rates to target strata with a higher likelihood of interviewing anglers without sacrificing coverage

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

(Andrews et al., 2014). Just prior to data collection, NMFS augments the sample with state-specific license registry data linked to the NSAR to ensure current contact information such as telephone number (Fisheries Statistics Division, 2016). Documentation on the augmentation could benefit from additional clarity regarding the point at which the frame is updated with critical information such as the NSAR match (yes/no; see also Chapter 7 recommendation regarding enhanced documentation).

The FES is designed to produce cross-sectional (i.e., yearly) fishing effort estimates by state. As noted in documentation provided to this committee by NMFS, the state-level annual estimates are expected to be precise, assuming a coefficient of variation (= 100 × standard error of the fishing effort estimate divided by the estimate) no greater than 20 percent and historical response rates (Fisheries Statistics Division, 2016). An optimal allocation methodology determines the distribution of cases across strata within each state. Requiring the FES to produce precise estimates for in-season estimation is not feasible given time and funding constraints. Doing so would require specialized surveys for this purpose—consider, for example, the red snapper survey field tests being conducted in Alabama, Florida, Mississippi, and Texas in collaboration with NMFS (Sharpe, 2016)—and/or specialized statistical methodology.

The 2006 report recommended the evaluation of panel designs for estimation of effort, noting both pros and cons of this alternative design. With survey panel designs, all or a portion of the sample is interviewed across multiple data-collection periods (e.g., Lavrakas, 2008). The 2006 committee focused specifically on the benefits of a rotating panel design, whereby change between 2 years of the study can be estimated along with cross-sectional changes as currently implemented. Also mentioned in the 2006 report were the potential benefits of a rotating panel design (multiple panels of sample members are brought in and removed from the study at a designated frequency) for maintaining or even increasing response rates (e.g., Lavrakas, 2008). To date, panel pilot studies have been conducted in Texas to evaluate the utility of the iSnapper app for prospective collection of catch data on red snapper (Stunz et al., 2014; NOAA, 2016a) and in North Carolina and Florida to assess the feasibility of collecting catch and effort simultaneously (NOAA, 2016a).

NMFS is cognizant of bias associated with nonresponse and has included design components to mitigate this challenge. All FES sampled households that do not respond within a specified time period receive a reminder postcard. The third and final mailing to remaining nonrespondents includes a nonresponse conversion letter, a second questionnaire, and a postage-paid return envelope, delivered together via first-class mail. In addition, the pilot studies that served in the development of the current FES design included an evaluation of nonresponse bias (Groves and Couper, 1998). NMFS conducted a small nonresponse follow-up study on a random subsample of nonrespondents (see, for example, Valliant et al., 2013 for a discussion of nonresponse follow-up studies). This subsample was

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

contacted again using priority mail and an additional cash incentive. Comparison of the “fishing prevalence” estimates did not uncover substantial differences in the initial and follow-up respondents. However, the documentation does not discuss changes in fishing effort. NMFS should evaluate the utility of including a nonresponse follow-up as part of the standard FES design as an ongoing evaluation of nonresponse bias.

WEIGHTING

Historically, fishing effort was estimated with the CHTS as the weighted number of saltwater fishing trips made by coastal-area residents inflated to include an estimate for noncoastal area residents tabulated from the APAIS (Chapter 4). With the FES, nonresident anglers—those on the NSAR and without a corresponding address on the ABS frame—contribute data for the nonresident estimates, while those on the ABS frame provide the core effort estimate. These surveys are referred to as the FES Nonresident Angler Survey (NAS) and the FES Resident Angler Survey (RAS) in some documentation (NMFS, 2013).

Regardless of the survey, the base weights (inverse selection probabilities) are adjusted for nonresponse to mitigate biases potentially present in the respondent data if those who decline to participate have differing levels of effort (e.g., Lohr, 2010; Valliant et al., 2013). NMFS uses a nonresponse weighting class adjustment with classes formed with information available for all sampled households, namely, the interaction of state, coastal/noncoastal area, and additionally with the FES, match/nonmatch with the NSAR, and presence of a telephone number linked to the sampled ABS address. Survey weights are generated independently for RAS and NAS. With this methodology, respondents and nonrespondents ideally respond similarly for key study questions within groups formed by the weighting classes (Valliant et al., 2013; Haziza and Lesage, 2016). This is a strong assumption made for all surveys using this approach.

As noted previously, the FES ABS frame appears to contain only information appended from the NSAR and no other source. NMFS may find that a nonresponse model enhanced with NSAR information could prove of benefit for the matched sample if item nonresponse and data quality were sufficient to warrant investigation. Additional model covariates may be obtained through supplemental information provided on the ABS frame (e.g., indicator for a seasonal home) or market research vendors (AAPOR, 2016).

In the final step, NMFS calibrates the nonresponse-adjusted weights for the study respondents to the estimated number of households by substate sampling strata from the American Community Survey (ACS; Fisheries Statistics Division, 2016). Not only does this procedure align the estimated number of households with the ACS, but also weight calibration has been shown to lower both sampling and nonsampling errors if relevant variables are available for respondents and from the population (Kott and Chang, 2010; Kott, 2016). Data obtained through

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

the FES questionnaire, such as household tenure, may prove advantageous to enhance the calibration model. In addition, as noted in the 2006 report, a rotating-panel survey could afford detailed variables for nonresponse adjustment for panel members who participated in the first year of the study but not the second.

In summary, FES weighting methodology includes three key components: inverse probability of selection, an adjustment for nonresponse, and poststratification. Documentation to date does not suggest any treatment for mail packets returned as undeliverable or weight adjustments for ineligibility (e.g., vacant households). NMFS could consider a separate unknown-eligibility adjustment especially if the proportion of the sample with no contact is large. Otherwise, the fishing effort estimates could be overinflated because the weighting methodology must assume the same rates of recreational fishing for unoccupied households as calculated from responding households. We assume that the population control totals do not include unoccupied households and therefore address this issue. However, enhanced documentation on the weighting methodology would benefit NMFS now and in the future, as well as provide additional information for the public at large.

Additionally, if NMFS further expands the bimonthly design to include a nonresponse follow-up, then further research is needed to evaluate the weighting methodology in light of a two-phase design, where phase 1 is the current FES design and phase 2 is the nonresponse follow-up. For example, correlates of nonresponse may differ by phase, suggesting a different nonresponse adjustment for the follow-up study. Enhancing the complexity of the design and/or the weighting methodology must be carefully evaluated to determine relative gains in efficiency and data quality without delaying release of the estimates or affecting continuity of the data series (see Chapter 8).

NMFS also could use the FES to estimate the number of households with at least one angler in U.S. coastal states. If these FES estimates do not align with the population, then estimated effort could be severely biased low or high. Consider this generic example: Unbeknownst to the research team, the FES sampling frame had 25 percent undercoverage of the angler population, a conservative estimate given the 70 percent result cited in Andrews et al. (2014). A higher proportion of sample addresses was drawn from the NSAR-matched cases in keeping with the current design. In keeping with leverage-saliency theory—where people who are interested in the survey topic are more likely to participate in the survey (Groves et al., 2000)—the response rate from angler households was 2.5 times higher than from non-angler households. For convenience and simplicity, we ignore the effect of measurement error in the data provided by the participating households. Instead, we focus on the final weight calibration step noted above. If the base weights are adjusted to the frame totals, then the estimated number of angler trips (effort) could be underestimated because of the undercoverage bias. Conversely, if the base weights are adjusted to the ACS totals, then the estimated number of angler trips could be overestimated because of nonresponse bias.

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

As demonstrated through this simple example, comparison of the estimated number of households from the FES with other sources is very important. However, to our knowledge, such information is nonexistent. Consequently, NMFS should consider collaborating with other federal agencies to include items on their surveys to estimate the number of recreational saltwater anglers or households with at least one angler, such as the American Time Use Survey. These external estimates could be used to verify the estimates or as covariates in an FES weight calibration model to reduce nonsampling biases (Dever and Valliant, 2010).

DATA QUALITY AND MISSING DATA

Many components define data quality, including coverage, nonresponse (both item and unit), questionnaire content, data entry, and sampling error. Biemer and Lyberg (2003) provide a framework for assessing quality through the lens of total survey error (see also Chapter 2 discussion). NMFS has made great strides in redesigning the effort survey to lower bias and improve data quality. However, the assessment of data quality must be ongoing, such as by including a nonresponse follow-up as a standard component of the FES design.

Other potential issues include respondent compensation, respondent perceptions, and the validity of retrospective data. The wave follow-up methods in use (e.g., reminder postcard) appear adequate and fit within the framework of standard mail-out survey methodologies. Based on the findings of an MRIP-sponsored pilot study, NMFS determined the optimal compensation to surveyed households to be $2.00 (Andrews et al., 2014). The findings of the study appear reasonable, and the choice of $2.00 reflects a careful consideration of the tradeoffs between nonresponse reduction and survey cost. Another potential problem is respondent perceptions. Respondents’ perceptions of government, the value of the MRIP survey, and the effectiveness of management efforts at various levels may vary by state. Variations in response rates across states, considered when determining the final sample size for each state, should be monitored and assessed on a regular basis. For states with particularly low response rates, efforts should be made to research underlying reasons (e.g., insufficient incentive), perhaps through a nonresponse follow-up, and to develop appropriate strategies to mitigate the problem.

Although discussed earlier, concerns about the validity of retrospective data certainly require further scrutiny. At least one pilot project is reviewing the measurement error and validity of the 2-month reference for estimating effort in the mail-out surveys. Potential measurement error is certainly one problem. However, the problem is somewhat more complicated by the fact that one person from each FES household is likely reporting on the fishing efforts of the other members of the household, similar to the CHTS design. There is the potential for measurement error when the respondent reports his or her effort, as well as when they

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

report other household members’ effort, for the same 2-month time period. Thus, measurement error in this case can take various forms.

One area of data quality not covered in the current FES documentation is item nonresponse, that is, missing responses to some questions from an otherwise completed questionnaire (Haziza, 2009). Item nonresponse can be addressed with imputation, where the missing value is replaced with a valid response using a defined model. Conversely, missing values can be excluded from the household-level estimates; in the case of fishing effort, this assumes no effort (fishing trips) for one or more household members. NMFS is encouraged to report on the level of item nonresponse and to identify procedures to address the incomplete information, because it has direct implications on estimation. Methods could include weighted hot-deck imputation (Cox, 1980) with predefined classes for quick implementation, or more advanced techniques for questions with high item nonresponse or increased likelihood for rounding bias (e.g., Huttenlocher et al., 1990).

VARIANCE ESTIMATION

A standard error for estimated effort is calculated through Taylor expansion procedures per information provided to the committee by NMFS (Wolter, 2007; Fisheries Statistics Division, 2016). The Taylor expansion approach does not account for nonresponse unlike other methods such as replicate variance estimation (e.g., Valliant, 2004). If the sampling fraction (i.e., proportion of households selected for the study out of those on the FES sampling frame) is small, the so-called reverse approach of Shao and Steel (1999) is also another option (see also Haziza, 2009; Kim and Rao, 2009). Software is available to analyze both sets of weights. However, the generation of replicate weights requires both additional time and additional research to determine how many replicate weights to generate (e.g., Wolter, 2007; Valliant et al., 2008).

CONCLUSIONS AND RECOMMENDATIONS

Conclusion: The methodologies associated with the current Fishing Effort Survey, including the address-based sampling mail survey design, are major improvements over the original Coastal Household Telephone Survey that employed random-digit dialing to contact anglers. This is a reflection of an immense amount of effort on the parts of the NMFS staff, contractors, and consultants.

Conclusion: The 2-month recall period for the Coastal Household Telephone Survey (CHTS) was set to be consistent with the seasonal time periods captured by the onsite intercept surveys, such as the Access Point Angler Intercept Survey. This same recall period was chosen for the Fishing Effort Survey to match the CHTS. Several factors, however, are related to the quality of angler recollections,

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

including total number of fishing trips and the frequency of trips around the beginning/end of the data-collection wave.

Recommendation: NMFS should continue to evaluate the cognitive properties of a 2-month recall period to confirm or update the research on this topic conducted in the 1970s.

Recommendation: NMFS should consider evaluating a prospective data-collection methodology, such as asking people in advance to document fishing trips planned over the next 2 months, to reduce concerns about angler recall.

Conclusion: Survey material initially sent to the sampled household includes a small cash incentive in appreciation of the adult respondent’s time to complete the questionnaire. Incentives have been shown to be effective in reducing nonresponse. Nonresponse, however, will be an ongoing challenge for all surveys, which can lower quality and precision.

Recommendation: NMFS should consider conducting targeted annual nonresponse studies as a standard component of the MRIP. The purpose of these studies would be to continually monitor correlates of nonresponse and nonresponse bias to control its damaging effects on data quality.

Conclusion: Maintaining comparability across the years is important for evaluating trends in fishing effort. Changes in fishing effort can result from actual change over time. They can also result from measurement errors such as nonresponse bias, from procedural changes such as new survey questions, or from ineffective adjustments to the survey weights. Without data on respondents who are repeatedly surveyed over the time period of interest, it can be difficult to determine the extent to which a change is real or resulting from these other sources.

Recommendation: As recommended in the 2006 report, NMFS is encouraged to continue research on survey panels, where a portion of the sampled households is retained for one or more interviews, for the Fishing Effort Survey alone or for an effort-catch combined study. The purpose of the survey panel would be to assess trends and any anomalies in those trends, to assess any improvements in data-collection efficiency through increased participation, and possibly to lower measurement error associated with, for example, trip recall with a more engaged sample of anglers.

Recommendation: NMFS should evaluate the benefits of collaborating with another federal survey (e.g., the American Time Use Survey) to include items related to fishing effort. These external estimates could provide corroboration of the

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

fishing effort estimates, as well as useful variables for an enhanced Fishing Effort Survey weight calibration model to address sampling and nonsampling biases.

Conclusion: Collecting data for fishing effort estimates through electronic modes (e.g., web questionnaire, smartphone app) may reduce study costs associated with keying and processing the questionnaires. In addition, these vehicles may be a viable option to increase release of fishing effort estimates with data that are evaluated in real time.

Recommendation: As recommended in the 2006 report, electronic data collection should be further evaluated as an option for the Fishing Effort Survey, including smartphone apps, electronic diaries for prospective data collection, and a web option for all or just panel members.

Conclusion: Weight adjustments have proven effective in lowering biases in survey estimates such as those associated with nonresponse and frame coverage errors. The effectiveness is only as great as the association of the adjustment covariates with nonresponse and important measures of the survey. The Fishing Effort Survey weighting methodology borrows on the strength of the new sampling design to include, for example, an indicator for at least one licensed angler in the household. Consequently, the use of additional variables that are associated with fishing effort and/or survey participation might prove beneficial for the weight adjustment models.

Recommendation: Current or augmented variables on the address-based sampling frame should be evaluated to improve the efficiency of the Fishing Effort Survey weighting methodology.

Conclusion: Variance estimation is a critical component to any survey. Methods that do not account for all components of the sampling design and weight adjustments will typically underestimate the sampling variance. This is especially important for surveys without a high level of response such as the Fishing Effort Survey (~40 percent).

Recommendation: Other variance estimation methods should be evaluated for fishing effort estimates to account for weight adjustments, especially those associated with nonresponse. These include replication methods and the so-called reverse approach.

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×

This page intentionally left blank.

Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 43
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 44
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 45
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 46
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 47
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 48
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 49
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 50
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 51
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 52
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 53
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 54
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 55
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 56
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 57
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 58
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 59
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 60
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 61
Suggested Citation:"3 Sampling and Statistical Estimation for the Fishing Effort Survey." National Academies of Sciences, Engineering, and Medicine. 2017. Review of the Marine Recreational Information Program. Washington, DC: The National Academies Press. doi: 10.17226/24640.
×
Page 62
Next: 4 Sampling and Statistical Estimation for the Angler Intercept Survey »
Review of the Marine Recreational Information Program Get This Book
×
 Review of the Marine Recreational Information Program
Buy Paperback | $55.00 Buy Ebook | $44.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The National Marine Fisheries Service (NMFS) of the National Oceanic and Atmospheric Administration (NOAA) is responsible for collecting information on marine recreational angling. It does so principally through the Marine Recreational Information Program (MRIP), a survey program that consists of an in-person survey at fishing access sites and a mail survey, in addition to other complementary or alternative surveys. Data collected from anglers through MRIP supply fisheries managers with essential information for assessing fish stocks. In 2006, the National Research Council provided an evaluation of MRIP's predecessor, the Marine Recreational Fisheries Statistics Survey (MRFSS). That review, Review of Recreational Fisheries Survey Methods, presented conclusions and recommendations in six categories: sampling issues; statistical estimation issues; human dimensions; program management and support; communication and outreach; and general recommendations.

After spending nearly a decade addressing the recommendations, NMFS requested another evaluation of its modified survey program (MRIP). This report, the result of that evaluation, serves as a 10-year progress report. It recognizes the progress that NMFS has made, including major improvements in the statistical soundness of its survey designs, and also highlights some remaining challenges and provides recommendations for addressing them.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!