Overall, the panel found that the data standards set out by the Marine Recreational Information Program (MRIP) were based on well-accepted principles of data quality, are compatible with the objectives of the Office of Management and Budget’s (OMB’s) Standards and Guidelines for Statistical Surveys and are consistent with the standards followed by federal statistical agencies. MRIP is to be lauded for developing these standards. The various experts who spoke to the panel were complimentary about the standards in general while offering suggestions for improvement. For the sake of conciseness, this report skims over those components of the standards where the general agreement was that they work well and instead focuses on what improvements might be made. The improvements recommended in this report largely consist of making the standards more complete by adding extra considerations, such as timeliness and coordination with the states, and by allowing for other types of data collection, processing, and analysis consistent with modern developments.
As discussed in Chapter 1, one standard that provoked complaints was the requirement within Standard 7 that data not be published if the percent standard errors (PSEs) were greater than 50 percent. MRIP chose not to fully implement this standard. This issue is reviewed within the discussion of Standard 7 within Chapter 3. Other issues that experts raised include the need for timeliness of data and for the comparability of data across surveys, each of these and related issues reviewed within the discussion of the relevant standards. Without criticizing the standards themselves, representatives of state agencies making presentations to the panel also commented
that they often had difficulty in meeting the standards because of limited financial resources and limited statistical expertise.
Conclusion 2-1: The panel concludes that the draft Marine Recreational Information Program data standards largely follow best practices in statistical survey methodology and are compatible with the objectives of the Office of Management and Budget’s Standards and Guidelines for Statistical Surveys. Except for concerns over early plans to suppress data with high percent standard errors, data collectors and users were largely satisfied with the current standards.
When reviewing the standards, the panel considered several key sources. A key source for all federal agencies is OMB’s Standards and Guidelines for Statistical Surveys (OMB, 2006). The various federal statistical agencies have produced their own standards based on OMB’s document, and these also were important sources for the panel. The panel notes too that since MRIP data quality standards require surveys to obtain OMB clearance, OMB standards are implicitly adopted even when not fully fleshed out in MRIP standards; depending on the methodological issue, MRIP standards are sometimes more detailed than the OMB standards, but the OMB standards are more comprehensive. Other key standards and guidelines that were useful in this review are the standards developed by the Federal Committee on Statistical Methodology1 and the set of best practices for surveys developed by the American Association for Public Opinion Research.2
A major condition underlying the creation of MRIP data quality standards is the fact that fisheries management is a responsibility of both the federal and state governments, each managing a different set of waters. Federal and state fisheries management activities also intersect, since some fish travel across both federal- and state-managed waters. The challenge is to find a way to coordinate data collection in a way that meets the needs of all parties. If the federal government simply imposed a set of standards on the states, it might both fail to support the states’ needs and get poor cooperation from the states. On the other hand, if each state individually sets its own data collection and reporting approach, then integrating the data from multiple states and the federal government could be difficult. Setting common standards helps to promote data integration, while allowing flexibility within the standards allows the states to customize the data
___________________
collection and reporting to meet their own needs. MRIP uses the certification process (with the potential for some federal funding as an incentive) to promote states’ participation in a common set of standards.
Conclusion 2-2: The certification approach provides a mechanism for allowing variations to meet local needs while still satisfying national requirements.
This system has major implications across multiple MRIP data quality standards, since it underlies how surveys are designed and implemented. For this reason, the topic is discussed here prior to a review of the individual standards.
A potential issue is that MRIP’s data quality standards may not fully satisfy states’ needs. States’ data needs are generally focused on higher temporal frequency of estimates or greater spatial resolution than national needs warrant (e.g., embayment-based estimates for Louisiana’s LA Creel). In such cases, while there may be no problem with the standards from the perspective of statistical quality, there may yet be a pragmatic problem in obtaining state cooperation to maintain the federal data system.
One such state need is the need for data to be sufficiently timely for the management of in-season fisheries. If the goal is only to create annual targets, then annual data may suffice, but those actively involved in managing fisheries need to know how close one is to reaching the annual target. Thus, timeliness has a very different meaning to those actively involved in fishery management. For example, LA Creel offers weekly estimates, available within two weeks of sample completion.3
Two aspects of timeliness are important. First, even at the temporal resolution of two-month waves, analysts and fishery managers sometimes find themselves waiting for multiple months before estimates of the preceding wave are available. This can be corrected by improvements in the efficiency of data processing. Second, even two-month waves are not always sufficiently responsive to management needs, as evidenced by LA Creel choosing to generate weekly estimates. This has allowed LA Creel to manage red snapper recreational seasons (private and non-federal charter) based primarily on actual harvest estimates rather than long-term forecasts. This management need for more frequent estimates, which cannot be satisfied simply through greater efficiency, can vary depending on the species and fishing seasons. Note that it is important to satisfy states’ need for timeliness as a way of encouraging cooperation by the states, even if the federal government does not have the same need for timeliness.
___________________
3 Harry Blanchet, presentation to the panel on June 26, 2025.
MRIP standards are broadly written and can accommodate both end-of-year assessments and in-season monitoring. This gives the states and regional agencies the flexibility they need to design surveys that meet their own needs for fishery management. However, when agencies seek to use federal data or to integrate data across multiple sources, they may find that those data may not meet their need for information that is relatively current and that corresponds to their fishing seasons. MRIP’s structure of interstate councils and commissions helps states to coordinate on a regional basis, where they will have shared needs. Because of widespread variations across the states, it is difficult to design a single system that works well for all states.
Conclusion 2-3: There may be value in promoting greater coordination across the states and federal government to better meet the states’ diverse needs.
A second data need is driven by the fact that the data needed for fishery management vary depending on geographic location. Based on differences in climate, local ecosystems, and the location of specific species, states have different fishing seasons, fish species, and embayments to monitor (e.g., Louisiana and Texas use estuary-specific goals, such as for Barataria Bay or Corpus Christi Bay vs. Galveston Bay).
Just as there are variations across states, there are also variations across data users (which include various agencies within the states). Table 2-1 shows how data uses and data needs vary across different types of users. These purposes and data uses sometimes overlap, but the different orientations lead to differences in data needs, particularly with respect to the timing, accompanied by differences in data collection.
MRIP’s data quality standards provide states with the freedom to customize surveys to meet their specific needs, but an issue to consider may be the degree to which the standards should cover how well the various surveys integrate together. That is, the standards largely treat each survey as a stand-alone survey, though ensuring a minimum level of data quality is certainly important. Standard 7 provides a list of certain types of data that need to be collected, but as Table 2-2 shows, there are still important variations across states. To the degree that different surveys use different measures or cover overlapping geographic areas, combining the data becomes more difficult.
Although Standard 7 does this in part, MRIP might look at how the standards could be designed further to treat survey design as part of a larger system. A system-based design could have at least two components. Even within each state/region, there is value in considering a survey not simply
TABLE 2-1 Variations in Data Uses and Data Needs Across User Groups
| User Group | Purpose | Timeframe |
|---|---|---|
| Stock analysts | Stock assessments | Annual or multi-year |
| Fisheries managers | Quota monitoring/management effectiveness | Season-year (by waves) |
| Regional and state directors | Quota monitoring | Monthly subwaves |
| Fishermen | Access to remaining quota | Week (subwave) |
alone but within a larger context. For example, one might ask whether there is a system for comprehensively considering all important types of anglers. When looking across states and regions, there is value in considering how well the data integrate with data from other states and regions.
Speaking to the panel, Catherine Bruger (manager of Fish Conservation at the Ocean Conservancy) and Michael Drexler (fisheries scientist at the Ocean Conservancy) discussed the difficulties associated with different methodologies being carried out for each state.4 They said the primary challenge in the Gulf is that most state surveys do not adhere to the existing MRIP data standards. Prior to implementing calibrations (Standard 4), NOAA Fisheries estimated that 2019 landings were in excess of the Gulf-wide catch limit and caused overfishing in that year.5 Most state surveys do not produce annual reports (Standard 5.2) or publicly accessible databases (Standard 7), though they are working with the Gulf States Marine Fisheries Commission to achieve these standards.
States also differ in their resources, especially with regard to those available to the federal government but also from one state to another. Several state agencies commented that they have difficulty obtaining and/or retaining survey statisticians to support their work.6 For example, California depends on a single statistician who is currently nearing retirement.7
Conclusion 2-4: State and regional systems sometimes have different priorities than Marine Recreational Information Program (MRIP) has, leading them to create their own surveys rather than incorporating MRIP surveys.
___________________
4 Catherine Bruger and Michael Drexler, presentation to committee on May 22, 2025. Also see State-Survey-Calibration-Handout_Final-1.pdf
5 GMFMC 2020 Meeting minutes. https://gulf-councilmedia.s3.amazonaws.com/uploads/2025/02/GMFMC-Full-Council-September-2020.pdf#page=[50]
6 June 26, 2025, presentations by Harry Blanchet, Lorna Wargo, Justin Hansen and Gway Rogers-Kirchner, and by Ryan Denton and Laura Riley.
7 Ryan Denton and Laura Riley, presentation to committee on June 26, 2025.
TABLE 2-2 Summary of Differences in Recreational Fishing Surveys
| Survey | Effort Unit | Catch Unit (by species) | Discards (by species) | Notes |
|---|---|---|---|---|
| Marine Recreational Information Program (MRIP) Access Point Angler Intercept Survey | Not measured (used with effort data from other sources) | Number of fish; where possible the length and weight of harvested fish | Reported by anglers; number released alive; no measure of size or condition | Dockside intercept survey for catch per trip (harvest may be either observed or reported) |
| MRIP Fishing Effort Survey | Number of days fishing from shore and from private or rental boats (household-reported) | Not collected | Not applicable | Mail survey estimating effort over previous time period for private and shore modes |
| State Creel Surveys | Angler-hours or angler-trips | Number of fish; sometimes weight | Varies by state; e.g., Washington Department of Fish and Wildlife measures the total steelhead catch, combining harvest with catch-and-release; California separately collects angler self-reports on kept-observed, kept-unobserved, released-alive, and released-dead, and treats discard rates differently depending on whether the fish were groundfish | Often region-specific; may differ by state |
| Highly Migratory Species Angling Logbook | Number of trips, gear effort | Number of fish kept | Released fish by species | Targets species like tuna and billfish; logbook-based |
| MRIP For-Hire Survey | Number of for-hire trips, angler counts | Targeted species for each trip; no statistics on the number of fish caught | Not applicable; effort survey only | Focused on charter and party boats |
| Survey | Effort Unit | Catch Unit (by species) | Discards (by species) | Notes |
|---|---|---|---|---|
| National Survey of Fishing, Hunting, and Wildlife-Associated Recreation (Census Survey) | Days fished per angler, fishing expenditures | Not collected | Not applicable | Tracks participation and economic data (not catch) |
SOURCES: NOAA Fisheries, Recreational Fishing Surveys, https://www.fisheries.noaa.gov/recreational-fishing-data/recreational-fishing-surveys; Cisco Werner, Certification of Marine Recreational Information Program (MRIP) For-Hire Survey Methods, 2023, https://www.fisheries.noaa.gov/s3/2023-03/FHS_Certification_Package_2023.pdf; California Department of Fish and Wildlife, California Recreational Fisheries Survey Methods, 2020, https://nrm.dfg.ca.gov/FileHandler.ashx?DocumentID=36136&inline=; Kale Bentley, Evaluation of Creel Survey Methodology for Steelhead Fisheries on the Quillayute and Hoh Rivers (Washington Department of Fish and Wildlife), July 2017, https://wdfw.wa.gov/sites/default/files/publications/01918/wdfw01918.pdf; U.S. National Marine Fisheries Service, U.S. Pacific Highly Migratory Species Hook and Line Logbook, 2018, https://www.fisheries.noaa.gov/s3/2023-06/hook-and-line-lb-ext-to-06-30-2026.pdf; NORC, Data Files for the 2022 National Survey of Fishing, Hunting, and Wildlife-Associated Recreation, 2025, https://www.norc.org/research/projects/survey-of-fishing-hunting-and-wildlife-recreation.html
Conclusion 2-5: The two-month intervals in the main Marine Recreational Information Program system provide only a limited capacity for in-season monitoring, particularly for fish with short fishing seasons.
Conclusion 2-6: The lack of consistent measures and definitions across surveys (e.g., differing definitions of “angler landings” between Marine Recreational Information Program and state programs) leads to the need to calibrate the data, potentially leading to misaligned allocations and potential overfishing overall along with disparities in what is permitted.
Conclusion 2-7: State and regional systems often have limited resources, particularly with regard to statistical expertise.
Conclusion 2-8: Issues not addressed clearly in the standards include the following: timeliness of data availability, recognition of differences in timing depending on the length of various fishing seasons; and differences in the measures used across surveys.
Recommendation 2-1: Marine Recreational Information Program should evaluate ways to promote greater consistency in how catches are enumerated, including in data collection approaches, data processing, and calibration.
Recommendation 2-2: Marine Recreational Information Program should make its standards more manageable to states and agencies that lack statistical experts and should consider expanding the technical expertise on surveys and statistics it provides to state and regional agencies.
Recommendation 2-3: Marine Recreational Information Program standards should be designed to meet the needs of both MRIP and those who are actively managing fishing activity. Where practical, these include the needs for in-season monitoring and for the availability of timely data.
Recommendation 2-4: Marine Recreational Information Program should add “timeliness” to its data quality standards to better meet the needs of those actively managing fisheries.
Recommendation 2-5: Marine Recreational Information Program should evaluate how to promote greater consistency in how catches are enumerated.