
Be the first to like this
Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.
Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.
Published on
Accurate and precise estimation of discards is a major objective of data collection programs throughout the world. Discard reduction is also a major topic of the new Common Fisheries Policy (CFP) and the future Data Collection MultiAnnual Programme (DCMAP). Using data from the Portuguese onboard observer programme that samples two otter trawl fisheries in ICES Division IXa, we compare two different approaches for estimating the sampling effort required to attain "assessment grade" discard estimates: a modelbased approach (exponentialdecay models) and a probabilitybased approach (based on classic sampling theory). We show that both approaches attain comparable sample size estimates and that the sample size required to attain precision objectives
varies across species and across fisheries being likely influenced by discard motifs. We demonstrate that sampling levels at least two fold higher than the present sampling levels would be required to attain the precision levels set in the current Data Collection Framework (DCF). We discuss the implications of these results in light of the future ability of European onboard sampling programs to detect, e.g., progressive reductions in discard levels.
Be the first to like this
Be the first to comment