Science Inventory

An approach for evaluating the repeatability of rapid wetland assessment methods: The effects of training and experience

Citation:

HERLIHY, A. T., J. SIFNEOS, C. Bacon, A. D. Jacobs, M. E. KENTULA, AND M. Fennessy. An approach for evaluating the repeatability of rapid wetland assessment methods: The effects of training and experience. ENVIRONMENTAL MANAGEMENT. Springer-Verlag, New York, NY, 44:369-377, (2009).

Impact/Purpose:

We sampled 92 wetlands from four different basins in the United States to quantify observer repeatability in rapid wetland condition assessment using the Delaware Rapid Assessment Protocol (DERAP).

Description:

We sampled 92 wetlands from four different basins in the United States to quantify observer repeatability in rapid wetland condition assessment using the Delaware Rapid Assessment Protocol (DERAP). In the Inland Bays basin of Delaware, 58 wetland sites were sampled by multiple observers with varying levels of experience (novice to expert) following a thorough training workshop. In the Nanticoke (Delaware/Maryland), Cuyahoga (Ohio), and John Day (Oregon) basins, 34 wetlands were sampled by two expert teams of observers with minimal protocol training. The variance in observer to observer scoring at each site was used to calculate pooled standard deviations (SDpool), coefficients of variation, and signal-to-noise ratios for each survey. The results showed that the experience level of the observer had little impact on the repeatability of the final rapid assessment score. Training, however, had a large impact on observer to observer repeatability. The SDpool in the Inland Bay survey with training (2.2 points out of a 0–30 score) was about half that observed in the other three basins where observers had minimal training (SDpool = 4.2 points). Using the results from the survey with training, we would expect that two sites assessed by different, trained observers who obtain DERAP scores differing by more than 4 points are highly likely to differ in ecological condition, and that sites with scores that differ by 2 or fewer points are within variability that can be attributed to observer differences.

URLs/Downloads:

Springer   Exit EPA's Web Site

Record Details:

Record Type:DOCUMENT( JOURNAL/ PEER REVIEWED JOURNAL)
Product Published Date:08/01/2009
Record Last Revised:11/24/2009
OMB Category:Other
Record ID: 199529