Science Inventory

Deep Lake Explorer: A web application for crowdsourcing the classification of benthic underwater video from the Laurentian Great Lakes

Citation:

Wick, M., T. Angradi, M. Pawlowski, D. Bolgrien, R. Debbout, J. Launspach, AND M. Nord. Deep Lake Explorer: A web application for crowdsourcing the classification of benthic underwater video from the Laurentian Great Lakes. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, 46(5):1469-1478, (2020). https://doi.org/10.1016/j.jglr.2020.07.009

Impact/Purpose:

This article is about developing Deep Lake Explorer, a web application to crowdsource analysis of underwater video in the Great Lakes. The article provides details on how the project was developed and an analysis of the accuracy of crowdsourced analysis compared to expert analysis. The paper will inform other scientists collecting underwater video on how crowdsourcing could be adopted to analyze video.

Description:

Underwater video is increasingly used to assess aspects of the Great Lakes benthos like the abundance of round goby and dreissenid mussels that have heretofore gone at least partially unassessed in lakewide benthic assessments. Video can sample hard bottom sites where grab samplers cannot. Efficient use of use of underwater video data requires affordable and accurate analysis tools. The Deep Lake Explorer (DLE) is a web application developed to support crowdsourced analysis of underwater video collected in the Great Lakes. In this study, volunteers (the crowd) used DLE to analyze 199 videos collected in the Niagara River, Lake Huron and Lake Ontario to determine if round goby, Dreissena, or aquatic vegetation were present, and to determine the dominant substrate type. We compared DLE results to an expert analysis of the same videos to evaluate accuracy. Depending on the attribute, DLE had 77 to 90% agreement with expert analysis of videos, and detection rates (number of videos with attribute detected by both DLE and expert divided by number where expert detected the attribute) of 62 to 95%. Many factors affected accuracy, including, including video image quality in the application, video resolution, video pre-processing, abundance of species of interest, and volunteer experience and training. Crowdsourcing projects such as DLE can increase timeliness and decrease project costs but may come with a tradeoff of slightly lower accuracy and a reduced complexity in the analysis.

Record Details:

Record Type:DOCUMENT( JOURNAL/ PEER REVIEWED JOURNAL)
Product Published Date:10/01/2020
Record Last Revised:11/10/2020
OMB Category:Other
Record ID: 350128