z-logo
open-access-imgOpen Access
Using iNaturalist in a Coverboard Protocol to Measure Data Quality: Suggestions for Project Design
Author(s) -
Julie Wittmann,
Derek J. Girman,
Daniel E. Crocker
Publication year - 2019
Publication title -
citizen science theory and practice
Language(s) - English
Resource type - Journals
ISSN - 2057-4991
DOI - 10.5334/cstp.131
Subject(s) - upload , citizen science , crowdsourcing , protocol (science) , variety (cybernetics) , computer science , salamander , resource (disambiguation) , quality (philosophy) , data collection , sampling (signal processing) , data science , world wide web , ecology , medicine , artificial intelligence , biology , statistics , philosophy , alternative medicine , epistemology , pathology , computer network , botany , mathematics , filter (signal processing) , computer vision
We evaluated the accuracy of data records generated by citizen scientist participants using iNaturalist in a coverboard sampling scheme, a common method to detect herpetofauna, involving 17 species of amphibians and reptiles. We trained and observed 10 volunteers working over an eight-month period at a study site in Sonoma County, California, USA. A total number of 1,169 observations were successfully uploaded to the iNaturalist database by volunteers including a new locality for Black Salamander. Volunteers were generally successful in collecting and uploading data for verification but had more difficulties detecting smaller, camouflaged salamanders and photographing quick-moving lizards. Errors associated with uploading smartphone data were reduced with volunteer experience. We evaluated all observations and determined that 82% could be verified to the species level based on the photograph. Forty-five percent of the observations made it to the iNaturalist “research grade” quality using their crowdsourcing tools. Independently we evaluated the crowdsourcing accuracy of the research grade observations and found them to be 100% accurate to the species level. A variety of factors (herpetofauna group, species, photograph quality) played a role in determining whether observations were elevated to research grade. Volunteer screening and training protocols emphasizing smartphones that have appropriate battery and data storage capacity eliminated some issues. Our results suggest that a variety of factors can help scientists and resource managers improve the experiences and quality of the data in citizen science biodiversity programs.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom