z-logo
open-access-imgOpen Access
A Rubric to Evaluate Citizen-Science Programs for Long-Term Ecological Monitoring
Author(s) -
Catherine A. Tredick,
Rebecca L. Lewison,
Douglas H. Deutschman,
Timothy ANN Hunt,
Karen L. Gordon,
Phoenix Von Hendy
Publication year - 2017
Publication title -
bioscience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.761
H-Index - 209
eISSN - 1525-3244
pISSN - 0006-3568
DOI - 10.1093/biosci/bix090
Subject(s) - rubric , computer science , assertion , process (computing) , quality (philosophy) , process management , data science , management science , psychology , engineering , mathematics education , philosophy , epistemology , programming language , operating system
&NA; Citizen‐science (CS) programs provide a cost‐effective way to collect monitoring data over large temporal and spatial scales. Despite the recent proliferation of these programs, some in the conservation and management community remain skeptical about the quality of information generated, in part because of the lack of a rigorous framework for program evaluation. Drawing from the CS literature, we developed a structured rubric to guide the evaluation of CS programs. We test the utility of the rubric by conducting an internal and external review of a case‐study CS program. The case study demonstrates the importance of the evaluation process and the effectiveness of the rubric to identify program elements that needed improvement. Our results support the assertion that program evaluation using a structured rubric can help CS programs meet their objectives, promote CS data usage in conservation and management, and maximize CS return on investment.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom