z-logo
Premium
Bayesian inference in camera trapping studies for a class of spatial capture–recapture models
Author(s) -
Royle J. Andrew,
Karanth K. Ullas,
Gopalaswamy Arjun M.,
Kumar N. Samba
Publication year - 2009
Publication title -
ecology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.144
H-Index - 294
eISSN - 1939-9170
pISSN - 0012-9658
DOI - 10.1890/08-1481.1
Subject(s) - mark and recapture , range (aeronautics) , inference , camera trap , bayesian inference , home range , bayesian probability , computer science , artificial intelligence , class (philosophy) , geography , statistics , ecology , mathematics , habitat , biology , materials science , population , demography , sociology , composite material
We develop a class of models for inference about abundance or density using spatial capture–recapture data from studies based on camera trapping and related methods. The model is a hierarchical model composed of two components: a point process model describing the distribution of individuals in space (or their home range centers) and a model describing the observation of individuals in traps. We suppose that trap‐ and individual‐specific capture probabilities are a function of distance between individual home range centers and trap locations. We show that the models can be regarded as generalized linear mixed models, where the individual home range centers are random effects. We adopt a Bayesian framework for inference under these models using a formulation based on data augmentation. We apply the models to camera trapping data on tigers from the Nagarahole Reserve, India, collected over 48 nights in 2006. For this study, 120 camera locations were used, but cameras were only operational at 30 locations during any given sample occasion. Movement of traps is common in many camera‐trapping studies and represents an important feature of the observation model that we address explicitly in our application.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here