z-logo
Premium
Influence of Depth on Detection Distance of Low‐Frequency Radio Transmitters in the Ohio River
Author(s) -
Freund Jason G.,
Hartman Kyle J.
Publication year - 2002
Publication title -
north american journal of fisheries management
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.587
H-Index - 72
eISSN - 1548-8675
pISSN - 0275-5947
DOI - 10.1577/1548-8675(2002)022<1301:iododd>2.0.co;2
Subject(s) - biotelemetry , transmitter , telemetry , attenuation , fish <actinopterygii> , environmental science , habitat , signal (programming language) , hydrology (agriculture) , water column , remote sensing , geology , fishery , oceanography , ecology , telecommunications , biology , physics , computer science , geotechnical engineering , optics , programming language , channel (broadcasting)
Radio telemetry is commonly utilized in large, deep bodies of water to assess fish movement and habitat use. A commonly neglected factor in these studies is the influence of signal attenuation on the results and conclusions. Signal attenuation is related to many factors but most importantly to the depth of the transmitter in the water column and water conductivity. While conducting a biotelemetry study within the Ohio River, several fish that had not been detected in prior search periods were detected in later searches. Consequently, we hypothesized that telemetered fish in deep water may not be detected. We conducted an experiment to measure the influence of depth on the maximum distance at which a transmitter could be detected and found that an exponential decay model (distance = 0.9890 × e (0.2005 × depth) ) best explained these data. Our results imply that radio telemetry studies may underestimate use of deepwater habitats by fishes.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here