z-logo
open-access-imgOpen Access
Shannon revisited: Information in terms of uncertainty
Author(s) -
Cole Charles
Publication year - 1993
Publication title -
journal of the american society for information science
Language(s) - English
Resource type - Journals
eISSN - 1097-4571
pISSN - 0002-8231
DOI - 10.1002/(sici)1097-4571(199305)44:4<204::aid-asi3>3.0.co;2-4
Subject(s) - communication source , information theory , uncertainty reduction theory , entropic uncertainty , entropy (arrow of time) , information transmission , computer science , shannon's source coding theorem , point (geometry) , information diagram , communication theory , mutual information , set (abstract data type) , mathematical economics , mathematics , uncertainty principle , statistics , artificial intelligence , sociology , principle of maximum entropy , telecommunications , communication , physics , computer network , geometry , quantum mechanics , maximum entropy thermodynamics , binary entropy function , quantum , programming language
Shannon's theory of communication is discussed from the point of view of his concept of uncertainty. It is suggested that there are two information concepts in Shannon, two different uncertainties, and at least two different entropy concepts. Information science focuses on the uncertainty associated with the transmission of the signal rather than the uncertainty associated with the selection of a message from a set of possible messages. The author believes the latter information concept, which is from the sender's point of view, has more to say to information science about what information is than the former, which is from the receiver's point of view and is mainly concerned with “noise” reduction. © 1993 John Wiley & Sons, Inc.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here