Four Limits in Probability and Their Roles in Source Coding
Author(s) -
Hiroki Koga
Publication year - 2011
Publication title -
ieice transactions on fundamentals of electronics communications and computer sciences
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.188
H-Index - 52
eISSN - 1745-1337
pISSN - 0916-8508
DOI - 10.1587/transfun.e94.a.2073
Subject(s) - converse , probabilistic logic , entropy (arrow of time) , mathematics , kullback–leibler divergence , information theory , upper and lower bounds , coding (social sciences) , limit (mathematics) , mathematical economics , discrete mathematics , computer science , statistical physics , statistics , physics , mathematical analysis , quantum mechanics , geometry
In information-spectrum methods proposed by Han and Verdú, quantities defined by using the limit superior (or inferior) in probability play crucial roles in many problems in information theory. In this paper, we introduce two nonconventional quantities defined in probabilistic ways. After clarifying basic properties of these quantities, we show that the two quantities have operational meaning in the eps-coding problem of a general source in the ordinary and optimistic senses. The two quantities can be used not only for obtaining variations of the strong converse theorem but also establishing upper and lower bounds on the width of the entropy-spectrum. We also show that the two quantities are expressed in terms of the smooth Rényi entropy of order zero
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom