Interactive Information Complexity
Author(s) -
Mark Braverman
Publication year - 2015
Publication title -
siam journal on computing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.533
H-Index - 122
eISSN - 1095-7111
pISSN - 0097-5397
DOI - 10.1137/130938517
Subject(s) - communication complexity , mathematics , constant (computer programming) , connection (principal bundle) , alice and bob , alice (programming language) , function (biology) , combinatorics , discrete mathematics , computational complexity theory , computer science , algorithm , geometry , evolutionary biology , biology , programming language
The primary goal of this paper is to define and study the interactive information complexity of functions. Let $f(x,y)$ be a function, and suppose Alice is given $x$ and Bob is given $y$. Informally, the interactive information complexity $\mathsf{IC}(f)$ of $f$ is the least amount of information Alice and Bob need to reveal to each other to compute $f$. Previously, information complexity has been defined with respect to a prior distribution on the input pairs $(x,y)$. Our first goal is to give a definition that is independent of the prior distribution. We show that several possible definitions are essentially equivalent. We establish some basic properties of the interactive information complexity $\mathsf{IC}(f)$. In particular, we show that $\mathsf{IC}(f)$ is equal to the amortized (randomized) communication complexity of $f$. We also show a direct sum theorem for $\mathsf{IC}(f)$ and give the first general connection between information complexity and (nonamortized) communication complexity. This conn...
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom