TY - JOUR
T1 - Mutual Information Bounds via Adjacency Events
AU - Han, Yanjun
AU - Ordentlich, Or
AU - Shayevitz, Ofer
N1 - Funding Information:
Y. Han was supported by the NSF Center for Science of Information under Grant CCF-0939370. O. Ordentlich was supported by the MIT-Technion postdoctoral fellowship. O. Shayevitz was supported in part by an ERC under Grant 639573, and an ISF under Grant 1367/14.
Publisher Copyright:
© 1963-2012 IEEE.
PY - 2016/11
Y1 - 2016/11
N2 - The mutual information between two jointly distributed random variables X and Y is a functional of the joint distribution PXY, which is sometimes difficult to handle or estimate. A coarser description of the statistical behavior of (X, Y) is given by the marginal distributions PX, PY and the adjacency relation induced by the joint distribution, where x and y are adjacent if P(x, y) > 0. We derive a lower bound on the mutual information in terms of these entities. The bound is obtained by viewing the channel from X to Y as a probability distribution on a set of possible actions, where an action determines the output for any possible input, and is independently drawn. We also provide an alternative proof based on convex optimization that yields a generally tighter bound. Finally, we derive an upper bound on the mutual information in terms of adjacency events between the action and the pair (X, Y), where in this case, an action a and a pair (x, y) are adjacent if y = a(x). As an example, we apply our bounds to the binary deletion channel and show that for the special case of an independent identically distributed input distribution and a range of deletion probabilities, our lower and upper bounds both outperform the best known bounds for the mutual information.
AB - The mutual information between two jointly distributed random variables X and Y is a functional of the joint distribution PXY, which is sometimes difficult to handle or estimate. A coarser description of the statistical behavior of (X, Y) is given by the marginal distributions PX, PY and the adjacency relation induced by the joint distribution, where x and y are adjacent if P(x, y) > 0. We derive a lower bound on the mutual information in terms of these entities. The bound is obtained by viewing the channel from X to Y as a probability distribution on a set of possible actions, where an action determines the output for any possible input, and is independently drawn. We also provide an alternative proof based on convex optimization that yields a generally tighter bound. Finally, we derive an upper bound on the mutual information in terms of adjacency events between the action and the pair (X, Y), where in this case, an action a and a pair (x, y) are adjacent if y = a(x). As an example, we apply our bounds to the binary deletion channel and show that for the special case of an independent identically distributed input distribution and a range of deletion probabilities, our lower and upper bounds both outperform the best known bounds for the mutual information.
KW - Mutual information bounds
KW - alternating minimization
KW - deletion channel
KW - functional representation
UR - http://www.scopus.com/inward/record.url?scp=85027352553&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85027352553&partnerID=8YFLogxK
U2 - 10.1109/TIT.2016.2609390
DO - 10.1109/TIT.2016.2609390
M3 - Article
AN - SCOPUS:85027352553
SN - 0018-9448
VL - 62
SP - 6068
EP - 6080
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 11
M1 - 7567587
ER -