You are here

Pre-c^3 Tutorial

Concentration of Measure Inequalities in Information Theory, Communications and Coding
Maxim Raginsky

Maxim Raginsky received the B.S. and M.S. degrees in 2000 and the Ph.D. degree in 2002 from Northwestern University, all in Electrical Engineering. He has held research positions with Northwestern, the University of Illinois at Urbana-Champaign (where he was a Beckman Foundation Fellow from 2004 to 2007), and Duke University. In 2012, he has returned to the UIUC, where he is currently an Assistant Professor with the Department of Electrical and Computer Engineering and the Coordinated Science Laboratory.
 
Abstract: 
Concentration of Measure Inequalities in Information Theory, Communications and Coding  This monograph is focused on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. Although it is a survey, it also includes various new recent results derived by the authors. The first part of this article introduces some classical concentration inequalities for martingales, and it also derives some recent refinements of these inequalities. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, and some other aspects that are related to wireless communications and coding. The second part of this article introduces the entropy method for deriving concentration inequalities for functions of many independent random variables, and it also exhibits its multiple connections to information theory. The basic ingredients of the entropy method are discussed first in conjunction with the closely related topic of logarithmic Sobolev inequalities. This discussion is complemented by a related viewpoint based on probability in metric spaces. This viewpoint centers around the so-called transportation-cost inequalities, whose roots are in information theory. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, the tutorial addresses several applications of the entropy method and related information-theoretic tools to problems in communications and coding. These include strong converses for several source and channel coding problems, empirical distributions of good channel codes with non-vanishing error probability, and an information-theoretic converse for concentration of measure.
Tutorial will be follwed by the 2nd Workshop on Cognition and Control
Date: 
January 14, 2014
Time: 
3:30-5:00pm
Room: 
409 NEB

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer