Reception to follow.
In this tutorial talk, I revisit the teaching of the fundamental limits in lossless compression in information theory. Traditionally, the emphasis is on the relationship of entropy with the minimal compression rate achievable with two different paradigms: 1) Variable-length symbol-by-symbol prefix lossless compression; 2) Fixed-to-fixed almost-lossless compression. I will talk about a third and more fundamental approach which leads to a natural measure of information. The action is on the bridge between that measure and the probabilistic measure of information, whose average is the entropy.
Currently on leave at MIT, Sergio Verdú is the Eugene Higgins Professor of Electrical Engineering at Princeton University. A member of the National Academy of Engineering, he is the recipient of the 2007 Claude E. Shannon Award and the 2008 IEEE Richard W. Hamming Medal.