Reception to follow.
F. Brooks argued in his 2003 JACM paper on the challenges of computer sciences that there is ``no theory that gives us a metric for the information embodied in structure’’. C. Shannon himself noticed this fifty years earlier in his 1953 paper. More generally, we lack an information theory of data structures (e.g., graphs, sets, social networks, chemical structures, biological networks). In this talk, we present some recent research results on structural information. We first propose some fundamental limits of information content for a wide range of data structures with correlated labels and then propose asymptotically optimal lossless compression algorithms achieving these limits for unlabeled graphs. Then we move to Markov fields and try to understand structural properties of large systems with local mutual dependencies and interaction. In particular, we focus on enumerating Markov field and universal types. We tackle most of these problems by complex analysis methods, thus within the realm of analytic information theory.
Wojciech Szpankowski is Saul Rosen Professor of Computer Science and Electrical and Computer Engineering at Purdue University. He received his M.S. and Ph.D. degrees in Electrical and Computer Engineering from Gdansk University of Technology. He held several Visiting Professor/Scholar positions, including Stanford, Hewlett-Packard Labs, INRIA, Ecole Polytechnique, the Newton Institute, Cambridge, UK, and ETH, Zurich. He is a Fellow of IEEE, the Erskine Fellow, and 2010 recipient of the Humboldt Research Award. His research interests cover analysis of algorithms, information theory, bioinformatics, analytic combinatorics, and stability problems of distributed systems. He published the book “Average Case Analysis of Algorithms on Sequences”, John Wiley & Sons, 2001. Szpankowski has been a guest editor and an editor of several technical journals, including ACM Transaction on Algorithms, Algorithmica, IEEE Transactions on Information Theory, and Combinatorics, Probability, and Computing. In 2008 he launched the interdisciplinary Institute for Science of Information whose mission is to extend classical information theory to modern settings, including knowledge discovery and information extraction from massive datasets. He is the Director of the NSF Science and Technology Center for Science of Information.