Abstract: Emerging problems in data analysis and management create a demand for new methods of data storage, aggregation, and transmission. Systems as diverse as data centers or wireless networks have put forward new requirements to the encoding and representation of digital information. In this talk, we discuss three problems arising from this modern coding-theoretic context: (i) We present coding schemes that minimize communication costs in distributed storage and learning systems. Our schemes were evaluated on Amazon EC2 clusters and were shown to improve upon the state-of-the-art solutions. (ii) We address long-standing coding-theoretic problems regarding Reed-Muller codes. We present advances on the capacity-achieving conjecture as well as a new decoder of Reed-Muller codes which significantly outperforms the state-of-the-art codes adopted in the 5G standard. (iii) We discuss a new method of using neural networks with certain permutation-invariant properties to decode codes with symmetric structures, which suggests a way to construct and efficiently decode the codes with the best possible finite length performance.
Bio: Min Ye received his B.S. in Electrical Engineering from Peking University, Beijing, China in 2012, and his Ph.D. in the Department of Electrical and Computer Engineering, University of Maryland, College Park in 2017. He is currently a postdoctoral researcher at Princeton University. His research interests include coding theory, information theory, differential privacy, and machine learning.
Host: Yury Polyanskiy