On the Information Bottleneck Theory of Deep Learning
Date:
This is a presentation of other peoples’ papers in course UNL CSCE 990: Deep Learning Seminar lectured by Prof. Stepthen Scott and Prof. Vinodchandran N. Variyam in Fall 2018.
Related Papers
- A. M. Saxe, Y. Bansal, J. Dapello, M. Advani, A. Kolchinsky, B.D. Tracey, and D.D. Cox, “On the Information Bottleneck Theory of Deep Learning”, ICLR 2018, [Online] https://openreview.net/forum?id=ry_WPG-A-
- R. Schwartz-Ziv and N. Tishby. Opening the black box of deep neural networks via information. arXiv preprint, 2017 [Online]https://arxiv.org/abs/1703.00810
- Michal Moshkovich and Naftali Tishby. Mixing complexity and its applications to neural networks. 2017. URL https://arxiv.org/abs/1703.00729
- Naftali Tishby and Noga Zaslavsky. Deep Learning and the information Bottleneck Principle. In Information Theory Workshop (ITW), 2015 IEEE, Pages 1-5. IEEE, 2015
- Naftali Tishby, Fernando C. Pereira, and William Bialek. The information bottleneck Method. In Proceedings of the 37-th Annual Allerton Conference on Communication, Control and Computing, 1999.