On the Information Bottleneck Theory of Deep Learning

Date:

This is a presentation of other peoples’ papers in course UNL CSCE 990: Deep Learning Seminar lectured by Prof. Stepthen Scott and Prof. Vinodchandran N. Variyam in Fall 2018.

This browser does not support PDFs. Please download the PDF to view it: Download PDF.

</embed>

Download Slides Here

Related Papers

  1. A. M. Saxe, Y. Bansal, J. Dapello, M. Advani, A. Kolchinsky, B.D. Tracey, and D.D. Cox, “On the Information Bottleneck Theory of Deep Learning”, ICLR 2018, [Online] https://openreview.net/forum?id=ry_WPG-A-
  2. R. Schwartz-Ziv and N. Tishby. Opening the black box of deep neural networks via information. arXiv preprint, 2017 [Online]https://arxiv.org/abs/1703.00810
  3. Michal Moshkovich and Naftali Tishby. Mixing complexity and its applications to neural networks. 2017. URL https://arxiv.org/abs/1703.00729
  4. Naftali Tishby and Noga Zaslavsky. Deep Learning and the information Bottleneck Principle. In Information Theory Workshop (ITW), 2015 IEEE, Pages 1-5. IEEE, 2015
  5. Naftali Tishby, Fernando C. Pereira, and William Bialek. The information bottleneck Method. In Proceedings of the 37-th Annual Allerton Conference on Communication, Control and Computing, 1999.