NIPS 2003 workshop on
Information Theory and Learning:

The Bottleneck and Distortion Approach

The interaction between Shannon’s information theory and learning has a long and fascinating history. Beyond the well studied relationship between compression, short description length, and statistical modeling (e.g. in Rissanen’s MDL principle) there is a deeper interpretation that stems from the duality of source and channel coding. This duality, related also to the notion of coding with “side information” (Wyner 1975), leads to an elegant information variation principle known as the “Information Bottleneck” (Tishby, Pereira, and Bialek 1999).  It has been applied so far mainly as an unsupervised non-parametric data organization technique and has drawn attention in recent years. It has been successfully used in clustering documents, data mining, and neural coding problems. Related approaches include Information Distortion method used in neural coding analysis and Deterministic Annealing used in clustering, compression, regression and other optimization problems. The method is also related to various well known techniques and principles such as Canonical Correlation Analysis (CCA),  IMAX (Becker & Hinton) and Infomax (Linsker).

The workshop provided an overview of the methods and their recent extensions and served as a forum to exchange ideas between various groups which use these techniques. By bringing together theorists and practitioners we hoped to expose and discuss the theoretical developments and improvements in the various algorithms and information theoretic ideas.

 

Workshop Program (Saturday, December 13)

Morning session

Afternoon session

Format

This one day workshop provided an excellent forum for discussing both the theory and algorithms that are related to IB and Information Distortion approach. 

 

We plan to publish a book with the workshop proceedings and beyond – stay tuned!

 

Organizers:

Naftali Tishby, School of Computer Science and Engineering and Center for Neural Computation, The Hebrew University, Jerusalem 91904, Israel. tishby@cs.huji.ac.il, Fax: +972-2-6757330 

Thomas Gedeon, Department of Mathematics, Montana State University, Bozeman, MT, 59715,  gedeon@math.montana.edu Phone: (406)-994-5359, Fax: (406)-994-1789

Keywords

Unsupervised Learning, Complexity – Accuracy tradeoff, Coding with Side Information, Rate Distortion theory,  Matched Source and Channel, Contingency table analysis, Neural Coding, Hierarchical Models, Deterministic Annealing, Canonical Correlation Analysis, Sufficient Dimensionality Reduction, Learning with context, Information Maximization algorithms (Infomax, IMAX)

 

This site will look much better in a browser that supports web standards, but it is accessible to any browser or Internet device.