[1503.02406] Deep Learning and the Information Bottleneck Principle (2015)
Tags:
> Deep Neural Networks (DNNs) are analyzed via the theoretical framework of the information bottleneck (IB) principle. We first show that any DNN can be quantified by the mutual information between the layers and the input and output variables. Using this representation we can calculate the optimal information theoretic limits of the DNN.
About This Document
File info