[1605.07427] Hierarchical Memory Networks - Chandar et. al (2016)(About) > hybrid between hard and soft attention memory networks. The memory is organized in a hierarchical structure such that reading from it is done with less computation than soft attention over a flat memory, while also being easier to train than hard attention over a flat memory
How do RBMs work? - Quora(About) > You can think of it a little bit like you think about Principal Components Analysis, in that it is trained by unsupervised learning so as to capture the leading variations in the data, and it yields a new representation of the data
How does one apply deep learning to time series forecasting? - Quora(About) > I would use the state-of-the-art [recurrent nets](/tag/recurrent_neural_network.html) (using gated units and multiple layers) to make predictions at each time step for some future horizon of interest. The RNN is then updated with the next observation to be ready for making the next prediction
The Consciousness Prior(About) "consciousness seen as the formation of a low-dimensional combination of a few concepts constituting a conscious thought, i.e., consciousness as awareness at a particular time instant": the projection of a big vector (all the things conscious and unconscious in brain). Attention: additional mechanism describing what mind chooses to focus on.