Parents:
 Wikipedia Recurrent neural network
the natural architecture of NN to deal with sequences. NN where **connections between units form a directed cycle**. This creates an **internal state of the network** which allows it to exhibit **dynamic temporal behavior**. Unlike feedforward neural networks, RNNs can use their internal memory to process arbitrary sequences of inputs. This makes them applicable to tasks such as unsegmented connected handwriting recognition or speech recognition. 2 broad classes: finite impulse and infinite impulse (a finite impulse RNN can be unrolled and replaced with a strictly feedforward neural network) RNNs suffer from the **vanishing gradient problem** that prevents them from learning long-range dependencies. [#LSTMs](/tag/lstm_networks) improve upon this by using a gating mechanism that allows for explicit memory deletes and updates. RNN in NLP: - Goal: reprenting a sequence of words as dense vectors - input: seq of words (or chars) - ouput: a seq of hidden states with each a representation of the seq from the beginning to a specific posiition - advantages: encoding sequential relationships and dependency among words
ExpandDescendants
14 Documents (Long List
Properties