Parents:
 Wikipedia Recurrent neural network
NN where **connections between units form a directed cycle**. This creates an **internal state of the network** which allows it to exhibit **dynamic temporal behavior**. Unlike feedforward neural networks, RNNs can use their internal memory to process arbitrary sequences of inputs. This makes them applicable to tasks such as unsegmented connected handwriting recognition or speech recognition. 2 broad classes: finite impulse and infinite impulse (a finite impulse RNN can be unrolled and replaced with a strictly feedforward neural network) RNN in NLP: - Goal: reprenting a sequence of words as dense vectors - input: seq of words (or chars) - ouput: a seq of hidden states with each a representation of the seq from the beginning to a specific posiition - advantages: encoding sequential relationships and dependency among words
ExpandDescendants
15 Documents (Long List
Properties