Self-Attention
Attention mechanism relating different positions of a sequence in order to compute a representation of the same sequence. Useful in machine reading, abstractive summarization, or image description generation
Related Tags:
 
ExpandDescendants
1 Documents (Long List
Properties