Location-based Attention is an attention mechanism in which the alignment scores are computed from solely the target hidden state $\mathbf{h}_{t}$ as follows:
$$ \mathbf{a}_{t} = \text{softmax}(\mathbf{W}_{a}\mathbf{h}_{t}) $$
Source: Effective Approaches to Attention-based Neural Machine TranslationPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Question Answering | 6 | 8.57% |
Retrieval | 5 | 7.14% |
Translation | 5 | 7.14% |
Decoder | 4 | 5.71% |
Machine Translation | 4 | 5.71% |
Sentence | 3 | 4.29% |
Language Modelling | 3 | 4.29% |
Text Generation | 2 | 2.86% |
Automatic Speech Recognition (ASR) | 2 | 2.86% |