Content-based attention is an attention mechanism based on cosine similarity:
$$f_{att}\left(\textbf{h}_{i}, \textbf{s}_{j}\right) = \cos\left[\textbf{h}_{i};\textbf{s}_{j}\right] $$
It was utilised in Neural Turing Machines as part of the Addressing Mechanism.
We produce a normalized attention weighting by taking a softmax over these attention alignment scores.
Source: Neural Turing MachinesPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Decoder | 5 | 9.26% |
Question Answering | 5 | 9.26% |
Speech Recognition | 3 | 5.56% |
Machine Translation | 3 | 5.56% |
Translation | 3 | 5.56% |
Automatic Speech Recognition (ASR) | 2 | 3.70% |
Retrieval | 2 | 3.70% |
Image Classification | 2 | 3.70% |
Sentence | 2 | 3.70% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |