Single-Headed Attention is a single-headed attention module used in the SHA-RNN language model. The principle design reasons for single-headedness were simplicity (avoiding running out of memory) and scepticism about the benefits of using multiple heads.
Source: Single Headed Attention RNN: Stop Thinking With Your HeadPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Sentence | 2 | 25.00% |
Arabic Text Diacritization | 1 | 12.50% |
Object Detection | 1 | 12.50% |
Semantic Part Detection | 1 | 12.50% |
Language Modelling | 1 | 12.50% |
Aspect-Based Sentiment Analysis (ABSA) | 1 | 12.50% |
Sentiment Analysis | 1 | 12.50% |