BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total).
Source: BLOOM: A 176B-Parameter Open-Access Multilingual Language ModelPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 15 | 12.10% |
Question Answering | 8 | 6.45% |
Machine Translation | 5 | 4.03% |
Quantization | 5 | 4.03% |
Translation | 5 | 4.03% |
Text Generation | 4 | 3.23% |
Large Language Model | 4 | 3.23% |
Benchmarking | 3 | 2.42% |
Mathematical Reasoning | 2 | 1.61% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |