GPT-4 is a transformer based model pre-trained to predict the next token in a document.
Source: GPT-4 Technical ReportPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 80 | 11.96% |
Large Language Model | 48 | 7.17% |
Question Answering | 39 | 5.83% |
Retrieval | 26 | 3.89% |
In-Context Learning | 25 | 3.74% |
Code Generation | 18 | 2.69% |
Benchmarking | 16 | 2.39% |
Sentence | 14 | 2.09% |
Decision Making | 13 | 1.94% |