CodeGen is an autoregressive transformers with next-token prediction language modeling as the learning objective trained on a natural language corpus and programming language data curated from GitHub.
Source: CodeGen: An Open Large Language Model for Code with Multi-Turn Program SynthesisPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Code Generation | 17 | 42.50% |
Language Modelling | 5 | 12.50% |
Program Synthesis | 4 | 10.00% |
Memorization | 2 | 5.00% |
In-Context Learning | 2 | 5.00% |
Large Language Model | 2 | 5.00% |
Benchmarking | 2 | 5.00% |
Prompt Engineering | 1 | 2.50% |
Question Answering | 1 | 2.50% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |