In-Context Learning

441 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find In-Context Learning models and implementations
2 papers
18,335
2 papers
7,176
2 papers
6,574
See all 7 libraries.

Most implemented papers

TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second

automl/tabpfn 5 Jul 2022

We present TabPFN, a trained Transformer that can do supervised classification for small tabular datasets in less than a second, needs no hyperparameter tuning and is competitive with state-of-the-art classification methods.

Neural Codec Language Models are Zero-Shot Text to Speech Synthesizers

microsoft/unilm 5 Jan 2023

In addition, we find Vall-E could preserve the speaker's emotion and acoustic environment of the acoustic prompt in synthesis.

From system models to class models: An in-context learning paradigm

forgi86/sysid-neural-transformers 25 Aug 2023

Is it possible to understand the intricacies of a dynamical system not solely from its input/output pattern, but also by observing the behavior of other systems within the same class?

PanGu-$α$: Large-scale Autoregressive Pretrained Chinese Language Models with Auto-parallel Computation

mindspore-ai/models 26 Apr 2021

To enhance the generalization ability of PanGu-$\alpha$, we collect 1. 1TB high-quality Chinese data from a wide range of domains to pretrain the model.

Data Distributional Properties Drive Emergent In-Context Learning in Transformers

deepmind/emergent_in_context_learning 22 Apr 2022

In further experiments, we found that naturalistic data distributions were only able to elicit in-context learning in transformers, and not in recurrent models.

OpenICL: An Open-Source Framework for In-context Learning

shark-nlp/openicl 6 Mar 2023

However, the implementation of ICL is sophisticated due to the diverse retrieval and inference methods involved, as well as the varying pre-processing requirements for different models, datasets, and tasks.

What needs to go right for an induction head? A mechanistic study of in-context learning circuits and their formation

aadityasingh/icl-dynamics 10 Apr 2024

By clamping subsets of activations throughout training, we then identify three underlying subcircuits that interact to drive IH formation, yielding the phase change.

What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers

kakaobrain/kogpt EMNLP 2021

GPT-3 shows remarkable in-context learning ability of large-scale language models (LMs) trained on hundreds of billion scale data.

MetaICL: Learning to Learn In Context

facebookresearch/metaicl NAACL 2022

We introduce MetaICL (Meta-training for In-Context Learning), a new meta-training framework for few-shot learning where a pretrained language model is tuned to do in-context learning on a large set of training tasks.

Learning To Retrieve Prompts for In-Context Learning

ohadrubin/epr NAACL 2022

In-context learning is a recent paradigm in natural language understanding, where a large pre-trained language model (LM) observes a test instance and a few training examples as its input, and directly decodes the output without any update to its parameters.