Regularization

Dropout is a regularization technique for neural networks that drops a unit (along with connections) at training time with a specified probability $p$ (a common value is $p=0.5$). At test time, all units are present, but with weights scaled by $p$ (i.e. $w$ becomes $pw$).

The idea is to prevent co-adaptation, where the neural network becomes too reliant on particular connections, as this could be symptomatic of overfitting. Intuitively, dropout can be thought of as creating an implicit ensemble of neural networks.

Source: Dropout: A Simple Way to Prevent Neural Networks from Overfitting

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 50 6.36%
Retrieval 30 3.82%
Question Answering 28 3.56%
Large Language Model 25 3.18%
Semantic Segmentation 23 2.93%
In-Context Learning 15 1.91%
Object Detection 14 1.78%
Sentence 12 1.53%
Image Segmentation 10 1.27%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories