Skip Connection Blocks

Residual Block

Introduced by He et al. in Deep Residual Learning for Image Recognition

Residual Blocks are skip-connection blocks that learn residual functions with reference to the layer inputs, instead of learning unreferenced functions. They were introduced as part of the ResNet architecture.

Formally, denoting the desired underlying mapping as $\mathcal{H}({x})$, we let the stacked nonlinear layers fit another mapping of $\mathcal{F}({x}):=\mathcal{H}({x})-{x}$. The original mapping is recast into $\mathcal{F}({x})+{x}$. The $\mathcal{F}({x})$ acts like a residual, hence the name 'residual block'.

The intuition is that it is easier to optimize the residual mapping than to optimize the original, unreferenced mapping. To the extreme, if an identity mapping were optimal, it would be easier to push the residual to zero than to fit an identity mapping by a stack of nonlinear layers. Having skip connections allows the network to more easily learn identity-like mappings.

Note that in practice, Bottleneck Residual Blocks are used for deeper ResNets, such as ResNet-50 and ResNet-101, as these bottleneck blocks are less computationally intensive.

Source: Deep Residual Learning for Image Recognition

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 40 5.86%
Self-Supervised Learning 36 5.27%
Classification 27 3.95%
Image Generation 24 3.51%
Semantic Segmentation 19 2.78%
Image-to-Image Translation 16 2.34%
Super-Resolution 15 2.20%
Translation 14 2.05%
Denoising 12 1.76%

Categories