Sigmoid Activations are a type of activation function for neural networks:
$$f\left(x\right) = \frac{1}{\left(1+\exp\left(-x\right)\right)}$$
Some drawbacks of this activation that have been noted in the literature are: sharp damp gradients during backpropagation from deeper hidden layers to inputs, gradient saturation, and slow convergence.
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Classification | 20 | 2.78% |
Language Modelling | 20 | 2.78% |
Sentence | 18 | 2.50% |
Image Classification | 17 | 2.36% |
Decision Making | 16 | 2.22% |
Management | 16 | 2.22% |
Time Series Forecasting | 16 | 2.22% |
Image-to-Image Translation | 15 | 2.08% |
Image Generation | 14 | 1.94% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |