The Rectified Linear Unit N, or ReLUN, is a modification of ReLU6 activation function that has trainable parameter n.
$$ReLUN(x) = min(max(0, x), n)$$
Source: Trainable Activations for Image ClassificationPaper | Code | Results | Date | Stars |
---|
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |