The DELU is a type of activation function that has trainable parameters, uses the complex linear and exponential functions in the positive dimension and uses the SiLU in the negative dimension.
$$DELU(x) = SiLU(x), x \leqslant 0$$ $$DELU(x) = (n + 0.5)x + |e^{-x} - 1|, x > 0$$
Source: Trainable Activations for Image ClassificationPaper | Code | Results | Date | Stars |
---|