The Shifted Rectified Linear Unit, or ShiLU, is a modification of ReLU activation function that has trainable parameters.
$$ShiLU(x) = \alpha ReLU(x) + \beta$$