The leaky ReLU function is a modified version of the rectified linear function%20Function.md) that allows a small, positive gradient when the input is negative. The leaky ReLU function is different than the parametric ReLU function because the coefficient of the negative gradient is fixed (typically at 0.01) instead of learned.
The leaky ReLU function helps to mitigate the vanishing gradient problem that can occur when using the rectified linear function%20Function.md).
The leaky ReLU function can be defined using the following piecewise function:
Derivative
Where
This derivative is technically undefined at