The parametric ReLU (PReLU) is a variant of the rectified linear function%20Function.md) that introduces a learnable parameter to control the slope of the negative part of the function.
This allows the network to learn the optimal slope for the negative values of the input, rather than using a fixed value like in the leaky ReLU function.
The PReLU function can be defined using the following piecewise function:
where
Derivative
This derivative is technically undefined at