Eroxl's Notes
Parametric ReLU Function

The parametric ReLU (PReLU) is a variant of the rectified linear function%20Function.md) that introduces a learnable parameter to control the slope of the negative part of the function.

This allows the network to learn the optimal slope for the negative values of the input, rather than using a fixed value like in the leaky ReLU function.

The PReLU function can be defined using the following piecewise function:

where is a learnable parameter that can be updated during training.

Y axisX axis00-2-2-1-1112211Expression 1Expression 2Expression 3alpha equals 0.0 1alpha equals 0.2 5alpha equals 0.5 0Expression 7Expression 8Expression 9α=0.01α=0.01α=0.25α=0.25α=0.50α=0.50

Derivative

This derivative is technically undefined at so usually in practice it is set to either or at .