LeakyReLU

LeakyReLU is a variation of the ReLU activation function which allows negative inputs to pass through, although modulated by a small factor. This activation can help mitigate the so-called 'dying ReLU problem' that happens when the prior layer tends to output negative values (which the ReLU activation maps to zero). Formally: LeakyReLU(x) = max(0,x) + c∗min(0,x), for a fixed small positive constant c.
Related concepts:
ReLUActivation