Leaky ReLU function
Leaky rectified linear unit activation function
Similar to ReLU but it allows for a small negative slope for input values
It helps alleviate the Dying ReLU problem
def leaky_ReLU(val, alpha = 0.1):
return np.where(val > 0, val, alpha * val)
y = \max(0.1 * x, x)