Softmax function

An activation function that ensures all output values fall within the range [0,1] and their sum equal 1.

pi=eyijeyj

Alternatively,

pi=s(yi,yi)pi=s(yi,yi)=eyik=0Keyk

The yi notation consists of every element in vector Y except yi:

yi=(y1,...,yi1,yi+1,...,yK)

This results in a transformation of the vector of values from

Y=(y0,y1,...,yk)$$into:

P = (p_0, p_1, ..., p_k)

where $\forall i$, $p_i \geq 0$, $\sum^{K}_{i=0}p_i = 1$ ```python def softmax(x): # Subtract the maximum value from each element of the input vector x # to avoid numerical instability (this is optional, but equivalent) x = x - np.max(x) # Compute the exponent of each element exp_x = np.exp(x) # Normalize the exponentiated values by their sum return exp_x / np.sum(exp_x) ``` We can take the highest value in the output vector of values when using it for Multi-class classification