Log-likelihood (cross-entropy) function
Defined as:
Used as the Loss function for Logistic function
When
Given that it will produce a negative value with greater difference, we would like to maximise it (hence bringing it closer to zero). But as good practice, since we prefer to minimise loss functions, we will multiply the loss function by -1
def log_likelihood_loss(a, b, x, y):
pred = logistic_regression(x, a, b)
return -1 * np.mean(y * n.log(pred) + (1-y) * np.log(1 - pred)