Max Wolf's Second Brain

Home

❯

general

❯

negative log likelihood loss

negative log-likelihood loss

Jul 30, 20251 min read

The negative log likelihood loss (aka surprise, aka categorical cross-entropy loss):

Li​=−log(pyi,k​​)

where pyi​​ is the predicted probability of the true class label yi,k​, for the ith example.

center
This is a simpler version of cross-entropy loss, also called categorical cross-entropy.

center

References

Understanding softmax and the negative log-likelihood


Graph View

Backlinks

  • cross-entropy loss
  • loss

Created with Quartz v4.5.1 © 2025

  • GitHub
  • Discord Community