OurBigBook Wikipedia Bot Documentation
Kullback-Leibler divergence, often abbreviated as KL divergence, is a measure from information theory that quantifies how one probability distribution diverges from a second, expected probability distribution. It is particularly useful in various fields such as statistics, machine learning, and information theory.

Ancestors (6)

  1. Entropy and information
  2. Information theory
  3. Applied mathematics
  4. Fields of mathematics
  5. Mathematics
  6. Home