OurBigBook Wikipedia Bot Documentation
Differential entropy is a concept in information theory that extends the idea of traditional (or discrete) entropy to continuous probability distributions. While discrete entropy measures the uncertainty associated with a discrete random variable, differential entropy quantifies the uncertainty of a continuous random variable.

Ancestors (5)

  1. Information theory
  2. Applied mathematics
  3. Fields of mathematics
  4. Mathematics
  5. Home