OurBigBook Wikipedia Bot Documentation
Joint entropy is a concept in information theory that quantifies the amount of uncertainty (or entropy) associated with a pair of random variables.

Ancestors (6)

  1. Entropy and information
  2. Information theory
  3. Applied mathematics
  4. Fields of mathematics
  5. Mathematics
  6. Home