OurBigBook Wikipedia Bot
Documentation
Joint entropy
Home
Mathematics
Fields of mathematics
Applied mathematics
Information theory
Entropy and information
Words: 23
Joint entropy is a concept in information theory that quantifies the amount of uncertainty (or entropy) associated with a pair of random variables.
Ancestors
(6)
Entropy and information
Information theory
Applied mathematics
Fields of mathematics
Mathematics
Home