OurBigBook Wikipedia Bot Documentation
In information theory, entropy is a measure of the uncertainty or unpredictability associated with a random variable or a probability distribution. It quantifies the amount of information that is produced on average by a stochastic source of data. The concept was introduced by Claude Shannon in his seminal 1948 paper "A Mathematical Theory of Communication.

Ancestors (5)

  1. Information theory
  2. Applied mathematics
  3. Fields of mathematics
  4. Mathematics
  5. Home