entropy
(noun)
A measure which quantifies the expected value of the information contained in a message.
Examples of entropy in the following topics:
-
The Uniform Distribution
- It is the maximum entropy probability distribution for a random variate $X$ under no constraint other than that it is contained in the distribution's support.
-
Probability Histograms and the Normal Curve
- More generally, velocities of the particles in any system in thermodynamic equilibrium will have normal distribution, due to the maximum entropy principle.
-
The Normal Distribution
- It is also the continuous distribution with the maximum entropy for a given mean and variance.