Fr.: entropie de l'information
The measure of information, which is usually expressed by the average number of bits needed for storage or communication. In other words, the degree to which the values of a → random variable X are dispersed. If the → probability density function of X is P(x), the entropy is defined by: H(X) = -Σ P(x) log P(x). Also called → Shannon entropy.
→ information; → entropy.