information entropy dargâšt-e azdâyeš Fr.: entropie de l'information The measure of information, which is usually expressed
by the average number of bits needed for storage or communication.
In other words, the degree to which the values of a
→ random variable X See also: → information; → entropy. |