Boltzmann's entropy formula
disul-e dargâšt-e Boltzmann
Fr.: formule d'entropie de Boltzmann
In → statistical thermodynamics, a probability equation relating the → entropy S of an → ideal gas to the quantity Ω, which is the number of → microstates corresponding to a given → macrostate: S = k. ln Ω. Same as → Boltzmann's relation.
1) A measure of the energy that is not available for work during a
→ thermodynamic process.
It is defined by dS = dQ/T, where dS is the differential change in
entropy, dQ is the differential amount of heat introduced to the system in
a → reversible process, and T the
→ absolute temperature of the system.
Entropy remains constant during → reversible processes
and increases during → irreversible processes without ever
decreasing. According to the → second law of thermodynamics, an
→ isolated system evolves toward a state of maximum entropy.
See also → Maxwell's demon.
From Ger. Entropie, coined 1865 by physicist Rudolf Clausius (1822-1888) from Gk. entropia "a turning toward," from en- "in" + trope "a turning, change," related to tropos "a turn, way, manner," from tropein "to turn," from PIE base *trep- "to turn" (cf. L. trepit "he turns").
Dargâšt, from dar "in" + gâšt present stem of gâštan "to cause to revolve, to turn," transitive of gaštan, variant gardidan "to turn, to change" (Mid.Pers. vartitan; Av. varət- "to turn, revolve;" cf. Skt. vartati; L. vertere; O.H.G. werden "to become;" PIE base *wer- "to turn, bend").
Fr.: entropie de l'information
The measure of information, which is usually expressed by the average number of bits needed for storage or communication. In other words, the degree to which the values of a → random variable X are dispersed. If the → probability density function of X is P(x), the entropy is defined by: H(X) = -Σ P(x) log P(x). Also called → Shannon entropy.
maximum entropy method (MEM)
raveš-e dargâšt-e bišiné
Fr.: méthode d'entropie maximum
A deconvolution algorithm which functions by minimizing a smoothness function in an image. The MEM seeks to extract as much information from a measurement as is justified by the data's signal-to-noise ratio.
Fr.: entropie de Shannon