Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
DY: Fachverband Dynamik und Statistische Physik
DY 47: Statistical Physics: General I
DY 47.11: Vortrag
Donnerstag, 12. März 2026, 12:15–12:30, ZEU/0114
Temperature as joules per bit — Charles Alexandre Bédard1, •Sophie Berthelette2, Xavier Coiteux-Roy3, and Stefan Wolf2,4 — 1École de technologie supérieure, Montreal, Canada — 2Università della Svizzera italiana, Lugano, Switzerland — 3University of Calgary, Calgary, Canada — 4Facoltà indipendente di Gandria, Gandria, Switzerland
In statistical mechanics, entropy is defined as a fundamental quantity. However, its unit, J/K, involves that of temperature, which is only subsequently defined --- and defined in terms of entropy. This circularity arises with the introduction of Boltzmann's constant into the very expression of entropy. The J/K carried by the constant prevents entropy from finding a unit of its own while simultaneously obfuscating its informational nature. Following the precepts of information theory, we argue that entropy is well measured in bits and coincides with information capacity at thermodynamic equilibrium. Consequently, not only is the temperature of a system in equilibrium expressed in J/bit, but it acquires a clear meaning: It is the cost in energy to increase its information capacity by 1 bit. Viewing temperature as joules per bit uncovers the strong duality exhibited by Gibbs long ago between available capacity and free energy. It also simplifies Landauer's cost and clarifies that it is a cost of displacement, not of erasure. Replacing the kelvin with the bit as an SI unit would remove Boltzmann's constant from the seven defining constants.
Keywords: International system of units (SI); Entropy; Temperature; Information theory; Landauer's principle