Macroscopically, the
ideal gas law states that, for an
ideal gas, the product of
pressure and
volume is proportional to the product of
amount of substance and
absolute temperature : pV = nRT , where is the
molar gas constant (). Introducing the Boltzmann constant as the gas constant per molecule ( being the
Avogadro constant) transforms the ideal gas law into an alternative form: p V = N k T , where is the
number of molecules of gas.
Role in the equipartition of energy Given a
thermodynamic system at an
absolute temperature , the average thermal energy carried by each microscopic
degree of freedom in the system is (i.e., about , or , at room temperature). This is generally true only for classical systems with a
large number of particles. In
classical statistical mechanics, this average is predicted to hold exactly for homogeneous
ideal gases. Monatomic ideal gases (the six noble gases) possess three degrees of freedom per atom, corresponding to the three spatial directions. According to the equipartition of energy this means that there is a thermal energy of per atom. This corresponds very well with experimental data. The thermal energy can be used to calculate the
root-mean-square speed of the atoms, which turns out to be inversely proportional to the square root of the
atomic mass. The root mean square speeds found at room temperature accurately reflect this, ranging from for
helium, down to for
xenon.
Kinetic theory gives the average pressure for an ideal gas as p = \frac{1}{3}\frac{N}{V} m \overline{v^2}. Combination with the ideal gas law p V = N k T shows that the average translational kinetic energy is \tfrac{1}{2}m \overline{v^2} = \tfrac{3}{2} k T. Considering that the translational motion velocity vector has three degrees of freedom (one for each dimension) gives the average energy per degree of freedom equal to one third of that, i.e. . The ideal gas equation is also obeyed closely by molecular gases; but the form for the heat capacity is more complicated, because the molecules possess additional internal degrees of freedom, as well as the three degrees of freedom for movement of the molecule as a whole. Diatomic gases, for example, possess a total of six degrees of simple freedom per molecule that are related to atomic motion (three translational, two rotational, and one vibrational). At lower temperatures, not all these degrees of freedom may fully participate in the gas heat capacity, due to quantum mechanical limits on the availability of excited states at the relevant thermal energy per molecule.
Role in Boltzmann factors More generally, systems in equilibrium at temperature have probability of occupying a state with energy weighted by the corresponding
Boltzmann factor: P_i \propto \frac{\exp\left(-\frac{E}{k T}\right)}{Z}, where is the
partition function. Again, it is the energy-like quantity that takes central importance. Consequences of this include (in addition to the results for ideal gases above) the
Arrhenius equation in
chemical kinetics.
Role in the statistical definition of entropy , Vienna, with bust and entropy formula. In statistical mechanics, the
entropy of an
isolated system at
thermodynamic equilibrium is defined as the
natural logarithm of , the number of distinct microscopic states available to the system given the macroscopic constraints (such as a fixed total energy ): S = k \,\ln W. This equation, which relates the microscopic details, or microstates, of the system (via ) to its macroscopic state (via the entropy ), is the central idea of statistical mechanics. Such is its importance that it is inscribed on Boltzmann's tombstone. The constant of proportionality serves to make the statistical mechanical entropy equal to the classical thermodynamic entropy of
Clausius: \Delta S = \int \frac{{\rm d}Q}{T}. One could choose instead a rescaled
dimensionless entropy in microscopic terms such that {S' = \ln W}, \quad \Delta S' = \int \frac{\mathrm{d}Q}{k T}. This is a more natural form and this rescaled entropy corresponds exactly to Shannon's
information entropy. The characteristic energy is thus the energy required to increase the rescaled entropy by one
nat.
Thermal voltage In
semiconductors, the
Shockley diode equation—the relationship between the flow of
electric current and the
electrostatic potential across a
p–n junction—depends on a characteristic voltage called the
thermal voltage, denoted by . The thermal voltage depends on absolute temperature as V_\mathrm{T} = { k T \over q } = { R T \over F }, where is the magnitude of the
electrical charge on the electron with a value Equivalently, { V_\mathrm{T} \over T } = { k \over q } \approx 8.617333262 \times 10^{-5}\ \mathrm{V/K}. At
room temperature , is approximately , which can be derived by plugging in the values as follows: V_\mathrm{T}={kT \over q} =\frac{1.38\times 10^{-23}\ \mathrm{J{\cdot}K^{-1}} \times 300\ \mathrm{K}}{1.6 \times 10^{-19}\ \mathrm{C}} \simeq 25.85\ \mathrm{mV} At the
standard state temperature of , it is approximately . The thermal voltage is also important in plasmas and electrolyte solutions (e.g. the
Nernst equation); in both cases it provides a measure of how much the spatial distribution of electrons or ions is affected by a boundary held at a fixed voltage. == History ==