atoms relative to their spacing is shown to scale under 1950
atmospheres of pressure. These room-temperature atoms have a certain average speed (slowed down here two trillion-fold). At any given instant however, a particular helium atom may be moving much faster than average while another may be nearly motionless. Five atoms are colored red to facilitate following their motions. This animation illustrates
statistical mechanics, which is the science of how the group behavior of a large collection of microscopic objects emerges from the kinetic properties of each individual object.
Nature of kinetic energy, translational motion, and temperature The thermodynamic temperature of any
bulk quantity of a substance (a statistically significant quantity of particles) is directly proportional to the mean average kinetic energy of a specific kind of particle motion known as
translational motion. These simple movements in the three -, -, and -axes dimensions of space means the particles move in the three spatial
degrees of freedom. This particular form of kinetic energy is sometimes referred to as
kinetic temperature. Translational motion is but one form of heat energy and is what gives gases not only their temperature, but also their pressure and the vast majority of their volume. This relationship between the temperature, pressure, and volume of gases is established by the
ideal gas law's formula and is embodied in the
gas laws. Though the kinetic energy borne exclusively in the three translational degrees of freedom comprise the thermodynamic temperature of a substance, molecules, as can be seen in
Fig. 3, can have other degrees of freedom, all of which fall under three categories: bond length, bond angle, and rotational. All three additional categories are not necessarily available to all molecules, and even for molecules that
can experience all three, some can be "frozen out" below a certain temperature. Nonetheless, all those degrees of freedom that are available to the molecules under a particular set of conditions contribute to the
specific heat capacity of a substance; which is to say, they increase the amount of heat (kinetic energy) required to raise a given amount of the substance by one kelvin or one degree Celsius. The relationship of kinetic energy, mass, and velocity is given by the formula . Accordingly, particles with one unit of mass moving at one unit of velocity have precisely the same kinetic energy, and precisely the same temperature, as those with four times the mass but half the velocity. The
Boltzmann constant relates the thermodynamic temperature of a gas to the mean kinetic energy of a particle's translational motion: \tilde{E} = \frac{3}{2} k_\text{B} T where: • \tilde{E} is the mean kinetic energy for an individual particle • is the Boltzmann constant • is the thermodynamic temperature of the bulk quantity of the substance '' below. While the Boltzmann constant is useful for finding the mean kinetic energy in a sample of particles, it is important to note that even when a substance is isolated and in
thermodynamic equilibrium (all parts are at a uniform temperature and no heat is going into or out of it), the translational motions of individual atoms and molecules occurs across a wide range of speeds (see animation in
Fig. 1 above). At any one instant, the proportion of particles moving at a given speed within this range is determined by probability as described by the
Maxwell–Boltzmann distribution. The graph shown here in
Fig. 2 shows the speed distribution of 5500 K helium atoms. They have a
most probable speed of 4.780 km/s (0.2092 s/km). However, a certain proportion of atoms at any given instant are moving faster while others are moving relatively slowly; some are momentarily at a virtual standstill (off the -axis to the right). This graph uses
inverse speed for its -axis so the shape of the curve can easily be compared to the curves in
Fig. 5 below. In both graphs, zero on the -axis represents infinite temperature. Additionally, the - and -axes on both graphs are scaled proportionally.
High speeds of translational motion Although very specialized laboratory equipment is required to directly detect translational motions, the resultant collisions by atoms or molecules with small particles suspended in a
fluid produces
Brownian motion that can be seen with an ordinary microscope. The translational motions of elementary particles are
very fast and temperatures close to
absolute zero are required to directly observe them. For instance, when scientists at the
NIST achieved a record-setting low temperature of 700 nK (billionths of a kelvin) in 1994, they used
optical lattice laser equipment to
adiabatically cool
cesium atoms. They then turned off the entrapment lasers and directly measured atom velocities of 7 mm per second in order to calculate their temperature. Formulas for calculating the velocity and speed of translational motion are given in the following footnote. It is neither difficult to imagine atomic motions due to kinetic temperature, nor distinguish between such motions and those due to zero-point energy. Consider the following hypothetical thought experiment, as illustrated in
Fig. 2.5 at left, with an atom that is exceedingly close to absolute zero. Imagine peering through a common optical microscope set to 400 power, which is about the maximum practical magnification for optical microscopes. Such microscopes generally provide fields of view a bit over 0.4 mm in diameter. At the center of the field of view is a single levitated argon atom (argon comprises about 0.93% of air) that is illuminated and glowing against a dark backdrop. If this argon atom was at a beyond-record-setting
one-trillionth of a kelvin above absolute zero, and was moving perpendicular to the field of view towards the right, it would require 13.9 seconds to move from the center of the image to the 200 μm tick mark. As the argon atom slowly moved, the positional jitter due to zero-point energy would be much less than the 200 nm (0.0002 mm) resolution of an optical microscope. Importantly, the atom's translational velocity of 14.43 μm/s constitutes all its retained kinetic energy due to not being precisely at absolute zero. Were the atom
precisely at absolute zero, imperceptible jostling due to zero-point energy would cause it to very slightly wander, but the atom would perpetually be located, on average, at the same spot within the field of view. This is analogous to a boat that has had its motor turned off and is now bobbing slightly in relatively calm and windless ocean waters; even though the boat randomly drifts to and fro, it stays in the same spot in the long term and makes no headway through the water. Accordingly, an atom that was precisely at absolute zero would not be "motionless", and yet, a statistically significant collection of such atoms would have zero net kinetic energy available to transfer to any other collection of atoms. This is because regardless of the kinetic temperature of the second collection of atoms, they too experience the effects of zero-point energy. Such are the consequences of
statistical mechanics and the nature of thermodynamics.
Internal motions of molecules and internal energy '', or internal energy, allowing it to contain more internal energy at the same temperature. As mentioned above, there are other ways molecules can jiggle besides the three translational degrees of freedom that imbue substances with their kinetic temperature. As can be seen in the animation at right,
molecules are complex objects; they are a population of atoms and thermal agitation can strain their internal
chemical bonds in three different ways: via rotation, bond length, and bond angle movements; these are all types of
internal degrees of freedom. This makes molecules distinct from
monatomic substances (consisting of individual atoms) like the
noble gases
helium and
argon, which have only the three translational degrees of freedom (the -, -, and -axes). Kinetic energy is stored in molecules' internal degrees of freedom, which gives them an
internal temperature. Even though these motions are called "internal", the external portions of molecules still move—rather like the jiggling of a stationary
water balloon. This permits the two-way exchange of kinetic energy between internal motions and translational motions with each molecular collision. Accordingly, as internal energy is removed from molecules, both their kinetic temperature (the kinetic energy of translational motion) and their internal temperature simultaneously diminish in equal proportions. This phenomenon is described by the
equipartition theorem, which states that for any bulk quantity of a substance in equilibrium, the kinetic energy of particle motion is evenly distributed among all the active degrees of freedom available to the particles. Since the internal temperature of molecules are usually equal to their kinetic temperature, the distinction is usually of interest only in the detailed study of non-
local thermodynamic equilibrium (LTE) phenomena such as
combustion, the
sublimation of solids, and the
diffusion of hot gases in a partial vacuum. The kinetic energy stored internally in molecules causes substances to contain more heat energy at any given temperature and to absorb additional internal energy for a given temperature increase. This is because any kinetic energy that is, at a given instant, bound in internal motions, is not contributing to the molecules' translational motions at that same instant. This extra kinetic energy simply increases the amount of internal energy that substance absorbs for a given temperature rise. This property is known as a substance's
specific heat capacity. Different molecules absorb different amounts of internal energy for each incremental increase in temperature; that is, they have different specific heat capacities. High specific heat capacity arises, in part, because certain substances' molecules possess more internal degrees of freedom than others do. For instance, room-temperature
nitrogen, which is a
diatomic molecule, has
five active degrees of freedom: the three comprising translational motion plus two rotational degrees of freedom internally. Not surprisingly, in accordance with the equipartition theorem, nitrogen has five-thirds the specific heat capacity per
mole (a specific number of molecules) as do the monatomic gases. Another example is
gasoline (see
table showing its specific heat capacity). Gasoline can absorb a large amount of heat energy per mole with only a modest temperature change because each molecule comprises an average of 21 atoms and therefore has many internal degrees of freedom. Even larger, more complex molecules can have dozens of internal degrees of freedom.
Diffusion of thermal energy: entropy, phonons, and mobile conduction electrons s
. Shown here are phonons with identical amplitudes but with wavelengths ranging from 2 to 12 average inter-molecule separations (a'').
Heat conduction is the diffusion of thermal energy from hot parts of a system to cold parts. A system can be either a single bulk entity or a plurality of discrete bulk entities. The term
bulk in this context means a statistically significant quantity of particles (which can be a microscopic amount). Whenever thermal energy diffuses within an isolated system, temperature differences within the system decrease (and
entropy increases). One particular heat conduction mechanism occurs when translational motion, the particle motion underlying temperature, transfers
momentum from particle to particle in collisions. In gases, these translational motions are of the nature shown above in
Fig. 1. As can be seen in that animation, not only does momentum (heat) diffuse throughout the volume of the gas through serial collisions, but entire molecules or atoms can move forward into new territory, bringing their kinetic energy with them. Consequently, temperature differences equalize throughout gases very quickly—especially for light atoms or molecules;
convection speeds this process even more. Translational motion in
solids, however, takes the form of
phonons (see
Fig. 4 at right). Phonons are constrained, quantized wave packets that travel at the speed of sound of a given substance. The manner in which phonons interact within a solid determines a variety of its properties, including its thermal conductivity. In electrically insulating solids, phonon-based heat conduction is
usually inefficient and such solids are considered
thermal insulators (such as glass, plastic, rubber, ceramic, and rock). This is because in solids, atoms and molecules are locked into place relative to their neighbors and are not free to roam.
Metals however, are not restricted to only phonon-based heat conduction. Thermal energy conducts through metals extraordinarily quickly because instead of direct molecule-to-molecule collisions, the vast majority of thermal energy is mediated via very light, mobile
conduction electrons. This is why there is a near-perfect correlation between metals'
thermal conductivity and their
electrical conductivity. Conduction electrons imbue metals with their extraordinary conductivity because they are
delocalized (i.e., not tied to a specific atom) and behave rather like a sort of quantum gas due to the effects of
zero-point energy (for more on ZPE, see
Note 1 below). Furthermore, electrons are relatively light with a rest mass only that of a
proton. As
Isaac Newton wrote with his
third law of motion, However, a bullet accelerates faster than a rifle given an equal force. Since kinetic energy increases as the square of velocity, nearly all the kinetic energy goes into the bullet, not the rifle, even though both experience the same force from the expanding propellant gases. In the same manner, because they are much less massive, thermal energy is readily borne by mobile conduction electrons. Additionally, because they are delocalized and
very fast, kinetic thermal energy conducts extremely quickly through metals with abundant conduction electrons.
Diffusion of thermal energy: black-body radiation '' above.
Thermal radiation is a byproduct of the collisions arising from various vibrational motions of atoms. These collisions cause the electrons of the atoms to emit thermal
photons (known as
black-body radiation). Photons are emitted anytime an electric charge is accelerated (as happens when electron clouds of two atoms collide). Even
individual molecules with internal temperatures greater than absolute zero also emit black-body radiation from their atoms. In any bulk quantity of a substance at equilibrium, black-body photons are emitted across a range of
wavelengths in a spectrum that has a bell curve-like shape called a
Planck curve (see graph in
Fig. 5 at right). The top of a Planck curve (
the peak emittance wavelength) is located in a particular part of the
electromagnetic spectrum depending on the temperature of the black-body. Substances at extreme
cryogenic temperatures emit at long radio wavelengths whereas extremely hot temperatures produce short
gamma rays (see ). Black-body radiation diffuses thermal energy throughout a substance as the photons are absorbed by neighboring atoms, transferring momentum in the process. Black-body photons also easily escape from a substance and can be absorbed by the ambient environment; kinetic energy is lost in the process. As established by the
Stefan–Boltzmann law, the intensity of black-body radiation increases as the fourth power of absolute temperature. Thus, a black-body at 824 K (just short of glowing dull red) emits 60 times the radiant
power as it does at 296 K (room temperature). This is why one can so easily feel the radiant heat from hot objects at a distance. At higher temperatures, such as those found in an
incandescent lamp, black-body radiation can be the principal mechanism by which thermal energy escapes a system.
Table of thermodynamic temperatures The table below shows various points on the thermodynamic scale, in order of increasing temperature.
Heat of phase changes The kinetic energy of particle motion is just one contributor to the total thermal energy in a substance; another is
phase transitions, which are the
potential energy of molecular bonds that can form in a substance as it cools (such as during
condensing and
freezing). The thermal energy required for a phase transition is called
latent heat. This phenomenon may more easily be grasped by considering it in the reverse direction: latent heat is the energy required to
break chemical bonds (such as during
evaporation and
melting). Almost everyone is familiar with the effects of phase transitions; for instance,
steam at 100 °C can cause severe burns much faster than the 100 °C air from a
hair dryer. This occurs because a large amount of latent heat is liberated as steam condenses into liquid water on the skin. Even though thermal energy is liberated or absorbed during phase transitions, pure
chemical elements,
compounds, and
eutectic alloys exhibit no temperature change whatsoever while they undergo them (see
Fig. 7, below right). Consider one particular type of phase transition: melting. When a solid is melting,
crystal lattice chemical bonds are being broken apart; the substance is transitioning from what is known as a
more ordered state to a
less ordered state. In
Fig. 7, the melting of ice is shown within the lower left box heading from blue to green. At one specific thermodynamic point, the
melting point (which is 0 °C across a wide pressure range in the case of water), all the atoms or molecules are, on average, at the maximum energy threshold their chemical bonds can withstand without breaking away from the lattice. Chemical bonds are all-or-nothing forces: they either hold fast, or break; there is no in-between state. Consequently, when a substance is at its melting point, every
joule of added thermal energy only breaks the bonds of a specific quantity of its atoms or molecules, converting them into a liquid of precisely the same temperature; no kinetic energy is added to translational motion (which is what gives substances their temperature). The effect is rather like
popcorn: at a certain temperature, additional thermal energy cannot make the kernels any hotter until the transition (popping) is complete. If the process is reversed (as in the freezing of a liquid), thermal energy must be removed from a substance. As stated above, the thermal energy required for a phase transition is called
latent heat. In the specific cases of melting and freezing, it is called
enthalpy of fusion or
heat of fusion. If the molecular bonds in a crystal lattice are strong, the heat of fusion can be relatively great, typically in the range of 6 to 30 kJ per mole for water and most of the metallic elements. If the substance is one of the monatomic gases (which have little tendency to form molecular bonds) the heat of fusion is more modest, ranging from 0.021 to 2.3 kJ per mole. Relatively speaking, phase transitions can be truly energetic events. To completely melt ice at 0 °C into water at 0 °C, one must add roughly 80 times the thermal energy as is required to increase the temperature of the same mass of liquid water by one degree Celsius. The metals' ratios are even greater, typically in the range of 400 to 1200 times. The phase transition of
boiling is much more energetic than freezing. For instance, the energy required to completely boil or vaporize water (what is known as
enthalpy of vaporization) is roughly 540 times that required for a one-degree increase. Water's sizable enthalpy of vaporization is why one's skin can be burned so quickly as steam condenses on it (heading from red to green in
Fig. 7 above); water vapors (gas phase) are liquefied on the skin with releasing a large amount of energy (enthalpy) to the environment including the skin, resulting in skin damage. In the opposite direction, this is why one's skin feels cool as liquid water on it evaporates (a process that occurs at a sub-ambient
wet-bulb temperature that is dependent on
relative humidity); the water evaporation on the skin takes a large amount of energy from the environment including the skin, reducing the skin temperature. Water's highly energetic enthalpy of vaporization is also an important factor underlying why
solar pool covers (floating, insulated blankets that cover
swimming pools when the pools are not in use) are so effective at reducing heating costs: they prevent evaporation. (In other words, taking energy from water when it is evaporated is limited.) For instance, the evaporation of just 20 mm of water from a 1.29 m-deep pool chills its water .
Internal energy The total energy of all translational and internal particle motions, including that of conduction electrons, plus the potential energy of phase changes, plus
zero-point energy and black-body radiation's peak emittance wavelength increases (the photons' energy decreases). When particles of a substance are as close as possible to complete rest and retain only ZPE (zero-point energy)-induced quantum mechanical motion, the substance is at the temperature of absolute zero ( = 0). even when exceedingly close to absolute zero; it will not freeze unless under 25 bar of pressure (c. 25 atmospheres). Whereas absolute zero is the point of zero thermodynamic temperature and is also the point at which the particle constituents of matter have minimal motion, absolute zero is not necessarily the point at which a substance contains zero internal energy; one must be very precise with what one means by
internal energy. Often, all the phase changes that
can occur in a substance,
will have occurred by the time it reaches absolute zero. However, this is not always the case. Notably, = 0
helium remains liquid at room pressure (
Fig. 9 at right) and must be under a pressure of at least to crystallize. This is because helium's heat of fusion (the energy required to melt helium ice) is so low (only 21 joules per mole) that the motion-inducing effect of zero-point energy is sufficient to prevent it from freezing at lower pressures. A further complication is that many solids change their crystal structure to more compact arrangements at extremely high pressures (up to millions of bars, or hundreds of gigapascals). These are known as
solid–solid phase transitions wherein latent heat is liberated as a crystal lattice changes to a more thermodynamically favorable, compact one. The above complexities make for rather cumbersome blanket statements regarding the internal energy in = 0 substances. Regardless of pressure though, what
can be said is that at absolute zero, all solids with a lowest-energy crystal lattice such those with a
closest-packed arrangement (see
Fig. 8, above left) contain minimal internal energy, retaining only that due to the ever-present background of zero-point energy. One can also say that for a given substance at constant pressure, absolute zero is the point of lowest
enthalpy (a measure of work potential that takes internal energy, pressure, and volume into consideration). Lastly, all = 0 substances contain zero kinetic thermal energy. == Practical applications for thermodynamic temperature ==