The primary goal of statistical thermodynamics (also known as equilibrium statistical mechanics) is to derive the
classical thermodynamics of materials in terms of the properties of their constituent particles and the interactions between them. In other words, statistical thermodynamics provides a connection between the macroscopic properties of materials in
thermodynamic equilibrium, and the microscopic behaviours and motions occurring inside the material. Whereas statistical mechanics proper involves dynamics, here the attention is focused on
statistical equilibrium (steady state). Statistical equilibrium does not mean that the particles have stopped moving (
mechanical equilibrium), rather, only that the ensemble is not evolving.
Fundamental postulate A
sufficient (but not necessary) condition for statistical equilibrium with an isolated system is that the probability distribution is a function only of conserved properties (total energy, total particle numbers, etc.). Other fundamental postulates for statistical mechanics have also been proposed. One such formalism is based on the
fundamental thermodynamic relation together with the following set of postulates: was developed into the theory of
concentration of measure phenomenon, which has applications in many areas of science, from functional analysis to methods of
artificial intelligence and
big data technology. Important cases where the thermodynamic ensembles
do not give identical results include: • Microscopic systems. • Large systems at a phase transition. • Large systems with long-range interactions. In these cases the correct thermodynamic ensemble must be chosen as there are observable differences between these ensembles not just in the size of fluctuations, but also in average quantities such as the distribution of particles. The correct ensemble is that which corresponds to the way the system has been prepared and characterized—in other words, the ensemble that reflects the knowledge about that system. Some examples include the
Bethe ansatz,
square-lattice Ising model in zero field,
hard hexagon model.
Monte Carlo Although some problems in statistical physics can be solved analytically using approximations and expansions, most current research utilizes the large processing power of modern computers to simulate or approximate solutions. A common approach to statistical problems is to use a
Monte Carlo simulation to yield insight into the properties of a
complex system. Monte Carlo methods are important in
computational physics,
physical chemistry, and related fields, and have diverse applications including
medical physics, where they are used to model radiation transport for radiation dosimetry calculations. The
Monte Carlo method examines just a few of the possible states of the system, with the states chosen randomly (with a fair weight). As long as these states form a representative sample of the whole set of states of the system, the approximate characteristic function is obtained. As more and more random samples are included, the errors are reduced to an arbitrarily low level. • The
Metropolis–Hastings algorithm is a classic Monte Carlo method which was initially used to sample the canonical ensemble. •
Path integral Monte Carlo, also used to sample the canonical ensemble.
Other • For rarefied non-ideal gases, approaches such as the
cluster expansion use
perturbation theory to include the effect of weak interactions, leading to a
virial expansion. • For dense fluids, another approximate approach is based on reduced distribution functions, in particular the
radial distribution function. •
Molecular dynamics computer simulations can be used to calculate
microcanonical ensemble averages, in ergodic systems. With the inclusion of a connection to a stochastic heat bath, they can also model canonical and grand canonical conditions. • Mixed methods involving non-equilibrium statistical mechanical results (see below) may be useful. == Non-equilibrium statistical mechanics ==