Search results
Results from the WOW.Com Content Network
The number of microstates Ω that a closed system can occupy is proportional to its phase space volume: () = (()) = where (()) is an Indicator function. It is 1 if the Hamilton function H ( x ) at the point x = ( q , p ) in phase space is between U and U + δU and 0 if not.
The large number of particles of the gas provides an infinite number of possible microstates for the sample, but collectively they exhibit a well-defined average of configuration, which is exhibited as the macrostate of the system, to which each individual microstate contribution is negligibly small.
Boltzmann's equation—carved on his gravestone. [1]In statistical mechanics, Boltzmann's equation (also known as the Boltzmann–Planck equation) is a probability equation relating the entropy, also written as , of an ideal gas to the multiplicity (commonly denoted as or ), the number of real microstates corresponding to the gas's macrostate:
If all the microstates are equiprobable (a microcanonical ensemble), the statistical thermodynamic entropy reduces to the form, as given by Boltzmann, = , where W is the number of microstates that corresponds to the macroscopic thermodynamic state. Therefore S depends on temperature.
(a) Single possible configuration for a system at absolute zero, i.e., only one microstate is accessible. Thus S = k ln W = 0. (b) At temperatures greater than absolute zero, multiple microstates are accessible due to atomic vibration (exaggerated in the figure). Since the number of accessible microstates is greater than 1, S = k ln W > 0.
N i is the expected number of particles in the single-particle microstate i, N is the total number of particles in the system, E i is the energy of microstate i, the sum over index j takes into account all microstates, T is the equilibrium temperature of the system, k B is the Boltzmann constant.
According to the fundamental postulate of statistical mechanics (which states that all attainable microstates of a system are equally probable), the probability p i will be inversely proportional to the number of microstates of the total closed system (S, B) in which S is in microstate i with energy E i.
Thus, an increase in entropy means a greater number of microstates for the final state than for the initial state, and hence more possible arrangements of a system's total energy at any one instant. Here, the greater 'dispersal of the total energy of a system' means the existence of many possibilities. [citation needed] [11]