Search results
Results from the WOW.Com Content Network
The Birnbaum–Saunders distribution, also known as the fatigue life distribution, is a probability distribution used extensively in reliability applications to model failure times. The chi distribution. The noncentral chi distribution; The chi-squared distribution, which is the sum of the squares of n independent Gaussian random variables.
Boltzmann's equation—carved on his gravestone. [1]In statistical mechanics, Boltzmann's equation (also known as the Boltzmann–Planck equation) is a probability equation relating the entropy, also written as , of an ideal gas to the multiplicity (commonly denoted as or ), the number of real microstates corresponding to the gas's macrostate:
Treatments on statistical mechanics [2] [3] define a macrostate as follows: a particular set of values of energy, the number of particles, and the volume of an isolated thermodynamic system is said to specify a particular macrostate of it. In this description, microstates appear as different possible ways the system can achieve a particular ...
The ensemble of microstates comprises a statistical distribution of probability for each microstate, and the group of most probable configurations accounts for the macroscopic state. Therefore, the system can be described as a whole by only a few macroscopic parameters, called the thermodynamic variables : the total energy E , volume V ...
The distribution of the grand canonical ensemble is called generalized Boltzmann distribution by some authors. [ 2 ] Grand ensembles are apt for use when describing systems such as the electrons in a conductor , or the photons in a cavity, where the shape is fixed but the energy and number of particles can easily fluctuate due to contact with a ...
The α-level upper critical value of a probability distribution is the value exceeded with probability , that is, the value such that () =, where is the cumulative distribution function. There are standard notations for the upper critical values of some commonly used distributions in statistics:
Despite the foregoing, there is a difference between the two quantities. The information entropy Η can be calculated for any probability distribution (if the "message" is taken to be that the event i which had probability p i occurred, out of the space of the events possible), while the thermodynamic entropy S refers to thermodynamic probabilities p i specifically.
On the other hand, in a convenient though arbitrary interpretation, "disorder" may be sharply defined as the Shannon entropy of the probability distribution of microstates given a particular macrostate, [11]: 379 in which case the connection of "disorder" to thermodynamic entropy is straightforward, but arbitrary and not immediately obvious to ...