Entropy (statistical thermodynamics): Difference between revisions

Content deleted Content added
m "microstates" was used in the intro without definition; replaced by "microscopic states"
some cleanup
 
(6 intermediate revisions by 6 users not shown)
Line 3:
The concept '''entropy''' was first developed by German physicist [[Rudolf Clausius]] in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In [[statistical mechanics]], [[entropy]] is formulated as a statistical property using [[probability theory]]. The '''statistical entropy''' perspective was introduced in 1870 by Austrian physicist [[Ludwig Boltzmann]], who established a new field of [[physics]] that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute [[thermodynamic system]]s.
 
== Boltzmann's principle ==
{{main|Boltzmann's entropy formula}}
Ludwig Boltzmann defined entropy as a measure of the number of possible microscopic states (''microstates'') of a system in [[thermodynamic equilibrium]], consistent with its macroscopic thermodynamic properties, which constitute the ''macrostate'' of the system. A useful illustration is the example of a sample of gas contained in a container. The easily measurable parameters volume, pressure, and temperature of the gas describe its macroscopic condition (''state''). At a microscopic level, the gas consists of a vast number of freely moving [[atom]]s or [[molecule]]s, which randomly collide with one another and with the walls of the container. The collisions with the walls produce the macroscopic pressure of the gas, which illustrates the connection between microscopic and macroscopic phenomena.
Line 11:
Equilibrium may be illustrated with a simple example of a drop of food coloring falling into a glass of water. The dye diffuses in a complicated manner, which is difficult to precisely predict. However, after sufficient time has passed, the system reaches a uniform color, a state much easier to describe and explain.
 
Boltzmann formulated a simple relationship between entropy and the number of possible microstates of a system, which is denoted by the symbol ''Ω''. The entropy ''S'' is [[proportionality (mathematics)|proportional]] to the [[natural logarithm]] of this number:
: <math>S = k_\text{B} \ln \Omega</math>
The proportionality constant ''k''<sub>B</sub> is one of the fundamental constants of physics, and is named the [[Boltzmann constant]] in honor of its discoverer.
 
Since Ω is a [[natural number]] (1,2,3,...), entropy is either zero or positive ({{nowrap|1=ln 1 = 0}}, {{nowrap|ln Ω ≥ 0}}).
 
Boltzmann's entropy describes the system when all the accessible microstates are equally likely. It is the configuration corresponding to the maximum of entropy at equilibrium. The randomness or disorder is maximal, and so is the lack of distinction (or information) of each microstate.
Line 52 ⟶ 50:
</div>
 
The quantity <math>k_\text{B}</math> is athe [[physical constant]] known as [[Boltzmann constant|Boltzmann's constant]]. The remaining factor of the equation, the entire [[summation]] is [[Dimensionless quantity|dimensionless]], since the value <math>p_i</math> is a probability and therefore dimensionless, and the [[logarithm]]{{math|ln}} is to the basis of the dimensionless [[mathematicalnatural constantlogarithm]] {{mvar|[[e (mathematical constant)|e]]}}. Hence the [[SI derived unit|SI derived units]] on both sides of the equation are same as [[heat capacity]]:
<math display="block"> [S] = [k_\text{B}] = \mathrm{\frac {J} {K}}</math>
 
Line 72 ⟶ 70:
The various ensembles used in [[statistical thermodynamics]] are linked to the entropy by the following relations:{{clarify|reason=What are the quantities that are being maintained constant between these different ensembles? Is this relationship only valid in the thermodynamic limit?|date=September 2013}}
<math display="block">S = k_\text{B} \ln \Omega_{\rm mic} = k_\text{B} (\ln Z_{\rm can} + \beta \bar E) = k_\text{B} (\ln \mathcal{Z}_{\rm gr} + \beta (\bar E - \mu \bar N)) </math>
* <math>\mathcal{Z}_Omega_{\rm grmic} </math> is the [[grand canonicalmicrocanonical ensemble|grand canonicalmicrocanonical partition function]]
 
* <math>\Omega_Z_{\rm miccan} </math> is the [[microcanonicalcanonical ensemble|microcanonicalcanonical partition function]]
* <math>Z_\mathcal{Z}_{\rm cangr} </math> is the [[grand canonical ensemble|grand canonical partition function]]
*<math>\mathcal{Z}_{\rm gr} </math> is the [[grand canonical ensemble|grand canonical partition function]]
 
== Order through chaos and the second law of thermodynamics ==
We can viewthink of ''Ω'' as a measure of our lack of knowledge about a system. AsTo an illustration ofillustrate this idea, consider a set of 100 [[coin]]s, each of which is either [[coin flipping|heads up or tails up]]. TheIn this example, let us suppose that the macrostates are specified by the total number of heads and tails, whereaswhile the microstates are specified by the facings of each individual coin (i.e., the exact order in which heads and tails occur). For the macrostates of 100 heads or 100 tails, there is exactly one possible configuration, so our knowledge of the system is complete. At the opposite extreme, the macrostate which gives us the least knowledge about the system consists of 50 heads and 50 tails in any order, for which there are {{val|100,891,344,545,564,193,334,812,497,256}} ([[combination|100 choose 50]]) ≈ 10<sup>29</sup> possible microstates.
 
Even when a system is entirely isolated from external influences, its microstate is constantly changing. For instance, the particles in a gas are constantly moving, and thus occupy a different position at each moment of time; their momenta are also constantly changing as they collide with each other or with the container walls. Suppose we prepare the system in an artificially highly ordered equilibrium state. For instance, imagine dividing a container with a partition and placing a gas on one side of the partition, with a vacuum on the other side. If we remove the partition and watch the subsequent behavior of the gas, we will find that its microstate evolves according to some chaotic and unpredictable pattern, and that on average these microstates will correspond to a more disordered macrostate than before. It is ''possible'', but ''extremely unlikely'', for the gas molecules to bounce off one another in such a way that they remain in one half of the container. It is overwhelmingly probable for the gas to spread out to fill the container evenly, which is the new equilibrium macrostate of the system.
 
This is an example illustrating the [[second law of thermodynamics]]:
: ''the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value''.
 
:''the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value''.
 
Since its discovery, this idea has been the focus of a great deal of thought, some of it confused. A chief point of confusion is the fact that the Second Law applies only to ''isolated'' systems. For example, the [[Earth]] is not an isolated system because it is constantly receiving energy in the form of [[sunlight]]. In contrast, the [[universe]] may be considered an isolated system, so that its total entropy is constantly increasing. (Needs clarification. See: [[Second law of thermodynamics#cite note-Grandy 151-21]])
Line 92 ⟶ 88:
 
To avoid coarse graining one can take the entropy as defined by the [[H-theorem#Tolman's_H-theorem|H-theorem]].<ref>{{cite book |isbn=0-486-68455-5|title=Lectures on Gas Theory|last1=Boltzmann|first1=Ludwig|date=January 1995|publisher=Courier Corporation }}</ref>
: <math>S = -k_{\rm B} H_{\rm B} := -k_{\rm B} \int f(q_i, p_i) \, \ln f(q_i,p_i) \,d q_1 dp_1 \cdots dq_N dp_N</math>
 
:<math>S = -k_{\rm B} H_{\rm B} := -k_{\rm B} \int f(q_i, p_i) \, \ln f(q_i,p_i) \,d q_1 dp_1 \cdots dq_N dp_N</math>
 
However, this ambiguity can be resolved with [[quantum mechanics]]. The [[quantum state]] of a system can be expressed as a superposition of "basis" states, which can be chosen to be energy [[eigenstate]]s (i.e. eigenstates of the quantum [[Hamiltonian (quantum mechanics)|Hamiltonian]]). Usually, the quantum states are discrete, even though there may be an infinite number of them. For a system with some specified energy ''E'', one takes Ω to be the number of energy eigenstates within a macroscopically small energy range between ''E'' and {{nowrap|''E'' + ''δE''}}. In the [[thermodynamical limit]], the specific entropy becomes independent on the choice of ''δE''.
Line 99 ⟶ 94:
An important result, known as [[Nernst's theorem]] or the [[third law of thermodynamics]], states that the entropy of a system at [[absolute zero|zero absolute temperature]] is a well-defined constant. This is because a system at zero temperature exists in its lowest-energy state, or [[ground state]], so that its entropy is determined by the [[Hamiltonian (quantum mechanics)|degeneracy]] of the ground state. Many systems, such as [[crystal|crystal lattices]], have a unique ground state, and (since {{nowrap|1=ln(1) = 0}}) this means that they have zero entropy at absolute zero. Other systems have more than one state with the same, lowest energy, and have a non-vanishing "zero-point entropy". For instance, ordinary [[ice]] has a zero-point entropy of {{val|3.41|u=J/(mol⋅K)}}, because its underlying [[crystal structure]] possesses multiple configurations with the same energy (a phenomenon known as [[geometrical frustration]]).
 
The third law of thermodynamics states that the entropy of a [[perfect crystal]] at absolute zero ({{val|0 [[kelvin]]|ul=K}}) is zero. This means that nearly all molecular motion should cease. The [[quantum harmonic oscillator|oscillator equation]] for predicting quantized vibrational levels shows that even when the vibrational quantum number is 0, the molecule still has vibrational energy{{Citation needed|date=March 2021}}:
: <math>E_\nu=h\nu_0(n +\begin{matrix} \frac{1}{2} \end{matrix})</math>
 
where <math>h</math> is the Planck's constant, <math>\nu_0</math> is the characteristic frequency of the vibration, and <math>n</math> is the vibrational quantum number. Even when <math>n=0</math> (the [[zero-point energy]]), <math>E_n</math> does not equal 0, in adherence to the [[Heisenberg uncertainty principle]].
:<math>E_\nu=h\nu_0(n+\begin{matrix} \frac{1}{2} \end{matrix})</math>
 
where <math>h</math> is Planck's constant, <math>\nu_0</math> is the characteristic frequency of the vibration, and <math>n</math> is the vibrational quantum number. Even when <math>n=0</math> (the [[zero-point energy]]), <math>E_n</math> does not equal 0, in adherence to the [[Heisenberg uncertainty principle]].
 
== See also ==
{{Divdiv col}}
* [[Boltzmann constant]]
* [[Configuration entropy]]
* [[Conformational entropy]]
* [[Enthalpy]]
* [[Entropy]]
* [[Entropy (classical thermodynamics)]]
* [[Entropy (energy dispersal)]]
* [[Entropy of mixing]]
* [[Entropy (order and disorder)]]
* [[Entropy (information theory)]]
* [[History of entropy]]
* [[Information theory]]
* [[Thermodynamic free energy]]
* [[Tsallis entropy]]
{{Divdiv col end}}
 
== References ==
{{Reflistreflist}}
 
{{DEFAULTSORT:Entropy (Statistical Thermodynamics)}}