Jump to content

Free entropy: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
m clean up using AWB
mNo edit summary
Tags: Mobile edit Mobile app edit iOS app edit
(38 intermediate revisions by 31 users not shown)
Line 1: Line 1:
{{Thermodynamics|cTopic=[[Thermodynamic potential|Potentials]]}}
{{Short description|Thermodynamic potential of entropy, analogous to the free energy}}
{{Thermodynamics|expanded=Potentials}}


A [[thermodynamic]] '''free entropy''' is an entropic [[thermodynamic potential]] analogous to the [[thermodynamic free energy|free energy]]. Also known as a Massieu, Planck, or Massieu-Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entropies frequently appear as the logarithm of a [[Partition function (statistical mechanics)|partition function]]. In [[mathematics]], free entropy is the generalization of entropy defined in [[free probability]].
A [[thermodynamic]] '''free entropy''' is an entropic [[thermodynamic potential]] analogous to the [[thermodynamic free energy|free energy]]. Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In [[statistical mechanics]], free entropies frequently appear as the logarithm of a [[Partition function (statistical mechanics)|partition function]]. The [[Onsager reciprocal relations]] in particular, are developed in terms of entropic potentials. In [[mathematics]], free entropy means something quite different: it is a generalization of entropy defined in the subject of [[free probability]].


A free entropy is generated by a [[Legendre transform]] of the entropy. The different potentials correspond to different constraints to which the system may be subjected.
A free entropy is generated by a [[Legendre transformation]] of the entropy. The different potentials correspond to different constraints to which the system may be subjected.


==Examples==
==Examples==
Line 36: Line 37:
{{Col-break}}
{{Col-break}}
::<math>S</math> is [[entropy]]
::<math>S</math> is [[entropy]]
::<math>\Phi</math> is the Massieu potential<ref name="Planes2000">{{cite web |author=Antoni Planes |coauthors=Eduard Vives |date=2000-10-24 |publisher=Universitat de Barcelona |url=http://www.ecm.ub.es/condensed/eduard/papers/massieu/node2.html |title=Entropic variables and Massieu-Planck functions |accessdate=2007-09-18 |work=Entropic Formulation of Statistical Mechanics }}</ref><ref>{{cite journal |author=T. Wada |coauthors=A.M. Scarfone |year=2004 |month=12 |title=Connections between Tsallis’ formalisms employing the standard linear average energy and ones employing the normalized q-average energy |journal=Physics Letters A |volume=335 |issue=5-6 |pages=351–362 |doi=10.1016/j.physleta.2004.12.054 |arxiv=cond-mat/0410527}}
::<math>\Phi</math> is the Massieu potential<ref name="Planes2000">{{cite web |author=Antoni Planes |author2=Eduard Vives |date=2000-10-24 |publisher=Universitat de Barcelona |url=http://www.ecm.ub.es/condensed/eduard/papers/massieu/node2.html |title=Entropic variables and Massieu-Planck functions |access-date=2007-09-18 |work=Entropic Formulation of Statistical Mechanics |archive-date=2008-10-11 |archive-url=https://web.archive.org/web/20081011011717/http://www.ecm.ub.es/condensed/eduard/papers/massieu/node2.html |url-status=dead }}</ref><ref>{{cite journal |author=T. Wada |author2=A.M. Scarfone |date=December 2004 |title=Connections between Tsallis' formalisms employing the standard linear average energy and ones employing the normalized q-average energy |journal=Physics Letters A |volume=335 |issue=5–6 |pages=351–362 |doi=10.1016/j.physleta.2004.12.054 |arxiv=cond-mat/0410527|bibcode = 2005PhLA..335..351W |s2cid=17101164 }}
</ref>
</ref>
::<math>\Xi</math> is the Planck potential<ref name="Planes2000"/>
::<math>\Xi</math> is the Planck potential<ref name="Planes2000"/>
Line 53: Line 54:
{{Col-end}}
{{Col-end}}


Note that the use of the terms "Massieu" and "Planck" for explicit Massieu-Planck potentials are somewhat obscure and ambiguous. In particular "Planck potential" has alternative meanings. The most standard notation for an entropic potential is <math>\psi</math>, used by both [[Planck]] and [[Schrödinger]]. (Note that Gibbs used <math>\psi</math> to denote the free energy.) Free entropies where invented by French engineer [[Francois Massieu]] in 1869, and actually predate Gibb's free energy (1875).
Note that the use of the terms "Massieu" and "Planck" for explicit Massieu-Planck potentials are somewhat obscure and ambiguous. In particular "Planck potential" has alternative meanings. The most standard notation for an entropic potential is <math>\psi</math>, used by both [[Planck]] and [[Schrödinger]]. (Note that Gibbs used <math>\psi</math> to denote the free energy.) Free entropies where invented by French engineer [[François Massieu]] in 1869, and actually predate Gibbs's free energy (1875).


==Dependence of the potentials on the natural variables==
==Dependence of the potentials on the natural variables==

===Entropy===
===Entropy===
:<math>S = S(U,V,\{N_i\})</math>
:<math>S = S(U,V,\{N_i\})</math>


By the definition of a total differential,
By the definition of a total differential,


:<math>d S = \frac {\partial S} {\partial U} d U + \frac {\partial S} {\partial V} d V + \sum_{i=1}^s \frac {\partial S} {\partial N_i} d N_i</math>.
:<math>d S = \frac {\partial S} {\partial U} d U + \frac {\partial S} {\partial V} d V + \sum_{i=1}^s \frac {\partial S} {\partial N_i} d N_i. </math>


From the [[thermodynamic potentials#The_equations_of_state|equations of state]],
From the [[thermodynamic potentials#The equations of state|equations of state]],


:<math>d S = \frac{1}{T}dU+\frac{P}{T}dV + \sum_{i=1}^s (- \frac{\mu_i}{T}) d N_i</math>.
:<math>d S = \frac{1}{T}dU+\frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i .</math>


The differentials in the above equation are all of [[Intensive and extensive properties|extensive variables]], so they may be integrated to yield
The differentials in the above equation are all of [[Intensive and extensive properties|extensive variables]], so they may be integrated to yield


:<math>S = \frac{U}{T}+\frac{p V}{T} + \sum_{i=1}^s (- \frac{\mu_i N}{T})</math>.
:<math>S = \frac{U}{T}+\frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right).</math>


===Massieu potential \ Helmholtz free entropy===
===Massieu potential / Helmholtz free entropy===
:<math>\Phi = S - \frac {U}{T}</math>
:<math>\Phi = S - \frac {U}{T}</math>
:<math>\Phi = \frac{U}{T}+\frac{P V}{T} + \sum_{i=1}^s (- \frac{\mu_i N}{T}) - \frac {U}{T}</math>
:<math>\Phi = \frac{U}{T}+\frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right) - \frac {U}{T}</math>
:<math>\Phi = \frac{P V}{T} + \sum_{i=1}^s (- \frac{\mu_i N}{T})</math>
:<math>\Phi = \frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right)</math>


Starting over at the definition of <math>\Phi</math> and taking the total differential, we have via a Legendre transform (and the [[chain rule]])
Starting over at the definition of <math>\Phi</math> and taking the total differential, we have via a Legendre transform (and the [[chain rule]])


:<math>d \Phi = d S - \frac {1} {T} dU - U d \frac {1} {T}</math>,
:<math>d \Phi = d S - \frac {1} {T} dU - U d \frac {1} {T} ,</math>
:<math>d \Phi = \frac{1}{T}dU+\frac{P}{T}dV + \sum_{i=1}^s (- \frac{\mu_i}{T}) d N_i - \frac {1} {T} dU - U d \frac {1} {T}</math>,
:<math>d \Phi = \frac{1}{T}dU + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac {1} {T} dU - U d \frac {1} {T},</math>
:<math>d \Phi = - U d \frac {1} {T}+\frac{P}{T}dV + \sum_{i=1}^s (- \frac{\mu_i}{T}) d N_i</math>.
:<math>d \Phi = - U d \frac {1} {T}+\frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i.</math>


The above differentials are not all of extensive variables, so the equation may not be directly integrated. From <math>d \Phi</math> we see that
The above differentials are not all of extensive variables, so the equation may not be directly integrated. From <math>d \Phi</math> we see that


:<math>\Phi = \Phi(\frac {1}{T},V,\{N_i\})</math>.
:<math>\Phi = \Phi(\frac {1}{T},V, \{N_i\}) .</math>


If reciprocal variables are not desired,<ref name="Debye1954">
If reciprocal variables are not desired,<ref name="Debye1954">
Line 95: Line 97:
</ref>{{rp|222}}
</ref>{{rp|222}}


:<math>d \Phi = d S - \frac {T d U - U d T} {T^2}</math>,
:<math>d \Phi = d S - \frac {T d U - U d T} {T^2} ,</math>
:<math>d \Phi = d S - \frac {1} {T} d U + \frac {U} {T^2} d T</math>,
:<math>d \Phi = d S - \frac {1} {T} d U + \frac {U} {T^2} d T ,</math>
:<math>d \Phi = \frac{1}{T}dU+\frac{P}{T}dV + \sum_{i=1}^s (- \frac{\mu_i}{T}) d N_i - \frac {1} {T} d U + \frac {U} {T^2} d T</math>,
:<math>d \Phi = \frac{1}{T}dU + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac {1} {T} d U + \frac {U} {T^2} d T,</math>
:<math>d \Phi = \frac {U} {T^2} d T + \frac{P}{T}dV + \sum_{i=1}^s (- \frac{\mu_i}{T}) d N_i</math>,
:<math>d \Phi = \frac {U} {T^2} d T + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i ,</math>
:<math>\Phi = \Phi(T,V,\{N_i\})</math>.
:<math>\Phi = \Phi(T,V,\{N_i\}) .</math>


===Planck potential \ Gibbs free entropy===
===Planck potential / Gibbs free entropy===
:<math>\Xi = \Phi -\frac{P V}{T}</math>
:<math>\Xi = \Phi -\frac{P V}{T}</math>
:<math>\Xi = \frac{P V}{T} + \sum_{i=1}^s (- \frac{\mu_i N}{T}) -\frac{P V}{T}</math>
:<math>\Xi = \frac{P V}{T} + \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right) -\frac{P V}{T}</math>
:<math>\Xi = \sum_{i=1}^s (- \frac{\mu_i N}{T})</math>
:<math>\Xi = \sum_{i=1}^s \left(- \frac{\mu_i N}{T}\right)</math>


Starting over at the definition of <math>\Xi</math> and taking the total differential, we have via a Legendre transform (and the [[chain rule]])
Starting over at the definition of <math>\Xi</math> and taking the total differential, we have via a Legendre transform (and the [[chain rule]])


:<math>d \Xi = d \Phi - \frac{P}{T} d V - V d \frac{P}{T}</math>
:<math>d \Xi = d \Phi - \frac{P}{T} d V - V d \frac{P}{T}</math>
:<math>d \Xi = - U d \frac {1} {T} + \frac{P}{T}dV + \sum_{i=1}^s (- \frac{\mu_i}{T}) d N_i - \frac{P}{T} d V - V d \frac{P}{T}</math>
:<math>d \Xi = - U d \frac {2} {T} + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac{P}{T} d V - V d \frac{P}{T}</math>
:<math>d \Xi = - U d \frac {1} {T} - V d \frac{P}{T} + \sum_{i=1}^s (- \frac{\mu_i}{T}) d N_i</math>.
:<math>d \Xi = - U d \frac {1} {T} - V d \frac{P}{T} + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i. </math>


The above differentials are not all of extensive variables, so the equation may not be directly integrated. From <math>d \Xi</math> we see that
The above differentials are not all of extensive variables, so the equation may not be directly integrated. From <math>d \Xi</math> we see that


:<math>\Xi = \Xi(\frac {1}{T},\frac {P}{T},\{N_i\})</math>.
:<math>\Xi = \Xi \left(\frac {1}{T}, \frac {P}{T}, \{N_i\} \right) .</math>


If reciprocal variables are not desired,<ref name="Debye1954"/>{{rp|222}}
If reciprocal variables are not desired,<ref name="Debye1954"/>{{rp|222}}


:<math>d \Xi = d \Phi - \frac{T (P d V + V d P) - P V d T}{T^2}</math>,
:<math>d \Xi = d \Phi - \frac{T (P d V + V d P) - P V d T}{T^2} ,</math>
:<math>d \Xi = d \Phi - \frac{P}{T} d V - \frac {V}{T} d P + \frac {P V}{T^2} d T</math>,
:<math>d \Xi = d \Phi - \frac{P}{T} d V - \frac {V}{T} d P + \frac {P V}{T^2} d T ,</math>
:<math>d \Xi = \frac {U} {T^2} d T + \frac{P}{T}dV + \sum_{i=1}^s (- \frac{\mu_i}{T}) d N_i - \frac{P}{T} d V - \frac {V}{T} d P + \frac {P V}{T^2} d T</math>,
:<math>d \Xi = \frac {U} {T^2} d T + \frac{P}{T}dV + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i - \frac{P}{T} d V - \frac {V}{T} d P + \frac {P V}{T^2} d T ,</math>
:<math>d \Xi = \frac {U + P V} {T^2} d T - \frac {V}{T} d P + \sum_{i=1}^s (- \frac{\mu_i}{T}) d N_i</math>,
:<math>d \Xi = \frac {U + P V} {T^2} d T - \frac {V}{T} d P + \sum_{i=1}^s \left(- \frac{\mu_i}{T}\right) d N_i ,</math>
:<math>\Xi = \Xi(T,P,\{N_i\})</math>.
:<math>\Xi = \Xi(T,P,\{N_i\}) .</math>


==References==
==References==
Line 129: Line 131:
==Bibliography==
==Bibliography==
*{{cite journal
*{{cite journal
| first =M.F. |last = Massieu|year=1869 |title= Compt. Rend.
| first =M.F. |last = Massieu|year=1869 |title= Compt. Rend
| volume=69
| volume=69
| issue= 858
| issue= 858
Line 135: Line 137:


*{{cite book
*{{cite book
| first = Herbert B. | last = Callen | authorlink = Herbert Callen | year = 1985
| first = Herbert B. | last = Callen | author-link = Herbert Callen | year = 1985
| title = Thermodynamics and an Introduction to Themostatistics | edition = 2nd
| title = Thermodynamics and an Introduction to Thermostatistics | edition = 2nd
| publisher = John Wiley & Sons | location = New York | isbn = 0-471-86256-8 }}
| publisher = John Wiley & Sons | location = New York | isbn = 0-471-86256-8 }}


[[Category:Thermodynamic entropy]]
[[Category:Thermodynamic entropy]]

[[pl:Entropia swobodna]]

Revision as of 16:38, 6 August 2023

A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entropies frequently appear as the logarithm of a partition function. The Onsager reciprocal relations in particular, are developed in terms of entropic potentials. In mathematics, free entropy means something quite different: it is a generalization of entropy defined in the subject of free probability.

A free entropy is generated by a Legendre transformation of the entropy. The different potentials correspond to different constraints to which the system may be subjected.

Examples

The most common examples are:

Name Function Alt. function Natural variables
Entropy
Massieu potential \ Helmholtz free entropy
Planck potential \ Gibbs free entropy

where

Note that the use of the terms "Massieu" and "Planck" for explicit Massieu-Planck potentials are somewhat obscure and ambiguous. In particular "Planck potential" has alternative meanings. The most standard notation for an entropic potential is , used by both Planck and Schrödinger. (Note that Gibbs used to denote the free energy.) Free entropies where invented by French engineer François Massieu in 1869, and actually predate Gibbs's free energy (1875).

Dependence of the potentials on the natural variables

Entropy

By the definition of a total differential,

From the equations of state,

The differentials in the above equation are all of extensive variables, so they may be integrated to yield

Massieu potential / Helmholtz free entropy

Starting over at the definition of and taking the total differential, we have via a Legendre transform (and the chain rule)

The above differentials are not all of extensive variables, so the equation may not be directly integrated. From we see that

If reciprocal variables are not desired,[3]: 222 

Planck potential / Gibbs free entropy

Starting over at the definition of and taking the total differential, we have via a Legendre transform (and the chain rule)

The above differentials are not all of extensive variables, so the equation may not be directly integrated. From we see that

If reciprocal variables are not desired,[3]: 222 

References

  1. ^ a b Antoni Planes; Eduard Vives (2000-10-24). "Entropic variables and Massieu-Planck functions". Entropic Formulation of Statistical Mechanics. Universitat de Barcelona. Archived from the original on 2008-10-11. Retrieved 2007-09-18.
  2. ^ T. Wada; A.M. Scarfone (December 2004). "Connections between Tsallis' formalisms employing the standard linear average energy and ones employing the normalized q-average energy". Physics Letters A. 335 (5–6): 351–362. arXiv:cond-mat/0410527. Bibcode:2005PhLA..335..351W. doi:10.1016/j.physleta.2004.12.054. S2CID 17101164.
  3. ^ a b The Collected Papers of Peter J. W. Debye. New York, New York: Interscience Publishers, Inc. 1954.

Bibliography

  • Massieu, M.F. (1869). "Compt. Rend". 69 (858): 1057. {{cite journal}}: Cite journal requires |journal= (help)