Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Second law of thermodynamics
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Derivation from statistical mechanics == {{Further|H-theorem}} The first mechanical argument of the [[Kinetic theory of gases]] that molecular collisions entail an equalization of temperatures and hence a tendency towards equilibrium was due to [[James Clerk Maxwell]] in 1860;<ref>{{Cite journal | last1 = Gyenis | first1 = Balazs | doi = 10.1016/j.shpsb.2017.01.001 | title = Maxwell and the normal distribution: A colored story of probability, independence, and tendency towards equilibrium | journal = Studies in History and Philosophy of Modern Physics | volume = 57 | pages = 53–65 | year = 2017| arxiv = 1702.01411 | bibcode = 2017SHPMP..57...53G | s2cid = 38272381 }}</ref> [[Ludwig Boltzmann]] with his [[H-theorem]] of 1872 also argued that due to collisions gases should over time tend toward the [[Maxwell–Boltzmann distribution]]. Due to [[Loschmidt's paradox]], derivations of the second law have to make an assumption regarding the past, namely that the system is [[Correlation and dependence|uncorrelated]] at some time in the past; this allows for simple probabilistic treatment. This assumption is usually thought as a [[boundary condition]], and thus the second law is ultimately a consequence of the initial conditions somewhere in the past, probably at the beginning of the universe (the [[Big Bang]]), though [[Boltzmann brain|other scenarios]] have also been suggested.<ref name="Hawking AOT">{{cite journal|last=Hawking|first=SW|title=Arrow of time in cosmology|journal=Phys. Rev. D|year=1985|volume=32|issue=10|pages=2489–2495|doi=10.1103/PhysRevD.32.2489|pmid=9956019|bibcode = 1985PhRvD..32.2489H }}</ref><ref>{{cite book | last = Greene | first = Brian | author-link = Brian Greene | title = The Fabric of the Cosmos | url = https://archive.org/details/fabricofcosmossp00gree | url-access = registration | publisher = Alfred A. Knopf | year = 2004 | page = [https://archive.org/details/fabricofcosmossp00gree/page/171 171] | isbn = 978-0-375-41288-2}}</ref><ref name=Lebowitz>{{cite journal|last=Lebowitz|first=Joel L.|title= Boltzmann's Entropy and Time's Arrow|journal=Physics Today|date=September 1993|volume=46|issue=9|pages=32–38|url=http://users.df.uba.ar/ariel/materias/FT3_2008_1C/papers_pdf/lebowitz_370.pdf|access-date=2013-02-22|doi=10.1063/1.881363|bibcode = 1993PhT....46i..32L }}</ref> Given these assumptions, in statistical mechanics, the second law is not a postulate, rather it is a consequence of the [[Statistical mechanics#Fundamental postulate|fundamental postulate]], also known as the equal prior probability postulate, so long as one is clear that simple probability arguments are applied only to the future, while for the past there are auxiliary sources of information which tell us that it was low entropy.{{citation needed|date=August 2012}} The first part of the second law, which states that the entropy of a thermally isolated system can only increase, is a trivial consequence of the equal prior probability postulate, if we restrict the notion of the entropy to systems in thermal equilibrium. The entropy of an isolated system in thermal equilibrium containing an amount of energy of <math>E</math> is: : <math>S = k_{\mathrm B} \ln\left[\Omega\left(E\right)\right]</math> where <math>\Omega\left(E\right)</math> is the number of quantum states in a small interval between <math>E</math> and <math>E +\delta E</math>. Here <math>\delta E</math> is a macroscopically small energy interval that is kept fixed. Strictly speaking this means that the entropy depends on the choice of <math>\delta E</math>. However, in the thermodynamic limit (i.e. in the limit of infinitely large system size), the specific entropy (entropy per unit volume or per unit mass) does not depend on <math>\delta E</math>. Suppose we have an isolated system whose macroscopic state is specified by a number of variables. These macroscopic variables can, e.g., refer to the total volume, the positions of pistons in the system, etc. Then <math>\Omega</math> will depend on the values of these variables. If a variable is not fixed, (e.g. we do not clamp a piston in a certain position), then because all the accessible states are equally likely in equilibrium, the free variable in equilibrium will be such that <math>\Omega</math> is maximized at the given energy of the isolated system<ref name="Young&FreedmanIS">Young, H. D; Freedman, R. A. (2004). ''University Physics'', 11th edition. Pearson. p. 731.</ref> as that is the most probable situation in equilibrium. If the variable was initially fixed to some value then upon release and when the new equilibrium has been reached, the fact the variable will adjust itself so that <math>\Omega</math> is maximized, implies that the entropy will have increased or it will have stayed the same (if the value at which the variable was fixed happened to be the equilibrium value). Suppose we start from an equilibrium situation and we suddenly remove a constraint on a variable. Then right after we do this, there are a number <math>\Omega</math> of accessible microstates, but equilibrium has not yet been reached, so the actual probabilities of the system being in some accessible state are not yet equal to the prior probability of <math>1/\Omega</math>. We have already seen that in the final equilibrium state, the entropy will have increased or have stayed the same relative to the previous equilibrium state. Boltzmann's [[H-theorem]], however, proves that the quantity {{math|''H''}} increases monotonically as a function of time during the intermediate out of equilibrium state. === Derivation of the entropy change for reversible processes === The second part of the second law states that the entropy change of a system undergoing a reversible process is given by: : <math>dS =\frac{\delta Q}{T}</math> where the temperature is defined as: : <math>\frac{1}{k_{\mathrm B} T}\equiv\beta\equiv\frac{d\ln\left[\Omega\left(E\right)\right]}{dE}</math> See ''[[Microcanonical ensemble]]'' for the justification for this definition. Suppose that the system has some external parameter, ''x'', that can be changed. In general, the energy eigenstates of the system will depend on ''x''. According to the [[adiabatic theorem]] of quantum mechanics, in the limit of an infinitely slow change of the system's Hamiltonian, the system will stay in the same energy eigenstate and thus change its energy according to the change in energy of the energy eigenstate it is in. The generalized force, ''X'', corresponding to the external variable ''x'' is defined such that <math>X dx</math> is the work performed by the system if ''x'' is increased by an amount ''dx''. For example, if ''x'' is the volume, then ''X'' is the pressure. The generalized force for a system known to be in energy eigenstate <math>E_{r}</math> is given by: : <math>X = -\frac{dE_{r}}{dx}</math> Since the system can be in any energy eigenstate within an interval of <math>\delta E</math>, we define the generalized force for the system as the expectation value of the above expression: : <math>X = -\left\langle\frac{dE_{r}}{dx}\right\rangle\,</math> To evaluate the average, we partition the <math>\Omega\left(E\right)</math> energy eigenstates by counting how many of them have a value for <math>\frac{dE_{r}}{dx}</math> within a range between <math>Y</math> and <math>Y + \delta Y</math>. Calling this number <math>\Omega_{Y}\left(E\right)</math>, we have: : <math>\Omega\left(E\right)=\sum_{Y}\Omega_{Y}\left(E\right)\,</math> The average defining the generalized force can now be written: : <math>X = -\frac{1}{\Omega\left(E\right)}\sum_{Y} Y\Omega_{Y}\left(E\right)\,</math> We can relate this to the derivative of the entropy with respect to ''x'' at constant energy ''E'' as follows. Suppose we change ''x'' to ''x'' + ''dx''. Then <math>\Omega\left(E\right)</math> will change because the energy eigenstates depend on ''x'', causing energy eigenstates to move into or out of the range between <math>E</math> and <math>E+\delta E</math>. Let's focus again on the energy eigenstates for which <math display="inline">\frac{dE_{r}}{dx}</math> lies within the range between <math>Y</math> and <math>Y + \delta Y</math>. Since these energy eigenstates increase in energy by ''Y dx'', all such energy eigenstates that are in the interval ranging from ''E'' – ''Y'' ''dx'' to ''E'' move from below ''E'' to above ''E''. There are : <math>N_{Y}\left(E\right)=\frac{\Omega_{Y}\left(E\right)}{\delta E} Y dx\,</math> such energy eigenstates. If <math>Y dx\leq\delta E</math>, all these energy eigenstates will move into the range between <math>E</math> and <math>E+\delta E</math> and contribute to an increase in <math>\Omega</math>. The number of energy eigenstates that move from below <math>E+\delta E</math> to above <math>E+\delta E</math> is given by <math>N_{Y}\left(E+\delta E\right)</math>. The difference : <math>N_{Y}\left(E\right) - N_{Y}\left(E+\delta E\right)\,</math> is thus the net contribution to the increase in <math>\Omega</math>. If ''Y dx'' is larger than <math>\delta E</math> there will be the energy eigenstates that move from below ''E'' to above <math>E+\delta E</math>. They are counted in both <math>N_{Y}\left(E\right)</math> and <math>N_{Y}\left(E+\delta E\right)</math>, therefore the above expression is also valid in that case. Expressing the above expression as a derivative with respect to ''E'' and summing over ''Y'' yields the expression: : <math>\left(\frac{\partial\Omega}{\partial x}\right)_{E} = -\sum_{Y}Y\left(\frac{\partial\Omega_{Y}}{\partial E}\right)_{x}= \left(\frac{\partial\left(\Omega X\right)}{\partial E}\right)_{x}\,</math> The logarithmic derivative of <math>\Omega</math> with respect to ''x'' is thus given by: : <math>\left(\frac{\partial\ln\left(\Omega\right)}{\partial x}\right)_{E} = \beta X +\left(\frac{\partial X}{\partial E}\right)_{x}\,</math> The first term is intensive, i.e. it does not scale with system size. In contrast, the last term scales as the inverse system size and will thus vanish in the thermodynamic limit. We have thus found that: : <math>\left(\frac{\partial S}{\partial x}\right)_{E} = \frac{X}{T}\,</math> Combining this with : <math>\left(\frac{\partial S}{\partial E}\right)_{x} = \frac{1}{T}\,</math> gives: : <math>dS = \left(\frac{\partial S}{\partial E}\right)_{x}dE+\left(\frac{\partial S}{\partial x}\right)_{E}dx = \frac{dE}{T} + \frac{X}{T} dx=\frac{\delta Q}{T}\,</math> === Derivation for systems described by the canonical ensemble === If a system is in thermal contact with a heat bath at some temperature ''T'' then, in equilibrium, the probability distribution over the energy eigenvalues are given by the [[canonical ensemble]]: : <math>P_{j}=\frac{\exp\left(-\frac{E_{j}}{k_{\mathrm B} T}\right)}{Z}</math> Here ''Z'' is a factor that normalizes the sum of all the probabilities to 1, this function is known as the [[Partition function (statistical mechanics)|partition function]]. We now consider an infinitesimal reversible change in the temperature and in the external parameters on which the energy levels depend. It follows from the general formula for the entropy: : <math>S = -k_{\mathrm B}\sum_{j}P_{j}\ln\left(P_{j}\right)</math> that : <math>dS = -k_{\mathrm B}\sum_{j}\ln\left(P_{j}\right)dP_{j}</math> Inserting the formula for <math>P_{j}</math> for the canonical ensemble in here gives: : <math>dS = \frac{1}{T}\sum_{j}E_{j}dP_{j}=\frac{1}{T}\sum_{j}d\left(E_{j}P_{j}\right) - \frac{1}{T}\sum_{j}P_{j}dE_{j}= \frac{dE + \delta W}{T}=\frac{\delta Q}{T}</math> === Initial conditions at the Big Bang === {{Further|Past hypothesis}} As elaborated above, it is thought that the second law of thermodynamics is a result of the very low-entropy initial conditions at the [[Big Bang]]. From a statistical point of view, these were very special conditions. On the other hand, they were quite simple, as the universe - or at least the part thereof from which the [[observable universe]] developed - seems to have been extremely uniform.<ref>Carroll, S. (2017). The big picture: on the origins of life, meaning, and the universe itself. Penguin.</ref> This may seem somewhat paradoxical, since in many physical systems uniform conditions (e.g. mixed rather than separated gases) have high entropy. The paradox is solved once realizing that gravitational systems have [[Heat capacity#Negative heat capacity|negative heat capacity]], so that when gravity is important, uniform conditions (e.g. gas of uniform density) in fact have lower entropy compared to non-uniform ones (e.g. black holes in empty space).<ref>Greene, B. (2004). The fabric of the cosmos: Space, time, and the texture of reality. Knopf.</ref> Yet another approach is that the universe had high (or even maximal) entropy given its size, but as the universe grew it rapidly came out of thermodynamic equilibrium, its entropy only slightly increased compared to the increase in maximal possible entropy, and thus it has arrived at a very low entropy when compared to the much larger possible maximum given its later size.<ref>{{cite journal | last1 = Davies | first1 = P. C. | year = 1983 | title = Inflation and time asymmetry in the universe | url = | journal = Nature | volume = 301 | issue = 5899| pages = 398–400 | doi = 10.1038/301398a0 | bibcode = 1983Natur.301..398D }}</ref> As for the reason why initial conditions were such, one suggestion is that [[cosmological inflation]] was enough to wipe off non-smoothness, while another is that the universe was [[Hartle–Hawking state|created spontaneously]] where the mechanism of creation implies low-entropy initial conditions.<ref>[https://www.quantamagazine.org/physicists-debate-hawkings-idea-that-the-universe-had-no-beginning-20190606/ Physicists Debate Hawking's Idea That the Universe Had No Beginning. Wolchover, N. Quantmagazine, June 6, 2019. Retrieved 2020-11-28]</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Second law of thermodynamics
(section)
Add topic