SUPPORT THE WORK

GetWiki

entropy

ARTICLE SUBJECTS
aesthetics  →
being  →
complexity  →
database  →
enterprise  →
ethics  →
fiction  →
history  →
internet  →
knowledge  →
language  →
licensing  →
linux  →
logic  →
method  →
news  →
perception  →
philosophy  →
policy  →
purpose  →
religion  →
science  →
sociology  →
software  →
truth  →
unix  →
wiki  →
ARTICLE TYPES
essay  →
feed  →
help  →
system  →
wiki  →
ARTICLE ORIGINS
critical  →
discussion  →
forked  →
imported  →
original  →
entropy
[ temporary import ]
please note:
- the content below is remote from Wikipedia
- it has been imported raw for GetWiki
{{About|entropy in thermodynamics}}{{hatnote|Not to be confused with Enthalpy.}}{{See introduction}}{{Use dmy dates|date=March 2017}}







factoids
{{Thermodynamics|expanded=sysprop}}{{EntropySegments}}In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number {{math|Ω}} of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Entropy expresses the number {{math|Ω}} of different configurations that a system defined by macroscopic variables could assume.BOOK, Roberto, Ligrone, 2019, Glossary, Entropy,weblink Biological Innovations that Built the World: A Four-billion-year Journey through Life & Earth History, 478, 3030160564, Springer, 2019-08-29, Under the assumption that each microstate is equally probable, the entropy S is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant {{math|kB}}. Formally (assuming equiprobable microstates),
S = k_{mathrm{B}} ln Omega .
Macroscopic systems typically have a very large number {{math|Ω}} of possible microscopic configurations. For example, the entropy of an ideal gas is proportional to the number of gas molecules {{math|N}}. The number of molecules in twenty liters of gas at room temperature and atmospheric pressure is roughly {{math|N ≈ {{val|6|e=23}}}} (the Avogadro number). The second law of thermodynamics states that the entropy of an isolated system never decreases over time. Isolated systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems, like organisms, may lose entropy, provided their environment's entropy increases by at least that amount so that the total entropy increases. Therefore, total entropy in the Universe does increase. Entropy is a function of the state of the system, so the change in entropy of a system is determined by its initial and final states. In the idealization that a process is reversible, the entropy does not change, while irreversible processes always increase the total entropy.Because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification. For this reason, it is often said that entropy is an expression of the disorder, or randomness of a system, or of the lack of information about it.BOOK, Edward A., Rietman, Jack A., Tuszynski, Yuzhuo, Wang, Francesco, Crea, 2017, Thermodynamics & Cancer Dormancy: A Perspective, Introduction: Entropy & Information,weblink Tumor Dormancy & Recurrence (Cancer Drug Discovery and Development), 63, 3319592408, Humana Press, 2019-08-29, The concept of entropy plays a central role in information theory.

History

File:Clausius.jpg|thumb|upright|{{nowrap|(1822–1888),}} originator of the concept of entropyThe French mathematician Lazare Carnot proposed in his 1803 paper Fundamental Principles of Equilibrium and Movement that in any machine the accelerations and shocks of the moving parts represent losses of moment of activity. In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare's son Sadi Carnot published Reflections on the Motive Power of Fire which posited that in all heat-engines, whenever "caloric" (what is now known as heat) falls through a temperature difference, work or motive power can be produced from the actions of its fall from a hot to cold body. He made the analogy with that of how water falls in a water wheel. This was an early insight into the second law of thermodynamics.WEB,weblink Carnot, Sadi (1796–1832), 2010-02-24, Wolfram Research, 2007, Carnot based his views of heat partially on the early 18th-century "Newtonian hypothesis" that both heat and light were types of indestructible forms of matter, which are attracted and repelled by other matter, and partially on the contemporary views of Count Rumford who showed (1789) that heat could be created by friction as when cannon bores are machined.BOOK, McCulloch, Richard, S., Treatise on the Mechanical Theory of Heat and its Applications to the Steam-Engine, etc., D. Van Nostrand, 1876, Carnot reasoned that if the body of the working substance, such as a body of steam, is returned to its original state at the end of a complete engine cycle, that "no change occurs in the condition of the working body".The first law of thermodynamics, deduced from the heat-friction experiments of James Joule in 1843, expresses the concept of energy, and its conservation in all processes; the first law, however, is unable to quantify the effects of friction and dissipation.In the 1850s and 1860s, German physicist Rudolf Clausius objected to the supposition that no change occurs in the working body, and gave this "change" a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction.JOURNAL, Clausius, Rudolf, Über die bewegende Kraft der Wärme und die Gesetze, welche sich daraus für die Wärmelehre selbst ableiten lassen, 1850, 10.1002/andp.18501550306, Annalen der Physik, 155, 3, [On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat] : Poggendorff's Annalen der Physik und Chemie Clausius described entropy as the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state This was in contrast to earlier views, based on the theories of Isaac Newton, that heat was an indestructible particle that had mass.Later, scientists such as Ludwig Boltzmann, Josiah Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis. In 1877 Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the natural logarithm of the number of microstates such a gas could occupy. Henceforth, the essential problem in statistical thermodynamics has been to determine the distribution of a given amount of energy E over N identical systems.Carathéodory linked entropy with a mathematical definition of irreversibility, in terms of trajectories and integrability.

Etymology

In 1865, Clausius named the concept of {{mvar|S}}, "the differential of a quantity which depends on the configuration of the system", (wikt:entropy|entropy) () after the Greek word for 'transformation'.BOOK, Gillispie, Charles Coulston, Charles Coulston Gillispie, The Edge of Objectivity: An Essay in the History of Scientific Ideas, 1960, Princeton University Press, 0-691-02350-6, 399, He gives "transformational content" () as a synonym, paralleling his "thermal and ergonal content" () as the name of {{mvar|U}}, but preferring the term entropy as a close parallel of the word energy, as he found the concepts to be nearly "analogous in their physical significance". This term was formed by replacing the root of ('work') by that of ('transformation').JOURNAL, Clausius, Rudolf, 1865, Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie (Vorgetragen in der naturforsch. Gesellschaft zu Zürich den 24. April 1865), Annalen der Physik und Chemie, 125, 7, 353–400, 10.1002/andp.18652010702, 1865AnP...201..353C, "Sucht man für S einen bezeichnenden Namen, so könnte man, ähnlich wie von der Gröſse U gesagt ist, sie sey der Wärme- und Werkinhalt des Körpers, von der Gröſse S sagen, sie sey der Verwandlungsinhalt des Körpers. Da ich es aber für besser halte, die Namen derartiger für die Wissenschaft wichtiger Gröſsen aus den alten Sprachen zu entnehmen, damit sie unverändert in allen neuen Sprachen angewandt werden können, so schlage ich vor, die Gröſse S nach dem griechischen Worte ἡ τροπὴ, die Verwandlung, die Entropie des Körpers zu nennen. Das Wort Entropie habei ich absichtlich dem Worte Energie möglichst ähnlich gebildet, denn die beiden Gröſsen, welche durch diese Worte benannt werden sollen, sind ihren physikalischen Bedeutungen nach einander so nahe verwandt, daſs eine gewisse Gleichartigkeit in der Benennung mir zweckmäſsig zu seyn scheint." (p. 390).

Definitions and descriptions

}}There are two equivalent definitions of entropy: the thermodynamic definition and the statistical mechanics definition. Historically, the classical thermodynamics definition developed first. In the classical thermodynamics viewpoint, the microscopic details of a system are not considered. Instead, the behavior of a system is described in terms of a set of empirically defined thermodynamic variables, such as temperature, pressure, entropy, and heat capacity. The classical thermodynamics description assumes a state of equilibrium, although more recently attempts have been made to develop useful definitions of entropy in nonequilibrium systems as well. The statistical definition of entropy and other thermodynamic properties were developed later. In this viewpoint, thermodynamic properties are defined in terms of the statistics of the motions of the microscopic constituents of a system – modeled at first classically, e.g. Newtonian particles constituting a gas, and later quantum-mechanically (photons, phonons, spins, etc.).

Function of state

There are many thermodynamic properties that are functions of state. This means that at a particular thermodynamic state (which should not be confused with the microscopic state of a system), these properties have a certain value. Often, if two properties of the system are determined, then the state is determined and the other properties' values can also be determined. For instance, a quantity of gas at a particular temperature and pressure has its state fixed by those values and thus has a specific volume that is determined by those values. As another instance, a system composed of a pure substance of a single phase at a particular uniform temperature and pressure is determined (and is thus a particular state) and is at not only a particular volume but also at a particular entropy.J. A. McGovern,WEB,weblink 2.5 Entropy, 2013-02-05, yes,weblink" title="web.archive.org/web/20120923080936weblink">weblink 2012-09-23, The fact that entropy is a function of state is one reason it is useful. In the Carnot cycle, the working fluid returns to the same state it had at the start of the cycle, hence the line integral of any state function, such as entropy, over this reversible cycle is zero.

Reversible process

Entropy is conserved for a reversible process. A reversible process is one that does not deviate from thermodynamic equilibrium, while producing the maximum work. Any process which happens quickly enough to deviate from thermal equilibrium cannot be reversible. In these cases energy is lost to heat, total entropy increases, and the potential for maximum work to be done in the transition is also lost. More specifically, total entropy is conserved in a reversible process and not conserved in an irreversible process.WEB, 6.5 Irreversibility, Entropy Changes, and Lost Work,weblink web.mit.edu, 21 May 2016, For example, in the Carnot cycle, while the heat flow from the hot reservoir to the cold reservoir represents an increase in entropy, the work output, if reversibly and perfectly stored in some energy storage mechanism, represents a decrease in entropy that could be used to operate the heat engine in reverse and return to the previous state, thus the total entropy change is still zero at all times if the entire process is reversible. An irreversible process increases entropy.WEB, Lower, Stephen, What is entropy?,weblink www.chem1.com, 21 May 2016,

Carnot cycle

The concept of entropy arose from Rudolf Clausius's study of the Carnot cycle.BOOK, Lavenda, Bernard H., A new perspective on thermodynamics, 2010, Springer, New York, 978-1-4419-1430-9, Online-Ausg., 2.3.4, In a Carnot cycle, heat {{math|QH}} is absorbed isothermally at temperature {{math|TH}} from a 'hot' reservoir and given up isothermally as heat {{math|QC}} to a 'cold' reservoir at {{math|TC}}. According to Carnot's principle, work can only be produced by the system when there is a temperature difference, and the work should be some function of the difference in temperature and the heat absorbed ({{math|QH}}). Carnot did not distinguish between {{math|QH}} and {{math|QC}}, since he was using the incorrect hypothesis that caloric theory was valid, and hence heat was conserved (the incorrect assumption that {{math|QH}} and {{math|QC}} were equal) when, in fact, {{math|QH}} is greater than {{math|QC}}.BOOK, Carnot, Sadi Carnot, Fox, Robert, Reflexions on the motive power of fire, 1986, Lilian Barber Press, New York, 978-0-936508-16-0, 26, BOOK, Truesdell, C., The tragicomical history of thermodynamics 1822–1854, 1980, Springer, New York, 978-0-387-90403-0, 78–85, Through the efforts of Clausius and Kelvin, it is now known that the maximum work that a heat engine can produce is the product of the Carnot efficiency and the heat absorbed from the hot reservoir:{{NumBlk|:|W=left(frac{T_text{H}-T_text{C}}{T_text{H}}right)Q_text{H} = left(1-frac{T_text{C}}{T_text{H}} right) Q_text{H}|{{EquationRef|1}}}}To derive the Carnot efficiency, which is {{math|1 − TC/TH}} (a number less than one), Kelvin had to evaluate the ratio of the work output to the heat absorbed during the isothermal expansion with the help of the Carnot–Clapeyron equation which contained an unknown function, known as the Carnot function. The possibility that the Carnot function could be the temperature as measured from a zero temperature, was suggested by Joule in a letter to Kelvin. This allowed Kelvin to establish his absolute temperature scale.BOOK, Clerk Maxwel, James, Pesic, Peter, Theory of heat, 2001, Dover Publications, Mineola, 978-0-486-41735-6, 115–158, It is also known that the work produced by the system is the difference between the heat absorbed from the hot reservoir and the heat given up to the cold reservoir:{{NumBlk|:|W=Q_text{H}-Q_text{C}|{{EquationRef|2}}}}Since the latter is valid over the entire cycle, this gave Clausius the hint that at each stage of the cycle, work and heat would not be equal, but rather their difference would be a state function that would vanish upon completion of the cycle. The state function was called the internal energy and it became the first law of thermodynamics.BOOK, Rudolf Clausius, The Mechanical Theory of Heat: With Its Applications to the Steam-engine and to the Physical Properties of Bodies,weblink 1867, J. Van Voorst, 978-1-4981-6733-8, 28, Now equating ({{EquationNote|1}}) and ({{EquationNote|2}}) gives
frac{Q_text{H}}{T_text{H}}-frac{Q_text{C}}{T_text{C}}=0
or
frac{Q_text{H}}{T_text{H}}=frac{Q_text{C}}{T_text{C}}
This implies that there is a function of state which is conserved over a complete cycle of the Carnot cycle. Clausius called this state function entropy. One can see that entropy was discovered through mathematics rather than through laboratory results. It is a mathematical construct and has no easy physical analogy. This makes the concept somewhat obscure or abstract, akin to how the concept of energy arose.Clausius then asked what would happen if there should be less work produced by the system than that predicted by Carnot's principle. The right-hand side of the first equation would be the upper bound of the work output by the system, which would now be converted into an inequality
W


- content above as imported from Wikipedia
- "entropy" does not exist on GetWiki (yet)
- time: 4:29am EDT - Mon, Sep 16 2019
[ this remote article is provided by Wikipedia ]
LATEST EDITS [ see all ]
GETWIKI 09 JUL 2019
Eastern Philosophy
History of Philosophy
GETWIKI 09 MAY 2016
GETWIKI 18 OCT 2015
M.R.M. Parrott
Biographies
GETWIKI 20 AUG 2014
GETWIKI 19 AUG 2014
CONNECT