SUPPORT THE WORK

GetWiki

entropy

ARTICLE SUBJECTS
aesthetics  →
being  →
complexity  →
database  →
enterprise  →
ethics  →
fiction  →
history  →
internet  →
knowledge  →
language  →
licensing  →
linux  →
logic  →
method  →
news  →
perception  →
philosophy  →
policy  →
purpose  →
religion  →
science  →
sociology  →
software  →
truth  →
unix  →
wiki  →
ARTICLE TYPES
essay  →
feed  →
help  →
system  →
wiki  →
ARTICLE ORIGINS
critical  →
discussion  →
forked  →
imported  →
original  →
entropy
[ temporary import ]
please note:
- the content below is remote from Wikipedia
- it has been imported raw for GetWiki
{{About|entropy in thermodynamics}}{{hatnote|Not to be confused with Enthalpy.}}{{See introduction}}{{Use dmy dates|date=March 2017}}







factoids
{{Thermodynamics|expanded=sysprop}}{{EntropySegments}}In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number {{math|Ω}} of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Under the assumption that each microstate is equally probable, the entropy S is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant {{math|kB}}. Formally (assuming equiprobable microstates),
S = k_{mathrm{B}} ln Omega .
Macroscopic systems typically have a very large number {{math|Ω}} of possible microscopic configurations. For example, the entropy of an ideal gas is proportional to the number of gas molecules {{math|N}}. The number of molecules in twenty liters of gas at room temperature and atmospheric pressure is roughly {{math|N ≈ {{val|6|e=23}}}} (the Avogadro number). At equilibrium, each of the {{math|Ω ≈ eN}} configurations can be regarded as random and equally likely.{{citation needed|reason=Equiprobability seems unnecessary and unlikely: there will be a spectrum of possibilities, with a spread of probabilities.|date=February 2019}}The second law of thermodynamics states that the entropy of an isolated system never decreases over time. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that amount so that the total entropy increases. Entropy is a function of the state of the system, so the change in entropy of a system is determined by its initial and final states. In the idealization that a process is reversible, the entropy does not change, while irreversible processes always increase the total entropy.Because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification. For this reason, it is often said that entropy is an expression of the disorder, or randomness of a system, or of the lack of information about it{{citation needed|date=December 2018}}. The concept of entropy plays a central role in information theory.Boltzmann's constant, and therefore entropy, have dimensions of energy divided by temperature, which has a unit of joules per kelvin (J⋅K−1) in the International System of Units (or kg⋅m2⋅s−2⋅K−1 in terms of base units). The entropy of a substance is usually given as an intensive property{{snd}}either entropy per unit mass (SI unit: J⋅K−1⋅kg−1) or entropy per unit amount of substance (SI unit: J⋅K−1⋅mol−1).

History

File:Clausius.jpg|thumb|upright|{{nowrap|(1822–1888),}} originator of the concept of entropyThe French mathematician Lazare Carnot proposed in his 1803 paper Fundamental Principles of Equilibrium and Movement that in any machine the accelerations and shocks of the moving parts represent losses of moment of activity. In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare's son Sadi Carnot published Reflections on the Motive Power of Fire which posited that in all heat-engines, whenever "caloric" (what is now known as heat) falls through a temperature difference, work or motive power can be produced from the actions of its fall from a hot to cold body. He made the analogy with that of how water falls in a water wheel. This was an early insight into the second law of thermodynamics.WEB,weblink Carnot, Sadi (1796–1832), 2010-02-24, Wolfram Research, 2007, Carnot based his views of heat partially on the early 18th century "Newtonian hypothesis" that both heat and light were types of indestructible forms of matter, which are attracted and repelled by other matter, and partially on the contemporary views of Count Rumford who showed (1789) that heat could be created by friction as when cannon bores are machined.BOOK, McCulloch, Richard, S., Treatise on the Mechanical Theory of Heat and its Applications to the Steam-Engine, etc., D. Van Nostrand, 1876, Carnot reasoned that if the body of the working substance, such as a body of steam, is returned to its original state at the end of a complete engine cycle, that "no change occurs in the condition of the working body".The first law of thermodynamics, deduced from the heat-friction experiments of James Joule in 1843, expresses the concept of energy, and its conservation in all processes; the first law, however, is unable to quantify the effects of friction and dissipation.In the 1850s and 1860s, German physicist Rudolf Clausius objected to the supposition that no change occurs in the working body, and gave this "change" a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction.JOURNAL, Clausius, Rudolf, Über die bewegende Kraft der Wärme und die Gesetze, welche sich daraus für die Wärmelehre selbst ableiten lassen [On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat], Poggendorff's Annalen der Physik und Chemie, 1850, 10.1002/andp.18501550306, Clausius described entropy as the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state. This was in contrast to earlier views, based on the theories of Isaac Newton, that heat was an indestructible particle that had mass.Later, scientists such as Ludwig Boltzmann, Josiah Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis. In 1877 Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the natural logarithm of the number of microstates such a gas could occupy. Henceforth, the essential problem in statistical thermodynamics has been to determine the distribution of a given amount of energy E over N identical systems.Carathéodory linked entropy with a mathematical definition of irreversibility, in terms of trajectories and integrability.

Definitions and descriptions

}}There are two related definitions of entropy: the thermodynamic definition and the statistical mechanics definition. Historically, the classical thermodynamics definition developed first. In the classical thermodynamics viewpoint, the system is composed of very large numbers of constituents (atoms, molecules) and the state of the system is described by the average thermodynamic properties of those constituents; the details of the system's constituents are not directly considered, but their behavior is described by macroscopically averaged properties, e.g. temperature, pressure, entropy, heat capacity. The early classical definition of the properties of the system assumed equilibrium. The classical thermodynamic definition of entropy has more recently been extended into the area of non-equilibrium thermodynamics. Later, the thermodynamic properties, including entropy, were given an alternative definition in terms of the statistics of the motions of the microscopic constituents of a system – modeled at first classically, e.g. Newtonian particles constituting a gas, and later quantum-mechanically (photons, phonons, spins, etc.). The statistical mechanics description of the behavior of a system is necessary as the definition of the properties of a system using classical thermodynamics becomes an increasingly unreliable method of predicting the final state of a system that is subject to some process.

Function of state

There are many thermodynamic properties that are functions of state. This means that at a particular thermodynamic state (which should not be confused with the microscopic state of a system), these properties have a certain value. Often, if two properties of the system are determined, then the state is determined and the other properties' values can also be determined. For instance, a quantity of gas at a particular temperature and pressure has its state fixed by those values and thus has a specific volume that is determined by those values. As another instance, a system composed of a pure substance of a single phase at a particular uniform temperature and pressure is determined (and is thus a particular state) and is at not only a particular volume but also at a particular entropy.J. A. McGovern,WEB,weblink 2.5 Entropy, 2013-02-05, yes,weblink" title="web.archive.org/web/20120923080936weblink">weblink 2012-09-23, The fact that entropy is a function of state is one reason it is useful. In the Carnot cycle, the working fluid returns to the same state it had at the start of the cycle, hence the line integral of any state function, such as entropy, over this reversible cycle is zero.

Reversible process

Entropy is conserved for a reversible process. A reversible process is one that does not deviate from thermodynamic equilibrium, while producing the maximum work. Any process which happens quickly enough to deviate from thermal equilibrium cannot be reversible. In these cases energy is lost to heat, total entropy increases, and the potential for maximum work to be done in the transition is also lost. More specifically, total entropy is conserved in a reversible process and not conserved in an irreversible process.WEB, 6.5 Irreversibility, Entropy Changes, and Lost Work,weblink web.mit.edu, 21 May 2016, For example, in the Carnot cycle, while the heat flow from the hot reservoir to the cold reservoir represents an increase in entropy, the work output, if reversibly and perfectly stored in some energy storage mechanism, represents a decrease in entropy that could be used to operate the heat engine in reverse and return to the previous state, thus the total entropy change is still zero at all times if the entire process is reversible. An irreversible process increases entropy.WEB, Lower, Stephen, What is entropy?,weblink www.chem1.com, 21 May 2016,

Carnot cycle

The concept of entropy arose from Rudolf Clausius's study of the Carnot cycle.BOOK, Lavenda, Bernard H., A new perspective on thermodynamics, 2010, Springer, New York, 978-1-4419-1430-9, Online-Ausg., 2.3.4, In a Carnot cycle, heat {{math|QH}} is absorbed isothermally at temperature {{math|TH}} from a 'hot' reservoir and given up isothermally as heat {{math|QC}} to a 'cold' reservoir at {{math|TC}}. According to Carnot's principle, work can only be produced by the system when there is a temperature difference, and the work should be some function of the difference in temperature and the heat absorbed ({{math|QH}}). Carnot did not distinguish between {{math|QH}} and {{math|QC}}, since he was using the incorrect hypothesis that caloric theory was valid, and hence heat was conserved (the incorrect assumption that {{math|QH}} and {{math|QC}} were equal) when, in fact, {{math|QH}} is greater than {{math|QC}}.BOOK, Carnot, Sadi Carnot, Fox, Robert, Reflexions on the motive power of fire, 1986, Lilian Barber Press, New York, 978-0-936508-16-0, 26, BOOK, Truesdell, C., The tragicomical history of thermodynamics 1822–1854, 1980, Springer, New York, 978-0-387-90403-0, 78–85, Through the efforts of Clausius and Kelvin, it is now known that the maximum work that a heat engine can produce is the product of the Carnot efficiency and the heat absorbed from the hot reservoir:{{NumBlk|:|W=left(frac{T_text{H}-T_text{C}}{T_text{H}}right)Q_text{H} = left(1-frac{T_text{C}}{T_text{H}} right) Q_text{H}|{{EquationRef|1}}}}To derive the Carnot efficiency, which is {{math|1 − TC/TH}} (a number less than one), Kelvin had to evaluate the ratio of the work output to the heat absorbed during the isothermal expansion with the help of the Carnot–Clapeyron equation which contained an unknown function, known as the Carnot function. The possibility that the Carnot function could be the temperature as measured from a zero temperature, was suggested by Joule in a letter to Kelvin. This allowed Kelvin to establish his absolute temperature scale.BOOK, Clerk Maxwel, James, Pesic, Peter, Theory of heat, 2001, Dover Publications, Mineola, 978-0-486-41735-6, 115–158, It is also known that the work produced by the system is the difference between the heat absorbed from the hot reservoir and the heat given up to the cold reservoir:{{NumBlk|:|W=Q_text{H}-Q_text{C}|{{EquationRef|2}}}}Since the latter is valid over the entire cycle, this gave Clausius the hint that at each stage of the cycle, work and heat would not be equal, but rather their difference would be a state function that would vanish upon completion of the cycle. The state function was called the internal energy and it became the first law of thermodynamics.BOOK, Rudolf Clausius, The Mechanical Theory of Heat: With Its Applications to the Steam-engine and to the Physical Properties of Bodies,weblink 1867, J. Van Voorst, 978-1-4981-6733-8, 28, Now equating ({{EquationNote|1}}) and ({{EquationNote|2}}) gives
frac{Q_text{H}}{T_text{H}}-frac{Q_text{C}}{T_text{C}}=0
or
frac{Q_text{H}}{T_text{H}}=frac{Q_text{C}}{T_text{C}}
This implies that there is a function of state which is conserved over a complete cycle of the Carnot cycle. Clausius called this state function entropy. One can see that entropy was discovered through mathematics rather than through laboratory results. It is a mathematical construct and has no easy physical analogy. This makes the concept somewhat obscure or abstract, akin to how the concept of energy arose.Clausius then asked what would happen if there should be less work produced by the system than that predicted by Carnot's principle. The right-hand side of the first equation would be the upper bound of the work output by the system, which would now be converted into an inequality
W


- content above as imported from Wikipedia
- "entropy" does not exist on GetWiki (yet)
- time: 7:10am EDT - Thu, Jul 18 2019
[ this remote article is provided by Wikipedia ]
LATEST EDITS [ see all ]
GETWIKI 09 JUL 2019
Eastern Philosophy
History of Philosophy
GETWIKI 09 MAY 2016
GETWIKI 18 OCT 2015
M.R.M. Parrott
Biographies
GETWIKI 20 AUG 2014
GETWIKI 19 AUG 2014
CONNECT