Entropy

views updated May 11 2018

Entropy

Entropy measures disorder

Entropy is a probabilistic property

Entropy is additive

Entropy is not conserved

Resources

Entropy is a physical quantity that is primarily a measure of the thermodynamic disorder of a physical system. Entropy has the unique property in that its global value must always increase or stay the same.

This property is reflected in the second law of thermodynamics. The factthat entropy must always increase in natural processes introduces the concept of irreversibility, and defines a unique direction for the flow of time.

Entropy is a property of all physical systems, the behavior of which is described by the second law of thermodynamics (the study of heat). The first law of thermodynamics states that the total energy of an isolated system is constant; the second law states that the entropy of an isolated system must stay the same or increase. Note that entropy, unlike energy, is not conserved but can increase. A systems entropy can also decrease, but only if it is part of a larger system whose overall entropy does increase.

French mathematician and engineer Comte Lazare Nicolas Marguerite Carnot (17531823) wrote indirectly about entropy when he said that the motions and movements of machines represent losses of moment of activity, or losses of useful energy to the system. Increases in entropy, what is now known as heat, was called caloric by French physicist, mathematician, and engineer Nicolas Leonard Sadi Carnot (who was Lazare Carnots son). The word entropy, first articulated in 1850 by German physicist Rudolf Clausius (18221888), does not correspond to any property of matter that scientists can sense, such as temperature, and so it is not easy to conceptualize. It can be roughly equated with the amount of energy in a system that is not available for work or, alternatively, with the orderliness of a system, but is not precisely given by either of these concepts. A basic intuitive grasp of what entropy means can be given by statistical mechanics, as described below. Still later, American mathematical physicist Josiah Willard Gibbs (1839 1903), Italian mathematician and physicist Ludwig Boltzmann (18441906), and Scottish physicist James Clerk Maxwell (18311879), among others, help to define the statistics behind the subject of entropy.

On a fundamental level, entropy is related to the number of possible physical states of a system, S = k log (γ), where S represents the entropy, k is Boltzmanns constant, and γ is the number of states of the system.

Consider a system of three independent atoms that are capable of storing energy in quanta or units of fixed size є. If there happens to be only three units of energy in this system, how many possible micro-statesthat is, distinct ways of distributing the energy among the atomsare there? This question is most easily answered, for this example, by listing all the possibilities. There are 10 possible configurations.

If n 0 stands for the number of atoms in the system with 0є energy, then n 1 stands for the number with є, n 2 for the number with 2є, and n 3 for the number with 3є. For example, in the microstates labeled 1, 2, and 3 in the figure that accompanies this article, (n 0, n 1, n 2, n 3 ) =(2, 0, 0, 3); that is, two atoms have 0є energy, no atoms have 1є or 2є, and one atom has 3є.

Each class or group of microstates corresponds to a distinct (n 0, n 1, n 2, n 3 ) distribution. There are three possible distributions, and where P stands for the number of microstates corresponding to each distribution, P can equal 3, 6, or 1. The three values of P can be verified by counting the microstates that themselves reflect the energy distributions for a system of three atoms sharing three units of energy. Again, the number of possible microstates P corresponding to each distribution.

The distribution P2 representing the distribution (n 0, n 1, n 2, n 3 ) = (1, 1, 1, 0)has the most possible microstates (six). If one assume that this system is constantly, randomly shifting from one microstate to another, that any microstate is equally likely to follow any other, and that one inspect the system at some randomly-chosen instant, then one is most likely to observe one of the microstates corresponding to distribution P2. Specifically, the probability of observing a microstate corresponding to distribution P2 is 0.6 (6 chances out of 10). The probability of observing distribution P1 is 0.3 (3 chances out of 10) and the probability of observing distribution P3 is 0.1 (1 chance out of 10).

The entropy of this or any other system, S, is defined as S = k ln(P max ), where P max is the number of microstates corresponding to the most probable distribution of the system (P max = 6 in this example), k is the Boltzmann constant (1.3803× 10-16 ergs per degree C [Celsius]), and ln(.) is the natural logarithm operation. Further inspection of this equation and the three-atom example given above will clarify some of the basic properties of entropy.

Entropy measures disorder

(1) Microstates 1, 2, and 3 of the three-atom system described abovethose distributions in which the energy available to the system is segregated entirely to one atomare in some sense clearly the most orderly or organized. Yet, these three microstates (distribution 1) are also unlikely; their total probability of occurrence at any moment is only half that of distribution 2. Order is less likely than disorder.

Entropy is a probabilistic property

(2) Any system might transition, at any time, to one of its less probable states, because energy can, in a sense, go anywhere it wants to; it is simply much less likely to go some places than others. In systems containing trillions of atoms or molecules, such as a roomful of air, the probability that the system will transition to a highly organized state analogous to microstate 1 in the Figuresay, that all the energy in the room will concentrate itself in one molecule while all the rest cool to absolute zerois extremely small. One would have to wait many trillions of years for such an event even to happen.

Entropy is additive

If two systems with entropy S 1 and S 2, respectively, are brought into contact with each other, their combined entropy equals S 1 + S 2. This result is assured by the logarithmic relationship between S and P max, as follows: If the number of microstates corresponding to the most likely distribution of the first system is P max1 and that of the second system is P max2, then the number of possible microstates of the most likely distribution of the combined system is simply P max1 P max2 . It is a fundamental properties of logarithms that ln(ab ) = ln(a) + ln(b), so if S 1+2 entropy of the combined system, then:

S1+2 = k 1n(Pmax1Pmax2)

=k [1n(Pmax1) + 1n(Pmax2)]

=k 1n(Pmax1) + k 1n(Pmax2)

=S1 + S2

Entropy is not conserved

All that is needed to increase the entropy of an isolated system is to increase the number of micro-states its particles can occupy; say, by allowing the system to occupy more space. It is beyond the scope of this discussion to prove that the entropy of a closed system cannot ever decrease, but this can be made plausible by considering the first law of thermodynamics, which forbids energy to be created or destroyed. As long as a system has the same number of atoms and the same number of quanta of energy to share between them, it is plausible that the system possesses a minimum number of possible microstatesand a minimum entropy.

It is sometimes claimed that entropy always increases, and that the second law requires that disorder must always increase when nature is left to its own devices. This is incorrect. Note that in the above example, a system of three independent atoms is stipulated; yet atoms rarely behave independently when in proximity to each other at low temperatures. They tend to form bonds, spontaneously organizing themselves into orderly structures (molecules and crystals). Order from disorder is, therefore, just as natural a process as disorder from order. At low temperatures, self-ordering predominates; at high temperatures, entropy effects dominate (i.e., order is broken down). Furthermore, any system that is not isolated can experience decreased entropy (increasing order) at the expense of increasing entropy elsewhere. The Earth, which shares the solar system with the sun, whose entropy is increasing rapidly, is one such non-isolated system. It is therefore an error to claim, as some people do, that biological evolutionwhich involves spontaneously increasing order in natural molecular systemscontradicts thermodynamics. Entropy does not forbid molecular self-organization because entropy is only one property of matter; entropy does discourage self-organization, but other properties of matter encourage it, and in some circumstances (especially at relatively low temperatures, as on the surface of the Earth) will prevail.

An alternative derivation of the entropy concept, based on the properties of heat engines (devices that turn heat flows partially into mechanical work) is often presented in textbooks. This method produces a definition of entropy that seems to differ from S = k ln(P max ), namely, d S =d Q rev/ T, where d S is an infinitesimal (very small) change in a systems entropy at the fixed temperature T when an infinitesimal quantity of heat, Q rev, flows reversibly into or out of the system. However, it can be shown that these definitions are exactly equivalent; indeed, the entropy concept was originally developed from the analysis of heat engines, and the statistical interpretation given above was not invented until later.

The entropy concept is fundamental to all science that deals with heat, efficiency, the energy of systems, chemical reactions, very low temperatures, and related topics. Its physical meaning is, in essence, that the amount of work the universe can perform is always declining as its orderliness declines, and must eventually approach zero. In other words, things are running down, and there is no way to stop them.

Resources

BOOKS

Greven, Andreas, et al., eds. Entropy. Princeton, NJ: Princeton University Press, 2003.

Jost, Jurgen. Dynamical Systems: Examples of Complex Behavior. Berlin, Germany, and New York: Springer, 2005.

Kirwan, A.D. Mother Natures Two Laws: Ringmasters for Circus Earth (Lessons on Entropy, Energy, Critical Thinking, and the Practice of Science). Singapore and River Edge, NJ: World Scientific, 2002.

Lee, Joon Chang Lee. Thermal Physics: Entropy and Free Energies. River Edge, NJ: World Scientific Publishing Co., Inc., 2002.

K. Lee Lerner

Larry Gilman

Entropy

views updated May 18 2018

Entropy

Entropy is a physical quantity that is primarily a measure of the thermodynamic disorder of a physical system. Entropy has the unique property in that its global value must always increase or stay the same; this property is reflected in the second law of thermodynamics . The fact that entropy must always increase in natural processes introduces the concept of irreversibility, and defines a unique direction for the flow of time .

Entropy is a property of all physical systems, the behavior of which is described by the second law of thermodynamics (the study of heat ). The first law of thermodynamics states that the total energy of an isolated system is constant; the second law states that the entropy of an isolated system must stay the same or increase. Note that entropy, unlike energy, is not conserved, but can increase. A system's entropy can also decrease, but only if it is part of a larger system whose overall entropy does increase.

Entropy, first articulated in 1850 by the German physicist Rudolf Clausius (1822–1888), does not correspond to any property of matter that we can sense, such as temperature , and so it is not easy to conceptualize. It can be roughly equated with the amount of energy in a system that is not available for work or, alternatively, with the orderliness of a system, but is not precisely given by either of these concepts. A basic intuitive grasp of what entropy means can be given by statistical mechanics , as described below.

On a fundamental level, entropy is related to the number of possible physical states of a system, S = k log (Gamma), where S represents the entropy, k is Boltzmann's constant, and (Gamma) is the number of states of the system.

Consider a system of three independent atoms that are capable of storing energy in quanta or units of fixed size eε. If there happens to be only three units of energy in this system, how many possible microstates—that is, distinct ways of distributing the energy among the atoms—are there? This question is most easily answered, for this example, by listing all the possibilities. There are 10 possible configurations.

If n0 stands for the number of atoms in the system with 0ε energy, n1 for the number with ε, n2 for the number with 2ε, and n3 for the number with 3ε. For example, in the microstates labeled 1, 2, and 3 in the figure that accompanies this article, (n0, n1, n2, n3) = (2, 0, 0, 3); that is, two atoms have 0ε energy, no atoms have 1ε or 2ε, and one atom has 3ε.

Each class or group of microstates that corresponds to a distinct (n0, n1, n2, n3) distribution. There are three possible distributions, and where P stands for the number of microstates corresponding to each distribution, P can equal 3, 6 or 1. The three values of P can be verified by counting the microstates that themselves reflect the energy distributions for a system of three atoms sharing three units of energy. Again, the number of possible microstates P corresponding to each distribution.

The distribution P2—representing the distribution (n0, n1, n2, n3) = (1, 1, 1, 0)—has the most possible microstates (six). If we assume that this system is constantly, randomly shifting from one microstate to another, that any microstate is equally likely to follow any other, and that we inspect the system at some randomly-chosen instant, we are, therefore, most likely to observe one of the microstates corresponding to distribution P2. Specifically, the probability of observing a microstate corresponding to distribution P2 is 0.6 (6 chances out of 10). The probability of observing distribution P1 is 0.3 (3 chances out of 10) and the probability of observing distribution P3 is 0.1 (1 chance out of 10).

The entropy of this or any other system, S, is defined as S = kln(Pmax), where Pmax is the number of microstates corresponding to the most probable distribution of the system (Pmax = 6 in this example), k is the Boltzmann constant (1.3803 × 10-16 ergs per degree C), and ln(ṡ) is the natural logarithm operation. Further inspection of this equation and the three-atom example given above will clarify some of the basic properties of entropy.


Entropy measures disorder

(1) Microstates 1, 2, and 3 of the three-atom system described above—those distributions in which the energy available to the system is segregated entirely to one atom—are in some sense clearly the most orderly or organized. Yet these three microstates (distribution 1) are also unlikely; their total probability of occurrence at any moment is only half that of distribution 2. Order is less likely than disorder.


Entropy is a probabilistic property

(2) Any system might transition, at any time, to one of its less probable states, because energy can, in a sense, go anywhere it wants to; it is simply much less likely to go some places than others. In systems containing trillions of atoms or molecules, such as a roomful of air, the probability that the system will transition to a highly organized state analogous to microstate 1 in the Figure—say, that all the energy in the room will concentrate itself in one molecule while all the rest cool to absolute zero—is extremely small; one would have to wait many trillions of years for such an even to happen.


Entropy is additive

If two systems with entropy S1 and S2, respectively, are brought into contact with each other, their combined entropy equals S1 + S2. This is assured by the logarithmic relationship between S and Pmax, as follows: If the number of microstates corresponding to the most likely distribution of the first system is Pmax1 and that of the second system is Pmax2, then the number of possible microstates of the most likely distribution of the combined system is simply Pmax1Pmax2. It is a fundamental properties of logarithms that ln(ab) = ln(a) + ln(b), so if S1+2 is the entropy of the combined system, then

Entropy is not conserved

All that is needed to increase the entropy of an isolated system is to increase the number of microstates its particles can occupy; say, by allowing the system to occupy more space. It is beyond the scope of this discussion to prove that the entropy of a closed system cannot ever decrease, but this can be made plausible by considering the first law of thermodynamics, which forbids energy to be created or destroyed. As long as a system has the same number of atoms and the same number of quanta of energy to share between them, it is plausible that the system possesses a minimum number of possible microstates—and a minimum entropy.

It is sometimes claimed that "entropy always increases," and that the second law requires that disorder must always increase when nature is left to its own devices. This is incorrect. Note that in the above example, a system of three independent atoms is stipulated; yet atoms rarely behave independently when in proximity to each other at low temperatures. They tend to form bonds, spontaneously organizing themselves into orderly structures (molecules and crystals). Order from disorder is, therefore, just as natural a process as disorder from order. At low temperatures, self-ordering predominates; at high temperatures, entropy effects dominate (i.e., order is broken down). Furthermore, any system that is not isolated can experience decreased entropy (increasing order) at the expense of increasing entropy elsewhere. Earth , which shares the solar system with the Sun , whose entropy is increasing rapidly, is one such non-isolated system. It is therefore an error to claim, as some writers do, that biological evolution—which involves spontaneously increasing order in natural molecular systems—contradicts thermodynamics. Entropy does not forbid molecular self-organization because entropy is only one property of matter; entropy does discourage self-organization, but other properties of matter encourage it, and in some circumstances (especially at relatively low temperatures, as on the surface of the earth) will prevail.

An alternative derivation of the entropy concept, based on the properties of heat engines (devices that turn heat flows partially into mechanical work) is often presented in textbooks. This method produces a definition of entropy that seems to differ from S = kln(Pmax), namely, dS = dQrev/T, where dS is an infinitesimal (very small) change in a system's entropy at the fixed temperature T when an infinitesimal quantity of heat, Qrev, flows reversibly into or out of the system. However, it can be shown that these definitions are exactly equivalent; indeed, the entropy concept was originally developed from the analysis of heat engines, and the statistical interpretation given above was not invented until later.

The entropy concept is fundamental to all science that deals with heat, efficiency, the energy of systems, chemical reactions , very low temperatures, and related topics. Its physical meaning is, in essence, that the amount of work the universe can perform is always declining as its "orderliness" declines, and must eventually approach zero . In other words, things are running down, and there is no way to stop them.


Resources

books

Dugdale, J.S. Entropy and Its Physical Meaning. Cornwall, UK: Taylor & Francis, 1996.

Goldstein, M., and I. Goldstein. The Refrigerator and the Universe: Understanding the Laws of Energy. Cambridge, MA: Harvard University Press, 1993.

Lee, Joon Chang. Thermal Physics: Entropy and Free Energies. World Scientific Publishing Co., Inc., 2002.


K. Lee Lerner
Larry Gilman

Entropy

views updated Jun 11 2018

Entropy


Entropy is a thermodynamic quantity whose value depends on the physical state or condition of a system. It is useful in physics as a means of expressing the Second Law of Thermodynamics. That is, while the law may be stated in terms of it being impossible to extract heat from a reservoir and convert it totally to usable work, in terms of entropy the law states that any changes occurring in a system that is thermally isolated from its surroundings are such that its entropy never decreases.

This behavior corresponds to the fact that entropy is a measure of the disorder of a system. On average all of nature proceeds to a greater state of disorder. Examples of irreversible progression to disorder are pervasive in the world and in everyday experience. Bread crumbs will never gather back into the loaf. Helium atoms that escape from a balloon never return. A drop of ink placed in a glass of water will uniformly color the entire glass and never assemble into its original shape.

Entropy as a measure of disorder can be shown to depend on the probability that the particles of a system are in a given state of order. The tendency for entropy to increase occurs because the number of possible states of disorder that a system can assume is greater than the number of more ordered states, making a state of disorder more probable. For example, the entropy of the ordered state of the water molecules in ice crystal is less than it is when the crystal is melted to liquid water. The entropy difference involved corresponds to the transfer of heat to the crystal in order to melt it.

It may appear that there are exceptions to the general rule of ultimate progression to disorder; the growth of crystals, plants, animals, and humans are all remarkable examples of order or organization. However, these are open systems that exchange matter and energy with their surroundings for their growth and sustenance. If a composite of the system plus its environment is considered, then it can always be shown that its entropy will never decrease, as long as the composite system is isolated.

Entropy is defined in physics as the ratio of the heat absorbed by a system to its absolute temperature (i.e., temperature based on the Kelvin scale). When a certain amount of heat passes to a system from one at a higher temperature, the entropy of the two systems combined increases. This is an irreversible process characterizing the general tendency of matter to seek temperature equilibrium, a state of maximum entropy or disorder.

This progressive tendency of nature toward disorder has been considered by many scholars as one of the primal natural processes that serve as a gauge for the irreversible nature of time. Accordingly, a considerable number have identified the relentless increase of entropy with what they term the thermodynamic arrow of time. In addition, the degradation associated with the increase of entropy has been discussed by some scholars of science and religion as a meaningful metaphor for evil.

See also Disorder; Thermodynamics, Second Law of


Bibliography

feynman, richard p. the feynman lectures on physics, vol. 1. reading, mass.: addison-wesley, 1963.

coveney, peter, and highfield, roger. the arrow of time: a voyage through science to solve time's greatest mystery. new york: ballantine, 1990.

russell, john robert. "entropy and evil." zygon: journal of science and religion 19 (1984): 449468.

sears, francis w. thermodynamics. reading, mass.: addison-wesley, 1953.

zemansky, mark w., and dittman, richard h. heat and thermodynamics, 6th edition. new york: mcgraw-hill, 1979.

lawrence w. fagg

entropy

views updated May 17 2018

en·tro·py / ˈentrəpē/ • n. Physics a thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system. (Symbol: S) ∎ fig. lack of order or predictability; gradual decline into disorder: a marketplace where entropy reigns supreme. ∎  (in information theory) a logarithmic measure of the rate of transfer of information in a particular message or language.DERIVATIVES: en·tro·pic / enˈträpik/ adj.en·tro·pi·cal·ly adv.

entropy

views updated May 29 2018

entropy A measure of the amount of information that is output by a source, or throughput by a channel, or received by an observer (per symbol or per second). Following Shannon (1948) and later writers, the entropy of a discrete memoryless source with alphabet A = {ai} of size n, and output X at time t is

where p(xi) = Prob {Xt = ai}

The logarithmic base b is chosen to give a convenient scale factor. Usually,

b = 2

b = e = 2.71828…

or

b = 10

Entropy is then measured in bits, in natural units or nats, or in Hartleys, respectively. When the source has memory, account has to be taken of the dependence between successive symbols output by the source.

The term arises by analogy with entropy in thermodynamics, where the defining expression has the same form but with a physical scale factor k (Boltzmann constant) and with the sign changed. The word negentropy is therefore sometimes used for the measure of information, as is uncertainty or simply “information”.

entropy

views updated Jun 11 2018

entropy Symbol S. A measure of the unavailability of a system's energy to do work; an increase in entropy is accompanied by a decrease in energy availability. When a system undergoes a reversible change the entropy (S) changes by an amount equal to the energy (Q) absorbed by the system divided by the thermodynamic temperature (T) at which the energy is absorbed, i.e. ΔS = ΔQ/T. However, all real processes are to a certain extent irreversible changes and in any closed system an irreversible change is always accompanied by an increase in entropy.

In a wider sense entropy can be interpreted as a measure of a system's disorder; the higher the entropy the greater the disorder. As any real change to a closed system tends towards higher entropy, and therefore higher disorder, it follows that the entropy of the universe (if it can be considered a closed system) is increasing and its available energy is decreasing. This increase in the entropy of the universe is one way of stating the second law of thermodynamics.

Entropy

views updated Jun 11 2018

Entropy ★★ 1999

Dorff gives a gifted lead performance in this excessively stylistic romantic drama. It spans a year in the life of arrogant moviemaker Jake Walsh (Dorff), who describes how he meets French model Stella (Godreche) and experiences love at first sight. Although they live together, their careers frequently keep them apart and Jake's emotional immaturity eventually separates them. The cliched film-within-a-film narrative only manages to slow the main story down. 104m/C VHS . Stephen Dorff, Judith Godreche, Kelly Macdonald, Lauren Holly, Jon Tenney, Frank Vincent, Paul Guilfoyle, Hector Elizondo; D: Phil Joanou; W: Phil Joanou; C: Carolyn Chen; M: George Fenton.

entropy

views updated May 14 2018

entropy Quantity that specifies the disorder of a physical system; the greater the disorder, the greater the entropy. In thermodynamics, it expresses the degree to which thermal energy is available for work – the less available it is, the greater the entropy. According to the second law of thermodynamics, a system's change in entropy is either zero or positive in any process.

entropy

views updated May 08 2018

entropy
1. Measure of disorder or unavailable energy in a thermodynamic system; the measure of increasing disorganization of the universe.

2. See LEAST-WORK PRINCIPLE; and LEAST-WORK PROFILE.

entropy

views updated May 23 2018

entropy
1. A measure of disorder or unavailable energy in a thermodynamic system; the measure of increasing disorganization of the universe.

2. See least-work principle and least-work profile.