Entropy Measure
An entropy measure is a system measure a system's maximum number of possible system arrangements (microstates).
- AKA: S.
- Context:
- It can be expressed as:
[math]\displaystyle{ S= -k_B\sum p_i\ln p_i }[/math]
- where [math]\displaystyle{ k_B }[/math] is the Boltzmann constant and [math]\displaystyle{ p_i }[/math] is the probability of the microstate i being occupied. For an isolated system in thermal equilibrium, it can be assumed that any microstate has equal probability of being occupied. Thus,
- [math]\displaystyle{ S= k_B \ln \Omega }[/math]
- where [math]\displaystyle{ \Omega }[/math] is the number of microestates.
- Example(s):
- The entropy of a reversible process is [math]\displaystyle{ \Delta S = \int dQ/T }[/math]
- It can produce an Entropy Value (for some specific system) [math]\displaystyle{ H(X_A) = - \Sigma_{x_A} \Pr(x_A) \log \Pr(x_A) }[/math].
- an Information Entropy Measure.
- a Perplexity Measure.
- Counter-Example
- See: Laws Of Thermodynamics, Gaussian Entropy, Information Entropy, Second Law of Thermodynamics, Log-Determinant Function, Gibb's Paradox.
References
2013
- (Wikipedia, 2013) ⇒ http://en.wikipedia.org/wiki/Entropy
- Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy.
Entropy is a thermodynamic quantity that helps to account for the flow of energy through a thermodynamic process. Entropy was originally defined for a thermodynamically reversible process as :[math]\displaystyle{ \Delta S = \int \frac {dQ_{rev}}T }[/math] where the entropy (S) is found from the uniform thermodynamic temperature (T) of a closed system divided into an incremental reversible transfer of heat into that system (dQ). The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic picture of the contents of a system. In thermodynamics, entropy has been found to be more generally useful and it has several other formulations. Entropy was discovered when it was noticed to be a quantity that behaves as a function of state. Entropy is an extensive property, but it is often given as an intensive property of specific entropy as entropy per unit mass or entropy per mole.
In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. The role of thermodynamic entropy in various thermodynamic processes can thus be understood by understanding how and why that information changes as the system evolves from its initial condition. It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it (which on some views of probability, amounts to the same thing as randomness). The second law is now often seen as an expression of the fundamental postulate of statistical mechanics via the modern definition of entropy.
- Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy.
- http://en.wikipedia.org/wiki/Entropy#Definitions_and_descriptions
- There are two related definitions of entropy: the thermodynamic definition and the statistical mechanics definition. Historically, the classical thermodynamics definition developed first, and it has more recently been extended in the area of non-equilibrium thermodynamics. Entropy was defined from a classical thermodynamics viewpoint, in which the details of the system's constituents are not directly considered, with their behavior only showing up in macroscopically averaged properties, e.g. heat capacity. Later, thermodynamic entropy was more generally defined from a statistical thermodynamics viewpoint, in which the detailed constituents --- modeled at first classically, e.g. Newtonian particles constituting a gas, and later quantum-mechanically (photons, phonons, spins, etc.)---were explicitly considered.
2012
- (Tong, 2012) ⇒ David Tong "Statistical Physics. University of Cambridge Part II Mathematical Tripos." ⇒ http://www.damtp.cam.ac.uk/user/tong/statphys/sp.pdf
- We define the entropy of the system to be
- [math]\displaystyle{ S(E) = k_B log \Omega(E)\quad \textrm{(1.2)} }[/math]
- Here [math]\displaystyle{ k_B }[/math] is a fundamental constant, known as Boltzmann’s constant. It has units of Joules per Kelvin.
- [math]\displaystyle{ k_B \approx 1.381 × 10−23JK^{−1} \quad \textrm{(1.3)} }[/math]
- The log in (1.2) is the natural logarithm (base e, not base 10). Why do we take the log in the definition? One reason is that it makes the numbers less silly. While the number of states is of order [math]\displaystyle{ \Omega \approx e^N }[/math], the entropy is merely proportional to the number of particles in the system, [math]\displaystyle{ S \approx N }[/math]. This also has the happy consequence that the entropy is an additive quantity. To see this, consider two non-interacting systems with energies E1 and E2 respectively. Then the total number of states of both systems is
- [math]\displaystyle{ \Omega(E1, E2) = \Omega(E1)\Omega(E2) }[/math]
- while the entropy for both systems is
- [math]\displaystyle{ S(E1, E2) = S_1(E1) + S_2(E2) }[/math]
2011
- http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Measuring-the-Entropy-989.html
- Determining the entropy turns out to be both difficult and laborious. In the case of a simple gas, if we know enough about its molecular structure and enough quantum mechanics, we can actually calculate its entropy. For most substances, though, we are forced to derive the entropy from a series of calorimetric measurements, most of them at very low temperatures.
This method for determining the entropy centers around a very simple relationship between q, the heat energy absorbed by a body, the temperature T at which this absorption takes place, and ΔS, the resultant increase in entropy: [math]\displaystyle{ \Delta S=\frac{q}{T} }[/math](1)
It is possible to derive this relationship from our original definition of entropy, namely, S = k ln W, but the proof is beyond the level of this text.
- Determining the entropy turns out to be both difficult and laborious. In the case of a simple gas, if we know enough about its molecular structure and enough quantum mechanics, we can actually calculate its entropy. For most substances, though, we are forced to derive the entropy from a series of calorimetric measurements, most of them at very low temperatures.