What is Entropy?

Entropy, one of the parameters that characterizes the state of matter in thermodynamics, is represented by the symbol S, and its physical meaning is a measure of the degree of chaos in the system.

Entropy, one of the parameters that characterizes the state of matter in thermodynamics, is represented by the symbol S, and its physical meaning is a measure of the degree of chaos in the system.
Chinese name
entropy
Foreign name
entropy
Presenter
Clausius
Presentation time
1865

A brief history of entropy discovery

T. Clausius put forward the concept of entropy in 1854. In 1923, Chinese physicist Professor Hu Gangfu first translated entropie into "entropy" according to the meaning of thermal temperature quotient. A. Einstein once summarized the status of entropy theory in science as "the first law of entropy theory for the whole science". Charles Percy Snow wrote in his book Two Cultural and Scientific Revolutions: "A humanist who knows nothing about thermodynamics and a scientist who knows nothing about Shakespeare Oops. Not long after the law of entropy was established, JC Maxwell put forward a famous paradox to try to prove that an isolation system will automatically change from thermal equilibrium to imbalance. In fact, the system inputs energy and information into the so-called " isolation system " through the work of the Maxwell demon. This system is actually a "self-organizing system".
The second law of thermodynamics , based on the principle of entropy , has been regarded as a fallen source in history. American historian H. Adams (1850-1901) said: "This principle only means that the volume of ruins is constantly increasing." Some people even think that this law indicates that the race will change from bad to worse and eventually die out. The second law of thermodynamics was the worst law in society at that time. Society is essentially different from a thermodynamic isolation system, but rather a "self-organizing system."

Entropy definition

Entropy classical thermodynamics

In 1865, Clausius named the new state function he discovered, and defined it in increments as
Where T is the thermodynamic temperature of the substance; dQ is the heat added to the substance during the entropy increase process, and the subscript "r" is an abbreviation of the English word "reversible", which indicates that the change process caused by the heating process is reversible.
If the process is irreversible, then
The subscript "ir" is an abbreviation of the English word "ireversible", which means that the change caused by the heating process is irreversible.
Combining the above two formulas can be obtained
This formula is called the Clausius inequality and is the most common expression of the second law in thermodynamics.

Entropy statistical thermodynamics

The size of the entropy is related to the micro-state of the system, that is, S = kln, where k is Boltzmann constant and k = 1.3807x10 -23 J · K -1 . [1] The micro state of the system is the thermodynamic probability of a system of a large number of particles obtained through statistical laws. Therefore, entropy has statistical significance, and it does not matter for systems with only a few, dozens, or hundreds of molecules.

Entropy property

Entropy state function

Entropy S is a state function, with a sum (capacity) property, and is a widely-measured non-conserved quantity, because the heat in its definition is proportional to the quantity of matter, but the determined state has a certain amount. The amount of change S is only determined by the system's permanent state and has nothing to do with whether the process is reversible or not. Since the change in the entropy of the system is equal to the sum of the thermal temperature quotient Q / T of the reversible process, the entropy change of the system can only be obtained through the reversible process. The reversible change of the isolated system or the adiabatic reversible change process S = 0.

Entropy macro

Entropy is a macroscopic quantity, which is a property manifested collectively by a large number of microscopic ions constituting the system. It includes the entropy contributed by the translation, vibration, rotation of electrons, and the motion of nuclear spins. It is meaningless to talk about the entropy of individual microscopic particles.

Absolute entropy

The absolute value of entropy cannot be determined by the second law of thermodynamics. The absolute value of entropy can be determined by the third law according to calorimetric data, which is called prescribed entropy or calorimetry. The absolute value of entropy can also be calculated from the microstructure data of the molecule using statistical thermodynamics, called statistical entropy or spectral entropy. [2]

Entropy application

Entropy was originally a material state parameter reflecting the irreversibility of spontaneous processes, which is derived from the second law of thermodynamics. The second law of thermodynamics is a rule summarized based on a large number of observations: in an isolated system, there is no energy exchange between the system and the environment, and the system always spontaneously changes in the direction of increasing chaos, always increasing the entropy value of the entire system This is the principle of entropy increase. Friction irreversibly transforms part of the mechanical energy into heat, which increases the entropy. Therefore, the entire universe can be regarded as an isolated system, which has evolved in the direction of increasing entropy.
Consider from a spontaneous process: the heat Q is transferred from a high temperature (T 1 ) object to a low temperature (T 2 ) object, and the entropy of the high temperature object is reduced by dS 1 = dQ / T 1 ,
The entropy of a low-temperature object increases by dS 2 = dQ / T 2. Considering the two objects together as a system, the change in entropy is dS = dS 2 -dS 1 > 0, that is, the entropy increases.

IN OTHER LANGUAGES

Was this article helpful? Thanks for the feedback Thanks for the feedback

How can we help? How can we help?