What Is Entropion?

Information is a very abstract concept. People often say that there is a lot of information or less information, but it is difficult to say exactly how much information is. For example, how much information does a Chinese book of 500,000 words have?

Information entropy

Usually, what symbol a source sends is uncertain, and it can be measured by its probability of occurrence. The probability is high, the chances are high, and the uncertainty is small; otherwise, the uncertainty is large.
The uncertainty function f is of probability P

Modern definition of information entropy

Information is a sign of matter, energy, information and its attributes . [ Inverse Wiener Information Definition]
Information is an increase in certainty . [Definition of Shannon information]
Information is a collection of things phenomena and their attributes . year 2002

Information entropy was originally defined

Claude E. Shannon, one of the originators of information theory, defined information (entropy) as the probability of occurrence of discrete random events.
The so-called information entropy is a mathematically rather abstract concept. Here we may consider information entropy as the probability of the appearance of a certain type of information. Information entropy and thermodynamic entropy are closely related. According to Charles H. Bennett's reinterpretation of Maxwell's Demon, the destruction of information is an irreversible process, so the destruction of information is in accordance with the second law of thermodynamics. The generation of information is the process of introducing negative (thermodynamic) entropy into the system. So the sign of information entropy and thermodynamic entropy should be opposite.
In general, when a piece of information has a higher probability of occurrence, it indicates that it is more widely spread, or that it is more highly cited. We can think that from the perspective of information dissemination, information entropy can represent the value of information. In this way we have a standard for measuring the value of information and can make more inferences about the problem of knowledge circulation.

Information entropy calculation formula

H (x) = E [I (x i )] = E [log (2,1 / P (x i ))] = -P (x i ) log (2, P (x i )) (i = 1,2, .. n)
Among them, x represents a random variable, and the set corresponding to all possible outputs is defined as a symbol set, and the output of the random variable is represented by x. P (x) represents the output probability function. The greater the uncertainty of a variable, the greater the entropy, and the greater the amount of information needed to figure it out.

Information Entropy Game Bible

Information entropy: The basic role of information is to eliminate people's uncertainty about things. After the combination of most particles, valuable figures are bet on its seemingly disliked form. Specifically, this is a phenomenon of information confusion in game matches.
Shannon pointed out that its accurate information should be
-(p1 * log (2, p1) + p2 * log (2, p2) + ... + p32 * log (2, p32)),
Information entropy
Among them, p1, p2,. , P32 are the probability that these 32 teams will win. Shannon called it "Entropy", which is generally represented by the symbol H, and the unit is bit.
Interested readers can calculate that when the 32 teams have the same probability of winning, the corresponding information entropy is equal to five bits. Readers with a mathematical foundation can also prove that the value of the above formula cannot be greater than five. For any random variable X (such as the winning team), its entropy is defined as follows:
The greater the uncertainty of a variable, the greater the entropy, and the greater the amount of information needed to figure it out.
Information entropy is a concept used to measure the amount of information in information theory. The more ordered a system is, the lower the information entropy;
Conversely, the more chaotic a system is, the higher the information entropy. Therefore, information entropy can also be said to be a measure of the degree of ordering of the system.
The concept of entropy comes from thermophysics.
It is assumed that there are two kinds of gases a and b. When the two kinds of gases are completely mixed, a stable state in thermophysics can be reached, and the entropy is highest at this time. If the reverse process is to be realized, that is, a and b are completely separated, it is not possible in a closed system. Only external intervention (information), that is, the addition of something ordered (energy) to the outside of the system, separates a and b. At this time, the system enters another stable state, and at this time, the information entropy is the lowest. Thermophysics proves that in a closed system, the entropy always increases until it reaches its maximum. To reduce the entropy of the system (make the system more orderly), the intervention of external energy is necessary.
The calculation of information entropy is very complicated. And information with multiple preconditions is almost impossible to calculate. Therefore, the value of information in the real world cannot be calculated. However, due to the close correlation between information entropy and thermodynamic entropy, information entropy can be measured during the decay process. Therefore, the value of information is reflected through the transmission of information. Without the introduction of added value (negative entropy), the more widely spread and the longer the information is, the more valuable it is.
Entropy is first and foremost a term in physics.
In communication, it refers to the uncertainty of information. A high-information entropy is very low, and a low-information entropy is high. Specifically, all activities that lead to an increase or decrease in the certainty, organization, regularity, or orderliness of a set of random events can be measured using a uniform scale of the amount of change in information entropy.

IN OTHER LANGUAGES

Was this article helpful? Thanks for the feedback Thanks for the feedback

How can we help? How can we help?