What is Information Theory?

Information theory is a discipline of applied mathematics that uses the methods of probability theory and mathematical statistics to study information, information entropy, communication systems, data transmission, cryptography, and data compression. An information system is a broad-based communication system, which refers to a system composed of all the equipment required to transmit certain information from one place to another. Information theory is a theory about information. It should have its own clear research objects and applicable scope. But since the birth of information theory, people have different understanding of it. [1]

[xìn x lùn]
Information theory considers the transfer of information as a statistical phenomenon and gives a method for estimating the capacity of a communication channel. Information transmission and information compression are two major areas in the study of information theory. These two aspects are again
Information theory is a discipline summed up from long-term communication practice in the late 1940s. It is a science that specializes in studying the general laws of effective processing and reliable transmission of information.
ECCherry wrote an early history of information theory. He started with pictographs of stone inscriptions, went through medieval enlightenment linguistics, and worked in the telegraph sciences such as ENGilbert in the 16th century.
In the 1920s, H. Nyquist and LVRHartley first studied the ability of communication systems to transmit information and tried to measure the channel capacity of the system. Modern information theory is beginning to emerge.
The paper "Mathematical Theory of Communication" published by Claude Shannon in 1948 is the first paper in the world to establish a mathematical model of the communication process. This paper and another paper published in 1949 set a modern Basis of Information Theory.
Due to the rapid development of modern communication technology and the cross-penetration of other disciplines, the study of information theory has expanded from the narrow scope of Shannon's mathematical theory that was limited to communication systems, and has become a huge system now called information science. [2]
Traditional communication systems such as telegraph, telephone, and post deliver message, voice, and text information, respectively; and systems such as broadcasting, telemetry, remote sensing, and remote control also transmit various types of information, but they are different types of information, so they are also information system. Sometimes, information must be transmitted in both directions. For example, telephone communication requires two-way conversation, and remote control systems require transmission of control information and reverse measurement information. This type of two-way information system is actually composed of two information systems. All information systems can be summarized into the model shown in the figure to study its basic laws.
Source: The source of information or the entity that generates the information to be transmitted, such as a speaker in a telephone system. For a telecommunications system, it should also include a microphone. The electrical signal that it outputs is used as the carrier of the information.
Sink: The destination or receiver of the information. In the telephone system, this is the listener and the headset. The latter converts the received electrical signal into sound for the listener to extract the required information.
Channel: The channel for transmitting information, such as the coaxial cable system including repeaters in telephone communications, the transceivers of earth stations in satellite communications, antennas, and repeaters on satellites.
Encoder: In information theory, it refers to all equipment that transforms signals, which is actually the sending part of the terminal. It includes all equipment from the source to the channel, such as quantizer, compression encoder, modulator, etc., so that the signal output by the source is converted into a signal suitable for channel transmission.
Decoder: is the inverse transform device of the encoder, which converts the signal sent on the channel into a signal that the sink can accept. It can include a demodulator, decoder, digital-to-analog converter, etc.
After the source and sink have been given and the channel has been selected, the performance of the information system lies in the encoder and decoder. When designing an information system, in addition to selecting channels and designing its auxiliary facilities, the main task is to design a codec. In general, the main performance index of an information system is its effectiveness and reliability. Validity is to transmit as much information as possible in the system; reliability is to require that the information received by the sink is as consistent as possible with the information sent by the source, or that the distortion is as small as possible. The best codec is to make the system most efficient and reliable. However, reliability and effectiveness are often contradictory. More effective often leads to unreliability, and vice versa. In a quantitative sense, the system should transmit the maximum information rate under the conditions of specified distortion or basically no distortion; or the minimum distortion under the conditions of specified information rate. Calculating this maximum information rate and proving that a codec that reaches or approaches this value exists is the basic task of information theory. The theory that only discusses such issues can be called Xiannong's information theory. It believes that the content of information theory should be more extensive, including the theory of extracting information and ensuring information security. The latter are estimation theory, detection theory, and cryptography.
Information theory is formed on the basis of probability theory, which is based on the probability characteristics of source symbols and channel noise. This type of information is often called syntax information. In fact, the basic rules of information systems should also include semantic and pragmatic information. The grammatical information is the construction of the source output symbol or its objective characteristics, and it has nothing to do with the subjective requirements of the sink. The semantics should take into account the meaning of each symbol. The same meaning can be expressed in different languages or words. Each language contains The syntax information can be different. Generally speaking, the semantic information rate can be less than the grammatical information rate; the telegram's information rate can be lower than the information rate of a voice that expresses the same meaning. Furthermore, the recipient or recipient of the information often only needs information that is useful to him. The language that he does not understand is meaningful, but it is useless to him. Therefore, pragmatic information, that is, information useful to the sink is generally smaller than semantic information. If only information systems are required to transmit semantic or pragmatic information, the efficiency will obviously be higher. In the current situation, a systematic theory of grammatical information has been established on the basis of probability theory to form a discipline; yet semantic and pragmatic information is not mature enough. Therefore, the discussion on the latter is usually called information science or general information theory, and does not belong to the category of general information theory. To sum up, the basic laws of information systems should include information measurement, source characteristics and source coding, channel characteristics and channel coding, detection theory, estimation theory, and cryptography.
Basic Information Theory Course (2nd Edition)
Author:

Information Theory Basic Information

Chinese Book Title: Basis of Information Theory
English book title: Elements of Information Theory
Author: Thomas M.Cover, Joy A, Thomas
Translator: Ruan Jishou, Zhang Hua
Publisher: Machinery Industry Press
Number of editions: first edition (January 1, 2008)
Folio: 16
Pages: 439
ISBN: 9787111220404

Introduction to Information Theory

This book is a classic and concise textbook in the field of information theory. The main contents include: introduction of entropy, source, channel capacity, rate distortion, data compression and coding theory and complexity theory. This book also introduces network information theory and hypothesis testing, and uses the horse racing model as a starting point to incorporate the research of the securities market into the framework of information theory. From a new perspective, it brings new investment concepts and research to the study of portfolios skill. This book is suitable as a basic textbook of information theory for advanced undergraduates and graduate students in electrical engineering, statistics, and telecommunications. It can also be used as a reference for researchers and professionals.

Information Theory Catalog

Chapter 1 Introduction and Overview
Chapter 2 Entropy, Relative Entropy, and Mutual Information
Chapter 3 Asymptotic Equipartition
Chapter 4 Entropy Rate of Stochastic Processes
Chapter 5 Data Compression
Chapter 6 Gaming and Data Compression
Chapter 7 Channel Capacity
Chapter 8 Differential Entropy
Chapter 9 Gaussian Channel
Chapter 10 Rate Distortion Theory
Chapter 11 Information Theory and Statistics
Chapter 12 Maximum Entropy
Chapter 13 Universal Source Coding
Chapter 14 Kolmogorov Complexity
Chapter 15 Network Information Theory
Chapter 16 Information Theory and Portfolio Theory
Chapter 17 Inequalities in Information Theory
References [3]

IN OTHER LANGUAGES

Was this article helpful? Thanks for the feedback Thanks for the feedback

How can we help? How can we help?