What is the importance of information theory

Information theory

In 1948 C. E. Shannon established a mathematical theory of information. It abstracts from the meaning of the content of the information and, as part of cybernetics, deals solely with the transmission and storage of character sequences. Complex systems cannot be predicted in terms of their behavior because of the imperfect information. So that the system can work, it must be constantly supplied with information. The task of information theory is now, despite the indeterminacy of cybernetic systems, to achieve largely error-free information acquisition, transmission and conversion with the help of statistical methods.

In economic sociology: examines problems of conveying and processing information in close relation to cybernetics. The main areas of information theory are questions of the development of character systems for coding messages (encryption - decryption) and the transmission of messages in information channels that are prone to interference. In addition to the application of statistical-mathematical information theory to optimize technical communication systems, information theory is increasingly finding its way into psychology, linguistics and sociology (e.g. investigation of the ability to perceive, the redundancy of linguistic codes, the "entropy" of group structures). In addition to the structure of sign systems (syntactics), the meaning of signs (semantics) and their social functions (pragmatics) are increasingly becoming the subject of a general I.

was founded in 1948 by Claude E. Shannon. It sees itself as part of cybernetics. As a mathematical communication theory, it is very communications-oriented and uses a special concept of information. Information is only examined in its syntactic dimension, its meaning is irrelevant. The focus of information theory is on the coding and transmission of messages. To do this, she uses the basics of statistics and probability theory. The figure illustrates the basic elements of a communication system. A message source and sink each have a supply of characters, the converters and reverse converters each have a supply of signals. The transmitter forms a message from a sequence of characters, which is converted into physically perceptible quantities by the encoder into a sequence of signals and then transmitted over the channel. The message is decrypted back at the other end of the channel by a decoder in a form understandable to the recipient. The character and signal stocks of the transmitter and receiver are of particular importance. In order to form or recognize a message, a selection decision must be made from the character and signal stocks. In information theory, these choices are viewed as random. The information is accordingly defined by the degree of freedom of choice that one has when selecting a special message from the set of all possible messages. The degree of freedom of choice when selecting a message from e.g. two equally probable messages is one, i.e. the information is one. Messages that can be predicted with certainty therefore do not provide any information. The dimensionless unit bit (binary digit) was chosen as the unit of measurement. Another information measure is the mean information content of a news source. It is also referred to as the entropy of the information and can be understood as the degree of uncertainty (existing at the recipient) about the expected message. The entropy and thus the "disorder" are maximum when all signals are equally likely, so that the sender has the maximum freedom of choice and the receiver has the greatest uncertainty. Other theorems relate to the optimal encoding of a message. Redundancy can occur with any coding. Redundant is that part of a message that can be left out without impairing the information content. Each message can be encoded in such a way that the redundancy is as small as desired. Redundancy can, however, also be used to ensure the security of the message transmission, which is endangered by disturbances occurring in the channel. Due to its strongly news-oriented view, the applicability of information theory to economic problems is very limited.

Previous technical term: Information technology | Next technical term: information environment



Report this article to the editors as incorrect & mark it for editing