![]() This tells us that the amount of information in a message or text is directly proportional to the amount of surprise available in the message. However, if the message discloses the results of the cliff-hanger US elections, then this is certainly highly informative. Let’s say, you have received a message, which is a repeat of an earlier text then this message is not at all informative. Allow me to explain what I mean by the amount of surprise. Now, this amount is estimated not only based on the number of different values that are present in the variable but also by the amount of surprise that this value of the variable holds. The entropy measures the “amount of information” present in a variable. So, we get information from a variable by seeing its value, in the same manner as we get details (or information) from a message or letter by reading its content. In other words, a variable is nothing but a unit of storage. Notionally, we can understand that information is something that can be stored in, transferred, or passed-on as variables, which can further take different values. In simple words, we know that information is some facts learned about something or someone. Let’s look at this concept in depth.īut first things first, what is this information? What ‘information’ am I referring to? The English meaning of the word entropy is: it is a state of disorder, confusion, and disorganization. So, we know that the primary measure in information theory is entropy. For this purpose, information entropy was developed as a way to estimate the information content in a message that is a measure of uncertainty reduced by the message. The work was aimed at the problem of how best to encode the information a sender wants to transmit. In his paper, he had set out to mathematically measure the statistical nature of “lost information” in phone-line signals. “ Information theory is a mathematical approach to the study of coding of information along with the quantification, storage, and communication of information.” ![]() Shannon was also known as the ‘father of information theory’ as he had invented the field of information theory. Shannon, mathematician, and electrical engineer, published a paper on A Mathematical Theory of Communication, in which he had addressed the issues of measure of information, choice, and uncertainty. The term entropy was first coined by the German physicist and mathematician Rudolf Clausius and was used in the field of thermodynamics. But, then what is Entropy? The Origin of Entropy ![]()
0 Comments
Leave a Reply. |