{"id":2577,"date":"2025-11-12T10:08:31","date_gmt":"2025-11-12T16:08:31","guid":{"rendered":"https:\/\/sites.imsa.edu\/hadron\/?p=2577"},"modified":"2025-11-15T21:27:37","modified_gmt":"2025-11-16T03:27:37","slug":"information-theory-and-its-real-world-applications","status":"publish","type":"post","link":"https:\/\/sites.imsa.edu\/hadron\/2025\/11\/12\/information-theory-and-its-real-world-applications\/","title":{"rendered":"Information Theory and Its Real World Applications"},"content":{"rendered":"<p style=\"text-align: center\"><b>Information Theory and Its Real World Applications<\/b><\/p>\n<p style=\"text-align: center\"><span style=\"font-weight: 400\">Written by: Pranshu Nautiyal<\/span><\/p>\n<p style=\"text-align: center\"><b>Introduction<\/b><\/p>\n<p><span style=\"font-weight: 400\">One of the most qualitative measures of recent centuries is information and how it is stored. Information theory lets us see how information can be a qualitative data factor, and how to reduce the amount of entropy you have. While information theory might not seem that important in the real world, it has many real-world applications, such as in data storage and security. However, one of the most fascinating ways to interact with information theory is with board games, as it helps you exploit the rules of them in many ways.<\/span><\/p>\n<p>&nbsp;<\/p>\n<p style=\"text-align: center\"><b>Information Theory and Entropy<\/b><\/p>\n<p><span style=\"font-weight: 400\">While information is commonly referred to as an abstract concept relating to facts and truth, it can also be expressed as the arrangement of what is conveyed. For example, combining both the sounds humans can make and the gestures we do forms the very basis of language that humans have used for thousands of years. Many animals often use their vocalizations to convey information to other animals in their species, like mating calls or calls for danger. Telegraphs, which are another form of how information is conveyed, use a simple system called Morse code involving two symbols to send messages over long distances. Information can be measured in simplest terms by bytes, which are singular units that hold either the values true or false. In computers, bits are also stored in binary, which is the most basic programming language. It uses a series of 0\u2019s and 1\u2019s to show information through the unique combinations of the two. Every unit of communication can eventually be simplified into a certain number of bytes at its core. Information theory also includes the value of entropy. It is used to measure the average uncertainty or information content in a probability distribution. The formula used to calculate entropy is\u00a0 H(P) = \u2212\u2211p(x) log\u2082 p(x), where H(P) is the entropy, and p(x) is the probability of a specific event happening. Entropy is important to measure because you always want it to be as high as possible, which signifies the most knowledge known in a situation. Due to all of these, entropy and information theory is very important to our normal lives, and can be seen in many aspects like games or security.<\/span><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-2581 aligncenter\" src=\"http:\/\/sites.imsa.edu\/hadron\/files\/2025\/11\/download.png\" alt=\"\" width=\"752\" height=\"563\" \/><\/p>\n<p><span style=\"font-weight: 400\">Graph by <\/span><a href=\"https:\/\/machinelearningmastery.com\/\"><span style=\"font-weight: 400\">https:\/\/machinelearningmastery.com\/<\/span><\/a><span style=\"font-weight: 400\">\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">This graph shows how the greater the probability of a given situation is, the less information is actually available<\/span><\/p>\n<p>&nbsp;<\/p>\n<p style=\"text-align: center\"><b>How to exploit entropy<\/b><\/p>\n<p><span style=\"font-weight: 400\">The first person to exploit entropy in games like poker was a scientist named John Kelley Jr., when he studied how a gambler can maximize the expected earnings out of every bet based on information learnt in that round. In this case, a byte was the amount of entropy in a bettable event with two possible outcomes, like winning or losing. Kelley\u2019s point was that to make our money grow exponentially from bets, the value of the side information we need can be calculated as I ( X ; Y ) = Ey [ DKL ( P( X | Y ) | | P ( X | I ) ) } where Y is the side information, X is the outcome of the betable event, and I is the state of the bookmaker&#8217;s knowledge. Since you do not know all the information you could have, the formula won\u2019t guarantee a win every single time. This principle and formula can be applied to a wide variety of games, from the original poker to even Wordle. If you knew the patterns of the letters and how they are arranged in the word, then it would be easier to figure out what the word is. In most gambling, this principle can be exploited to figure out the winner, especially if you have prior knowledge of the topic.<\/span><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-2580 aligncenter\" src=\"http:\/\/sites.imsa.edu\/hadron\/files\/2025\/11\/download-1.png\" alt=\"\" width=\"864\" height=\"647\" \/><\/p>\n<p style=\"text-align: center\"><span style=\"font-weight: 400\">Graph by <\/span><a href=\"https:\/\/machinelearningmastery.com\/\"><span style=\"font-weight: 400\">https:\/\/machinelearningmastery.com\/<\/span><\/a><span style=\"font-weight: 400\">\u00a0<\/span><\/p>\n<p style=\"text-align: center\"><span style=\"font-weight: 400\">This graph shows how the lessened probability distribution contributes to more entropy there is. The lower the range of probability, the higher the entropy, and therefore, the higher the chance you win.<\/span><\/p>\n<p style=\"text-align: center\"><b>Real World Applications<\/b><\/p>\n<p><span style=\"font-weight: 400\">While information theory certainly is interesting to exploit for games and gambling, the more practical use of information theory comes from its roles in the real world, like security. When a code is encrypted, it is the hacker\u2019s job to decrypt the message and steal whatever data was in the message. A perfect ciphertext would not reveal anything about the cipher used, thereby reducing the amount of entropy to foil the hacker. This is why cybersecurity specialists often have to use information theory.\u00a0<\/span><\/p>\n<p style=\"text-align: center\"><b>Conclusion<\/b><\/p>\n<p><span style=\"font-weight: 400\">Information theory is one of the most fascinating theories, and it can be applied to many different situations due to its broad scope. Information is one of the most important things around, and having a fundamental grasp on it and how entropy works is important for topics as mundane as games to topics as important as security.<\/span><\/p>\n<p>&nbsp;<\/p>\n<p style=\"text-align: center\"><strong>References and Sources<\/strong><\/p>\n<h4><span style=\"font-weight: 400\">Brownlee, Jason. \u201cA Gentle Introduction to Information Entropy.\u201d Machine Learning Mastery,\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a013 July 2020,\u00a0 machinelearningmastery.com\/what-is-information-entropy.\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0Accessed 9 Oct. 2025.<\/span><\/h4>\n<h4><span style=\"font-weight: 400\">\u201cGambling and Information Theory.\u201d Wikipedia, Wikimedia Foundation, 3 Oct. 2025,\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 en.wikipedia.org\/wiki\/Gambling_and_information_theory. Accessed 9 Oct. 2025.<\/span><\/h4>\n<h4><span style=\"font-weight: 400\">Khan Academy. \u201cOrigins of Written Language\u00a0 Computer Science.\u201d YouTube, uploaded by\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0Khan Academy, 28 Apr. 2014, www.youtube.com\/watch?v=lkeXaqoXDYQ. Accessed\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a09 Oct. 2025.<\/span><\/h4>\n<h4><span style=\"font-weight: 400\">Khan Academy. \u201cWhat Is Information Theory?\u00a0 Journey into Information Theory.\u201d YouTube,\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 uploaded by Khan Academy, 28 Apr. 2014, www.youtube.com\/watch?\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 v=d9alWZRzBWk. Accessed 9 Oct. 2025.<\/span><\/h4>\n<h4><span style=\"font-weight: 400\">3Blue1Brown. \u201cSolving Wordle Using Information Theory.\u201d YouTube, uploaded by\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a03Blue1Brown, 27 Jan. 2022, www.youtube.com\/watch?v=v68zYyaEmEA. Accessed 9\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0Oct. 2025.<\/span><\/h4>\n","protected":false},"excerpt":{"rendered":"<p>Information Theory and Its Real World Applications Written by: Pranshu Nautiyal Introduction One of the most qualitative measures of recent centuries is information and how it is stored. Information theory lets us see how information can be a qualitative data factor, and how to reduce<\/p>\n","protected":false},"author":1092,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"ngg_post_thumbnail":0,"footnotes":""},"categories":[11],"tags":[],"class_list":["post-2577","post","type-post","status-publish","format-standard","hentry","category-math"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/posts\/2577","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/users\/1092"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/comments?post=2577"}],"version-history":[{"count":1,"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/posts\/2577\/revisions"}],"predecessor-version":[{"id":2582,"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/posts\/2577\/revisions\/2582"}],"wp:attachment":[{"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/media?parent=2577"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/categories?post=2577"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sites.imsa.edu\/hadron\/wp-json\/wp\/v2\/tags?post=2577"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}