This whole mess started with Claude Shannon in 1948 figuring out a method to quantify the information of a message in a paper called A Mathematical Theory of Communication or Information Theory. Confused? Don’t Panic! His big idea was that you could assign a number to the amount of information a message contained based upon how many yes or no questions it takes to define the message.

Think of it as a mathematical game of hangman where every letter has a number assigned. A = 1, B = 2, C = 3 ….  Z = 26. Now like in Hangman we are given the empty spaces that represent our message. To figure out the message and the amount of information it contains I will begin by asking if the first is letter greater than 13 which is the halfway point of the alphabet. Finding that our first space is not greater than 13 I will then ask if the first letter is greater than 6. I will the continue to till I find that the first letter is C. Using only the 26 character alphabet it takes 8 yes or no questions to define each letter. Each yes or no question is what Shannon coined a bit. So it takes at bits of information to know a letter for certain.  Continuing in the same manner we get all three letters and each takes 8 bits of information. So this message contains 24 bits of information.

_C_ ___ ___      _C_ A_ T_

Implications

Almost immediately the implications of quantifying information was seen for it’s potential in other fields.. However Shannon was very opposed to this expansion and viewed it more as exploitation of his work. Either way Shannon was the key to the beginning of thinking of information as a thing though there is a lot of disagreement on the nature of that thing.

Noble Prize winner Gerard t’Hooft explains that he has the suspicion that information is part of a pre-quantum theory. Meaning information is more fundamental to the nature of the universe than quantum mechanics and Information Theory 2.0 may be emerging as an alternative to String Theory.