Is Noise Information?
Posted on January 14th, 2008 at 7:59 pm by necco

Claude E. Shannon, founder of modern information theory, stated that it was in his 1948 paper named A Mathematical Theory of Communication. He stated that the amount of information entropy in a signal is equal to -∑p(x)log[p(x)] over the set X where p(x) is the probability that the message x out of X possible messages was received.Some just think it’s noise because they don’t want to listen.