Via: Junk Worth Knowing
The infinite monkey theorem revolves around the idea that a monkey hitting random keys on a keyboard for an infinite amount of time will almost surely type a given text, usually defined as the complete works of William Shakespeare. In response to theory and math below, professors Charles Kittel and Herbert Kroemer put it this way, “The probability of Hamlet is therefore zero in any operational sense of an event…”, and the statement that the monkeys must eventually succeed “gives a misleading conclusion about very, very large numbers.”
Typing monkeys have made numerous appearances in media, from The Simpsons to A Hitchhiker’s Guide To The Galaxy, to a Bob Newhart stand up routine, giving them a notable position in pop culture. In 2003, Plymouth University researchers put six macaques in a cage with a desktop computer. The monkeys proceeded to bash the machine with a rock, urinate on it, and type the letter S a lot. The results were published in the book, Notes Towards The Complete Works of Shakespeare.
The infinite monkey theorem is often used an example of the risks of assigning an enormously large, but finite number, to the value of infinity when reasoning.
The first step to understanding the theory is understanding what is meant by “almost surely”. In the theory “almost surely” is a mathematical term with a precise meaning. The difference between an event being almost sure and sure is the same as the subtle difference between something happening with probability 1 and happening always.
If an event is sure, then it will always happen. No other event can possibly occur even if the other event’s probability is 1. If an event is almost sure, then other events are theoretically possible in a given sample space (a sample space is a set of the possible outcomes), however as the size of the sample space increases, the probability of any other event nears zero.
Secondly, the monkeys at a typewriter are only used to demonstrate a source of random data. In reality, monkeys at a typewriter are a very poor metaphor for random data. A monkey could be taught how to type simple words, or be influenced by things in the environment, or use their own intellect to produce non-random results. In the infinite monkey theorem, it is important to know that the source of the data has to be truly random. That is, all letters have the same probability of being typed regardless of what letters come before them.
When events are independent, the probability of them all happening is the product of the individual probabilities multiplied together. Think of it this way, the first word in the first line of Hamlet is “Who’s there?”. If we omit spaces, capitalization and punctuation for simplicity sake, we get “whosthere”. The probability of the first letter correctly being “w” is 1 out of 26 (W being 1 letter out of the 26 in the alphabet. The probability of the second letter being “h” is the same, 1/26, and so on and so forth. Therefore the probability that a monkey will correctly type the first line of 9 letters is: (1/26) x (1/26) x (1/26) x (1/26) x (1/26) x (1/26) x (1/26) x (1/26) x (1/26) …or more simply: (1/26)9
Which equals 0.00000000000018417% or for those of you who are counting. Given that Hamlet has about 130,000 characters in it, the probability of a monkey typing it out is 1 in 3.4 × 10183,946. It also means that the monkey needs to type that many letters before he or she completes Hamlet. If the world were filled with monkeys typing for all time, their total probability to produce a single instance of Hamlet would still be less than one in 10183,800.