Home Page  Overview  Site Map  Index  Appendix  Illustration  About  Contact  Update  FAQ 
The Monte Carlo Method was invented by Stanislaw Ulam (19091984). He was born in Poland and became a US citizen in 1941. During WWII, he was a team member of the Manhattan Project making nuclear bomb. In particular, he worked on the neutron diffusion problem in "guntype" bomb. It leads him to further investigation of numerical computations for very complicated systems. With the newly availability of generalpurpose electronic computer, he proposed a new way to estimate statistically the probability of a certain process. This is the original idea of Monte Carlo Method named in reference to his uncle's fondness for gambling in Casino Monte Carlo (Figure 01).  
Figure 01 Monte Carlo Casino [view large image] 
No mathematics is involved for checking biased coin, i.e., other than p = 1/2. The outcome is determined by just counting the numbers for p and (1p). The counting should be done by computer since the accuracy depends on large number of countings. Once the process is completed, the skew can be estimated by the mean (p = 1/2 is fair) : The variance is :  
Figure 02 _{} MC with Bernoulli Distribution 
For p = 0; k = 0, f(0,0) = 1 or k = 1, f(0,1) = 0  the case for complete lose. For p = 1; k = 0, f(1,0) = 0 or k = 1, f(1,1) = 1  the case for certain win. For p = 1/2; k = 0, f(1/2,0) = 1/2 or k = 1, f(1/2,1) = 1/2  the case for fair play. N.B. f(p,0) + f(p,1) 1. 
The points are inserted randomly into the graph, accuracy depends on large number of points.  
Figure 03 _{} Monte Carlo with Random points 
See "Table of Integrals". 
 
Figure 04 _{} DIY Monte Carlo 
Figure 05 Algorithm 
The area under the irregular curve is A = (0.4x0.4) + (0.4x0.4)/2 = 0.24 . Visual counting obtains n = 35 points inside the area, the total number of points N = 5x9 = 45 giving the ratio r = n/N = 0.78. Since the total area is 0.4x0.8 = 0.32, A ~ 0.78x0.32 = 0.25 QED (See Figure 06).  
Figure 06 MC for Irregular Function [view large image] 
Indeed, the area under this shape can be computed with an array as suggested above producing the same value of A ~ 0.24 (See the corresponding algorithm). 
C(N,k)=N!/[k!(Nk)!]  (7), where the factorial q! = q(q1)(q2)...1, and 0! = 1 (see red dots in Figure 08, also see a "table"). The Binomial probability is defined by : P_{Binomial} = (a + b)^{N} = _{} C(N,k)a^{Nk}b^{k} 1,  (8) where a = p, b = (1  p).  
Figure 07 Binomial Distribution _{} 
In case a = b, it is called Gaussian or Normal probability P_{Gaussian} which is symmetrical with respect to its maximum. For a b, P_{Binomial} reveals a skew from symmetry as shown in Figure 07 for p = 0.7. P_{Binomial} goesover to continuum as N with the following transitions : 
See the "computer program" for detail. Essentially, the program scales the Binomial probability on an 1x1 square graph. Figure 08 shows the area under the probability curve in red. The counting ratio between the numbers under the curve and the whole square is : ratio = 0.4 similar to the area by numerical integration. This scaled value of total probability can be reverted to "1" with the multiplication of .  
Figure 08 _{} Binomial Probability by Monte Carlo 
In general, the integrand can be very complicated and becomes more formidable by the addition of (x). Fortunately, many standard distributions are available in the form of "subroutines" for public use (at a fee). For example, with f(x) = x^{2}, it can be evaluated using 1/2 of the array for the Binomial distribution from the previous examples. Figure 09 shows the integrand obtained by the Monte Carlo method in few red dots (see "computer program" + scaling ). The solid curves are evaluated by conventional method for comparison. Integration can be performed by the Monte Carlo method as usual if each piece of the integrand is stored in another array.  
Figure 09 Find Integrand with Probability Density by Monte Carlo [view large image] 
Essentially, the program picks a random value of x by which f(x) is multiplied to (x). The integrand is evaluated as f(x)(x). A corresponding curve would emerge in a plot with increasing number of iterations, or by just running the x index from x_{i} to x_{f} (5 to 10 here). 
together with another normal distribution (uncorrelated) into the process (see an example in a "computer program"). Figure 10 shows the result pictorially in a 3D graph (no such convenience beyond the bivariate). The newly created normal distribution can be stored in another array for further processing, but the size of the array and computer time rapidly become unmanageable as the number of dimension increases.  
Figure 10 _{} 
Figure 11 _{} 
Assuming unlimited computing resources, there is no problem extending the procedure to multivariate functions (dimension > 2). As shown in Figure 11, the probability density distribution (x_{1},x_{2},...) can be in various forms to generate f(), then goes on to produce the integral of the whole thing. 
Bayesian Theorem  It introduces a new kind of probability P(HD), which relates the probability of "event H" given that "event D" has taken place. As shown in Figure 12, P(HD) = [P(DH) x P(H)] / P(D)  (11), where P(H) and P(D) are the probabilities of observing H and D independent of each other. 

Figure 12 Bayesian Probability _{} 
Actually, it is somewhat mislabelled for the event H as "Hypotheses" (which causes so much distress for Calvin the comic character, see insert in Figure 12), and D as "Data". 
From available statistical data, probability of smoke without fire P(B) ~ 50%, i.e., rather common and fire without smoke P(A) ~ 1%, i.e., very rare. Giving that the probability of smoke with the apperence of fire is almost certain P(BA) ~ 99%, P(AB) = [P(BA) x P(A)] / P(B) = (0.99 x 0.01) / 0.5 ~ 2%. 

Figure 13 Smoke and Fire [view large image] 
Lastly, the correlated probability (plus) [P(AB) x P(B)] = [P(BA) x P(A)] ~ 1%, with the appearance of either fire or smoke.



Figure 14 _{} Bayesian Monte Carlo 
