Home Page Overview Site Map Index Appendix Illustration About Contact Update FAQ


Monte Carlo Methods (2022 Edition)


Contents

Original Idea and Bernoulli Probability
Random Probability
Binomial and Gaussian Probabilities
Generalization
Bayesian Monte Carlo

Original Idea and Bernoulli Probability

Monte Carlo Casino The Monte Carlo Method was invented by Stanislaw Ulam (1909-1984). He was born in Poland and became a US citizen in 1941. During WWII, he was a team member of the Manhattan Project making nuclear bomb. In particular, he worked on the neutron diffusion problem in "gun-type" bomb. It leads him to further investigation of numerical computations for very complicated systems. With the newly availability of general-purpose electronic computer, he proposed a new way to estimate statistically the probability of a certain process. This is the original idea of Monte Carlo Method named in reference to his uncle's fondness for gambling in Casino Monte Carlo (Figure 01).

Figure 01 Monte Carlo Casino [view large image]

Figure 02 shows the simple case of tossing a biased coin. It is described by the Bernoulli distribution function f(p,k) mathematically :
f(p,k) = pk(1-p)1-k ---------- (1)
where p is a parameter (or dependent variable) representing the probability for win, (1-p) is for lose with corresponding discrete running variable k = 1 or 0.
Bernoulli Distribution No mathematics is involved for checking biased coin, i.e., other than p = 1/2. The outcome is determined by just counting the numbers for p and (1-p). The counting should be done by computer since the accuracy depends on large number of countings. Once the process is completed, the skew can be estimated by the mean (p = 1/2 is fair) :

The variance is :

Figure 02 MC with Bernoulli Distribution


For p = 0; k = 0, f(0,0) = 1 or k = 1, f(0,1) = 0 - the case for complete lose.
For p = 1; k = 0, f(1,0) = 0 or k = 1, f(1,1) = 1 - the case for certain win.
For p = 1/2; k = 0, f(1/2,0) = 1/2 or k = 1, f(1/2,1) = 1/2 - the case for fair play.
N.B. f(p,0) + f(p,1) 1.
In general, Bernoulli probability can be considered as a tool for obtaining possible outcomes of any yes–no inquiry. Such questions always lead to Boolean-valued answer: a single bit whose value is either success/yes/true/win with probability p or failure/no/false/lose with probability (1-p). It can be used to check out a (possibly biased) coin toss as explained above. It can also explain why the casino always win - it is because the house always keeps an edge of about 0.3% in every game, i.e., (1 - p) ~ 0.53 to their advantage in long run (see "Why Does the House Always Win? A Look at Casino Profitability").

[Top]


Random Probability Distribution

One application of the Monte Carlo Method is the replacement of Numerical Integration in the form :
Random Distribution
The points are inserted randomly into the graph, accuracy depends on large number of points.

Figure 03 Monte Carlo with Random points

See "Table of Integrals".


Instead of counting the numbers visually, it saves a lot of effort by computer programming. A DIY graph similar to the full-fledged one (in Figure 03) is shown in Figure 04 to compute the value of . Due to limitation of random number generation in the programming language Basic 256, it returns a value of ~ 3.0 instead of 3.1416. Anyway, this is just a demonstration of principle (see the algorithm in Figure 05).

In case the function f(x) is irregular, the Monte Carlo Method is still doable. For examples :