[go: up one dir, main page]

0% found this document useful (0 votes)
75 views8 pages

Chapter 3 - Random Numbers - 2013 - Simulation

The document discusses pseudorandom number generation using a multiplicative congruential method and mixed congruential generators. It also describes how random numbers can be used to evaluate integrals using a Monte Carlo approach, including estimating the value of pi by generating random points in a square and counting the proportion that fall within an inscribed circle.

Uploaded by

陳裕庭
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
75 views8 pages

Chapter 3 - Random Numbers - 2013 - Simulation

The document discusses pseudorandom number generation using a multiplicative congruential method and mixed congruential generators. It also describes how random numbers can be used to evaluate integrals using a Monte Carlo approach, including estimating the value of pi by generating random points in a square and counting the proportion that fall within an inscribed circle.

Uploaded by

陳裕庭
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Random Numbers

Introduction

The building block of a simulation study is the ability to generate random numbers,
where a random number represents the value of a random variable uniformly
distributed on (0, 1). In this chapter we explain how such numbers are computer
generated and also begin to illustrate their uses.

3.1 Pseudorandom Number Generation

Whereas random numbers were originally either manually or mechanically


generated, by using such techniques as spinning wheels, or dice rolling, or card
shuffling, the modern approach is to use a computer to successively generate
pseudorandom numbers. These pseudorandom numbers constitute a sequence
of values, which, although they are deterministically generated, have all the
appearances of being independent uniform (0, 1) random variables.
One of the most common approaches to generating pseudorandom numbers
starts with an initial value x0 , called the seed, and then recursively computes
successive values xn , n  1, by letting

xn = axn−1 modulo m (3.1)

where a and m are given positive integers, and where the above means that axn−1 is
divided by m and the remainder is taken as the value of xn . Thus, each xn is either
0, 1, . . . , m − 1 and the quantity xn /m—called a pseudorandom number—is taken
as an approximation to the value of a uniform (0, 1) random variable.

Simulation. DOI: http://dx.doi.org/10.1016/B978-0-12-415825-2.00003-6


© 2013 Elsevier Inc. All rights reserved. 39
40 3 Random Numbers

The approach specified by Equation (3.1) to generate random numbers is called


the multiplicative congruential method. Since each of the numbers xn assumes one
of the values 0, 1, . . . , m − 1, it follows that after some finite number (of at most
m) of generated values a value must repeat itself; and once this happens the whole
sequence will begin to repeat. Thus, we want to choose the constants a and m so
that, for any initial seed x0 , the number of variables that can be generated before
this repetition occurs is large.
In general the constants a and m should be chosen to satisfy three criteria:

1. For any initial seed, the resultant sequence has the “appearance” of being a
sequence of independent uniform (0, 1) random variables.
2. For any initial seed, the number of variables that can be generated before
repetition begins is large.
3. The values can be computed efficiently on a digital computer.

A guideline that appears to be of help in satisfying the above three conditions


is that m should be chosen to be a large prime number that can be fitted to the
computer word size. For a 32-bit word machine (where the first bit is a sign bit) it
has been shown that the choices of m = 231 − 1 and a = 75 = 16, 807 result in
desirable properties. (For a 36-bit word machine the choices of m = 235 − 31 and
a = 55 appear to work well.)
Another generator of pseudorandom numbers uses recursions of the type

xn = (axn−1 + c) modulo m

Such generators are called mixed congruential generators (as they involve both an
additive and a multiplicative term). When using generators of this type, one often
chooses m to equal the computer’s word length, since this makes the computation
of (axn−1 + c) modulo m—that is, the division of axn−1 + c by m—quite efficient.
As our starting point in the computer simulation of systems we suppose that
we can generate a sequence of pseudorandom numbers which can be taken as an
approximation to the values of a sequence of independent uniform (0, 1) random
variables. That is, we do not explore the interesting theoretical questions, which
involve material outside the scope of this text, relating to the construction of “good”
pseudorandom number generators. Rather, we assume that we have a “black box”
that gives a random number on request.

3.2 Using Random Numbers to Evaluate Integrals


One of the earliest applications of random numbers was in the computation of
integrals. Let g(x) be a function and suppose we wanted to compute θ where
 1
θ= g(x) d x
0
3.2 Using Random Numbers to Evaluate Integrals 41

To compute the value of θ , note that if U is uniformly distributed over (0, 1), then
we can express θ as
θ = E [g(U )]
If U1 , . . . , Uk are independent uniform (0, 1) random variables, it thus follows
that the random variables g(U1 ), . . . , g(Uk ) are independent and identically
distributed random variables having mean θ . Therefore, by the strong law of large
numbers, it follows that, with probability 1,

k
g(Ui )
→ E [g(U )] = θ as k → ∞
i=1
k

Hence we can approximate θ by generating a large number of random numbers


u i and taking as our approximation the average value of g(u i ). This approach to
approximating integrals is called the Monte Carlo approach.
If we wanted to compute
 b
θ= g(x) d x
a

then, by making the substitution y = (x − a)/(b − a), dy = d x/(b − a), we see


that
 1
θ= g(a + [b − a] y)(b − a) dy
0
 1
= h(y) dy
0

where h(y) = (b −a)g(a +[b − a] y). Thus, we can approximate θ by continually


generating random numbers and then taking the average value of h evaluated at
these random numbers.
Similarly, if we wanted
 ∞
θ= g(x) d x
0

we could apply the substitution y = 1/(x + 1), dy = −d x/(x + 1)2 = −y 2 d x,


to obtain the identity
 1
θ= h(y) dy
0

where  
g 1
y
−1
h(y) =
y2
42 3 Random Numbers

The utility of using random numbers to approximate integrals becomes more


apparent in the case of multidimensional integrals. Suppose that g is a function
with an n-dimensional argument and that we are interested in computing
 1 1  1
θ= ... g(x1 , . . . , xn ) d x1 d x2 · · · d xn
0 0 0
The key to the Monte Carlo approach to estimate θ lies in the fact that θ can be
expressed as the following expectation:
θ = E [g(U1 , . . . , Un )]
where U1 , . . . , Un are independent uniform (0, 1) random variables. Hence, if
we generate k independent sets, each consisting of n independent uniform (0, 1)
random variables
U11 , . . . , Un1
U12 , . . . , Un2
..
.
U1 , . . . , Unk
k

then, since the random variables g(U1i , . . . , Uni ), i = 1, . . . , k, are all independent
and identically distributed random variables with mean θ , we can estimate θ by
k
i=1 g(U1 , . . . , Un )/k.
i i

For an application of the above, consider the following approach to


estimating π .
Example 3a The Estimation of π Suppose that the random vector
(X, Y ) is uniformly distributed in the square of area 4 centered at the origin. That
is, it is a random point in the region specified in Figure 3.1. Let us consider now the
probability that this random point in the square is contained within the inscribed
circle of radius 1 (see Figure 3.2). Note that since (X, Y ) is uniformly distributed
in the square it follows that
P{(X, Y ) is in the circle} = P{X 2 + Y 2  1}
Area of the circle π
= =
Area of the square 4
Hence, if we generate a large number of random points in the square, the proportion
of points that fall within the circle will be approximately π/4. Now if X and Y were
independent and both were uniformly distributed over (−1, 1), their joint density
would be
f (x, y) = f (x) f (y)
1 1
= ·
2 2
1
= , −1  x  1, −1  y  1
4
3.2 Using Random Numbers to Evaluate Integrals 43

(−1, 1) (1, 1)

(−1, −1) (1, −1)


= (0, 0)

Figure 3.1. Square.

(−1, 1) (1, 1)

(−1, −1) (1, −1)


= (0, 0)

Figure 3.2. Circle within Square.

Since the density function of (X, Y ) is constant in the square, it thus follows (by
definition) that (X, Y ) is uniformly distributed in the square. Now if U is uniform
on (0, 1) then 2U is uniform on (0, 2), and so 2U − 1 is uniform on (−1, 1).
Therefore, if we generate random numbers U1 and U2 , set X = 2U1 − 1 and
Y = 2U2 − 1, and define

1 if X 2 + Y 2  1
I =
0 otherwise
44 3 Random Numbers

then π
E [I ] = P{X 2 + Y 2  1} =
4
Hence we can estimate π/4 by generating a large number of pairs of
random numbers u 1 , u 2 and estimating π/4 by the fraction of pairs for which
(2u 1 − 1)2 + (2u 2 − 1)2  1. 

Thus, random number generators can be used to generate the values of uniform
(0, 1) random variables. Starting with these random numbers we show in Chapters
4 and 5 how we can generate the values of random variables from arbitrary
distributions. With this ability to generate arbitrary random variables we will be
able to simulate a probability system—that is, we will be able to generate, according
to the specified probability laws of the system, all the random quantities of this
system as it evolves over time.

Exercises

1. If x0 = 5 and
xn = 3xn−1 mod 150
find x1 , . . . , x10 .
2. If x0 = 3 and
xn = (5xn−1 + 7) mod 200
find x1 , . . . , x10 .
In Exercises 3–9 use simulation to approximate the following integrals.
Compare your estimate with the exact answer if known.
1
3. 0
exp{e x } d x
1
4. 0
(1 − x 2 )3/2 d x
 2 x+x 2
5. −2
e dx
∞
6. 0
x(1 + x 2 )−2 d x
 ∞ −x 2
7. −∞
e dx
 1  1 (x+y)2
8. 0 0
e dy dx
 ∞  x −(x+y)
9. 0 0
e d y d x
1 if y < x
[Hint: Let I y (x) = and use this function to equate the integral
0 if y  x
to one in which both terms go from 0 to ∞.]
10. Use simulation to approximate Cov(U, eU ), where U is uniform on (0, 1).
Compare your approximation with the exact answer.
Bibliography 45

11. Let U be uniform on (0, 1). Use simulation to approximate the following:
 √ 
(a) Corr U, 1 − U 2 .
 √ 
(b) Corr U 2 , 1 − U 2 .

12. For uniform (0, 1) random variables U1 , U2 , . . . define



n
N = Minimum n: Ui > 1
i=1

That is, N is equal to the number of random numbers that must be summed to
exceed 1.

(a) Estimate E [N ] by generating 100 values of N.


(b) Estimate E [N ] by generating 1000 values of N.
(c) Estimate E [N ] by generating 10,000 values of N.
(d) What do you think is the value of E [N ]?

13. Let Ui , i  1, be random numbers. Define N by


 n

−3
N = Maximum n: Ui  e
i=1

0
where i=1 Ui ≡ 1.

(a) Find E [N ] by simulation.


(b) Find P{N = i}, for i = 0, 1, 2, 3, 4, 5, 6, by simulation.

14. With x1 = 23, x2 = 66, and

xn = 3xn−1 + 5xn−2 mod(100), n  3

we will call the sequence u n = xn /100, n  1, the text’s random number


sequence. Find its first 14 values.

Bibliography
Knuth, D., The Art of Computer Programming, Vol. 2, 2nd ed., Seminumerical Algorithms.
Addison-Wesley, Reading, MA, 2000.
L’Ecuyer, P., “Random Numbers for Simulation,” Commun. Assoc. Comput. Mach. 33,
1990.
46 3 Random Numbers

Marsaglia, G., “Random Numbers Fall Mainly in the Planes,” Proc. Natl. Acad. Sci. USA
61, 25–28, 1962.
Marsaglia, G., “The Structure of Linear Congruential Sequences,” in Applications of
Number Theory to Numerical Analysis, S. K. Zaremba, ed., Academic Press, London,
1972, pp. 249–255.
Naylor, T., Computer Simulation Techniques. Wiley, New York, 1966.
Ripley, B., Stochastic Simulation. Wiley, New York, 1986.
von Neumann, J., “Various Techniques Used in Connection with Random Digits, ‘Monte
Carlo Method,”’ U.S. National Bureau of Standards Applied Mathematics Series, No.
12, 36–38, 1951.

You might also like