[go: up one dir, main page]

0% found this document useful (0 votes)
32 views5 pages

Homework 4: Stats 217

Mate

Uploaded by

ingles2021all
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views5 pages

Homework 4: Stats 217

Mate

Uploaded by

ingles2021all
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Homework 4: Stats 217

Due on July 20, 2022 at 11:59 pm. Each problem carries 10 points. Please feel free to use any
programming language of your choice and submit the code.

1. Consider a branching process {Xn }n≥0 with progeny distribution P oisson(λ). For λ ∈
{0.1, 0.5, 0.9, 1, 1.5, 2, 3, 5, 10, 20, 50}, compute numerically if needed, the extinction prob-
ability and plot the extinction probability (on y axis) vs. λ (on x-axis). What can you say
about the nature of the plot? Does it make sense intuitively?

2. (a) Suppose a ball moves on the integers {0, 1, 2, ..., N } between two solid walls placed
at 0 and N respectively. At every time, the ball moves one step right with probability
p and one step left with probability 1 − p independently. If it hits either of the two
walls, it returns to its previous position i.e. starting at 0 or N , it goes to 1 and N − 1
respectively almost surely. Argue that the position of the ball describes a discrete time
markov chain, identify its state space and write the transition probability matrix. [This
is also called the random walk with reflecting barriers.]
(b) Simulate the above markov chain for N = 10 and p = 0.5. Run it for a long-ish time,
say 50000 steps. Compute the observed proportion of times the markov chain spends
on each state. Plot these proportions perhaps using a bar plot or histogram. Make any
interesting conclusion if you can from the plot. Compare with the uniform distribution
on {0, 1, ..., 10}.
(c) Now vary p ∈ {0.1, 0.2, ..., 0.9}. Perform above simulation for each value of p (except
p = 0.5 as it is already taken care of) and plot the proportions suitably. Make any
interesting conclusion and compare with what you got in (b). Also compare with the
uniform distribution on {0, 1, ..., 10}.

1
Stat 217: Problem Set 4
July 20, 2022

1 Question 1
Consider a branching process {Xn }n≥0 with progeny distribution Poisson(λ). For
λ ∈ {0.1, 0.5, 0.9, 1, 1.5, 2, 3, 5, 10, 20, 50}, compute numerically if needed, the extinction probability and plot
the extinction probability (on y axis) vs. λ (on x-axis). What can you say about the nature of the plot?
Does it make sense intuitively?

Solution: First recall that a branching process where the expected number of progeny is less than one
will tend to extinction; this is also true if the expected number of progeny is exactly one and individuals
can generate no offspring with non-zero probability. We therefore have extinction probability p = 1 for
λ = 0.1, 0.5, 0.9, 1 since the parameter λ is the Poisson mean and variance.
For λ > 1 we make use of the fact that the extinction probability is the smallest solution for ϕ(s) = s where

X
ϕ(s) = E[sξ ] = pk sk
k=0

X λk
= e−λ sk
k!
k=0

X (λs)k
= e−λ
k!
k=0
= e−λ eλs

And now plugging in for s and rewriting in a rather counterintuitive way,

s = e−λ+λs
−λse−λs = −λe−λ

which we can get using the Lambert W function z = wew , by taking z = −λe−λ and implicitly w = −λs.
The extinction probability is plotted in the figure below using the displayed python code: there is a sharp
dropoff (likely exponential decay, since the curve drops faster than linear in semi-log scale) starting at λ = 1
which results in very small extinction probability by λ = 5. This does match intuition that a branching
process with this progeny distribution can still go extinct with λ > 1: indeed, extinction should still be
reasonably probable if λ is close to 1, but unlikely or nearly impossible if λ is decently large.

1
import numpy as np
import matplotlib.pyplot as plt
from scipy.special import lambertw

def p_extinct(lamda):
if lamda <= 1:
s=1
else:
x=lambertw(-lamda*np.exp(-lamda))
s=-x/lamda
return s

lam=[0.1,0.5,0.9,1,1.5,2,3,5,10,20,50]

plt.semilogx(lam, [p_extinct(l) for l in lam], ’b.’)


plt.xlabel(r’$\lambda$’); plt.ylabel(’Extinction probability’);
plt.title(’Branching process with Poisson progeny distribution’);
plt.savefig(’hw4q1.png’)

2
2 Question 2
(a) Suppose a ball moves on the integers {0, 1, 2, . . . , N } between two solid walls placed at 0 and N
respectively. At every time, the ball moves one step right with probability p and one step left with
probability 1 − p independently. If it hits either of the two walls, it returns to its previous position i.e.
it gets reflected. Argue that the position of the ball describes a discrete time markov chain, identify
its state space and write the transition probability matrix. [This is also called the random walk with
reflecting barriers.]

Solution: This chain has the state space {0, 1, 2, . . . , N }, which describes the position Xt of the ball
at time step t. Since each step is independent of all others, we know that the only thing determining
where the ball ends up at time step t − 1 is where it currently sits at time t. More formally, we have
that p(Xt+1 = j|Xt = k) = p(Xt+1 = j|{Xi = ki }ti=0 ), which has that the chain satisfies Markov
property, and is therefore a Markov chain.
For the transition matrix M = {mij }i=0,1,...,N ;j=0,1,...,N we have the mij as follows:
ˆ mi,i+1 = p, for 1 ≤ i ≤ N − 1
ˆ mi,i−1 = 1 − p, for 1 ≤ i ≤ N − 1
ˆ m0,1 = mN,N −1 = 1
ˆ mi,j = 0; otherwise

(b) Simulate the above markov chain for N = 10 and p = 0.5. Run it for a long-ish time, say 50000
steps. Compute the observed proportion of times the markov chain spends on each state. Plot these
proportions perhaps using a bar plot or histogram. Make any interesting conclusion if you can from
the plot. Compare with the uniform distribution on {0, 1, ..., 10}.

Solution: See plot below. Code follows in part (c). We conclude that the markov chain spends a
nearly uniform amount of time at each node, except at the boundaries, which have values about half
as large as the rest. Because of this, the frequencies attained outside the boundary resemble those for
Unif(0, N − 1) even though the state space has N + 1 values.

(c) Now vary p ∈ {0.1, 0.2, ..., 0.9}. Perform above simulation for each value of p (except p = 0.5 as it is
already taken care of) and plot the proportions suitably. Make any interesting conclusion and compare
with what you got in (b). Also compare with the uniform distribution on {0, 1, ..., 10}.

3
Solution: See plot and code below. Distributions for p ̸= 0.5 are skewed in the expected direction
(right if p > 0.5, left otherwise), with symmetry across the center point (5) between p and 1 − p (i.e.,
the distribution for p = 0.1 mirrors that for p = 0.9, etc.). A similar step down in frequency between
N − 1 and N (likewise 1 and 0) is also observed.

import numpy as np
import matplotlib.pyplot as plt

def chain(N=10, p=0.5, steps=50000):


path = [5] # start in the middle
for i in range(steps):
if path[i] == 0: # deterministic cases first
path.append(1)
elif path[i] == N:
path.append(N-1)
else: # step right with probability p
path.append(path[i] - 1 + 2*(np.random.random() < p))
return path

paths = {p:chain(p=p) for p in [0.1,0.2,0.3,0.4,0.5,0.6,0.7,0.8,0.9]}

# 2b
plt.hist(paths[0.5], bins=np.arange(12), histtype=’step’, density=True, align=’left’)
plt.plot(np.arange(-1,12), np.ones(13)/11., ’k--’)
plt.xlim(-1,11); plt.xticks(np.arange(11)); plt.xlabel(’State’); plt.ylabel(’Frequency’)
plt.legend([’Uniform distribution’,’Walk (p=0.5)’], loc=’lower center’)
plt.savefig(’hw4q2b.png’); plt.clf()

# 2c
for p,path in paths.items():
plt.hist(path, bins=np.arange(12), histtype=’step’, density=True, align=’left’)
plt.plot(np.arange(-1,12), np.ones(13)/11., ’k--’)
plt.xlim(-1,11); plt.xticks(np.arange(11)); plt.xlabel(’State’); plt.ylabel(’Frequency’)
plt.legend([’Uniform’]+[’p={}’.format(p) for p in paths.keys()], loc=’upper center’)
plt.savefig(’hw4q2c.png’)

You might also like