Sta4853 2
Sta4853 2
Sta4853 2
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b
Outline
1 §2.1: Stationarity
4 R: Random Walk
5 Homework 1b
Stochastic Process
Definition (stochastic process)
A stochastic process is sequence of indexed random variables denoted as Z(ω, t)
where ω belongs to a sample space and t belongs to an index set.
From here on out, we will simply write a stochastic process (or time series) as
{Zt } (dropping the ω).
Notation
For the time units t1 , t2 , . . . , tn , denote the n-dimensional distribution function of
Zt1 , Zt2 , . . . , Ztn as
Example
iid Qn
Let Z1 , . . . , Zn ∼ N (0, 1). Then FZt1 ,...,Ztn (x1 , . . . , xn ) = j=1 Φ(xj ).
Stochastic Process
Definition (stochastic process)
A stochastic process is sequence of indexed random variables denoted as Z(ω, t)
where ω belongs to a sample space and t belongs to an index set.
From here on out, we will simply write a stochastic process (or time series) as
{Zt } (dropping the ω).
Notation
For the time units t1 , t2 , . . . , tn , denote the n-dimensional distribution function of
Zt1 , Zt2 , . . . , Ztn as
Example
iid Qn
Let Z1 , . . . , Zn ∼ N (0, 1). Then FZt1 ,...,Ztn (x1 , . . . , xn ) = j=1 Φ(xj ).
Stochastic Process
Definition (stochastic process)
A stochastic process is sequence of indexed random variables denoted as Z(ω, t)
where ω belongs to a sample space and t belongs to an index set.
From here on out, we will simply write a stochastic process (or time series) as
{Zt } (dropping the ω).
Notation
For the time units t1 , t2 , . . . , tn , denote the n-dimensional distribution function of
Zt1 , Zt2 , . . . , Ztn as
Example
iid Qn
Let Z1 , . . . , Zn ∼ N (0, 1). Then FZt1 ,...,Ztn (x1 , . . . , xn ) = j=1 Φ(xj ).
Stochastic Process
Definition (stochastic process)
A stochastic process is sequence of indexed random variables denoted as Z(ω, t)
where ω belongs to a sample space and t belongs to an index set.
From here on out, we will simply write a stochastic process (or time series) as
{Zt } (dropping the ω).
Notation
For the time units t1 , t2 , . . . , tn , denote the n-dimensional distribution function of
Zt1 , Zt2 , . . . , Ztn as
Example
iid Qn
Let Z1 , . . . , Zn ∼ N (0, 1). Then FZt1 ,...,Ztn (x1 , . . . , xn ) = j=1 Φ(xj ).
Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt } is strongly stationary if any collection
Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt } is strongly stationary if any collection
Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt } is strongly stationary if any collection
Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt } is strongly stationary if any collection
Mean Function
Definition (Mean Function)
The mean function of a time series {Zt } (if it exists) is given by
Z ∞
µt = E(Zt ) = x ft (x) dx
−∞
then
Mean Function
Definition (Mean Function)
The mean function of a time series {Zt } (if it exists) is given by
Z ∞
µt = E(Zt ) = x ft (x) dx
−∞
then
Mean Function
Definition (Mean Function)
The mean function of a time series {Zt } (if it exists) is given by
Z ∞
µt = E(Zt ) = x ft (x) dx
−∞
then
White Noise
Notation
at ∼ WN(0, σ 2 ) — white noise with mean zero and variance σ 2
IID WN
If as is independent of at for all s 6= t, then wt ∼ IID(0, σ 2 )
Gaussian White Noise ⇒ IID
Suppose at is normally distributed.
uncorrelated + normality ⇒ independent
Thus it follows that at ∼ IID(0, σ 2 ) (a stronger assumption).
White Noise
Notation
at ∼ WN(0, σ 2 ) — white noise with mean zero and variance σ 2
IID WN
If as is independent of at for all s 6= t, then wt ∼ IID(0, σ 2 )
Gaussian White Noise ⇒ IID
Suppose at is normally distributed.
uncorrelated + normality ⇒ independent
Thus it follows that at ∼ IID(0, σ 2 ) (a stronger assumption).
White Noise
Notation
at ∼ WN(0, σ 2 ) — white noise with mean zero and variance σ 2
IID WN
If as is independent of at for all s 6= t, then wt ∼ IID(0, σ 2 )
Gaussian White Noise ⇒ IID
Suppose at is normally distributed.
uncorrelated + normality ⇒ independent
Thus it follows that at ∼ IID(0, σ 2 ) (a stronger assumption).
White Noise
Notation
at ∼ WN(0, σ 2 ) — white noise with mean zero and variance σ 2
IID WN
If as is independent of at for all s 6= t, then wt ∼ IID(0, σ 2 )
Gaussian White Noise ⇒ IID
Suppose at is normally distributed.
uncorrelated + normality ⇒ independent
Thus it follows that at ∼ IID(0, σ 2 ) (a stronger assumption).
Note that
Zt = δ + Zt−1 + at
= 2δ + Zt−2 + at + at−1 (Zt−1 = δ + Zt−2 + at−1 )
..
.
t
X
= δt + aj
j=1
Therefore we see
Xt t
X
µt = E [Zt ] = E [δt] + E aj = δt + E [aj ] = δt
j=1 j=1
Arthur Berg Stationarity, ACF, White Noise, Estimation 7/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b
So in particular, we have
So in particular, we have
So in particular, we have
γ(s, t)
ρ(s, t) = p
γ(s, s)γ(t, t)
γ(s, t)
ρ(s, t) = p
γ(s, s)γ(t, t)
Weak Stationarity
Weak Stationarity
Weak Stationarity
Weak Stationarity
Outline
1 §2.1: Stationarity
4 R: Random Walk
5 Homework 1b
γ(h)
ρh = ρ(h) =
γ(0)
γ(h)
ρh = ρ(h) =
γ(0)
Outline
1 §2.1: Stationarity
4 R: Random Walk
5 Homework 1b
White Noise
White Noise
Outline
1 §2.1: Stationarity
4 R: Random Walk
5 Homework 1b
> w = rnorm(200,0,1)
> x = cumsum(w)
> wd = w +.2
> xd = cumsum(wd)
> plot.ts(xd, ylim=c(-5,55))
> lines(x)
> lines(.2*(1:200), lty="dashed")
Outline
1 §2.1: Stationarity
4 R: Random Walk
5 Homework 1b
Homework 1b