Properties of Estimators contd.
Example 3:
Let 𝑇1 and 𝑇2 be two unbiased independent estimators
for 𝜃, find an unbiased estimator for each of the
following:
1) 𝑘 𝜃 = 𝜃 2 2) 𝑘 𝜃 = 𝜃(1 − 𝜃)
Solution
Since 𝑇1 and 𝑇2 are two unbiased estimators for 𝜃
𝐸 𝑇1 = 𝜃, 𝐸 𝑇2 = 𝜃
Since 𝑇1 and 𝑇2 are independent
𝐸 𝑇1 𝑇2 = 𝐸 𝑇1 𝐸 𝑇2
1
Theory of Statistics II
SA302
Lecture 6
Course Instructor: Reham Elshaer
Email: rehamelshaer@feps.edu.eg
2
Properties of Estimators
Solution Contd.
1) Consider
𝑇3 = 𝑇1 𝑇2
𝐸 𝑇3 = 𝐸 𝑇1 𝐸 𝑇2 = 𝜃. 𝜃 = 𝜃 2
∴ 𝑇3 = 𝑇1 𝑇2 is an unbiased estimator for 𝜃 2
2) Consider
𝑇4 = 𝑇1 1 − 𝑇2 𝑜𝑟 𝑇2 1 − 𝑇1
𝐸 𝑇4 = 𝐸(𝑇1 1 − 𝑇2 )
= 𝐸 𝑇1 𝐸 1 − 𝑇2 = 𝜃(1 − 𝜃)
∴ 𝑇4 = 𝑇1(1 − 𝑇2 ) is an unbiased estimator for 𝜃(1 − 𝜃) 3
Properties of Estimators
Example 4:
Let 𝑋1 , 𝑋2 ,…, 𝑋𝑛 be a random sample from an
exponential distribution with pdf
𝑓 𝑥; 𝜃 = 𝜃 𝑒 − 𝜃𝑥 , x > 0, 𝜃 > 0
1
1) Show that 𝑋 is an unbiased estimator for
𝜃
𝑐
2) Find the constant 𝑐 such that E =𝜃
𝑋
4
Properties of Estimators
Solution
1
1) 𝐸 𝑋 = 𝐸 𝑋 =
𝜃
1
∴ 𝑋 is an unbiased estimator for
𝜃
𝑐 1 𝑛 1
2) E 𝑋
=cE 𝑋
=cE σ 𝑋𝑖
=cnE 𝑍
where 𝑍 = σ 𝑋𝑖
𝐼𝑓 𝑋 ~ exp 𝜃 𝑡ℎ𝑒𝑛 𝑍 = 𝑋𝑖 ~𝐺𝑎𝑚𝑚𝑎 (𝑛, 𝜃)
𝜃 𝑛 𝑛−1 −𝜃𝑧
𝑓𝑍 𝑧; 𝑛, 𝜃 = 𝑧 𝑒 , 𝜃 > 0, 𝑧>0
Γ𝑛
5
Properties of Estimators
Solution
1 ∞1 ∞ 1 𝜃𝑛
E =න fZ z;n,θ dz= න 𝑧 𝑛−1 𝑒 −𝜃𝑧 dz
Z 0 z 0 z Γ𝑛
𝜃 𝑛 ∞ 𝑛−1−1 −𝜃𝑧
= න 𝑧 𝑒 dz
Γ𝑛 0
𝜃 𝑛 Γ(𝑛 − 1) ∞ 𝜃 𝑛−1 𝑛−2 𝑒 −𝜃𝑧 dz
= න 𝑧
Γ𝑛 𝜃 𝑛−1 0 Γ(𝑛 − 1)
Note that the integral is equal to 1.
1 𝜃
∴E =
𝑍 𝑛−1
6
Properties of Estimators
Solution
1 1 𝜃
cE =ncE =nc
𝑋 𝑍 𝑛−1
𝑐 𝑛−1
To make an unbiased estimator for 𝜃, we have to choose 𝑐 =
𝑋 𝑛
𝑛−1 𝑛 − 1 (𝑛) 𝑛 − 1
= =
𝑛𝑋 𝑛 σ 𝑋𝑖 σ 𝑋𝑖
𝑛−1
∴ σ 𝑋𝑖
is an unbiased estimator for 𝜃.
7
Minimum Variance Unbiased Estimators
Assume 𝑋1 , 𝑋2 ,…, 𝑋𝑛 is a random sample from a pdf or a
pmf 𝑓 𝑥; 𝜃 and that 𝑇 is an unbiased estimator of 𝐾(𝜃),
then under certain regularity conditions, we have that
2
𝐾 ሖ𝜃
𝑉(𝑇) ≥ 2 ⟹ (1)
𝜕
𝐸 l𝑛 𝐿 𝑥; 𝜃
𝜕𝜃
2
𝐾 𝜃ሖ
𝑜𝑟 𝑉(𝑇) ≥ ⟹ (2)
𝜕2
−𝐸 2 l𝑛 𝐿 𝑥; 𝜃
𝜕𝜃
where 𝐿 𝑥; 𝜃 = ς𝑛𝑖=1 𝑓(𝑥𝑖 ; 𝜃)
8
Minimum Variance Unbiased Estimators
Note that (1) and (2) can also be written as
2
𝐾 ሖ𝜃
𝑉(𝑇) ≥ 2 ⟹ (3)
𝜕
𝑛𝐸 l𝑛 𝑓 𝑥𝑖 ; 𝜃
𝜕𝜃
2
𝐾 𝜃ሖ
𝑜𝑟 𝑉 𝑇 ≥ ⟹ (4)
𝜕2
−𝑛 𝐸 2 ln 𝑓 𝑥𝑖 ; 𝜃
𝜕𝜃
Note that the expression in the right hand side in any of the
above inequalities is called Crame’r – Rao Lower Bound (CRLB) 9
Minimum Variance Unbiased Estimators
Note that
If we have an unbiased estimator whose variance is
equal to Crame’r – Rao Lower Bound, we call it
minimum variance unbiased estimator (MVUE) or Best
Unbiased Estimator.
However, this does not mean that any MVUE satisfies
Crame’r – Rao Lower Bound.
10
Minimum Variance Unbiased Estimators (MVUE)
Definition
Let 𝑋1 , 𝑋2 ,…, 𝑋𝑛 be a random sample from 𝑓 𝑥; 𝜃 .
An estimator 𝑇 ∗ of 𝐾(𝜃) is defined to be MVUE of
𝐾(𝜃) if and only if
• 𝐸 𝑇 ∗ = 𝐾(𝜃) i.e. 𝑇 ∗ is unbiased and
• 𝑉𝑎𝑟(𝑇 ∗ ) ≤ 𝑉𝑎𝑟(𝑇) for any other estimator 𝑇 of
𝐾 𝜃 , which satisfies that 𝐸 𝑇 = 𝐾(𝜃).
11
Minimum Variance Unbiased Estimators (MVUE)
Example:
Let 𝑋1 , 𝑋2 ,…, 𝑋𝑛 be a random sample from a
𝑁(𝜃, 1) distribution.
Find Crame’r Rao Lower Bound when the function to
be estimated is
1) 𝐾 𝜃 = 𝜃 2) 𝐾 𝜃 = 𝜃 2
12
Minimum Variance Unbiased Estimators (MVUE)
Solution:
2
1) 𝐾 𝜃 = 𝜃, 𝐾 ሖ𝜃 = 1, 𝐾 ሖ𝜃 =1
2
ሖ
𝐾 𝜃
Crame’r Rao Lower Bound =
2
𝜕
𝐸 l𝑛 𝐿 𝑥; 𝜃
𝜕𝜃
1 −
1
(𝑥𝑖 −𝜃)2
𝑓 𝑥𝑖 ; 𝜃 = 𝑒 2 , −∞ < 𝑥𝑖 < ∞, −∞ < 𝜃 < ∞
2𝜋
𝑛
1 1
− σ(𝑥𝑖 −𝜃)2
𝐿 𝑥; 𝜃 = ෑ 𝑓 𝑥𝑖 ; 𝜃 = 𝑛𝑒 2
𝑖=1 (2𝜋) 2
−𝑛 1
𝐿𝑛𝐿 𝑥; 𝜃 = ln(2𝜋) − (𝑥𝑖 − 𝜃)2
2 2
13
Minimum Variance Unbiased Estimators (MVUE)
Solution contd.
𝜕 1 𝜕 (𝑥𝑖 − 𝜃)2
𝐿𝑛𝐿 𝑥; 𝜃 = −
𝜕𝜃 2 𝜕𝜃
1
= − 2 𝑥𝑖 − 𝜃 −1 = 𝑥𝑖 − 𝜃
2
𝑥𝑖 − 𝑛𝜃 = 𝑛𝑥 − 𝑛𝜃 = 𝑛(𝑥 − 𝜃)
2
𝜕
l𝑛 𝐿 𝑥; 𝜃 = 𝑛2 (𝑥 − 𝜃) 2
𝜕𝜃
2
𝜕
𝐸 l𝑛 𝐿 𝑥; 𝜃 = 𝐸 𝑛2 𝑋 − 𝜃 2
𝜕𝜃 14
Minimum Variance Unbiased Estimators (MVUE)
Solution contd.
ത
= 𝑛2 𝑉𝑎𝑟(𝑋)
𝜎 2
2 21
=𝑛 𝑛
= 𝑛
𝑛
(since 𝜎 2 =1)
=𝑛
2
𝐾 ሖ𝜃 1
Crame’r Rao Lower Bound = =
𝑛 𝑛
Since 𝑉𝑎𝑟 𝑋 ഥ = Crame’r Rao Lower Bound, we say
that 𝑋ത is MVUE.
15
Minimum Variance Unbiased Estimators (MVUE)
Solution contd.
2
2) 𝐾 𝜃 =𝜃 , 2
𝐾ሖ 𝜃 = 2𝜃, 𝐾 ሖ𝜃 = 4𝜃 2
4𝜃 2
Crame’r Rao Lower Bound =
𝑛
16
Minimum Variance Unbiased Estimators (MVUE)
In the previous example, if we use the second inequality
2
𝐾 𝜃ሖ
𝑉(𝑇) ≥
𝜕2
−𝐸 l𝑛 𝐿 𝑥; 𝜃
𝜕𝜃 2
We note that
𝜕
𝐿𝑛𝐿 𝑥; 𝜃 = 𝑛(𝑥 − 𝜃)
𝜕𝜃
𝜕2
∴ 2 l𝑛 𝐿 𝑥; 𝜃 = −𝑛
𝜕𝜃
𝜕2
∴ −𝐸 2 l𝑛 𝐿 𝑥; 𝜃 = 𝑛
𝜕𝜃
Hence we obtain the same result as the first inequality. 17