[go: up one dir, main page]

0% found this document useful (0 votes)
70 views4 pages

Exercises 7

This document provides solutions and explanations for several statistical estimation exercises. It defines estimators for parameters of uniform and gamma distributions and shows they are unbiased. It also derives maximum likelihood estimators and determines they are consistent by showing their variance approaches 0 as the sample size increases. Key results include deriving the bias and mean squared error of the sample mean estimator of a uniform distribution parameter, and defining efficient estimators for the parameter of a beta distribution.

Uploaded by

dan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views4 pages

Exercises 7

This document provides solutions and explanations for several statistical estimation exercises. It defines estimators for parameters of uniform and gamma distributions and shows they are unbiased. It also derives maximum likelihood estimators and determines they are consistent by showing their variance approaches 0 as the sample size increases. Key results include deriving the bias and mean squared error of the sample mean estimator of a uniform distribution parameter, and defining efficient estimators for the parameter of a beta distribution.

Uploaded by

dan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

ESTIMATORS Exercises

Answers

[8.12]

a.

1 1
For the uniform distribution E(Yi ) = θ + 2 and so E(Ȳ ) = θ + 2 and the bias is
E(Ȳ ) − θ = 21 .

b.

An unbiased estimator is Ȳ − 12 .

c.

1
The variance of Ȳ is 12n and so

MSE(Ȳ ) = V (Ȳ ) + B 2
1 1
= +
12n 4

[8.13]

a.

 
Y Y 1
E n 1− = E(Y ) − E(Y 2 )
n n n
1
V (Y ) + E 2 (Y )

= np −
n
= np − p(1 − p) − np2
= (n − 1)p(1 − p)

b.

An unbiased estimator is
n  Y  Y
n 1−
n−1 n n

1
ESTIMATORS Exercises

[9.3]

a.

1 1 1
E(θ̂1 ) = E(Ȳ − ) = (θ + ) − = θ
2 2 2

1
V (θ̂1 ) = V (Ȳ ) =
12n

The probability density function of Y(n) is given by

fn (y) = nF n−1 (y)f (y)


= n(y − θ)n−1 θ ≤y ≤θ+1

and by the method of transformations the variable Z(n) = Y(n) − θ has a beta
distribution, B(n, 1).

n  n n
E(θ̂2 ) = E Z(n) + θ − = +θ− =θ
n+1 n+1 n+1

n  n
V (θ̂2 ) = V Z(n) + θ − = V (Z(n) ) =
n+1 (n + 2)(n + 1)2

and so θ̂1 and θ̂2 are unbiased and the efficiency of θ1 relative to θ̂2 is
n
V (θ̂2 ) (n+2)(n+1)2 12n2
= 1 =
V (θ̂1 ) 12n
(n + 2)(n + 1)2 .

2
ESTIMATORS Exercises

[9.74]

θ
2
Z
E(Y ) = 2 (θy − y 2 ) dy
θ 0
θ
2 h θy 2 y 3 i
= 2 −
θ 2 3 0
θ
=
3
θ̂
and setting the sample mean equal to the population mean gives Ȳ = 3 or θ̂ = 3Ȳ .

[9.85]

a.

The likelihood function is


n
Y 1
L= α
yiα−1 e−yi /θ
i=1
Γ(α)θ
n
1 P
yi /θ
Y
= n nα
e −
yiα−1
Γ (α)θ i=1

and P n
yi Y
ln L = −n ln Γ(α) − nα ln θ − + ln yiα−1
θ i=1

Differentiating with respect to θ gives


P
∂ ln L yi nα
= 2 −
∂θ θ θ
and equating the derivative to zero gives


θ̂ = .
α

3
ESTIMATORS Exercises

b.

Given that the gamma variable has mean αθ and variance αθ2 ,

1 αθ2 θ2
E(θ̂) = θ and V (θ̂) = = .
α2 n nα

c.

θ2
The estimator θ̂ is unbiased for θ and limn→∞ V (θ̂) = limn→∞ nα = 0 and so θ̂ is
a consistent estimator of θ.

You might also like