[go: up one dir, main page]

0% found this document useful (0 votes)
62 views5 pages

ISI MStat 06

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 5

Test Code MS (Short answer type) 2006

Syllabus for Mathematics

Permutations and Combinations. Binomial and multinomial theorem. Theory


of equations. Inequalities.
Determinants, matrices, solution of linear equations and vector spaces.
Trigonometry, Coordinate geometry of two and three dimensions.
Geometry of complex numbers and De Moivre’s theorem. Elements of set
theory.
Convergence of sequences and series. Power series. Functions, limits and
continuity of functions of one or more variables.
Differentiation, Leibnitz formula, maxima and minima, Taylor’s theorem. Dif-
ferentiation of functions of several variables. Applications of differential calculus.
Indefinite integral, Fundamental theorem of calculus, Riemann integration
and properties. Improper integrals. Differentiation under the integral sign. Dou-
ble and multiple integrals and applications.

Syllabus for Statistics


Probability and Sampling Distributions
Notions of sample space and probability, combinatorial probability, condi-
tional probability and independence, random variable and expectations, mo-
ments, standard discrete and continuous distributions, sampling distributions
of statistics based on normal samples, central limit theorem, approximation of
binomial to normal or Poisson law. Bivariate normal and multivariate normal
distributions.

1
Descriptive Statistics
Descriptive statistical measures, graduation of frequency curves, productmo-
ment, partial and multiple correlation, Regression (bivariate and multivariate).

Inference
Elementary theory and methods of estimation (unbiasedness, minimum vari-
ance, sufficiency, maximum likelihood method, method of moments). Tests of
hypotheses (basic concepts and simple applications of Neyman-Pearson Lemma).
Confidence intervals. Tests of regression. Elements of non-parametric inference.

Design of Experiments and Sample Surveys


Basic designs (CRD/RBD/LSD) and their analyses. Elements of factorial de-
signs. Conventional sampling techniques (SRSWR/SRSWOR) including stratifi-
cation; ratio and regression methods of estimation.

2
Sample Questions
1. Let A and B be two invertible n×n real matrices. Assume that A+B is invertible.
Show that A−1 + B −1 is also invertible.

2. Maximize x + y subject to the condition that 2x2 + 3y 2 ≤ 1.

3. Let X1 , X2 . . . be i.i.d. Bernoulli random variables with parameter 14 , let Y1 , Y2 , . . .


be another sequence of i.i.d. Bernoulli random variables with parameter 34 and
Let N be a geometric random variable with parameter 21 (i. e. P (N = k) = 21k
for k = 1, 2 . . .). Assume the Xi ’s, Yj ’s and N are all independent.
Compute Cov( N
P PN
i=1 Xi , i=1 Yi ).

4. Let U be uniformly distributed on the interval (0, 2) and let V be an independent


random variable which has a discrete uniform distribution on {0, 1, . . . , n}. i.e.
1
P {V = i} = for i = 0, 1, . . . , n.
n+1
Find the cumulative distribution function of X = U + V .

5. Suppose X is the number of heads in 10 tossses of a fair coin. Given X = 5, what


is the probability that the first head occured in the third toss?

6. Let Y1 , Y2 , Y3 be i.i.d. continuous random variables. For i = 1, 2, define Ui as


1 if Yi+1 > Yi ,
n
Ui =
0 otherwise.
Find the mean and variance of U1 + U2 .

7. Let Y be a random variable with probability density function



1 −y/θ
e if y > 0,
fY (y|θ) = θ
0 otherwise,
with θ > 0.
Suppose that the conditional distribution of X given Y = y is N (y, σ 2 ), with
σ 2 > 0. Both θ and σ 2 are unknown parameters. Let (X1 , Y1 ), . . . , (Xn , Yn ) be a
random sample from the joint distribution of X and Y .
Find a nontrivial joint sufficient statistic for (θ, σ 2 ).

1
8. Let (X1 , Y1 ), . . . , (Xn , Yn ) be a random sample from the discrete distribution with
joint probability mass function

(x, y) = (0, 0) and (1, 1),
fX,Y (x, y) = 42−θ
4 (x, y) = (0, 1) and (1, 0),

with 0 ≤ θ ≤ 2. Find the maximum likelihood estimator of θ.

9. Let X1 , X2 , . . . be i.i.d. random variables with density fθ (x), x ∈ R, θ ∈ (0, 1)


being the unknown parameter. Suppose that there exists an unbiased estimator T
of θ based on sample size 1, i. e. Eθ (T (X1 )) = θ. Assume that Var(T (X1 )) < ∞.

(a) Find an estimator Vn for θ based on X1 , . . . Xn such that Vn is consistent


for θ.
(b) Let Sn be the MVUE (minimum variance unbiased estimator) of θ based on
X1 , . . . , Xn . Show that limn→∞ Var(Sn ) = 0.

10. For the data collected via a randomized block design with v treatments and b
blocks, the following model is postulated:

E(yij ) = µ + τi + βj , 1 ≤ i ≤ v, 1 ≤ j ≤ b,

where ti and βj are the effects of the ith treatment and the jth block respectively,
and µ is a general mean. For 1 ≤ i ≤ v, define Qi P = Ti − G
v , where Ti is the total
of observations under the ith treatment and G = vi=1 Ti . Show that

b b
E(Qi ) = (b − )τi , Var(Qi ) = σ 2 (b − ),
v v
b 2
Cov(Qi , Qj ) = −( )σ for i 6= j,
v
where σ 2 is the per observation variance.

11. A straight line regression E(y) = α + βx is to be fitted using four observations.


Assume Var(y|x) = σ 2 for all x. The values of x at which observations are to be
made lie in the closed interval [−1, 1]. The following choices of the values of x
where observations are to be made are available:

(a) two observations each at x = −1 and x = 1,


(b) one observation each at x = −1 and x = 1 and two observations at x = 0,

2
(c) one observation each at x = −1, − 12 , 12 , 1.

If the interest is to estimate the slope with least variance, which of the above
strategies would you choose and why?

12. Consider a possibly unbalanced coin with probability of heads in each toss being
p, where p is unknown. Let X be the number of tails before the first head occurs.
Find the uniformly most powerful test of level α for testing H0 : p = 61 against
H1 : p > 16 .

You might also like