Chapter 3
1. In the equation, y = β0 + β1 x1 + β2 x2 + u, β2 is a(n) _____.
a. independent variable
b. dependent variable
c. slope parameter
d. intercept parameter
2. Consider the following regression equation: y = β1 + β2 x1 + β2 x2 + u. What does β1 imply?
a.β1 measures the ceteris paribus effect of x1 on x2 .
b. β1 measures the ceteris paribus effect of y on x1 .
c. β1 measures the ceteris paribus effect of x1 on y.
d. β1 measures the ceteris paribus effect of x1 on u.
3. If the explained sum of squares is 35 and the total sum of squares is 49, what is the residual sum of
squares?
a. 10
b. 12
c. 18
d. 14
4. Which of the following is true of R2?
a. R2 is also called the standard error of regression.
b. A low R2 indicates that the Ordinary Least Squares line fits the data well.
c. R2 usually decreases with an increase in the number of independent variables in a regression.
d. R2 shows what percentage of the total variation in the dependent variable, Y, is explained by the
explanatory variables.
5. The value of R2 always _____.
a. lies below 0
b. lies above 1
c. lies between 0 and 1
d. lies between 1 and 1.5
6. If an independent variable in a multiple linear regression model is an exact linear combination of
other independent variables, the model suffers from the problem of _____.
a. perfect collinearity
b. homoskedasticity
c. heteroskedasticty
d. omitted variable bias
© 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a
publicly accessible website, in whole or in part
7. The assumption that there are no exact linear relationships among the independent variables in a
multiple linear regression model fails if _____, where n is the sample size and k is the number of
parameters.
a. n>2
b. n=k+1
c. n>k
d. n<k+1
8. Exclusion of a relevant variable from a multiple linear regression model leads to the problem of
_____.
a. misspecification of the model
b. multicollinearity
c. perfect collinearity
d. homoskedasticity
9. Suppose the variable x2 has been omitted from the following regression equation, y = β0 + β1 x1 +
̃ is the estimator obtained when x2 is omitted from the equation. The bias in β
β2 x2 + u. β ̃1 is positive if
1
_____.
a. β2 >0 and x 1 and x 2 are positively correlated
b. β2 <0 and x 1 and x 2 are positively correlated
c. β2 >0 and x 1 and x 2 are negatively correlated
d. β2 = 0 and x 1 and x 2 are negatively correlated
10. Suppose the variable x2 has been omitted from the following regression equation, y = β0 + β1 x1 +
̃1 is the estimator obtained when x2 is omitted from the equation. The bias in β
β2 x2 + u. β ̃1 is negative if
_____.
a. β2 >0 and x 1 and x 2 are positively correlated
b. β2 <0 and x 1 and x 2 are positively correlated
c. β2 =0 and x 1 and x 2 are negatively correlated
d. β2 =0 and x 1 and x 2 are negatively correlated
11. Suppose the variable x2 has been omitted from the following regression equation, y = β0 + β1 x1 +
̃1 is the estimator obtained when x2 is omitted from the equation. If E(β
β2 x2 + u. β ̃1 ) >β1, β
̃1 is said to
_____.
a. have an upward bias
b. have a downward bias
c. be unbiased
d. be biased toward zero
Answer: a
© 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a
publicly accessible website, in whole or in part
12. High (but not perfect) correlation between two or more independent variables is called _____.
a. heteroskedasticty
b. homoskedasticty
c. multicollinearity
d. micronumerosity
13. The term _____ refers to the problem of small sample size.
a. micronumerosity
b. multicollinearity
c. homoskedasticity
d. heteroskedasticity
14. Find the degrees of freedom in a regression model that has 10 observations and 7 independent
variables.
a. 17
b. 2
c. 3
d. 4
15. The Gauss-Markov theorem will not hold if _____.
a. the error term has the same variance given any values of the explanatory variables
b. the error term has an expected value of zero given any values of the independent variables
c. the independent variables have exact linear relationships among them
d. the regression model relies on the method of random sampling for collection of data
will not hold if the independent variables have exact linear relationships among them.
16. The term “linear” in a multiple linear regression model means that the equation is linear in
parameters.
17. The key assumption for the general multiple regression model is that all factors in the unobserved
error term be correlated with the explanatory variables.
18. The coefficient of determination (R2) decreases when an independent variable is added to a multiple
regression model.
19. An explanatory variable is said to be exogenous if it is correlated with the error term.
20. A larger error variance makes it difficult to estimate the partial effect of any of the independent
variables on the dependent variable.
© 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a
publicly accessible website, in whole or in part