[go: up one dir, main page]

0% found this document useful (0 votes)
26 views44 pages

Logit Analysis

Chapter 10 of 'Introduction to Econometrics' by Christopher Dougherty discusses binary choice models, specifically focusing on logit analysis and maximum likelihood estimation. It explains the limitations of linear probability models and introduces the logistic function as a solution for predicting probabilities that remain within the range of 0 to 1. The chapter also details the application of the logit model to real-world scenarios, such as predicting high school graduation based on the ASVABC score.

Uploaded by

AKHAND RAJ
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views44 pages

Logit Analysis

Chapter 10 of 'Introduction to Econometrics' by Christopher Dougherty discusses binary choice models, specifically focusing on logit analysis and maximum likelihood estimation. It explains the limitations of linear probability models and introduces the logistic function as a solution for predicting probabilities that remain within the range of 0 to 1. The chapter also details the application of the logit model to real-world scenarios, such as predicting high school graduation based on the ASVABC score.

Uploaded by

AKHAND RAJ
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 44

Type author name/s here

Dougherty

Introduction to Econometrics,
5th edition
Chapter heading
Chapter 10: Binary Choice and
Limited Dependent Variable
Models, and Maximum Likelihood
Estimation

© Christopher Dougherty, 2016. All rights reserved.


BINARY CHOICE MODELS: LOGIT ANALYSIS

Y, p
A
1
1 – b1 – b2Xi
b1 +b2Xi

b1 b1 + b2Xi

B
0 Xi X

The linear probability model may make the nonsense predictions that an event will occur
with probability greater than 1 or less than 0.

1
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00
1
F Z  p  F Z  
1  e Z
0.75
Z  1   2 X

0.50

0.25

0.00
-8 -6 -4 -2 0 2 4 6 Z

The usual way of avoiding this problem is to hypothesize that the probability is a sigmoid
(S-shaped) function of Z, F(Z), where Z is a function of the explanatory variables.

2
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00
1
F Z  p  F Z  
1  e Z
0.75
Z  1   2 X

0.50

0.25

0.00
-8 -6 -4 -2 0 2 4 6 Z

Several mathematical functions are sigmoid in character. One is the logistic function
shown here. As Z goes to infinity, e–Z goes to 0 and p goes to 1 (but cannot exceed 1). As Z
goes to minus infinity, e–Z goes to infinity and p goes to 0 (but cannot be below 0).
3
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00
1
F Z  p  F Z  
1  e Z
0.75
Z  1   2 X

0.50

0.25

0.00
-8 -6 -4 -2 0 2 4 6 Z

The model implies that, for values of Z less than –2, the probability of the event occurring is
low and insensitive to variations in Z. Likewise, for values greater than 2, the probability is
high and insensitive to variations in Z.
4
BINARY CHOICE MODELS: LOGIT ANALYSIS

1 U
p  F Z   Y
1  e Z V
dU dV
V U
dY
 dZ 2 dZ
dZ V

To obtain an expression for the sensitivity, we differentiate F(Z) with respect to Z. The box
gives the general rule for differentiating a quotient.

5
BINARY CHOICE MODELS: LOGIT ANALYSIS

1 U
p  F Z   Y
1  e Z V
dU dU dV
U 1  0 V U
dY
dZ  dZ 2 dZ
dZ V
dV
V 1  e  Z   e  Z
dZ

dp 1  e  Z 0  1  e  Z  e Z
 
dZ 1  e 
 Z 2
1  e 
Z 2

We apply the rule to the expression for F(Z).

6
BINARY CHOICE MODELS: LOGIT ANALYSIS

1 dp e Z
f (Z ) p  F Z   f Z   
1  e Z dZ 1  e  Z 2

0.2

0.1

0
-8 -6 -4 -2 0 2 4 6 Z

The sensitivity, as measured by the slope, is greatest when Z is 0. The marginal function,
f(Z), reaches a maximum at this point.

7
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00
1
F Z  p  F Z  
1  e Z
0.75
Z  1   2 X

0.50

0.25

0.00
-8 -6 -4 -2 0 2 4 6 Z

For a nonlinear model of this kind, maximum likelihood estimation is much superior to the
use of the least squares principle for estimating the parameters. More details concerning
its application are given at the end of this sequence.
8
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00
1
F Z  p  F Z  
1  e Z
0.75
Z   1   2 ASVABC

0.50

0.25

0.00
-8 -6 -4 -2 0 2 4 6 Z

We will apply this model to the graduating from high school example described in the linear
probability model sequence. We will begin by assuming that ASVABC is the only relevant
explanatory variable, so Z is a simple function of it.
9
BINARY CHOICE MODELS: LOGIT ANALYSIS

. logit GRAD ASVABC


Iteration 0: log likelihood = -158.10206
Iteration 1: log likelihood = -135.72326
Iteration 2: log likelihood = -131.33688
Iteration 3: log likelihood = -131.22325
Iteration 4: log likelihood = -131.223

The Stata command is logit, followed by the outcome variable and the explanatory
variable(s). Maximum likelihood estimation is an iterative process, so the first part of the
output will be like that shown.
10
BINARY CHOICE MODELS: LOGIT ANALYSIS

. logit GRAD ASVABC


Iteration 0: log likelihood = -158.10206
Iteration 1: log likelihood = -135.72326
Iteration 2: log likelihood = -131.33688
Iteration 3: log likelihood = -131.22325
Iteration 4: log likelihood = -131.223
----------------------------------------------------------------------------
Logistic regression Number of obs = 500
LR chi2(1) = 53.76
Prob > chi2 = 0.0000
Log likelihood = -131.223 Pseudo R2 = 0.1700
----------------------------------------------------------------------------
GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-----------+----------------------------------------------------------------
ASVABC | 1.336514 .2021962 6.61 0.000 .9402163 1.732811
_cons | 2.413086 .187316 12.88 0.000 2.045953 2.780218
----------------------------------------------------------------------------

Zˆ 2.4131  1.3365 ASVABC

In this case the coefficients of the Z function are as shown.

11
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00 0.4

1
pi 
1  e  2.4131 1.3365 ASVABCi
0.75 0.3
Cumulative effect

Marginal effect
0.50 0.2

0.25 0.1

0.00 0
-3 -2 -1 0 1 2 3

ASVABC

Zˆ 2.4131  1.3365 ASVABC

Since there is only one explanatory variable, we can draw the probability function and
marginal effect function as functions of ASVABC.

12
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00 0.4

1
pi 
1  e  2.4131 1.3365 ASVABCi
0.75 0.3
Cumulative effect

Marginal effect
0.50 0.2

0.25 0.1

0.00 0
-3 -2 -1 0 1 2 3

ASVABC

Zˆ 2.4131  1.3365 ASVABC

We see that ASVABC has its greatest effect on graduating when it is below ‒1, that is, in the
low ability range. Any individual with a score above the average (0) is almost certain to
graduate.
13
BINARY CHOICE MODELS: LOGIT ANALYSIS

. logit GRAD ASVABC


Iteration 0: log likelihood = -158.10206
Iteration 1: log likelihood = -135.72326
Iteration 2: log likelihood = -131.33688
Iteration 3: log likelihood = -131.22325
Iteration 4: log likelihood = -131.223
----------------------------------------------------------------------------
Logistic regression Number of obs = 500
LR chi2(1) = 53.76
Prob > chi2 = 0.0000
Log likelihood = -131.223 Pseudo R2 = 0.1700
----------------------------------------------------------------------------
GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-----------+----------------------------------------------------------------
ASVABC | 1.336514 .2021962 6.61 0.000 .9402163 1.732811
_cons | 2.413086 .187316 12.88 0.000 2.045953 2.780218
----------------------------------------------------------------------------

Zˆ 2.4131  1.3365 ASVABC

The t statistic indicates that the effect of variations in ASVABC on the probability of
graduating from high school is highly significant.

14
BINARY CHOICE MODELS: LOGIT ANALYSIS

. logit GRAD ASVABC


Iteration 0: log likelihood = -158.10206
Iteration 1: log likelihood = -135.72326
Iteration 2: log likelihood = -131.33688
Iteration 3: log likelihood = -131.22325
Iteration 4: log likelihood = -131.223
----------------------------------------------------------------------------
Logistic regression Number of obs = 500
LR chi2(1) = 53.76
Prob > chi2 = 0.0000
Log likelihood = -131.223 Pseudo R2 = 0.1700
----------------------------------------------------------------------------
GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-----------+----------------------------------------------------------------
ASVABC | 1.336514 .2021962 6.61 0.000 .9402163 1.732811
_cons | 2.413086 .187316 12.88 0.000 2.045953 2.780218
----------------------------------------------------------------------------

Zˆ 2.4131  1.3365 ASVABC

Strictly speaking, the t statistic is valid only for large samples, so the normal distribution is
the reference distribution. For this reason the statistic is denoted z in the Stata output.
This z has nothing to do with our Z function.
15
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00 0.4

1
pi 
1  e  2.4131 1.3365 ASVABCi
0.75 0.3
Cumulative effect

Marginal effect
0.50 0.2

0.25 0.1

0.00 0
-3 -2 -1 0 1 2 3

ASVABC

Zˆ 2.4131  1.3365 ASVABC

The coefficients of the Z function do not have any direct intuitive interpretation.

16
BINARY CHOICE MODELS: LOGIT ANALYSIS

1
p  F Z  
1  e Z

Z   1   2 X 2  ...  k X k

However, we can use them to quantify the marginal effect of a change in ASVABC on the
probability of graduating. We will do this theoretically for the general case where Z is a
function of several explanatory variables.
17
BINARY CHOICE MODELS: LOGIT ANALYSIS

1
p  F Z  
1  e Z

Z   1   2 X 2  ...  k X k

p dp Z e Z
  f Z  i  i
X i dZ X i 1  e 
 Z 2

Since p is a function of Z, and Z is a function of the X variables, the marginal effect of Xi on


p can be written as the product of the marginal effect of Z on p and the marginal effect of Xi
on Z.
18
BINARY CHOICE MODELS: LOGIT ANALYSIS

1
p  F Z  
1  e Z

Z   1   2 X 2  ...  k X k

p dp Z e Z
  f Z  i  i
X i dZ X i 1  e 
 Z 2

dp e Z
f Z   
dZ 1  e  Z 2

We have already derived an expression for dp/dZ. The marginal effect of Xi on Z is given by
its b coefficient.

19
BINARY CHOICE MODELS: LOGIT ANALYSIS

1
p  F Z  
1  e Z

Z   1   2 X 2  ...  k X k

p dp Z e Z
  f Z  i  i
X i dZ X i 1  e 
 Z 2

dp e Z
f Z   
dZ 1  e  Z 2

Hence we obtain an expression for the marginal effect of Xi on p.

20
BINARY CHOICE MODELS: LOGIT ANALYSIS

1
p  F Z  
1  e Z

Z   1   2 X 2  ...  k X k

p dp Z e Z
  f Z  i  i
X i dZ X i 1  e 
 Z 2

dp e Z
f Z   
dZ 1  e  Z 2

The marginal effect is not constant because it depends on the value of Z, which in turn
depends on the values of the explanatory variables. A common procedure is to evaluate it
for the sample means of the explanatory variables.
21
BINARY CHOICE MODELS: LOGIT ANALYSIS

. sum ASVABC

Variable | Obs Mean Std. Dev. Min Max


-----------+--------------------------------------------------------
ASVABC | 500 .2715089 .8985844 -2.219903 2.640667

The sample mean of ASVABC in this sample is 0.2715.

22
BINARY CHOICE MODELS: LOGIT ANALYSIS

. sum ASVABC

Variable | Obs Mean Std. Dev. Min Max


-----------+--------------------------------------------------------
ASVABC | 500 .2715089 .8985844 -2.219903 2.640667

Z   1   2 X 2.4131  1.3365 0.2715 2.7760


Logistic regression Number of obs = 500
LR chi2(1) = 53.76
Prob > chi2 = 0.0000
Log likelihood = -131.223 Pseudo R2 = 0.1700
----------------------------------------------------------------------------
GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-----------+----------------------------------------------------------------
ASVABC | 1.336514 .2021962 6.61 0.000 .9402163 1.732811
_cons | 2.413086 .187316 12.88 0.000 2.045953 2.780218
----------------------------------------------------------------------------

When evaluated at the mean, Z is equal to 2.7760.

23
BINARY CHOICE MODELS: LOGIT ANALYSIS

. sum ASVABC

Variable | Obs Mean Std. Dev. Min Max


-----------+--------------------------------------------------------
ASVABC | 500 .2715089 .8985844 -2.219903 2.640667

Z   1   2 X 2.4131  1.3365 0.2715 2.7760

e  Z e  2.7760 0.0623

1 1
p F  Z   Z
 0.9414
1 e 1  0.0623

e–Z is 0.0623. Hence F(Z) is 0.9414. There is 94.1 percent probability that an individual with
average ASVABC will graduate from high school.

24
BINARY CHOICE MODELS: LOGIT ANALYSIS

. sum ASVABC

Variable | Obs Mean Std. Dev. Min Max


-----------+--------------------------------------------------------
ASVABC | 500 .2715089 .8985844 -2.219903 2.640667

Z   1   2 X 2.4131  1.3365 0.2715 2.7760

e  Z e  2.7760 0.0623

dp e Z 0.0623
f Z     0.0552
dZ 1  e  1  0.0623 
 Z 2 2

f(Z) is 0.0552.

25
BINARY CHOICE MODELS: LOGIT ANALYSIS

. sum ASVABC

Variable | Obs Mean Std. Dev. Min Max


-----------+--------------------------------------------------------
ASVABC | 500 .2715089 .8985844 -2.219903 2.640667

Z   1   2 X 2.4131  1.3365 0.2715 2.7760

e  Z e  2.7760 0.0623

dp e Z 0.0623
f Z     0.0552
dZ 1  e  1  0.0623 
 Z 2 2

dp dp dZ
  f  Z   i 0.0552 1.3365 0.0738
dX i dZ dX i

The marginal effect, evaluated at the mean, is therefore 0.0738. This implies that a one unit
increase in ASVABC would increase the probability of graduating from high school by 7.4
percent.
26
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00 0.4
0.941

0.75 0.3
Cumulative effect

Marginal effect
0.50 0.2

0.25 0.1
0.074

0.00 0
0.27
-3 -2 -1 0 1 2 3

ASVABC

In this example, the marginal effect at the mean of ASVABC is low. An increase of a whole
standard deviation increases the probability by only 7.4 percent. The reason is that anyone
with an average score is almost certain to graduate anyway.
27
BINARY CHOICE MODELS: LOGIT ANALYSIS

. sum ASVABC

Variable | Obs Mean Std. Dev. Min Max


-----------+--------------------------------------------------------
ASVABC | 500 .2715089 .8985844 -2.219903 2.640667

Z   1   2 X 2.4131  1.3365  2  0.2599

e  Z e 0.2599 1.2969

1 1
p F  Z   Z
 0.4354
1 e 1  1.2969

To show that the marginal effect varies, we will also calculate it for ASVABC equal to ‒2,
two standard deviations below the mean. For this value of ASVABC, the probability of
graduating is only 43.5 percent.
28
BINARY CHOICE MODELS: LOGIT ANALYSIS

. sum ASVABC

Variable | Obs Mean Std. Dev. Min Max


-----------+--------------------------------------------------------
ASVABC | 500 .2715089 .8985844 -2.219903 2.640667

Z   1   2 X 2.4131  1.3365  2  0.2599

e  Z e 0.2599 1.2969

dp e Z 1.2969
f Z     0.2458
dZ 1  e  1  1.2969 
 Z 2 2

dp dp dZ
  f  Z   i 0.2458 1.3365 0.3285
dX i dZ dX i

For ASVABC equal to ‒2, a one unit increase in ASVABC increases the probability of
graduating by 32.9 percent.

29
BINARY CHOICE MODELS: LOGIT ANALYSIS

1.00 0.4

0.329
0.75 0.3
Cumulative effect

Marginal effect
0.50 0.2
0.435

0.25 0.1

0.00 0
-3 -2 -1 0 1 2 3

ASVABC

For an individual with a score of ‒2, with only a 44 percent probability of graduating, an
increase in the score has a relatively large impact.

30
BINARY CHOICE MODELS: LOGIT ANALYSIS

. logit GRAD ASVABC SM SF MALE


----------------------------------------------------------------------------
Logistic regression Number of obs = 500
LR chi2(4) = 62.72
Prob > chi2 = 0.0000
Log likelihood = -126.74232 Pseudo R2 = 0.1984
----------------------------------------------------------------------------
GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-----------+----------------------------------------------------------------
ASVABC | 1.145117 .2108465 5.43 0.000 .731865 1.558368
SM | .1250287 .078239 1.60 0.110 -.028317 .2783744
SF | .1077591 .0729015 1.48 0.139 -.0351251 .2506434
MALE | -.3405648 .3390381 -1.00 0.315 -1.005067 .3239376
_cons | -.350845 1.031741 -0.34 0.734 -2.37302 1.67133
----------------------------------------------------------------------------

Here is the output for a model with a somewhat better specification, with the iteration
messages deleted.

31
BINARY CHOICE MODELS: LOGIT ANALYSIS

. sum ASVABC SM SF MALE

Variable | Obs Mean Std. Dev. Min Max


-----------+--------------------------------------------------------
ASVABC | 500 .2715089 .8985844 -2.219903 2.640667
SM | 500 13.542 2.649202 1 20
SF | 500 13.282 2.877331 3 20
MALE | 500 .5 .5005008 0 1

We will estimate the marginal effects, putting all the explanatory variables equal to their
sample means.

32
BINARY CHOICE MODELS: LOGIT ANALYSIS

Logit: Marginal Effects


mean ̂ product

ASVABC 0.2715 1.1451 0.3109


SM 13.54 0.1250 1.6925
Z   1   2 X 2  ... k X k
SF 13.28 0.1078 1.4316
2.9139
MALE 0.50 –0.3406 –0.1703
constant 1.00 –0.3508 –0.3508
Total 2.9139

The first step is to calculate Z, when the X variables are equal to their sample means.

33
BINARY CHOICE MODELS: LOGIT ANALYSIS

Logit: Marginal Effects


mean ̂ product

ASVABC 0.2715 1.1451 0.3109


SM 13.54 0.1250 1.6925
e  Z e  2.9139 0.0543
SF 13.28 0.1078 1.4316
e Z
MALE 0.50 –0.3406 –0.1703 f Z  
1  e 
Z 2
constant 1.00 –0.3508 –0.3508
Total 2.9139 0.0488

We then calculate f(Z).

34
BINARY CHOICE MODELS: LOGIT ANALYSIS

Logit: Marginal Effects


mean ̂ product f(Z) ̂ f(Z)

ASVABC 0.2715 1.1451 0.3109 0.0488 0.0559


SM 13.54 0.1250 1.6925 0.0488 0.0061
SF 13.28 0.1078 1.4316 0.0488 0.0053
MALE 0.50 –0.3406 –0.1703 0.0488 –0.0166
constant 1.00 –0.3508 –0.3508
Total 2.9139

p dp Z
  f Z  i
X i dZ X i

The estimated marginal effects are f(Z) multiplied by the respective coefficients.

35
BINARY CHOICE MODELS: LOGIT ANALYSIS

Logit: Marginal Effects


mean ̂ product f(Z) ̂ f(Z)

ASVABC 0.2715 1.1451 0.3109 0.0488 0.0559


SM 13.54 0.1250 1.6925 0.0488 0.0061
SF 13.28 0.1078 1.4316 0.0488 0.0053
MALE 0.50 –0.3406 –0.1703 0.0488 –0.0166
constant 1.00 –0.3508 –0.3508
Total 2.9139

p dp Z
  f Z  i
X i dZ X i

According to the computations, a one standard deviation increase in the ASVABC score
increases the probability of graduating from high school by 5.6 percent.

36
BINARY CHOICE MODELS: LOGIT ANALYSIS

Logit: Marginal Effects


mean ̂ product f(Z) ̂ f(Z)

ASVABC 0.2715 1.1451 0.3109 0.0488 0.0559


SM 13.54 0.1250 1.6925 0.0488 0.0061
SF 13.28 0.1078 1.4316 0.0488 0.0053
MALE 0.50 –0.3406 –0.1703 0.0488 –0.0166
constant 1.00 –0.3508 –0.3508
Total 2.9139

p dp Z
  f Z  i
X i dZ X i

Being male decreases it by 1.7 percent. Variations in parental education appear to have
negligible effects.

37
BINARY CHOICE MODELS: LOGIT ANALYSIS

. logit GRAD ASVABC SM SF MALE


----------------------------------------------------------------------------
Logistic regression Number of obs = 500
LR chi2(4) = 62.72
Prob > chi2 = 0.0000
Log likelihood = -126.74232 Pseudo R2 = 0.1984
----------------------------------------------------------------------------
GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-----------+----------------------------------------------------------------
ASVABC | 1.145117 .2108465 5.43 0.000 .731865 1.558368
SM | .1250287 .078239 1.60 0.110 -.028317 .2783744
SF | .1077591 .0729015 1.48 0.139 -.0351251 .2506434
MALE | -.3405648 .3390381 -1.00 0.315 -1.005067 .3239376
_cons | -.350845 1.031741 -0.34 0.734 -2.37302 1.67133
----------------------------------------------------------------------------

From the regression output it can be seen that the effect of ASVABC was significant at the
0.1 percent level but the effects of the other variables were not significant.

38
BINARY CHOICE MODELS: LOGIT ANALYSIS

1 1
p  F Z   
1 e Z
1  e  1   2 ASVABC

Z   1   2 ASVABC

Individuals who graduated: outcome probability


1
1  e  1   2 ASVABCi

This sequence will conclude with an outline explanation of how the model is fitted using
maximum likelihood estimation.

39
BINARY CHOICE MODELS: LOGIT ANALYSIS

1 1
p  F Z   
1 e Z
1  e  1   2 ASVABC

Z   1   2 ASVABC

Individuals who graduated: outcome probability


1
1  e  1   2 ASVABCi

In the case of an individual who graduated, the probability of that outcome is F(Z). We will
give subscripts 1, ..., s to the individuals who graduated.

40
BINARY CHOICE MODELS: LOGIT ANALYSIS

1 1
p  F Z   
1 e Z
1  e  1   2 ASVABC

Z   1   2 ASVABC

Individuals who graduated: outcome probability


1
1  e  1   2 ASVABCi

Individuals who did not graduate: outcome probability


1
1
1  e  1   2 ASVABCi

In the case of an individual who did not graduate, the probability of that outcome is 1 – F(Z).
We will give subscripts s+1, ..., n to these individuals.

41
BINARY CHOICE MODELS: LOGIT ANALYSIS

Maximize

F  Z1  ... F  Z s   1  F  Z s 1  ...  1  F  Z n 


1 1
  ˆ1  ˆ2 ASVABC1
...   ˆ1  ˆ2 ASVABC s
1 e 1 e
 1   1 
 1  ˆ ˆ  ...  1  ˆ ˆ 
 1  e  1   2 ASVABC s 1   1  e  1   2 ASVABCn 
Did graduate Did not graduate
1 1
 1   2 ASVABCi
1
1e 1  e  1   2 ASVABCi

We choose the estimates of b1 and b2 so as to maximize the joint probability of the outcomes,
that is, F(Z1) x ... x F(Zs) x [1 – F(Zs+1)] x ... x [1 – F(Zn)]. There are no mathematical formulae
for the estimates. They have to be determined iteratively by a trial-and-error process.
42
Copyright Christopher Dougherty 2016.

These slideshows may be downloaded by anyone, anywhere for personal use.


Subject to respect for copyright and, where appropriate, attribution, they may be
used as a resource for teaching an econometrics course. There is no need to
refer to the author.

The content of this slideshow comes from Section 10.2 of C. Dougherty,


Introduction to Econometrics, fifth edition 2016, Oxford University Press.
Additional (free) resources for both students and instructors may be
downloaded from the OUP Online Resource Centre
http://www.oxfordtextbooks.co.uk/orc/dougherty5e/.

Individuals who are studying econometrics on their own who feel that they
might benefit from participation in a formal course should consider the London
School of Economics summer school course
EC212 Introduction to Econometrics
http://www2.lse.ac.uk/study/summerSchools/summerSchool/Home.aspx
or the University of London International Programmes distance learning course
EC2020 Elements of Econometrics
www.londoninternational.ac.uk/lse.

2016.05.19

You might also like