[go: up one dir, main page]

0% found this document useful (0 votes)
38 views41 pages

Unit 2 - Inference

1) The document discusses hypothesis testing and inference in econometrics. It covers topics like single parameter significance tests, confidence intervals, and linear hypothesis testing. 2) Testing a single parameter involves establishing a null and alternative hypothesis about the parameter, such as whether it is equal to zero. The t-statistic is used to test hypotheses based on the Student's t distribution. 3) Examples are provided to demonstrate single parameter hypothesis testing using t-statistics from OLS regressions. Parameters are tested to see if they are statistically different than the value stated in the null hypothesis, such as being equal to zero.

Uploaded by

Eduardo Muñoz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views41 pages

Unit 2 - Inference

1) The document discusses hypothesis testing and inference in econometrics. It covers topics like single parameter significance tests, confidence intervals, and linear hypothesis testing. 2) Testing a single parameter involves establishing a null and alternative hypothesis about the parameter, such as whether it is equal to zero. The t-statistic is used to test hypotheses based on the Student's t distribution. 3) Examples are provided to demonstrate single parameter hypothesis testing using t-statistics from OLS regressions. Parameters are tested to see if they are statistically different than the value stated in the null hypothesis, such as being equal to zero.

Uploaded by

Eduardo Muñoz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

UNIT 2: INFERENCE

Mª Carmen García Centeno


Roberto Atanes Torres
INFERENCE

OBJECTIVES

1. Individual significance tests.

2. Joint significance tests.

3. Linear hypothesis tests.

4. Use PcGive to estimate and test hypotheses.

2
INFERENCE

INDEX

1. Introduction
2. Single parameter significance test
3. Two or more parameters being equal to zero test
4. Intervals and confidence regions
5. Linear hypothesis testing
6. Structural change test
7. Conclusions
8. Bibliography

3
INFERENCE

INTRODUCTION
Estimation and hypothesis testing are particularly important in
Econometrics.

Regarding estimation there are two different options: point


estimation and interval estimation.

We’ve already seen point estimation with OLS and ML.

Interval estimation allows to build and interval around a point


estimation and it is crucial to measure the accuracy of the
aforementiond point estimation.

To determine if a certain hypothesis is true or not we can use


confidence intervals.
4
INFERENCE

INTRODUCTION
Main concepts regarding hypothesis testing:
§ Type I error is the probability of rejecting the null hypothesis when it
is true. It is represented as a and it is also known as significance level
and it defines he critic region size.

§ The most commonly used significance levels in Econometrics are 1%,


5% or 10% maximum. Its complementary is called level of confidence
(1 - a ) and it’s the probability of accepting the null hypothesis when it is
true.

§ Type II error is the probability of accepting the null hypothesis when it


is false.
Which error do you think is worse?

5
INFERENCE

INTRODUCTION
§ The interval estimator provides a set of values where we can find
the true value of the parameter. If we wish to know how “close” is
to we must find two positive values and so:

( )
Pr bˆ1 - d £ b1 £ bˆ1 + d = 1 - a (1)

§ Both ends of the interval are known as the confidence limits, being
b̂1 + d the upper limit and b̂1 - d the lower limit of the interval.
§ Equation (1) is telling us that the probability of b1 being within the
limits is (1 - a ) because despite the parameter is unknown, it is a
constant.

6
INFERENCE

INTRODUCTION
§ The goal of testing a single parameter is to know if it is close enough to
a hypothetical value. Nonetheless, tests can be arranged for one or more
parameters.
§ The hypothesis we consider is true is the null hypothesis and it is known
as the como null hypothesis (H0). The null hypothesis will be simple ( H 0 : b1 = 0 ) .
§ The hypothesis against H0 is the alternative hypothesis (H1) and it can be
simple or composite ( H1 : b1 = 1; H1 : b1 ¹ 1; H1 : b1 < 1; H1 : b1 > 1) .
§ To arrange the test we need to know the probability distribution of the
statitic, the significance level, the critical value and the critical and
acceptance regions size.

7
INFERENCE

OLS ESTIMATORS SAMPLE


DISTRIBUTIONS
OLS assumptions establish that estimated parameters are unbiased and
according to Gauss-Markov their variance is minimum. Knowing their
mean and variance is useful to assess it accuracy, but we also need to
know their probability distribution.

We know that the population error u is independent from the


explanatory variables and it follows a normal distribution.
u ! N ( 0, s 2 )

If

Then,

8
INFERENCE

TESTING ONE PARAMETER


Parameters are unknown population constants although we can design
hypotheses regarding their values and consequently use statistical
inference to test the aforementioned hypotheses. According to the
normality assumption, the parameters follow this distribution:

bˆi ! N ( bi , s 2 aii )
If we standardize it we’ll obtain the variable Z that follows a
standardized normal distribution:
bˆ1 - b1
Z=
s 2 aii
But we cannot use the normal distribution as the variance is unknown.
For that reason we will work with a Student’s t distribution with n–k
degrees of freedom: bˆ1 - b1
t= ! t( n - k )
sˆ aii
2

9
INFERENCE

TESTING ONE PARAMETER

H 0 : bi = 0

H 0 : b2 = 0

10
INFERENCE

TESTING ONE PARAMETER

The t statistics for the contrast is:


bˆ2 - 0
t=
sˆ 2 aii
As the standard deviation (SD) is always positive, the sign of the
statistic depends on the value of the estimated parameter. For a given
value of the SD we must take into account that the greater the value of
the parameter, the greater the value of the t. If the sample is far from
zero there is an evidence to reject H0. Nevertheless, we must consider
that t measures the number of SD that b̂ 2 is far from zero.

To arrange the test we need to know the significance level (we will
usually work with 5% despite 1% and 10% could also be applied) and
also the critical values.

11
INFERENCE

TESTING ONE PARAMETER


According to the alternative hypothesis, test can be:
§ Unilateral (one tail)
§ Bilateral (two tais)
In addition, there are two unilateral test:
H 0 : bi = 0 H 0 : bi = 0
H1 : b i < 0 H1 : b i > 0
In the first one we disregard the the positive population values and in
the second case the negative ones (economic theory will help to choose
the one to work with).
We will look for the critical value for a given significance level and it
will determine both the critical region and the acceptance region.
Decision rule: if the t statistic is in the critical region we will reject H0;
if it is in the acceptance region we won’t reject H0.

12
INFERENCE

TESTING ONE PARAMETER


Modelling L Salary by OLS
Coefficient Std.Error t-value t-prob
Constant 0.370530 0.1033 3.58 0.0004
Years educ. 0.011242 0.0093 1.20 0.2313
Years exp. 0.005227 0.0013 4.02 0.0001
Leisure -0.001406 0.0004 -2.99 0.0031

sigma 1.951358 RSS 12.0197221


R^2 0.714582 F(4,250) = 24.21 [0.000]**
log-likelihood -27.1297 DW 2.01
no. of observations 254 no. of parameters 4

H 0 : b2 = 0 H 0 : b4 = 0
H1 : b 2 > 0 H1 : b 4 < 0
a = 5% a = 5%
t( 250;0,05) @ 1.64 t( 250;0,05) @ -1.64
t = 1.20 < 1.64 Þ Se acepta H 0 t = -2.99 > -1.64 Þ No se acepta H 0
Years of education doesn’t have any effect on salary while if we increase one
hour in Leisure salary will decrease 0.1%
13
INFERENCE

TESTING ONE PARAMETER


A bilateral test can be defined as: H 0 : bi = 0
H1 : b i ¹ 0

This hypothesis is used when the sign of the parameter is not determined by
economic theory (or common sense).
We will look for the critical value for a certain significance level and it will
determine both the acceptance region and the critical region. In a bilateral
test each tail area will be equal to: (a / 2 ) .
Decision rule: if the t statistic is in the critical region we will reject H0 and
accordingly the variable xi will be significative; if it is in the acceptance
region we won’t reject H0 and that variable won’t be significative.

Bilateral tests are commonly used in Econometrics.

14
INFERENCE

TESTING ONE PARAMETER


Modelling L Grades Language 6º by OLS
Coefficient Std.Error t-value t-prob
Constant 0.77053 0.1633 4.72 0.000
Math grades 1.21327 0.5504 2.20 0.031
%Immigrants 0.52278 0.4323 1.20 0.234
%Repeating course -0.74066 0.2476 -2.99 0.004
Income 0.54089 0.1567 3.45 0.001
sigma 0.451358 RSS 12.0197221
R^2 0.645821 F(4,59) = 4.808 [0.002]**
log-likelihood -37.2974 DW 2.03
no. of observations 64 no. of parameters 5

H 0 : b4 = 0 H 0 : b3 = 0
H1 : b 4 ¹ 0 H1 : b 3 ¹ 0
a = 5% a = 5%
t(59;0,025) @ -2 t(59;0,025) @ 2
t = -2.99 > -2 Þ No se acepta H 0 t = 1.20 < 2 Þ Se acepta H 0

The % of immigrants doesn’t have any effect on grades while % of students repeating
course does.
It is not referred to specific students but to average population.
15
INFERENCE

TESTING ONE PARAMETER


Other hypothesis tests (unilateral or bilateral):
H 0 : bi = ci
H1 : bi ¹ ci
The t statistics and its probability distribution would be:
bˆi - ci
t= ! t( n - k ;(a /2))
sˆ aii
2

Once the critical values are obtained, we will be able to also obtain the
critical and acceptance regions. If the value of the t statistic is in the
critical region, we won’t accept H0 and consequently the parameter
won’t be equal to ci. On the flipside, if the t statistic value is in the
acceptance region, we will accept H0 and the value of the parameter is
equal to ci

16
INFERENCE

TESTING ONE PARAMETER


Modelling L Grades Language 6º by OLS
Coefficient Std.Error t-value t-prob
Constant 0.77053 0.1633 4.72 0.000
Math grades 1.21327 0.5504 2.20 0.031
%Immigrants 0.52278 0.4323 1.20 0.234
%Repeating course -0.74066 0.2476 -2.99 0.004
Income 0.54089 0.1567 3.45 0.001
sigma 0.451358 RSS 12.0197221
R^2 0.645821 F(4,59) = 4.808 [0.002]**
log-likelihood -37.2974 DW 2.03
no. of observations 64 no. of parameters 5

H 0 : b2 = 1
H1 : b1 ¹ 1
a = 5%
t(59;0,25) @ 2
1.21327 - 1
t= = 0,3874 < 2 Þ Se acepta H 0
0.5504
An increase of one point in Maths will imply (on average) that Language
grades will raise one point.
17
INFERENCE

TESTING ONE PARAMETER

18
INFERENCE

TESTING ONE PARAMETER


Using p-values
On one hand, if the p-value is small, there’s an evidence to possibly
reject H0 because the result obtained from available data has a small
probability when H0 is true. On the other hand, when p-value is large
there’s evidence to accept H0
Modelling L Grades Language 6º by OLS
Coefficient Std.Error t-value t-prob
Constant 0.77053 0.1633 4.72 0.000
Maths grades 1.21327 0.5504 2.20 0.031
%Immigrants 0.52278 0.4323 1.20 0.234
%Repeating course -0.74066 0.2476 -2.99 0.004
Income 0.54089 0.1567 3.45 0.001

sigma 0.451358 RSS 12.0197221


R^2 0.645821 F(4,59) = 4.808 [0.002]**
log-likelihood -37.2974 DW 2.03
no. of observations 64 no. of parameters 5

For a 5% significance level we reject H0 in all cases except for


%Immigrants.

19
INFERENCE

TESTING ONE PARAMETER


Additional considerations:
§ We must take into account both the statistical and economical
significance (if a variable is statistically significative, its coefficient
analysis regarding its size will give us an idea on its importance
practically and economically speaking).
§ If a variable is not statistically significative we should analyse if its
expected effect on y is important. If that’s the case and we cannot
obtain a sample, we could calculate its p-value for the t statistic in
order to make that variable significative.
§ If the sign of a coefficient is not the expected one and the t statistic is
small we could conclude that a variable wouldn’t be significative. If it
is large there could be a problem and we should ponder the model and
the data we are working with.

20
INFERENCE

TESTING ONE PARAMETER


§ When accepting H0 we must bear in mind that there could be other null
hypotheses that may be compatible with data.
§ When accepting H0 what we can assume is that based on the available
sample there would be no evidence to reject H0 . We cannot say that H0 is
true with a 100% degree of confidence.
§ The most frequent H0 is H 0 : bi = 0, that is testing if there is a relation
between the explanatory variable and the explained variable. Indeed, if
there would be no relation it wouldn’t make sense to bring up new
hypotheses.
§ There aren’t rules for establishing hypotheses, the most common
criterion is to use economic theory.
§ The tests can also be done making a comparison between the
significance level and the p-value or using confidence intervals.

21
INFERENCE

CONFIDENCE INTERVALS
Confidence intervals or interval estimation provide an interval that
contains a set of probable values for the population paramete. It is not a
point estimation.
Thus, to compute a confidence interval we will utilize the following t
distribution:
Pr ( -tn - 2,a 2 £ t £ tn - 2,a 2 ) = 1-a
Substituting:
æ ö
ç ˆ
bi - bi ÷
Pr ç -tn - 2,a 2 £ £ tn - 2,a 2 ÷ = 1 - a
ç
è ( )
Var bˆi ÷
ø

( ) ( )
Pr æç bˆi - tn - 2,a 2 Var bˆi £ b i £ bˆi + tn - 2,a 2 Var bˆi ö÷ = 1 - a
è ø
The compact versión of the interval would be: ˆ
( )
bi ± tn - 2,a 2 Var bˆi

22
INFERENCE

CONFIDENCE INTERVALS
Modelling L Salary by OLS
Coefficient Std.Error t-value t-prob
Constant 0.370530 0.1033 3.58 0.0004
Years educ. 0.011242 0.0093 1.20 0.2313
Years exp. 0.005227 0.0013 4.02 0.0001
Leisure -0.001406 0.0004 -2.99 0.0031

Confidence intervals: Obtained with T-student table

rechaza H0
Para b1: ( 0.3705 ± 1.64 ( 0.1033) ) Þ ( 0, 20108, 0,5399 ) we see if zero is
in the interval or

Para b 2 : ( 0.0112 ± 1.64 ( 0.0093


texto) ) Þ ( -0, 00405, 0, 02645 )
acepta H0 not, and then if
zero is in it, it
complete the
rechaza H0
Para b3: ( 0.0052 ± 1.64 ( 0.0013) ) Þ ( 0, 00306, 0, 00733 ) null hypothesis
(h0) or if zero is
not, it complete
rechaza H0 Para b 4 : ( -0.0014 ± 1.64 ( 0.0004 ) ) Þ ( -0, 00205, -0, 00074 ) hypothesis h1.

Those intervals allow to test hypotheses:


H 0 : bi = 0 Education years aren’t significative because zero is included
. H1 : b i ¹ 0 the interval. The rest of variable are significative.

23
INFERENCE

CONFIDENCE INTERVALS FOR s 2

(T - k )sˆ 2
! cT2- K
s2

Pr ( c12-a 2 £ c 2 £ ca2 2 ) = 1 - a
it is not symmetrical, so the two
critical values are different in
sign. is not like the normal
distribution, which is
c 2. symmetrical.

æ (T - k ) sˆ 2 ( T - k ) sˆ 2
ö
Pr ç £s £
2
÷÷ = 1 - a
ç ca2 2 c1-a 2 ø
2
è

24
INFERENCE

CONFIDENCE INTERVALS FOR s


2

We are going to calculate the confidence interval for the populartion


variance with a 5% significance level and test H 0 : s 2 = 10 where the critical
values are: c 0.975
2
@ 74, 27; c 0.025
2
@ 129.561
Modelling L Salary by OLS
Coefficient Std.Error t-value t-prob
Constant 0.370530 0.1033 3.58 0.0004
Years educ. 0.011242 0.0093 1.20 0.2313
Years exp. 0.005227 0.0013 4.02 0.0001
Leisure -0.001406 0.0004 -2.99 0.0031

sigma 1.951358 RSS 12.0197221


R^2 0.714582 F(4,250) = 24.21 [0.000]**
no. of observations 254 no. of parameters K= 4

N-K Squared sigma

æ ( 254 - 4 ) 3.8077 ( 254 - 4 ) 3.8077 ö


Pr ç £s £
2
÷ = 0.95
è 129.56 74.27 ø
Pr ( 7.3473 £ s 2 £ 12.8170 ) = 0,95 We accept H0
as 10 is in the interval (has to be 10 by the enunciado), we accept H0 which is H0= variance =10

25
INFERENCE

LINEAR COMBINATION OF PARAMETERS TEST

This test allows to test hypohteses over more than one population.
Parameter. The methodology is similar to what we’ve already seen.
H 0 : bi = b j Þ H 0 : bi - b j = 0
H1 : b i > b j
bˆi - bˆ j
t= ! tn - k ,a
(
Var bˆi - bˆ j )
To arrange this test, we can proceed in two different ways:
§ Estimate the restricted model with one variable as the difference between
xi and xj and the do an individual significance test over the parameter.
§ Obtain the standard deviation using the variance.
variance for the difference of these two coeffcients

( ) ( ) ( ) (
Var bˆi - bˆ j = Var bˆi + Var bˆ j - 2 cov bˆi , bˆ j )
26
INFERENCE

LINEAR COMBINATION OF PARAMETERS TEST


Modelling LPrice by OLS
Coefficient Std.Error t-value t-prob
Constant 10.9710 0.2827 38.8 0.0000
m^2 0.171680 0.07434 2.31 0.0603
Bedrooms. 0.019962 0.007269 2.74 0.0578
Bathrooms 0.016375 0.004111 3.98 0.0073
LPrice = + 10.97 + 0.1717*m^2 + 0.019*Bedrooms + 0.016375*Bathrooms
(SE) (0.283) (0.0743) (0.00727) (0.004111)
N=100; cov( b3 , b 4 ) = 0, 00001; t96;0.05 @ 1.65
Test if one more bedroom has the same effect on the price of a house than that
of one more bathroom or if this effect is greater.
H 0 : b3 = b 4 Þ H 0 : b3 - b 4 = 0
H1 : b 3 > b 4
Var ( b3 , b 4 ) = 0.0072692 + 0.0041112 - 2 ( 0, 00001) = 0, 000049
bˆ3 - bˆ4 0.003587
t= = = 0.5124 Þ Se acepta H 0 .
(
Var bˆ3 - bˆ4 ) 0.000049

27
INFERENCE

LINEAR RESTRICTION TEST


A linear hypothesis test can be formulated as:

H 0 : Rb = r
H1 : R b ¹ r
Where R is a matrix with dimension q x k (rows = number of hypotheses and
columns = number of parameters); r is a vector with dimension q.

If à

Therefore: and if H0 is true:

follows a

Hence,

28
INFERENCE

LINEAR RESTRICTION TEST

( ) ( )
' -1
Rbˆ -r é R ( X ' X ) R 'ù
-1
Rbˆ -r R2
ë û
q F = k - 12 ! Fk -1,T - k
F=
uˆ ' uˆ RSS
! Fk -1,T - k
(1 - R )
T -k T -k
*T = N

29
INFERENCE

LINEAR RESTRICTION TEST


Modelling LPrice by OLS
Coefficient Std.Error t-value t-prob
Constant 10.9710 0.2827 38.8 0.0000
m^2 0.171680 0.07434 2.31 0.0603
Bedrooms 0.019962 0.007269 2.74 0.0578
Bathrooms 0.016375 0.004111 3.98 0.0073

sigma 0.0901387 RSS 0.048749954


R^2 0.882146 F(3,96) = 239.52 [0.000]**
log-likelihood 12.4288 DW 1.74
no. of observations 10 no. of parameters 4

H 0 : {b 2 = b 3 = b 4 = 0 æ0 1 0 0ö æ0ö
ç ÷ ç ÷
R = ç 0 0 1 0÷; r = ç 0÷ F = 239.52 > 2.68
H1 : { b 2 ¹ b 3 ¹ b 4 ¹ 0 ç0 0 0 1÷ ç0÷
è ø è ø
F3,96 = 2.68 We reject H0

30
INFERENCE

LINEAR RESTRICTION TEST


§Modelling LPrice by OLS
Coefficient Std.Error t-value t-prob
Constant 10.9710 0.2827 38.8 0.0000
m^2 0.17168 0.0743 2.31 0.0603
Bedrooms 0.01996 0.0072 2.74 0.0578
Bathrooms 0.01637 0.00411 3.98 0.0073

Test if one more bedroom in a house has the same effect over price than that of
an additional bathroom and in addition test if m2 are significative.
Test for linear restrictions (Rb=r):
R matrix
ì b3 = b 4 ì b3 ¹ b 4 Constant m^2 bathrooms bedrooms
H0 : í H1 : í 0.00000 0.00000 1.0000 -1.0000
îb2 = 0 îb2 ¹ 0 0.00000 1.0000 0.00000 0.00000
r vector 0.00000 0.00000
LinRes F(2,96) = 127.43 [0.000]

We can’t accept H0

31
INFERENCE

LINEAR RESTRICTION TEST

if RSS1 << RSS2 this means that X3 was relevant

H0 : b * = 0 if RSS 1 == RSS2 —> X3 is not relevant

H1 : b * ¹ 0 R^2 = ESS/TSS

TSS = ESS + RSS

32
INFERENCE

LINEAR RESTRICTION TEST


§ Significance test using the residual sum.
If H0 is true, that would imply that the considered variables wouldn’t be
significant to explain the behaviour of y and accordingly the associated
parameters won’t be statistically significant.

If those variables are excluded from the model, the rest of coefficients shouldn’t
change much with respect to the original model with all the variables.
That would also imply that the RSS and the RRSS would be very similar.
Hence, the value of the statistic would be smaller than the critical value and we
would not reject H0.

On the flipside, if the RSS and RRSS are significantly different, we would reject
H0 and the variable would be statistically significant.

33
INFERENCIA EN EL MODELO LINEAL GENERAL

LINEAR RESTRICTION TEST

Modelling LPrice by OLS


Coefficient Std.Error t-value t-prob
Constant 10.9710 0.2827 38.8 0.0000
m^2 0.171680 0.0743 2.31 0.0603
Bedrooms 0.019962 0.0072 2.74 0.0578
Bathrooms 0.016375 0.0041 3.98 0.0073

Test if the number of bathrooms and the number of bedrooms don’t have any
influence on the price.

Test for excluding:


H 0 : b3 = b 4 = 0 [0] = bathrooms
H1 : b 3 ¹ b 4 ¹ 0 [1] = bedroomsSubset
F(2,96) = 94.21 [0.000]

We can´t accept H0

34
INFERENCE

CONFIDENCE REGIONS

If we look for a range of values for several coefficients, then we are not
considering an interval, but a region and we will work with a F distribution.

( ) ( Rbˆ -r )
' -1
Rbˆ -r é R ( X ' X ) R 'ù
-1
ë û
q
F= ! Fq ,T - k
uˆ ' uˆ
T -k

The R matrix selects the coefficients we will use to build up our confidence
region. Thus, the F statistic is:
( ) ( bˆ -b )
' -1
bˆ2 -b é( X ' X )2 ù
-1
ë û 2

q
F= ! Fq ,T - k
sˆ 2

35
INFERENCE

CONFIDENCE REGIONS
la
If is the value from the Fq,T-K tables for a significance = 5%
æ
( ) -1 -1
( )
Pr ç bˆ2 -b 2 é( X ' X )2 ù bˆ2 -b 2 £ la qsˆ 2 ö÷ = 1 - a
'

è ë û ø

This confidence region will be a polynomial which grade is equal to the number
of coefficients.

Those regions are mostly used for just two coefficients, and in this case the
confidence region will be an ellipse centered on the OLS estimators.

In addition, the greater the accuracy of the estimations, the lower the variances
and therefore the ellipse will de smaller.
Ellipse’s tilt will depend on the covariance of the parameters. If the covariance
is positive, it will rise from left to right and vice versa.

36
INFERENCE

STRUCTURAL CHANGE TEST


Structural change appears when the structure of the model changes at a certain
moment in time. For that purpose, we will apply the Chow test to determine if
this change is enough to assess changing the model’s coefficients.

In that sense we must consider:


§ The moment when we suspect there’s been a change that will split our sample
(T) into two independent subsets (T1 and T2).
§ Formulate and estimate the restricted model for the whole sample and then
calculate RSS.
yt = xt' b + ut t = 1,..., T ; SR
§ Formulate and estimate the two unrestricted models and calculate RSS1 and
RSS2
yt = xt' b1 + ut t = 1,..., T1 ; SR1
yt = xt' b 2 + ut t = T1 + 1,..., T ; SR2

37
INFERENCE

STRUCTURAL CHANGE TEST

H 0 : b1 = b 2 = b
H1 : b1 ¹ b 2 ¹ b

38
INFERENCE

STRUCTURAL CHANGE TEST


Example:
In order to estimate consumption using income, production and inflation during
1990-2017 (quarterly data) we will test if there is a structural change in 2000
knowing that RSS=0.0017; RSS1 = 0.0006419; and RRS2 = 0.00070632.

The test is: H 0 : b1 = b 2 = b


H1 : b1 ¹ b 2 ¹ b

The F statistic is : F = 0,57609


And the critical value with a 5% significance level is: F4,60=2,53

As F = 0,57609 < F4,60=2,53 we accept H0 , which implies that there is no


structural change and we can use one single model for 1990-2017.

39
INFERENCE

CONCLUSIONS

§ Hypothesis tests are used to answer the next question: is a specific


circumstance compatible with the considered hypothesis?
§ To answer the previous question, we can work with both confidence
intervals and of significance tests (individually or jointly).
§ Finding one or more coefficients that aren’t significant doesn’t mean that
all coefficients are simultaneously non-significant.
§ The F test can work for several hypotheses such as if a coefficient is
significative, if two or more coefficients are equal, if coefficients fulfil
linear restrictions, if there’s a structural change in the model, etc.
§ If the model is validated it can be used for predictive purposes, situation
analysis, economic policy review…

40
INFERENCE

REFERENCES

• Gujarati, D.N. (2010). Basic Econometrics. McGraw-Hill International.

• Greene, W. H. (2006). Econometric Analysis. Prentice-Hall.

• Wooldridge, J.M. (2006). Introductory Econometrics. Thomson.

41

You might also like