MA1
MA1
MA1
Generate 1000 datapoints from the following MA(1) process Xt = et – et-1 , where et is a Gaussian WN(0,1) .
Assuming generated data or a sample from same ARMA model . Identify its order using correlogram sample
partial correlogram.
Theory :
Auto Regressive Integrated Moving Average(ARIMA):
This model is a forecasting technique that projects future values of a series based entirely on
its own inertia . It’s main application is in the area of the short term forecasting requiring at least 40
historical data points. It works best when data exhibits a stable or consistent pattern over time with a
minimum amount of outliers .
ARIMA is usually superior to exponential smoothing techniques when the data is reasonably large and the
correlation between past observations.
ARIMA methodology attempts to describe the movements in a stationary time series as a function of what
are called ‘autoregressive and moving average ‘ parameters . These are referred as AR parameters
(autoregressive ) and MA parameters (moving average).
Correlogram is an aid to interpret a set of ACF (autocorrelation function) where sample auto correlations are
plotted by lag h.
Steps:
1. Get a series of 1 to 1000 in excel sheet and copy the column in SPSS . Rename it as observation .
2. Transform -> Compute variable -> Target variable -> Function Group : Random Number -> Choose
Normal ->Parameters : 0,1 -> Ok
3. Transform -> Compute variable -> Target variable : X -> Numeric Exp:e – lag(c,1) -> Ok
4. Analyze -> Forecasting -> Autocorrelation -> Variables :X -> Options -> Max no of lags :15 -> Choose
Bartlett’s Approximation -> Continue -> Display ACF and PACF ->Ok
Output:
Table 1:
Case Processing Summary
X
Series Length 1000
Number of Missing Values User-Missing 0
System-Missing 0
Number of Valid Values 1000
Number of Computable First Lags 999
Table 2:
Autocorrelations
Series: X
Box-Ljung Statistic
a
Lag Autocorrelation Std. Error Value df Sig.b
1 .018 .032 .309 1 .578
2 -.025 .032 .944 2 .624
3 .001 .032 .944 3 .815
4 .053 .032 3.780 4 .437
5 .011 .032 3.903 5 .563
6 .007 .032 3.954 6 .683
7 -.050 .032 6.465 7 .487
8 .043 .032 8.375 8 .398
9 -.020 .032 8.800 9 .456
10 -.003 .032 8.807 10 .550
11 .036 .032 10.156 11 .516
12 .011 .032 10.285 12 .591
13 .002 .032 10.289 13 .670
14 -.024 .032 10.860 14 .697
15 -.030 .032 11.788 15 .695
a. The underlying process assumed is MA with the order equal to the lag
number minus one. The Bartlett approximation is used.
b. Based on the asymptotic chi-square approximation.
Figure 1:
Table 3:
Partial Autocorrelations
Series: X
Partial
Lag Autocorrelation Std. Error
1 .018 .032
2 -.025 .032
3 .002 .032
4 .052 .032
5 .009 .032
6 .009 .032
7 -.050 .032
8 .043 .032
9 -.026 .032
10 .000 .032
11 .041 .032
12 .006 .032
13 .006 .032
14 -.027 .032
15 -.028 .032
Graph 2:
Conclusion: