Section 5 Flashcards

1
Q

What are autocorrelated errors?

A

When there is correlation between the errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What makes a time series variable autocorrelated?

A

When it is correlated with itself at different points in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a first-order autoregressive process (AR(1))?

A

When a variable is a function of itself from the previous period (see notes on P1 example regarding AR(1))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Give an example of an AR process? What is alpha?

A

Xt=αX(t-1)+εt

Where alpha is between -1 and 1 and is a parameter called the Autocorrelation Coefficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Prove the equation for the variance of X in a AR process? What does this proof assume?

A

See notes

Assumes that Xt is homoskedastic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

By comparing the covariance between the variable and its first lag, show there’s a non-zero first-order autocorrelation?

A

See notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do we know that as we look at autocorrelations further into the past, the correlation with current Xt decreases?

A

Since absolute value of alpha is less than 1, and that α^j->0 as n->infinity, the correlation with current Xt must decrease as n-> further into the future

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

See

A

Bits in notes S5 saying to see bits in lectures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

See and read 4.2 autocorrelated regression errors

A

now, read whole section

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why, and how, can errors be written as an AR process?

A

Error will now be correlated with itself in different time periods

Written as: ε(t)=ρε(t-1)+u(t)

ρ=AC coefficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Given we want to estimate a MRM, where error process can be written as AR(1), if we ignore AC in errors and estimate beta parameters, what are the consequences? (2) and what is the solution?

A
  • OLS estimators are still unbiased (didn’t use no AC assumption when proving them to be unbiased)
  • When errors are AR(1), the equations for the variances of the OLS estimators are wrong. There is a corrected variance equation in notes BUT it still does not make OLS the best estimator since it doesn’t have the smallest variance tf use GLS
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Informal way and formal way of testing for AC’ed errors?

A

Plot graphs and then visually compare residuals to see if any correlation

formal way: durbin watson test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Briefly describe what the durbin watson test does?

A

Tests for first order AC (only), assumes the error term is written as ε(t)=ρε(t-1)+u(t)
Then tests the hypotheses:
H0: ρ=0
H1: ρ not equal to 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

See

A

Durbin watson equations, and the number line thing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What values can the DW test take?

A

anywhere between 0 and 4

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

3 drawbacks of the durbin watson test, and a solution to the third problem?

A
  • 2 inconclusive regions in the distribution
  • only tests for first order AC
  • test is invalid if one regressor in model is Y(t-1) (lagged dependent variable) (solution is to use durbin’s h test)
17
Q

Explain durbin’s h test?

A

Use when Y(t-1) is present. Hypotheses are same as before. Test is done off normal distribution (see equations)

18
Q

See

A

h stat note bottom of side 2 page 1

19
Q

What is the idea behind GLS estimation?

A

Using OLS estimation on a model that is transformed so it has no AC errors

20
Q

Explain the method of GLS estimation?

A

Given simple RM and error is AC such that ε(t)=ρε(t-1)+u(t):
1) Lag model by one period by multiplying by ρ
2) Subtract this model from the original to give:
Y(t)=α1+β2X(t)+u(t) (see notes)

Here u(t) satisfies all of the classical assumptions tf just use normal OLS estimators on the transformed model (method can also be used on MRM)

21
Q

When is the cochrane iterative procedure used? Explain the cochrane iterative procedure?

A

Used when ρ is unknown (extension of GLS)
METHOD:
1) estimate parameters of original model
2) using info from residuals, estimate ρ from:
ε(t)=ρε(t-1)+u(t) to give ρ(hat)
3) use ρ(hat) to estimate the quasi-differences (see form in notes!)
4) use OLS to estimate parameters in the transformed model (a) Y(t)=α1+α2X(t)+u(t)
5) use residuals of (a) to estimate ρ again, denoted ρ(hathat)
6) Repeat steps 3-5 until estimates for ρ converge

22
Q

What is artificial autocorrelation?

A

see notes 5.6