Kume section Flashcards

1
Q

When is a stochastic process weakly stationary?

A

If and only if for all t and s:

  1. E(yt) = µ, for all t
  2. var(yt) = E[(yt − µ)^2] = σ^2_y < ∞, for all t
  3. cov(yt, yt−s) = E[(yt − µ)(yt−s − µ)] = γs, for all t,s
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Give the covariance formula

A

Cov(X,Y) = Corr(X,Y) * sqrt(var(x)) * sqrt(var(y))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How is the autocorrelation between time yt and yt-s found?

A

ρs = γs/γ0

= corr(yt, yt−s) = cov(yt, yt−s) / (√γ0*√γ0)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Give the formula for sample autocorrelation

A

ρˆs =γˆs/ γˆ0
= rs
= sum(t=s+1 to T) (yt − y¯)*(yt−s − y¯) / sum(t=1 to T)(yt − y¯)^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define a white noise process

A

A sequence {εt} is a WN process if for all t:
E(εt) = 0
var(εt) = E(εt^2) = σ^2
cov(εt, εt−s) = E(εtεt−s) = 0 for all s ≠ 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Give the general formula for an ARMA(p,q) process

A

yt = a0 + sum(i=1 to p) aiyt−i + sum(i=0 to q) βiεt−i

A(L)yt = a0 + B(L)εt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do you present an ARMA process as an infinite MA process?

A
yt = [a0 +
sum(i = 0 to q) βiεt−i] / (1 − sum(i=1 to p) aiLi)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How do you present an ARMA process as an infinite AR process?

A

sum(i=0 to ∞) (−β1)^i

yt−i = εt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

When is an ARMA process stationary?

A

Roots of A(L) outside unit circle⇐⇒ Char. roots inside

unit circle ⇐⇒ MA(∞)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

When is an ARMA process invertible

A

Roots of B(L) outside unit circle⇐⇒AR(∞) representation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Give the Yule-Walker equations

A
γs = a1γs−1 + a2γs−2 +...+ apγs−p
γ0 = a1γ1 + a2γ2 +...+ apγp + σ^2

also

ρs = a1ρs−1 + a2ρs−2 + · · · + aρs-p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

In an AR(p) model, where are the ACF bars?

A

ρs ≠ 0, ∀s but (exp) decaying to 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

In an AR(p) model where are the PACF bars?

A

φss ≠ 0 for s ≤ p; φss = 0 s > p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In an MA(q) model, where are the ACF bars?

A

ρs ≠ 0 for s ≤ q; φss = 0 s > q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

In an MA(q) model where are the PACF bars?

A

φss ≠ 0∀s; exp-decaying to 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

In an ARMA(p,q) model, where are the ACF bars?

A

ρs ≠ 0, ∀s but (exp) decaying to 0 for s > q

17
Q

In an ARMA(p,q) model, where are the PACF bars?

A

φss ≠ 0∀s; but exp-decaying to 0 for s > p

18
Q

What is an ARIMA model?

A

The process yt is ARIMA(p,d,q) if ∆dyt = yt^star is stationary
ARMA(p,q)

19
Q

Give the formula for AIC

A

AIC(p, q) = ln(σ^2) + 2(p + q + 1)/T

20
Q

Give formula for BIC

A

BIC(p, q) = ln(σ^2) + (ln T)(p + q + 1)/T

21
Q

Would a model be better with a higher, or lower AIC/BIC?

A

Lower

22
Q

How can you estimate parameters in a model?

A
  • method of moments

- mle GENERALLY FOR ARMA USE MLE

23
Q

Give the forecast expectation of an ARMA model

A

ET (yT+j ) = a0 +

sum(i=1 to j-1) ai * ET (y_T+j−i) + sum(i=j to p) aiy_T+j−i + 0 + sum(i=j to q) βiε_T+j−i