Section 3 Flashcards

1
Q

What is X in the bivariate linear model?

A

Explanatory/exogenous variable/regressor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is Y in the model?

A

Dependent/endogenous variable/regressand

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Is a model still linear if it has the form:

a) X^2
b) X1X2
c) β^2?

A

Yes yes no

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What’s the best method to estimate the regression line?

A

Minimise sum of squared errors - this is called ordinary least square estimation (OLS)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

See 3.2.1 OLS method

A

Now

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How can we say the OLS is the best estimation?

A

If the classical linear regression assumptions apply

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the 5 classical linear regression assumptions?

A

1) E(εi)=0 (errors have a 0 mean)
2) Var(εi)=σ^2 for all i (homoskedastic)
3) Cov(εi,εj)=E(εiεj) (errors are not autocorrelated)
4) E(Xiεj) = E(Xi)E(εj) = 0 (X and ε are independent)
5) εi ~ N(0,σ^2) (each error has the same normal distribution with same μ and σ^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does homoskedastic mean?

A

The variance of the error is constant for all observations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does it mean for error terms to be not autocorrelated?

A

Means there is no correlation between them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does the Gauss-Markov Theorem state and why?(2 reasons)

A

It states OLS estimators are the best linear unbiased estimators

Why?
Unbiased; sampling distributions of estimators are centred around their true values
Efficient; smallest variance compared with all other linear estimators

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How can we tell β estimates are normally distributed?

A

Since Y is a function of the linear errors tf is normally distributed and OLS estimators are linear function of Y, OLS estimators too must be normally distributed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Notes

A

Since the gauss Markov theorem tells us OLS is unbiased, the means of the sampling distributions of the estimators are just β1 and β2

(Learn equations to find the variance!)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does the coefficient of determination measure?

A

Goodness of fit

Note: although OLS finds best fitting line, doesn’t mean the line is good necessarily

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What do TSS, ESS and RSS stand for?

A
TSS = total sum of squares
ESS = explained sum of squares
RSS = residual sum of squares
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does a small RSS imply?

A

A good fit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What does R^2 tell us?

A

It tells us how much of the total variation in the Y variable is explained by the regression model

17
Q

Learn 3.4.1 and 3.4.2 and 3.5

A

T test and test of significance and confidence bands

18
Q

What are the 5 columns of the Eviews output?

A

1) variable (name of variable)
2) coefficient (OLS parameter estimates)
3) standard error (of the estimate)
4) t statistic (P(β)=0, equal to 2/3)
5) probability (p value measures probability under t distribution that lies above t statistic value)

19
Q

If doing a 2 sided 5% test, at what range of p value will you reject the null?

A

P-value<0.05

20
Q

Prove the OLS estimators for the simple regression model?

A

see notes

21
Q

Prove the OLS estimators for the simple regression model are unbiased

A

see notes

22
Q

Prove the OLS estimators variance for the simple regression model

A

see notes

23
Q

What is meant by a “linear estimator”?

A

An estimator is linear if it is a linear function of the sample observations

24
Q

What is meant by an “unbiased estimator”?

A

An estimator is unbiased if, on average, the estimate coincides with the true value. If an estimator x^ of x is unbiased then E(x^)=x (see 4b and 4c and 4d class 1)