Econ B S3 Flashcards

1
Q

What does it mean if CML4 is violated?

A

Perfect multicollinearity therefore OLS estimator can’t be computed; problems also arise when variables are close to PC

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does it mean if CML5 is violated?

A

Heteroscedasticity (HTSC) and autocorrelation (AC); OLS is no longer efficient but remains consistent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the 3 distinct assumptions for CLM1?

A

1) The random term error enters in additively
2) The model is linear in regression coefficients (partial derivatives functions of only known constants and regressors)
3) The population regression coefficients beta(j) are unknown constants that don’t vary across observations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does “The population regression coefficients beta(j) are unknown constants that don’t vary across observations” mean for cross sectional and time-series data?

A

For cross sectional data, it means there aren’t any sub groups of the population that have different marginal effects on the dependent variable

For time-series data, it means the effect of the x variable x(k) on y is constant in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does CLM6 allow us to do if it holds?

A

Confidence intervals/hypothesis testing using a t distribution (SEE NOTES)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

See

A

Read pages 8&9 of note, see and learn ‘assuming x is random’ and then try proving beta1 is consistent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why is CLM2 normally violated? What does this mean?

A

In economics, it is normally violated since x is random (stochastic) since the explanatory variables aren’t chosen and inputted.
Means that assuming x and u are independent is too strong, but uncorrelated is not enough (u may be uncorrelated to x but correlated to x^2) *

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does * lead to?

A

New assumption: Zero Conditional Mean Assumption/Mean Independence: the expected value of u doesn’t depend on the value of x:
E(u|x)=0
tf variable x is now ‘exogenous’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does the Zero Conditional Mean Assumption/Mean Independence assumption imply? (2)

A

1) Unconditional mean of the population values of u equals 0: E(u|x)=0 tf E(u)=0

2) x and u have 0 covariance (uncorrelated):
E(u|x)=0 -> Cov(x,u)=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why does E(u|x)=0 -> E(u)=0?

A

Law of iterated expectations states E(E(u|x))=E(u)

Therefore: E(u)=E(E(u|x))=E(0)=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does E(u|x)=0 -> Cov(x,u)=0 lead to? Prove that E(u|x)=0 -> Cov(x,u)=0.

A

It means no linear correlation, but also that x and u are mean independent (tf x is strictly exogenous to u). Proof is page 10 of notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why do we need a new set of assumptions?

A

Since x is now random, we need new assumptions with everything conditional on x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Which 2 assumptions don’t change?

A

CLM1 and CLM 4

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is MLR2?

A

The error u has an expected value of zero, given any values of the IVs

E(u|x1,x2,…,xk)=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is MLR4?

A

The errors are conditionally homoskedastic:
V(u|x1,…,xk)=σ^2

and conditionally uncorrelated:
Cov(ui,uj|x)=0 for all i not equal to j

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is MLR5?

A

The population error u, conditionally on x, is normally distributed with zero mean and variance σ^2

u|x~N(0,σ^2)

17
Q

What is the second new assumption?

A

the sample observations are statistically independent

18
Q

What does the 2nd new assumption imply? (2)

A

1) Error terms u(i) and u(j) are also statistically independent
cov(u(i),u(s)|x)=0 for all i not equal to s

2) Dependent variables y(i) and y(j) are also statistically independent
cov(y(i),y(s)|x)=0 for all i not equal to s

19
Q

Why is random sampling assumption usually suitable for cross sectional models but rarely for time-series data?

A

see notes

20
Q

2 points about cross sectional data?

A
  • sample of observations taken at a single point/period in time
  • individual observations have no natural ordering
21
Q

3 points about time series data?

A
  • sample observations taken on one or more variables over successive periods
  • individual observations have natural ordering; a chronological one
  • often exhibit a high degree of time dependence, tf cannot be assumed to be generated by random sampling
22
Q

What is a stochastic/time series process?

A

A sequence of RVs indexed by time

23
Q

Why do we think of TS data as an outcome of random variables?

A

When we collect TS data we obtain 1 possible outcome of the stochastic process, since we can’t go back and realize it again with different historical conditions. BUT if conditions had been different, we’d obtain a different result to the process. Tf the set of all possible realizations is equivalent to the population

24
Q

What is a static model?

A

When a model models a contemporaneous relationship between x and y (*happening in the same time period tf change in x immediately causes change in y)

25
Q

Which assumptions are suitable for time series models?

A

MLR1,3&4

26
Q

finish notes and make flashcards on them (3.5)

A

now

27
Q

What is omitted variable bias?

A

When we omit a variable that belongs in the true/population model (underspecifying the model) -> OLS estimator being biased (See example in my notes)

28
Q

When a variable is omitted, say in a simple regression model, the incorrect model error will be:
v=β2x2+u
Show that this will result in failure of mean independence.
What happens if we estimate β(hat)1?

A

See my notes p3 2nd side at bottom

29
Q

Learn lecture 4 proof!!!

A

in my notes pages 4&5