Term 1 Flashcards

1
Q

What are the implications of a statistical relationship?

A

A causes B
B causes A
A 3rd variable causes both
Random occurrence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does the stochastic error include?

A

Other explanatory variables (X1, X2..) that are missing
Measurement error
Incorrect functional form
Random and unpredictable occurrences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does a hat above a variable indicate?

A

It must be estimated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do you calculate the residual and error term

A

e=Y-YHat

E=Y-E(Y|X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do you illustrate the residual and error term?

A

Difference between sample line and point is residual

Difference between point and true line is error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do you estimate a value of B1 using OLS?

A

Sum(X-XBar)(Y-YBAR)/SUM(X-XBAR)^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do you estimate a value of B0 using OLS?

A

YBar-B1X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How do you calculate TSS?

A

Sum(Y-YBar)^2

TSS=ESS+RSS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How do you calculate ESS/?

A

Sum(Yhat-Ybar)^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do you calculate RSS/

A

Sum(e^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How do you calculate R^2?

A

ESS/TSS
OR
1-RSS/TSS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the DOF?

A

The number of observations (N) - Number of coefficients (K)

N-K
N-K-1 for intercept

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How do you calculate Adjusted R^2

A

(RSS/N-K-1)/(TSS/N-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How can you calculate the correlation coefficient r?

A

Root R^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are the steps of applied regression?

A
  1. 5 Choose the dependant variable
  2. Review the literature and develop a theoretical model
  3. Specify the model - expected signs
  4. Hypothesise the expected signs and coefficents
  5. Collect Data, Inspect and Clean
  6. Estimate, evaluate and analyse the equation
  7. Document the Results
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the sampling distribution of Bhat?

A

The variety of Bhat you get from different samples

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

How can the mean reveal bias?

A

An estimated BHat should have an expected value of B

E(βHat)=β

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are the classical assumptions of OLS? (1-4)

A

The regression model is linear, is correctly specified, and has an additive error term

The error term has a zero population mean

All explanatory variables are uncorrelated with the error term

Observations of the error term are uncorrelated with each other (no serial correlation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What are the classical assumptions of OLS? (5-7)

A

The error term has a constant variance (no heteroskedasticity)

No explanatory variable is a perfect linear function of any other explanatory variable(s) (no perfect multicollinearity)

The error term is normally distributed (this assumption is optional but usually is invoked)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

If the classical assumptions are met., what can be said?

A

OLS will provide the Best Linear Unbiased Linear Estimator (BLUE)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is the formula for the T-Test?

A

T=(Bk-BH0)/SE(BK)
Bk is the coefficient
Bho is the null, usually 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

How do you calculate the variance of an estimation?

A

=Sum(e^2)/N-2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

How do you calculate the variance and SE of a coefficent?

A

VAR(B)=VAR/SUm(X-Xbar)^2

Root for SE

24
Q

How do you calculate a confidence interval?

A

B +- Tc*SE(B)

25
Q

What are the limitations of the T-Test?

A

Does not consider theoretical validity

Does not test importance

26
Q

What are the three potential specificaiton errors?

A

Independent variables
Functional Form
Form of the stochastic error term

27
Q

What is the effect of omitting a revlevant variable?

A

It cannot be held constant therefore biases other coefficents

Violates CA-3
Correlates with error term

28
Q

What is the effect of an irrelevant variable?

A

Does not cause bias
Will increase variance and hence t-scores
Will reduce adjusted R^2

29
Q

What is the four criteria to test whether a variable belongs?

A

Theory: Is the variables place theoretically sound
t-Test: Is the variables estimated coefficient significant in the expected direction
Adjusted R^2: Does the overall fit of the equation improve when the variable is added
Bias: Do other variables coefficients change significantly when the variable is added

30
Q

What is the equation for the F-Test?

A

F=(RSSm-RSS)/M
/ (RSS/N-
k-1)
M = Number of constraints

31
Q

What is the equation for the F-Test if the restricted equation is Y=B0

A

F=ESS/K / RSS/N-K-1

32
Q

What is the equation for the F-Test if the restricted equation is Y=B0 using R^2

A

F=R^2/K / 1-R^/N-K-1

33
Q

How do you calculate the critical value for an F-Test

A
Numerator = Number of Constraints
Denominator = N-k-1
34
Q

What is RESET and how do you execute

A

Ramsey Regression Specification Error Test
Add Y2 Y3 and Y4 variables
Compare R^2 of old and new
Perform a F-Test to test significance of New variables

35
Q

What are Akaike’s Information Criterion and the Schwarz Criterion?

A

Methods of comparing alternative specifications
AIC=Log(RSS/N)+2(K+1)/N
SC==Log(RSS/N)+LogN(K+1)/N
Lower the better

36
Q

What are the effects of changing the scale of x?

A

Coefficient must also be multiplied by the scaling factor

SE also

37
Q

What are the effects of scaling y?

A

The whole regression will need to be re-run

38
Q

What are the effects of scaling x and y

A

Intercept and residuals will change,

39
Q

How can we check the distribution of residuals?

A

Diagram
Jarque-Bera

JB=N/6(S^2+ (k-3)^2/4)
s=skewness, k=kurtosis

Critical value is obtained chi-squared

40
Q

What are the three components of B0?

A

The True B0
The constant impact of any specification errors (Omitted variable)
The mean of ε if not equal to zero

41
Q

What happens if you suppress the constant term?

A

You violate classical assumption 2,

The error term has an expected value of zero

42
Q

Discuss linear functional form

A

Linear in the variables
All linear

Not Linear in the coefficients
X^B

43
Q

Discuss Log functional Form

A
Double log (on both sides)
Still linear in coefficients

Lin-Log- Log of variables
Log-Lin - Log of dependant

44
Q

Discuss Other functional forms

A

Polynominal-x^2

Inverse-1/x

45
Q

What can you not use to compare two different functional forms?

A

R^2

TSS

46
Q

What does an intercept dummy variable do?

A

Change the intercept based on a condition

Use one variable less than the number of conditions

47
Q

What is the omitted condition?

A

The Event not represented by the dummy variable

48
Q

What happens if you use two dummys for two conditions?

A

Violate CS-6 creating perfect co linearity

This is the dummy variable trap

49
Q

What is a slope dummy?

A

Effects both intercept and slope

50
Q

What is an indicator variable?

A

A variable similar to a dummy, but compares the interaction of two variables

51
Q

What is the chow test?

A

Tests the equivalence of two regressions
Create an indicator intercept and slope variable for every interaction
Seenotes
F-Test that they are all equal to zero

52
Q

What are the consequences of multi-collinearity

A
Bias
Larger SE
T-Values go down
Estimates become sensitive to changes
Fit of equation will not change
53
Q

How can you detect collinearity?

A

Hard to detect
Correlation coefficient
High variance inflation factors

54
Q

Why is correlation coefficent not as useful?

A

To given cutoff point

A group of variables, acting together, may cause colinearity despite no test revealing this

55
Q

How would you use high variance inflation factors

A

Run an OLS that has the variable as a function of all others

Do 1/(1-R^2)

If >5, sever mulitcollinearity

56
Q

What are the remidies for multi-collinearity?

A

Do Nothing
Removing a variable may cause spec bias

Drop a redundant variable
Always base this decision on theory
Increase the sample size