What extra assumptions are introduced in multiple variable regression?

- No exact collinearity between X variables

- No specification bias

What does an estimate in multiple linear regression mean?

The change in Y caused by a change in x, holding all other variables constant

How does the correlation between regressors affect the error of the estimates

Greater the correlation, higher the error

Why do we use Adjusted R^2?

Because normal R^2 can be increased just by adding junk regressors. Adjusted R^2 compensates for the number of variables

When can we compare R^2 values?

-When sample size is the same

-When dependant variables are the same

Give the formula for adjusted R^2

Does R^2 have any intrinsic properties that might favour its use over other calculations?

Nope, pretty arbitrary

What is the Gross/Simple correlation coefficient?

Shown as r_{1-2}, where 1 = Y and i > 1 = Xi. Shows the correlation between two variables

What is the partial correlation coefficient?

The correlation between two variables, eliminating the correlation effect from some other variables. Shown as r_{12}._{34}, where the effects from 3 and 4 are eliminated