RA ch7 Multiple Regression 2
Ch 7: Multiple Regression 2
In Ch 6, we’ve seen linear regression model with multiple variables. To resolve the limitation for testing regression coefficients, we introduce a new concept, extra sum of squares.
Outline
- Extra sum of squares
- definition
- application 1 : Tests
- application 2 : coefficient of partial determination
- Standardized multiple regression model
- correlation transformation and standardized regression model
- multicollinearity issue
1. Extra sum of squares
definition
Recall
can be divided into and .- When we add prediction variables to the regression model,
gets decreased, while gets increased.
Using the above concept, define extra sum of squares
the marginal explanation ability by adding
application : Tests
Test whether a single
Full model)
Reduced model)
test statistic:
Test whether several
Full model)
Reduced model)
test statistic:
Test whether the slopes of
Full model)
Reduced model)
test statistic:
application 2 : coefficient of partial determination
We can also calculate
2. Standardized multiple regression model
When the predictor variables are not scaled properly, it leads to
correlation transformation and standardized regression model
correlation transform
When we use
standardized regression model
Note
Property of correlation matrix of
multicollinearity issue
Why do we need to avoid multicollinearity among variables?
When we solve normal equation
, the condition makes the system highly singular. This leads to high round-off error while computation, resulting in severe error in .
How to know if the predictor variables are correlated among themselves?
uncorrelated and > > $$SSR(X_1 X_2) = SSR(X_1) SSR(X_2 X_1) = SSR(X_2)$$ - perfectly correlated
and > > Does not exist. infinitely many regression lines - general effects of multicollinearity (
) increase of explanation ability is not significant.
Enjoy Reading This Article?
Here are some more articles you might like to read next: