RA ch5 Matrix Approaches to Simple Linear Regression
Ch 5: Matrix Approaches to Simple Linear Regression
Linear functions can be written by matrix operations such as addition and multiplication. Now, we move on to formulation of linear regression into matrices. Knowledge of linear algebra provides lots of intuition to interpret linear regression models. Also, it is easier to describe the calculations using matrices, even in the generalized settings. Therefore, we briefly review useful concepts in linear algebra, and then describe the simple linear regression model into matrix form.
Outline
- Review of linear algebra
- Linear indep:endency and rank
- quadratic form
- projection(idempotent) matrix
- Simple linear regression into matrix form
- notations
- simple linear regression
- ANOVA results
- a sketch to generalized settings
1. Review of linear algebra
Linear independency and rank
Linear independency among a set of vectors
is linearly independent. is consistent for any . (nonsingular) has the unique solution. has only a trivial solution (zero vector). Nullspace of is trivial.-
- Eigenvalues of
are nonzero. is invertible.
For a square matrix
What about a general matrix
quadratic form
Let
Moreover, if
projection(idempotent) matrix
A matrix
- Any eigenvalue of an idempotent matrix is 1 or 0.
- Trace(sum of diagonal entries) of an idempotent matrix becomes its rank.
- If
is idempotent, then is also idempotent.
(Moreover, it decomposes the vector space into 2 spaces as where denotes direct sum. . This concept is different from orthogonality. Orthogonality is a stronger condition.) - Any projection matrix except for
is rank-deficient (not invertible).
When an idempotent matrix
Remark A matrix
2. Simple linear regression into matrix form
notations
Response variable
Note
Note
simple linear regression
Least square method, regression coefficients
This can be written as
properties of
Objective function,
Regression line,
Properties of hat matrix
(idempotent), Orthogonal projection matrix We can interpret the regression line as an orthogonal projection of onto the range space of , .-
Residuals,
Property of the residual
- Since
is an orthogonal projection matrix, so is . We can interpret the residual as an orthogonal projection of onto the space orthogonal to . , . > >
Mean response of
ANOVA results
a sketch to generalized settings
- In application, we use normal error regression model by assuming normal distribution to errors. Also, there is an explicit formula to write normal distribution of the error in matrix form.
- By moving on to matrix formulation, we can generalize the current regression model with one prediction variable to multiple variables. In most cases, the difference is just adding more variables to the design matrix
and coeffient and . When we use normal error regression model for multiple regression, we have MANOVA analogous to ANOVA.
Enjoy Reading This Article?
Here are some more articles you might like to read next: