RA ch1 Linear Regression with One Predictor Variable
Ch 1: Linear Regression with One Predictor Variable
Outline
-
Simple Regression Model
- Formulation by Least Square Estimation
- Gauss-Markov theorem
- Properties of linear regression model
- Estimation of
-
Normal Error Regression Model
- Formulation by Maximum Likelihood Estimation
- Advantages of Normal error regression model
1. Simple Regression Model
Formulation by Least Square Estimation
Let
where
There are various ways to draw a line which crosses the data points
Define a function
Note
is convex.- Every convex function has a global minimum.
is differentiable over .
By differentiating the function
The solution for the above simultaneous equations is
Proof
Remark
- Once a dataset
is given, s are known constants. Also, s are considered as constants. - The error term
is a random variable. is the sum of the constant and random . Therefore, is also a random variable.- Since
, </br> , and </br> .
Gauss Markov theorem
Statement Under the regression model, the least square estimators of regression coefficients
- unbiased (i.e.
) and - having minimum variance among all unbiased linear estimators. (i.e.
, unbiased, linear)
Shortly, it is said that “
Proof go to link
Properties of linear regression model
Notations Observation :
residual :
by the choice of is the minimum-
(residual is orthogonal to ) > > (residual is orthogonal to the fitted line ) (regression line passes through )
Estimation of
Observation
( is an unbiased estimator of )-
2. Normal Error Regression Model
Formulation by Maximum Likelihood Estimation
Assuming a normal distribution to the error terms
Now the formulation is given as
We estimate
Using the i.i.d. condition of variables
We maximize
(same as LSE)
cf)
Advantages of Normal error regression model
By introducing normal distribution on the error
Here is the jupyter notebook script to run several practice codes using R.
Enjoy Reading This Article?
Here are some more articles you might like to read next: