Ridge Regression

Describing Ridge Regression

  • Ridge regression is a type of regression that performs L2 regularization on the OLS coefficients
  • Specifically, the penalty term associated with L2 regularization is the square of the magnitude of OLS coefficients
  • Ridge regression mitigates the problem of multicollinearity in regression, and thus mitigates the problem of overfitting
  • Ridge regression achieves this by shrinking the OLS cofficients to zero, but not exactly zero

Mathematics behind Ridge Regression

  • When using OLS, the coefficients can often create a ridge in the coefficient space, meaning many different coefficients on the space can do as well (or nearly as well)
  • By adding a penalty (or tuning factor), the coefficient space can be lifted up to provide better coefficient estimates compared to the OLS coefficient space
  • This adjusted coefficient space doesn't guarantee better coefficient estimates, but it can be helpful to explore when looking at additional regression techniques

Effects of Ridge Regression

  • Correlation in parameter estimates is reduced, which will mitigate the problem of multicollinearity
  • Parameter estimates won't be very large in magnitude if the RSS for small parameters aren't much worse, which also mitigates the problem of overfitting

References

Previous
Next

Basis Functions

Lasso Regression