Regularization via ridge regression and the lasso #1
An introduction and overview
Regularization is a method of fitting a model containing all predictors $p$ that regularize the coefficient estimates towards zero. Also known as constraining or shrinking the model’s coefficient estimates, regularization can significantly reduce the model’s variance and thus improve test error estimates and model performance. The two most commonly used...
[Read More]