岭回归的目的
A good way to reduce overfitting is to regularize the model: the fewer degrees of freedom it has, the harder it will be for it to overfit the data… For a linear model, regularization is typically achieved by constraining the weights of the model. We will now look at Ridge Regression, Lasso Regression, and Elastic Net, which implement three different ways to constrain the weights.
https://stats.stackexchange.com/questions/603862/what-does-size-of-coefficients-have-to-do-with-multicollinearity-or-overfitting
岭回归可以有效避免数据的过拟合:
岭回归的参数$\lambda$(正则化强度)需要通过交叉验证等方法进行选择,以找到最优的平衡点,使得模型既不过拟合也不欠拟合。
简单说,岭回归是带范数惩罚的最小二乘回归。
https://www.zhihu.com/question/28221429/answer/51527607
岭回归和最小二乘法的区别与联系
由最小二乘法,有:
$$\hat{x}=(A^T A)^{-1}A^T b$$
然而它是有使用前提的,矩阵$A$需要是列满秩:
$$rank(A)=\dim(x)$$
岭回归可以理解为是最小二乘法的优化,是L2范数正则下的最小二乘;当feature之间完全共线性(使有解)或者相关程度异常高(使有稳定解)的时候,适合使用岭回归。
0 条评论