- [[L1 regularization]], [[elastic net regularization]], [[regularization]], [[regularized logistic regression]], [[L1 vs L2 regularization]], [[coordinate descent]]
# Idea
For [[linear regression]] with [[L2 regularization]], it's called **Ridge**. But for other models like [[logistic regression]], we simply say L2.
Ridge regression is a special case of [[L2 regularization|Tikhonov regularization]].
Unlike L1, L2 doesn't perform feature selection.
Note that by default, [[scikit-learn]]'s [[logistic regression]] uses [[L2 regularization]]. Smaller `C` parameter values indicate greater regularization.
L2 regularization adds the squared of the coefficients, sums them, and adds them to the error term.
![[s20220620_123202.png]]
![[Pasted image 20210125203758.png]]
# References
- https://campus.datacamp.com/courses/linear-classifiers-in-python/logistic-regression-3?ex=1
- [udacity](https://www.youtube.com/watch?v=PyFNIcsNma0&feature=emb_logo)