- [[logistic regression]], [[linear classifiers]]
- [[machine-learning algorithms]]
- [[nonlinear support vector machines]]
# Idea
It is one of the more popular [[linear classifiers]] and uses the [[hinge loss]]. The standard linear SVM also uses [[L2 regularization]].
Unlike [[logistic regression]] and other algorithms, not all samples are used to fit the model—only the [[support vectors]] are used (i.e., [[non-support vectors]]).
The loss function used for SVM is the [[hinge loss]].
It's good and common practice to standardize data by z-scoring before fitting SVMs [[Guggenmos 2018 multivariate pattern analysis for MEG comparison of dissimilarity measures|(Guggenmos et al., 2018)]].
# References
- plotting classifiers: https://scikit-learn.org/stable/auto_examples/svm/plot_iris_svc.html
- https://campus.datacamp.com/courses/linear-classifiers-in-python/loss-functions?ex=7
- https://campus.datacamp.com/courses/linear-classifiers-in-python/support-vector-machines?ex=1
# Figures
- [[Grootswagers 2017 decoding dynamic brain patterns time series neuroimaging data tutorial]]
![[Pasted image 30.png|700]]