# Idea
A weighted average ([[harmonic mean]]) of the [[precision]] and [[recall]] scores that addresses the problems with [[accuracy]] scores. It ranges from 0 to 1, with 1 being the best F1 score.
$ F_{\beta} = (1 + \beta^2) \cdot \frac{precision \cdot recall}{\left( \beta^2 \cdot precision \right) + recall} $
When $\beta = 0.5$, we place more emphasis on precision. It's called the $F_{0.5}$ score or F-score for simplicity.
When $\beta = 1$, the equation reduces to the following:
$ F_{\beta} = 2 \cdot \frac{precision \cdot recall}{precision + recall} $
The above equation is also known as the [[harmonic mean]] of precision and recall.
![[20240111173839.png]]
# References
- [Trading off precision and recall - Advice for applying machine learning | Coursera](https://www.coursera.org/learn/advanced-learning-algorithms/lecture/42TEG/trading-off-precision-and-recall)