- [[economics is constrained optimization|economics is constrained optimization]]
- [[machine-learning algorithms]], [[gradient-based optimization algorithms]], [[gradient descent]], [[genetic algorithms]]
- central to [[deep learning]]
- [[scipy.optimize.minimize]]
# Idea
Optimization is about finding maximum or minimum values. It allows us to find the optimal solution to problems that can be described as mathematical functions.
[[Pierre de Fermat]] was one of the first to solve optimization problems—before [[calculus]] and [[derivative|derivatives]] were invented. He used the [[double intersection]] method.
## Applications
Given the economics is all about [[economics is constrained optimization]], optimality is the cornerstone of standard economic theory.
# References
- [youtube using scipy.optimize.minimize](https://www.youtube.com/watch?v=wS5D72wLez8)
- [youtube scipy beginners guide](https://www.youtube.com/watch?v=cXHvC_FGx24)
- https://campus.datacamp.com/courses/linear-classifiers-in-python/loss-functions?ex=4
- https://courses.kristakingmath.com/library/applications-of-derivatives-dc8b34ad/110495/path/step/57969616/
- [Optimizer Selection — Bumps 0.9.0 documentation](https://bumps.readthedocs.io/en/latest/guide/optimizer.html)
![[least squares or squared loss#Minimizing loss in Python with scipy optimize minimize]]