- [[population growth]] vs [[exponential growth]] growth, [[fractional logistic regression]]
- [[chain rule]]: the derivative of the sigmoid function is super easy to calculate and hence it's used as an activation function in neural networks
- [[convolutional neural networks]]
- [[logistic regression]], [[logit]]
- [[Luce 1959]] and [[Luce choice rule]]
- [[machine learning]]: sigmoid function is used to map predictions to probabilities in machine learning
- [[cross-entropy loss or logistic loss]], [[first derivative of the sigmoid function]]
# Idea
The sigmoid (aka *logistic* or *inverse logit*) function was introduced in three papers by Pierre Francois Verhulst between 1838 and 1847. It was used to model population growth.
Today, it's one of the most population [[activation function|activation functions]] used in artificial and biological neural networks. It takes any real-valued number ($-\infty$ to $\infty$) and maps it to a value between 0 and 1. It is also used in [[logistic regression]]. Its popularity in neural networks is due to its simple [[first derivative of the sigmoid function|first derivative]], which is necessary for [[back propagation]].
$f(x) = \frac{1}{1 + e^{-x}} = \frac{e^{x}}{e^{x} + 1}$
```functionplot
---
bounds: [-10, 10, 0, 1]
xLabel: x
yLabel: y
---
f(x) = 1 / (1 + exp(-x))
```
```python
def sigmoid(x):
return 1.0 / (1 + np.exp(-x))
# better solution below to avoid overflow errors
# https://blog.dailydoseofds.com/p/sigmoid-and-softmax-are-not-implemented
def sigmoid(z) :
if z>0:
a = np.exp(-z)
return 1/(1+a)
else:
a = np.exp(z)
return a/ (1+a)
```
## First-ever sigmoid figure
The first ever figure of the sigmoid function, by Verhulst.
![[Pasted image 11.png]]
## Avoid overflow errors in practice implementation
- [Sigmoid and Softmax Are Not Implemented the Way Most People Think](https://www.blog.dailydoseofds.com/p/sigmoid-and-softmax-are-not-implemented?publication_id=1119889&post_id=141534271&isFreemail=true&r=35ltar)
# References
- https://www.edn.com/how-the-sigmoid-function-is-used-in-ai/
- https://ml-cheatsheet.readthedocs.io/en/latest/logistic_regression.html
- https://en.wikipedia.org/wiki/Logit
- [Sigmoid and Softmax Are Not Implemented the Way Most People Think](https://blog.dailydoseofds.com/p/sigmoid-and-softmax-are-not-implemented)