- [[neural network good practices]], [[activation function]], [[forward propagation]], [[back propagation]]
- [[dense layer]], [[convolutional layer]]
- [[TensorFlow]], [[PyTorch]]
- [[derivative]]
# Idea
Neural networks were first built by researchers who wanted to develop algorithms that mimic the biological brain. They are also known as [[deep learning]] models. Each unit is also called a neuron.
When we use a neural network model to make predictions, it's known as [[forward propagation]]. The process is known as **inference**. Making inferences is making predictions.
Neural networks rely on [[back propagation]] and [[computation graph]] to compute [[derivative|derivatives]] and use [[gradient descent]] to train the network.
For notation, see [[neural network notation]].
![[20231224173536.png]]
![[20231224174021.png]]
![[20231224174056.png]]
![[20231226142806 1.png]]
# References
- [Example: Recognizing Images - Neural Networks | Coursera](https://www.coursera.org/learn/advanced-learning-algorithms/lecture/RCpEW/example-recognizing-images)
- [Neural network layer - Neural Networks | Coursera](https://www.coursera.org/learn/advanced-learning-algorithms/lecture/z5sks/neural-network-layer)
- [More complex neural networks - Neural Networks | Coursera](https://www.coursera.org/learn/advanced-learning-algorithms/lecture/a5AfY/more-complex-neural-networks)