- [[matrix]]
# Idea
A single number that reflects the **commonalities** (or similarities) between two mathematical objects (e.g., vectors, matrices, tensors, signals, images).
The dot product of two vectors ${a}=\left[a_{1}, b_{a}, \ldots, b_{n}\right]$ and ${b}=\left[b_{1}, b_{2}, \ldots, b_{n}\right]$ is defined as:
$
\boldsymbol{a} \cdot \boldsymbol{b}=\sum_{i}^{n} a_i b_i
$
$\mathbf{a} \cdot \mathbf{b}=\sum_{i=1}^{n} a_{i} b_{i}=a_{1} b_{1}+a_{2} b_{2}+\cdots+a_{n} b_{n}$
$
\boldsymbol{a} \cdot \boldsymbol{b}=\left[\begin{array}{l}
a_0 \\
a_1 \\
a_2 \\
a_3
\end{array}\right]\left[\begin{array}{l}
b_0 \\
b_1 \\
b_2 \\
b_3
\end{array}\right]=\left[a_0 b_0+a_1 b_1+a_2 b_2+a_3 b_3\right]=c
$
It's one of the most fundamental vector/matrix operations that underlies most computations in linear algebra. Examples:
- [[variance-covariance matrix]]
- [[linear combinations]]
- [[matrix multiplication]]
Note that [[dot product is covariance]].
![[C1_W2_Lab04_dot_notrans.gif]]
# References
- [Dot product - Wikipedia](https://en.wikipedia.org/wiki/Dot_product)