- [[information theory]]
# Idea
Because computing [[conceptual definitions of entropy]] involves multiplying many probabilities, the resulting values become very small and difficult to work with. To address this problem, we make use of the [[logarithm product rule]].
![[Pasted image 20210127004142.png]]
In the first row, $P(blue)$ should be 0.
Because the logarithms of numbers less than 1 is negative, we take the negative of the entire equation to make it positive. We divide by 4 in the last column because [[conceptual definitions of entropy]] is the average of negatives of the logarithms of the probabilities. This table also shows [[how to calculate entropy]].
# References
- [udacity](https://www.youtube.com/watch?v=w73JTBVeyjE&feature=emb_logo)