[Intro to Deep Learning with PyTorch -- L2 -- N15] Softmax function
The Softmax Function
In the next video, we'll learn about the softmax function, which is the equivalent of the sigmoid activation function, but when the problem has 3 or more classes.
![](https://img2020.cnblogs.com/blog/364241/202006/364241-20200611024818098-1827407797.png)
![](https://img2020.cnblogs.com/blog/364241/202006/364241-20200611025029392-2075230950.png)
import numpy as np def softmax(L): expL = np.exp(L) sumExpL = sum(expL) result = [] for i in expL: result.append(i*1.0/sumExpL) return result