The softmax activation function is another kind of AF applied in neural networks to calculate probability distribution from a vector of real figures. This function generates an output that ranges between values 0 and 1 and with the sum of the chances existing equal to 1.