What activation function is used in the case of two or more output variables in neural network?

Activation function in the Neural Network:

For two output variable:

  1. Hidden : relu
  2. Output : sigmoid

For more than 2 output variable:

  1. Hidden : relu
  2. Output : softmax