Activation Functions in Neural Networks and Its Types

It is a curve (sigmoid, tanH, ReLU) which is used to map the values of the network between bounded values. This is done for every node in the network. For example, sigmoid can map any range of values between 0 and 1.