Activation function plays a very important role in any neural network. As the activation name suggests, this function’s job is to trigger something. Yes, that’s it. Activation function triggers a neuron or node. The activation function determines whether information from this node is forwarded to the next node.

Let us discuss the matter with the help of a real example. All of us have passed some or the other test. What does it matter if the total number is taken as the neurons and the pass number is taken as the activation function? I can move from class one to two only when I get more than pass marks, i.e. trigger pass marks. The same is true for neural networks. The activation function acts as a filter here and determines whether the information from that node will be carried to the next node.

Those who have an idea about diodes or transistors, may know that every n-p-n or p-n-p junction has a depletion layer whose job is to act as an insulator between the two junctions. When it gets a certain amount of voltage, it becomes active in other cases it remains inactive (Inactive), when current flows through it, this phenomenon is called forward bias (Forward Bias). Basically the forward bias of this depletion layer acts as the activation function.

If we were to represent a very simple activation function we could express it as –

A=\sum\left(w\ast v\right)+b

Here A = activation function, w = weight, v = value and b = bias

That is, we can create our own activation function by adding bias to the neuron’s values and weights.

Types of Activation Function

Activation function can be linear and non linear. But linear functions are not usually used as activation functions. Because linear functions basically repeat the same thing over and over again.

Different types of activation functions are used for different tasks. When to use an activation function depends on what needs to be used.

Some of the activation functions are described below –

Step Function

Relu Function

Sigmoid Function

Tanh Function

Softmax Function

0 0 votes
Article Rating
0
Would love your thoughts, please comment.x
()
x