Activation function defines the output of the node basis on input.
Types of Activation Functions
1) Step Function
If x >=0, y=1
If x<0, y=0
2) Sigmoid Function
This function ranges from (0,1) and is defined as
S(x) = 1/(1+e^(-x))
3) ReLU Function
Rectified linear unit, it will output the input directly if input is positive, else output will be zero.
y = max(0,x)
4) Leaky ReLU
Leaky Rectified linear unit, this activation function provides small slope for negative values instead of flat slope.
y = ax if x < 0
y = x
5) ELU
Exponential Linear Unit
y = x if x x> 0
y = a((e^x)-1))
If x value is less than 0, output will be slightly less than 0.
No comments:
Post a Comment