The activation function is used to calculate the output response of a neuron. Different activation functions have different properties:
Binary step function is a threshold-based activation function which returns '0' if the input is less than zero, and '1' otherwise.
import numpy as np
import matplotlib.pyplot as plt
# Binary Step Activation Function
def binaryStep(x):
return np.heaviside(x, 1)
x = np.linspace(-10, 10)
plt.plot(x, binaryStep(x))
plt.axis('tight')
plt.title('Activation Function: Binary Step')
plt.show()
The Linear Activation Function returns the input as it is. It is represented by f(x) = x.
import numpy as np
import matplotlib.pyplot as plt
# Linear Activation Function
def linear(x):
return x
x = np.linspace(-10, 10)
plt.plot(x, linear(x))
plt.axis('tight')
plt.title('Activation Function: Linear')
plt.show()
The Sigmoid Activation Function returns values between 0 and 1, suitable for binary classification problems.
import numpy as np
import matplotlib.pyplot as plt
# Sigmoid Activation Function
def sigmoid(x):
return 1 / (1 + np.exp(-x))
x = np.linspace(-10, 10)
plt.plot(x, sigmoid(x))
plt.axis('tight')
plt.title('Activation Function: Sigmoid')
plt.show()
The Bipolar Sigmoid Activation Function returns values between -1 and 1.
import numpy as np
import matplotlib.pyplot as plt
# Bipolar Sigmoid Activation Function
def bipolarSigmoid(x):
return -1 + 2 / (1 + np.exp(-x))
x = np.linspace(-10, 10)
plt.plot(x, bipolarSigmoid(x))
plt.axis('tight')
plt.title('Activation Function: Bipolar Sigmoid')
plt.show()
The Hyperbolic Tangent Activation Function returns values between -1 and 1, similar to the bipolar sigmoid but with steeper gradients.
import numpy as np
import matplotlib.pyplot as plt
# Hyperbolic Tangent (tanh) Activation Function
def tanh(x):
return np.tanh(x)
x = np.linspace(-10, 10)
plt.plot(x, tanh(x))
plt.axis('tight')
plt.title('Activation Function: Hyperbolic Tangent (tanh)')
plt.show()
The ReLU Activation Function returns the input if it is positive, and zero otherwise.
import numpy as np
import matplotlib.pyplot as plt
# ReLU (Rectified Linear Unit) Activation Function
def ReLU(x):
return np.maximum(0, x)
x = np.linspace(-10, 10)
plt.plot(x, ReLU(x))
plt.axis('tight')
plt.title('Activation Function: ReLU (Rectified Linear Unit)')
plt.show()
The Leaky ReLU Activation Function is similar to ReLU, but allows a small gradient when the input is negative.
import numpy as np
import matplotlib.pyplot as plt
# Leaky ReLU Activation Function
def leakyReLU(x):
return np.maximum(0.01 * x, x) # Using a small slope (0.01) for negative input
x = np.linspace(-10, 10)
plt.plot(x, leakyReLU(x))
plt.axis('tight')
plt.title('Activation Function: Leaky ReLU')
plt.show()
The Softmax Activation Function is used in the output layer of neural network models for multi-class classification problems. It returns a probability distribution over multiple classes.
import numpy as np
import matplotlib.pyplot as plt
# Softmax Activation Function
def softmax(x):
exp_vals = np.exp(x - np.max(x, axis=0)) # Subtracting the maximum value for numerical stability
return exp_vals / np.sum(exp_vals, axis=0)
x = np.linspace(-10, 10)
plt.plot(x, softmax(x))
plt.axis('tight')
plt.title('Activation Function: Softmax')
plt.show()
The choice of activation function has a significant impact on the performance and behavior of neural networks. Each activation function has its own advantages and disadvantages, and the selection should be based on the specific requirements of the problem at hand.
Reference: Neural Network: A Comprehensive Foundation by Simon Haykin