Heaviside Step Activation Function
A Heaviside Step Activation Function is a Neuron Activation Function based on the Linear Threshold Function.
- AKA: Binary Step Activation Function, Linear Threshold Activation Function.
- Context:
- It can (typically) be used in the activation of Heaviside Step Neurons.
- Example(s):
- Counter-Example(s):
- See: Artificial Neural Network, Artificial Neuron, Neural Network Topology, Neural Network Layer, Neural Network Learning Rate.
References
2017
- (Mate Labs, 2017) ⇒ Mate Labs Aug 23, 2017. Secret Sauce behind the beauty of Deep Learning: Beginners guide to Activation Functions
- QUOTE: Heaviside (Binary step, 0 or 1, high or low) step function is typically only useful within single-layer perceptrons, an early type of neural networks that can be used for classification in cases where the input data is linearly separable. These functions are useful for binary classification tasks. The output is a certain value, A1, if the input sum is above a certain threshold and A0 if the input sum is below a certain threshold. The values used by the Perceptron were A1 = 1 and A0 = 0
[math]\displaystyle{ f(x) = \begin{cases} 0, & \mbox{for } x \lt 0 \\ 1, & \mbox{for } x \geq 0 \end{cases} }[/math]
Range: [math]\displaystyle{ \{0 \text{ or } 1\} }[/math]
Examples: [math]\displaystyle{ f(2) = 1,\; f(-4) = 0,\; f(0) = 0,\; f(1) = 1 }[/math]
- QUOTE: Heaviside (Binary step, 0 or 1, high or low) step function is typically only useful within single-layer perceptrons, an early type of neural networks that can be used for classification in cases where the input data is linearly separable. These functions are useful for binary classification tasks. The output is a certain value, A1, if the input sum is above a certain threshold and A0 if the input sum is below a certain threshold. The values used by the Perceptron were A1 = 1 and A0 = 0
2005
- (Golda,2005) ⇒ Adam Golda (2005). "Introduction to neural networks"
- QUOTE:
Signal [math]\displaystyle{ }[/math] is processed by activation function, which can take different shapes. If the function is linear the output signal can be described as:
[math]\displaystyle{ y=k\varphi }[/math]
Neural networks described by above formula are called linear neural networks. The other type of activation function is threshold function:
[math]\displaystyle{ y_j=\begin{cases} 1, & \mbox{for } \varphi \gt \varphi_h \\0, & \mbox{for others}\end{cases} \text{with} }[/math]
where [math]\displaystyle{ \varphi_h }[/math] is a given constant threshold value.
- QUOTE:
1986
- (Williams, 1986) ⇒ Ronald J. Williams. (1986). “The Logic of Activation Functions.” In: (Rumelhart & McClelland, 1986).
- QUOTE: Example 1. [math]\displaystyle{ A = \{0,1\} }[/math] (the two-point set), [math]\displaystyle{ \alpha=f\circ g }[/math] where [math]\displaystyle{ g }[/math] is linear into [math]\displaystyle{ \mathbb{R} }[/math] and [math]\displaystyle{ f: \mathbb{R}\rightarrow A }[/math] is a thresholding function. (The operator [math]\displaystyle{ \circ }[/math] between two functions here denotes composition in a right-to-left manner.) A unit using this activation function is called a threshold logic unit or a linear threshold unit and is the basis of the simple perceptron (Rosenblatt, 1962; Minsky & Papert, K969).