Heaviside Step Neuron
A Heaviside Step Neuron is an artificial neuron that uses a Binary Step Activation Function.
- AKA: Binary Step Neuron, Binary Step Activation Unit, Heaviside Step Activation Unit, Threshold Logic Unit, Linear Logic Unit.
- Context:
- It can be mathematically described as
[math]\displaystyle{ y_j=\begin{cases} 0, & \mbox{for } z_j \lt 0 \\1, & \mbox{for } z_j \geq 0 \end{cases} \text{with} \quad z_j=\sum_{i=0}^nw_{ji}x_i+b \quad \text{for}\quad j=0,\cdots, p }[/math]
where [math]\displaystyle{ x_i }[/math] are the Neural Network Input vector, [math]\displaystyle{ y_j }[/math] are the Neural Network Output vector, [math]\displaystyle{ w_{ji} }[/math] is the Neural Network Weights and [math]\displaystyle{ b }[/math] is the Bias Neuron.
- …
- It can be mathematically described as
- Example(s):
- Counter-Example(s):
- See: Artificial Neural Network, Perceptron, Neural Network Activation Function.
References
2017
- (Mate Labs, 2017) ⇒ Mate Labs Aug 23, 2017. Secret Sauce behind the beauty of Deep Learning: Beginners guide to Activation Functions
- QUOTE: Heaviside (Binary step, 0 or 1, high or low) step function is typically only useful within single-layer perceptrons, an early type of neural networks that can be used for classification in cases where the input data is linearly separable. These functions are useful for binary classification tasks. The output is a certain value, A1, if the input sum is above a certain threshold and A0 if the input sum is below a certain threshold. The values used by the Perceptron were A1 = 1 and A0 = 0
[math]\displaystyle{ f(x) = \begin{cases} 0, & \mbox{for } 0 \lt 0 \\ 1, & \mbox{for } x \geq 0 \end{cases} }[/math]
Range: [math]\displaystyle{ \{0 \text{ or } 1\} }[/math]
Examples: [math]\displaystyle{ f(2) = 1,\; f(-4) = 0,\; f(0) = 0,\; f(1) = 1 }[/math]
- QUOTE: Heaviside (Binary step, 0 or 1, high or low) step function is typically only useful within single-layer perceptrons, an early type of neural networks that can be used for classification in cases where the input data is linearly separable. These functions are useful for binary classification tasks. The output is a certain value, A1, if the input sum is above a certain threshold and A0 if the input sum is below a certain threshold. The values used by the Perceptron were A1 = 1 and A0 = 0
1986
- (Williams, 1986) ⇒ Ronald J. Williams. (1986). “The Logic of Activation Functions.” In: (Rumelhart & McClelland, 1986).
- QUOTE: Example 1. [math]\displaystyle{ A = \{0,1\} }[/math] (the two-point set), [math]\displaystyle{ \alpha=f\circ g }[/math] where [math]\displaystyle{ g }[/math] is linear into [math]\displaystyle{ \mathbb{R} }[/math] and [math]\displaystyle{ f: \mathbb{R}\rightarrow A }[/math] is a thresholding function. (The operator [math]\displaystyle{ \circ }[/math] between two functions here denotes composition in a right-to-left manner.) A unit using this activation function is called a threshold logic unit or a linear threshold unit and is the basis of the simple perceptron (Rosenblatt, 1962; Minsky & Papert, K969).
.