Clipped Rectifier Unit Activation Function
(Redirected from Clipped Rectifier Unit function)
Jump to navigation
Jump to search
A Clipped Rectifier Unit Activation Function is a Rectified-based Activation Function that is thresholded at a clipping value [math]\displaystyle{ z }[/math], i.e. [math]\displaystyle{ f(x,z)=min(max(0,x),z) }[/math].
- AKA: CLipped ReLU, Clipped Rectifier Unit Function.
- Context:
- It can (typically) be used in the activation of Clipped Rectifier Neurons.
- Example(s):
- Counter-Example(s):
- a Concatenated Rectified Linear Activation Function,
- an Exponential Linear Activation Function,
- a Leaky Rectified Linear Activation Function,
- a Noisy Rectified Linear Activation Function,
- a Parametric Rectified Linear Activation Function,
- a Randomized Leaky Rectified Linear Activation Function,
- a Scaled Exponential Linear Activation Function,
- a Softplus Activation Function,
- a S-shaped Rectified Linear Activation Function.
- See: Artificial Neural Network, Artificial Neuron, Neural Network Topology, Neural Network Layer, Neural Network Learning Rate.
References
2018
- (Chainer, 2018) ⇒ http://docs.chainer.org/en/stable/reference/generated/chainer.functions.clipped_relu.html Retrieved:2018-2-18
- QUOTE:
chainer.functions.clipped_relu(x, z=20.0)
source Clipped Rectifier Unit function.
For a clipping value [math]\displaystyle{ z(\gt 0) }[/math], it computes
[math]\displaystyle{ ClippedReLU(x,z)=min(max(0,x),z) }[/math].
Parameters:
- x (Variable or
numpy.ndarray
orcupy.ndarray
) – Input variable. A [math]\displaystyle{ (s_1,s_2,\cdots,s_N) }[/math]-shaped float array. - z (float) – Clipping value. (default = 20.0)
- x (Variable or
- QUOTE:
- Returns: Output variable. A [math]\displaystyle{ (s_1,s_2,\cdots,s_N) }[/math]-shaped float array.
- Return type: Variable
- Example:
>>> x = np.random.uniform(-100, 100, (10, 20)).astype('f') >>> z = 10.0 >>> np.any(x < 0) True >>> np.any(x > z) True >>> y = F.clipped_relu(x, z=z) >>> np.any(y.data < 0) False >>> np.any(y.data > z) False