Noisy Rectified Linear Activation Function
(Redirected from Noisy ReLU)
Jump to navigation
Jump to search
A Noisy Rectified Linear Activation Function is a Rectified-based Activation Function that adds an Gaussian Noise, i.e. [math]\displaystyle{ f(x) = \max(0, x + Y) }[/math] with [math]\displaystyle{ Y \sim \mathcal{N}(0, \sigma(x)) }[/math].
- AKA: Noisy ReLU.
- Context:
- It can (typically) be used in the activation of Gaussian Neurons.
- Example(s):
- …
- Counter-Example(s):
- a Clipped Rectifier Unit Activation Function,
- a Concatenated Rectified Linear Activation Function,
- an Exponential Linear Activation Function,
- a Leaky Rectified Linear Activation Function,
- a Parametric Rectified Linear Activation Function,
- a Randomized Leaky Rectified Linear Activation Function,
- a Scaled Exponential Linear Activation Function,
- a Softplus Activation Function,
- a S-shaped Rectified Linear Activation Function.
- See: Artificial Neural Network, Artificial Neuron, Neural Network Topology, Neural Network Layer, Neural Network Learning Rate, Restricted Boltzmann Machine, Gaussian Noise.
References
2018
- (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#Noisy_ReLUs Retrieved:2018-2-10.
- Rectified linear units can be extended to include Gaussian noise, making them noisy ReLUs, giving[1] : [math]\displaystyle{ f(x) = \max(0, x + Y) }[/math] , with [math]\displaystyle{ Y \sim \mathcal{N}(0, \sigma(x)) }[/math] Noisy ReLUs have been used with some success in restricted Boltzmann machines for computer vision tasks.
- ↑ Vinod Nair and Geoffrey Hinton (2010). Rectified linear units improve restricted Boltzmann machines (PDF). ICML.
2016
- (Gulcehre et al., 2016) ⇒ Caglar Gulcehre, Marcin Moczulski, Misha Denil, and Yoshua Bengio (2016). "Noisy Activation Functions". In: International Conference on Machine Learning (PMLR).