Tanhshrink Activation Function
Jump to navigation
Jump to search
A Tanhshrink Activation Function is a Hyperbolic Tangent-based Activation Function that is defined as [math]\displaystyle{ f(x)=x-Tanh(x) }[/math]
- Context:
- It can (typically) be used in the activation of Tanhshrink Neurons.
- Example(s):
torch.nn.Tanhshrink
,- ...
- …
- Counter-Example(s):
- a HardTanh Activation Function,
- a Rectified-based Activation Function,
- a Heaviside Step Activation Function,
- a Ramp Function-based Activation Function,
- a Softmax-based Activation Function,
- a Logistic Sigmoid-based Activation Function,
- a Gaussian-based Activation Function,
- a Softmin Activation Function,
- a Softsign Activation Function,
- a Softshrink Activation Function,
- a Adaptive Piecewise Linear Activation Function,
- a Bent Identity Activation Function,
- a Maxout Activation Function.
- See: Hyperbolic Tangent Function, Artificial Neural Network, Artificial Neuron, Neural Network Topology, Neural Network Layer, Neural Network Learning Rate.
References
2018
- (Pyttorch, 2018) ⇒ http://pytorch.org/docs/master/nn.html#tanhshrink
- QUOTE:
class torch.nn.Tanhshrink
sourceApplies element-wise, [math]\displaystyle{ Tanhshrink(x)=x−Tanh(x) }[/math]
Shape:
*** Input: [math]\displaystyle{ (N,∗) }[/math] where * means, any number of additional dimensions
- Output: [math]\displaystyle{ (N,∗) }[/math], same shape as the input.
- QUOTE:
- Examples:
>>> m = nn.Tanhshrink() >>> input = autograd.Variable(torch.randn(2)) >>> print(input) >>> print(m(input))