Log-Sigmoid Activation Function
Jump to navigation
Jump to search
A Log-Sigmoid Activation Function is a Sigmoid-based Activation Function that is based on the logarithm function of a Sigmoid Function.
- Context:
- It can (typically) be used in the activation of LogSigmoid Neurons.
- Example(s):
- Counter-Example(s):
- a Hard-Sigmoid Activation Function,
- a Rectified-based Activation Function,
- a Heaviside Step Activation Function,
- a Ramp Function-based Activation Function,
- a Softmax-based Activation Function,
- a Hyperbolic Tangent-based Activation Function,
- a Gaussian-based Activation Function,
- a Softmin Activation Function,
- a Softsign Activation Function,
- a Softshrink Activation Function,
- a Adaptive Piecewise Linear Activation Function,
- a Bent Identity Activation Function,
- a Maxout Activation Function.
- See: Sigmoid Function, Artificial Neural Network, Artificial Neuron, Neural Network Topology, Neural Network Layer, Neural Network Learning Rate.
References
2018
- (Pyttorch, 2018) ⇒ http://pytorch.org/docs/master/nn.html#logsoftmax
- QUOTE:
class torch.nn.LogSigmoid
sourceApplies element-wise
[math]\displaystyle{ LogSigmoid(x)=\log\left(\dfrac{1}{1+\exp(-x_i)}\right) }[/math]
Shape:
*** Input: [math]\displaystyle{ (N,∗) }[/math] where * means, any number of additional dimensions
- Output: [math]\displaystyle{ (N,∗) }[/math], same shape as the input.
- QUOTE:
- Examples:
>>> m = nn.LogSigmoid() >>> input = autograd.Variable(torch.randn(2)) >>> print(input) >>> print(m(input))