Hyperbolic Tangent Neuron
A Hyperbolic Tangent Neuron is an artificial neuron that uses a Hyperbolic Tangent Activation Function.
- AKA: Hyperbolic Tangent Activation Unit, tanh Unit.
- Context:
- It can be mathematically described as
[math]\displaystyle{ y_j=\tanh(z_j)=\dfrac{2}{1+e^{-2z_j}} -2\quad \text{with} \quad z_j=\sum_{i=0}^nw_{ji}x_i+b \quad \text{for}\quad j=0,\cdots, p }[/math]
where [math]\displaystyle{ x_i }[/math] are the Neural Network Input vector, [math]\displaystyle{ y_j }[/math] are the Neural Network Output vector, [math]\displaystyle{ w_{ji} }[/math] is the Neural Network Weights and [math]\displaystyle{ b }[/math] is the Bias Neuron.
- …
- It can be mathematically described as
- Example(s):
- …
- Counter-Example(s):
- See: Artificial Neural Network, Perceptron.
References
2017
- (Mate Labs, 2017) ⇒ Mate Labs Aug 23, 2017. Secret Sauce behind the beauty of Deep Learning: Beginners guide to Activation Functions
- QUOTE: Hyperbolic tangent (TanH) — It looks like a scaled sigmoid function. Data is centered around zero, so the derivatives will be higher. Tanh quickly converges than sigmoid and logistic activation functions.
[math]\displaystyle{ f(x)=\tanh(x)=\dfrac{2}{1+e^{-2x}} -2 }[/math]
Range: [math]\displaystyle{ (-1, 1) }[/math]
Examples: [math]\displaystyle{ \tanh(2) = 0.9640,\; \tanh(-0.567) = -0.5131, \; \tanh(0) = 0 }[/math]
- QUOTE: Hyperbolic tangent (TanH) — It looks like a scaled sigmoid function. Data is centered around zero, so the derivatives will be higher. Tanh quickly converges than sigmoid and logistic activation functions.
.