Pages that link to "Softmax-based Activation Function"
Jump to navigation
Jump to search
The following pages link to Softmax-based Activation Function:
Displayed 21 items.
- Neuron Activation Function (← links)
- Rectified Linear Unit (ReLU) Activation Function (← links)
- LogSoftmax Activation Function (← links)
- Log-Sigmoid Activation Function (← links)
- Tanhshrink Activation Function (← links)
- HardTanh Activation Function (← links)
- Softmin Activation Function (← links)
- Softsign Activation Function (← links)
- Softshrink Activation Function (← links)
- Adaptive Piecewise Linear Activation Function (← links)
- Bent Identity Activation Function (← links)
- Maxout Activation Function (← links)
- Hard-Sigmoid Activation Function (← links)
- Long Short-Term Memory Unit Activation Function (← links)
- S-LSTM Unit Activation Function (← links)
- Tree-LSTM Unit Activation Function (← links)
- Inverse Square Root Unit (ISRU) Activation Function (← links)
- Inverse Square Root Linear Unit (ISRLU) Activation Function (← links)
- Soft Exponential Activation Function (← links)
- Sinusoidal Activation Function (← links)
- Sinc Activation Function (← links)