Stacked Neural Network (SNN)
(Redirected from stacked neural network)
Jump to navigation
Jump to search
A Stacked Neural Network (SNN) is an Artificial Neural Network that is the resulting model of several artificial neural networks combined using stacked generalization.
- Example(s):
- Counter-Example(s):
- See: Encoder-Decoder Neural Network, Stacked Ensemble-based Learning Task, Attention Mechanism.
References
2016
- (Mohammadi & Das, 2016) ⇒ Milad Mohammadi, and Subhasis Das (2016). "SNN: Stacked Neural Networks". Preprint arXiv:1605.08512.
- QUOTE: Stacked Neural Networks (S-NN) is defined as a combination of publicly available neural network architectures whose features are extracted at an intermediate layer of the network, and then concatenated together to form a larger feature set. Figure 2 illustrates this idea in detail. The concatenated feature vector is used to train a classifier layer which consists of an optional dropout layer, an affine layer and an SVM loss function(...)
- QUOTE: Stacked Neural Networks (S-NN) is defined as a combination of publicly available neural network architectures whose features are extracted at an intermediate layer of the network, and then concatenated together to form a larger feature set. Figure 2 illustrates this idea in detail. The concatenated feature vector is used to train a classifier layer which consists of an optional dropout layer, an affine layer and an SVM loss function(...)
1996
- (Sridhar et al., 1996) ⇒ D. V. Sridhar, R. C. Seagrave, and E. B. Bartlett (1996). "Process Modeling Using Stacked Neural Networks". AIChE Journal, 42(9), 2529-2539.
- QUOTE: When the candidate level 0 models are artificial neural networks and they are combined using stacked generalization, we define the resulting model as a stacked neural network (SNN). In this work, attention is restricted to level 0 models which are single layer backpropagation networks. Of course other level 0 models such as radial basis function networks, general regression networks and backpropagation networks with more than one hidden layer can be used. The methodology developed here to stack level 0 neural networks can easily be extended to include any level 0 model. The level 1 models considered in this paper axe linear models. The proposed architecture for SNNs is shown in Figure 2.
- QUOTE: When the candidate level 0 models are artificial neural networks and they are combined using stacked generalization, we define the resulting model as a stacked neural network (SNN). In this work, attention is restricted to level 0 models which are single layer backpropagation networks. Of course other level 0 models such as radial basis function networks, general regression networks and backpropagation networks with more than one hidden layer can be used. The methodology developed here to stack level 0 neural networks can easily be extended to include any level 0 model. The level 1 models considered in this paper axe linear models. The proposed architecture for SNNs is shown in Figure 2.