Fully-Connected Neural Network
(Redirected from Fully-Connected Multi-layer ANN)
Jump to navigation
Jump to search
A Fully-Connected Neural Network is an Artificial Neural Network that is composed solely of Fully-Connected Neural Network Layers.
- AKA: FCNN, Fully-Connected NN, Fully-Connected Artificial Neural Network.
- …
- Example(s):
- a 2-Layer Fully-Connected Neural Network such as: .
- a 3-Layer Fully-Connected Neural Network such as:
- a Multi Hidden-Layer Neural Network such as
- Counter-Example(s):
- a Neural Network Layer;
- a Partially Connected Neural Networks, such as: ;
- a Fully-Connected Mesh Network such as
- a Convolutional-Pooling Neural Network (CNN).
- See: Artificial Neural Network, Neural Network Topology, Feed-Forward Neural Network, Perceptron.
References
2017a
- (Miikkulainen, 2017) ⇒ Miikkulainen R. (2017) "Topology of a Neural Network". In: Sammut, C., Webb, G.I. (eds) "Encyclopedia of Machine Learning and Data Mining". Springer, Boston, MA
- ABSTRACT: Topology of a neural network refers to the way the neurons are connected, and it is an important factor in how the network functions and learns. A common topology in unsupervised learning is a direct mapping of inputs to a collection of units that represents categories (e.g., Self-Organizing Maps). The most common topology in supervised learning is the fully connected, three-layer, feedforward network (see Backpropagation and Radial Basis Function Networks): All input values to the network are connected to all neurons in the hidden layer (hidden because they are not visible in the input or output), the outputs of the hidden neurons are connected to all neurons in the output layer, and the activations of the output neurons constitute the output of the whole network. Such networks are popular partly because they are known theoretically to be universal function approximators (with, e.g., a sigmoid or Gaussian nonlinearity in the hidden layer neurons), although networks with more layers may be easier to train in practice (e.g., Cascade-Correlation).
2017b
- (CS231n, 20017) ⇒ http://cs231n.github.io/neural-networks-1/#layers Retrieved: 2017-12-31
- QUOTE: Neural Networks as neurons in graphs. Neural Networks are modeled as collections of neurons that are connected in an acyclic graph. In other words, the outputs of some neurons can become inputs to other neurons. Cycles are not allowed since that would imply an infinite loop in the forward pass of a network. Instead of an amorphous blobs of connected neurons, Neural Network models are often organized into distinct layers of neurons. For regular neural networks, the most common layer type is the fully-connected layer in which neurons between two adjacent layers are fully pairwise connected, but neurons within a single layer share no connections. Below are two example Neural Network topologies that use a stack of fully-connected layers:
2017c
- (Wikipedia, 2017) ⇒ https://en.wikipedia.org/wiki/Network_topology#Fully_connected_network Retrieved:2017-12-17.
- In a fully connected network, all nodes are interconnected. (In graph theory this is called a complete graph.) The simplest fully connected network is a two-node network. A fully connected network doesn't need to use packet switching or broadcasting. However, since the number of connections grows quadratically with the number of nodes: This kind of topology does not trip and affect other nodes in the network [math]\displaystyle{ c= \frac{n(n-1)}{2}.\, }[/math] This makes it impractical for large networks.
2016
- (Zhao, 2016) ⇒ Peng Zhao, 2016. "R for Deep Learning (I): Build Fully Connected Neural Network from Scratch".
- QUOTE: Fully connected neural network, called DNN in data science, is that adjacent network layers are fully connected to each other. Every neuron in the network is connected to every neuron in adjacent layers.
A very simple and typical neural network is shown below with 1 input layer, 2 hidden layers, and 1 output layer. Mostly, when researchers talk about network’s architecture, it refers to the configuration of DNN, such as how many layers in the network, how many neurons in each layer, what kind of activation, loss function, and regularization are used.
- QUOTE: Fully connected neural network, called DNN in data science, is that adjacent network layers are fully connected to each other. Every neuron in the network is connected to every neuron in adjacent layers.
1990
- (Hsu et al., 1990) ⇒ Hsu, K. Y., Li, H. Y., & Psaltis, D. (1990). Holographic implementation of a fully connected neural network. Proceedings of the IEEE, 78(10), 1637-1645 DOI: 10.1109/5.58357.
- ABSTRACT : A holographic implementation of a fully connected neural network is presented. This model has a simple structure and is relatively easy to implement, and its operating principles and characteristics can be extended to other types of networks, since any architecture can be considered as a fully connected network with some of its connections missing. The basic principles of the fully connected network are reviewed. The optical implementation of the network is presented. Experimental results which demonstrate its ability to recognize stored images are given, and its performance and analysis are discussed based on a proposed model for the system. Special attention is focused on the dynamics of the feedback loop and the tradeoff between distortion tolerance and image-recognition capability of the associative memory.