Perceptron Function
A Perceptron Function is a linear classification function composed of a summation function and an activation function.
- Context:
- It can range from being an Abstract Perceptron Function to being a Software Perceptron Function.
- It can range from being a Fixed-Paramater erceptron Function to being a Free-Parameter Perceptron Function.
- It can range from (typically) being a Binary Perceptron Classifier to being a Multiclass Perceptron Classifier
- It can be trained by a Perceptron Training System (that applies a perceptron algorithm to solve a perceptron training task).
- Example(s):
- Counter-Example(s):
- a Linear Network.
- a Multi-layer Neural Network, such as a Multi-Layer Perceptron.
- See: Single-layer Neural Network, Linear Classifier Model, Linear Predictor Function, Feature Vector, Online Algorithm, Artificial Neural Network.
References
2014
- (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/perceptron Retrieved:2014-8-20.
- In machine learning, the perceptron is an algorithm for supervised classification of an input into one of several possible non-binary outputs. It is a type of linear classifier, i.e. a classification algorithm that makes its predictions based on a linear predictor function combining a set of weights with the feature vector. The algorithm allows for online learning, in that it processes elements in the training set one at a time.
The perceptron algorithm dates back to the late 1950s; its first implementation, in custom hardware, was one of the first artificial neural networks to be produced.
- In machine learning, the perceptron is an algorithm for supervised classification of an input into one of several possible non-binary outputs. It is a type of linear classifier, i.e. a classification algorithm that makes its predictions based on a linear predictor function combining a set of weights with the feature vector. The algorithm allows for online learning, in that it processes elements in the training set one at a time.
2013
2003
- (Kanal, 2003) ⇒ Laveen N. Kanal. (2003). "Perceptron." In: "Encyclopedia of Computer Science, 4th edition." John Wiley and Sons. ISSN:0-470-86412-5 http://portal.acm.org/citation.cfm?id=1074686
- QUOTE: In 1957 the psychologist Frank Rosenblatt proposed "The Perceptron: a perceiving and recognizing automaton" as a class of artificial nerve nets, embodying aspects of the brain and receptors of biological systems. Fig. 1 shows the network of the Mark 1 Perceptron. Later, Rosenblatt protested that the term perceptron, originally intended as a generic name for a variety of theoretical nerve nets, was actually associated with a very specific piece of hardware (Rosenblatt, 1962). The basic building block of a perceptron is an element that accepts a number of inputs xi, i = 1,..., N, and computes a weighted sum of these inputs where, for each input, its fixed weight ω can be only + 1 or - 1. The sum is then compared with a threshold θ, and an output y is produced that is either 0 or 1, depending on whether or not the sum exceeds the threshold. In other words ...
... A perceptron is a signal transmission network consisting of sensory units (S units), association units (A units), and output or response units (R units). The receptor of the perceptron is analogous to the retina of the eye and is made of an array of sensory elements (photocells). Depending on whether or not an S-unit is excited, it produces a binary output. A randomly selected set of retinal cells is connected to the next level of the network, the A units. Each A unit behaves like the basic building block discussed abowvhee, re the + 1, - 1 weights for the inputs to each A unit are randomly assigned. The threshold for all A units is the same.
- QUOTE: In 1957 the psychologist Frank Rosenblatt proposed "The Perceptron: a perceiving and recognizing automaton" as a class of artificial nerve nets, embodying aspects of the brain and receptors of biological systems. Fig. 1 shows the network of the Mark 1 Perceptron. Later, Rosenblatt protested that the term perceptron, originally intended as a generic name for a variety of theoretical nerve nets, was actually associated with a very specific piece of hardware (Rosenblatt, 1962). The basic building block of a perceptron is an element that accepts a number of inputs xi, i = 1,..., N, and computes a weighted sum of these inputs where, for each input, its fixed weight ω can be only + 1 or - 1. The sum is then compared with a threshold θ, and an output y is produced that is either 0 or 1, depending on whether or not the sum exceeds the threshold. In other words ...
1990
- (Gallant, 1990) ⇒ S. I. Gallant. (1990). "Perceptron-based Learning Algorithms." In: IEEE Transactions on Neural Networks, 1(2).
1969
- (Minsky & Papert, 1969) ⇒ Marvin L. Minsky, and S. A. Papert. (1969). "Perceptrons." MIT Press
1962
- (Novikoff, 1962) ⇒ A. B. Novikoff. (1962). "On Convergence Proofs on Perceptrons. Symposium on the Mathematical Theory of Automata, 12.
1958
- (Rosenblatt, 1958) ⇒ Frank Rosenblatt. (1958). "The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain." Psychological Review, 65(6).