Constrained Error Back Propagation Network (CEBPN)

From GM-RKB
(Redirected from CEBPN)
Jump to navigation Jump to search

A Constrained Error Back Propagation Network (CEBPN) is a Multi-Layer Perceptron that consists of an input layer, an output layer and a hidden layer of local basis function nodes.



References

2005

1999

1998

1996a

1996b

  • (Andrews & Geva, 1996) ⇒ Robert Andrews, and Shlomo Geva (1996). “Rulex And Cebp Networks As The Basis For A Rule Refinement System". Technical report, Neurocomputing Research Centre, Faculty of Information Technology, Queensland University of Technology.

1996c

1995

1994a

  • (Geva & Sitte) ⇒ Shlomo Geva, and Joaquin Sitte (1994). “Constrained Gradient Descent". In: Proceedings of Fifth Australian Conference on Neural Computing.

1994b

  • (Andrews & Geva, 1994) ⇒ Robert Andrews, and Shlomo Geva (1994). “Rule Extraction From a Constrained Error Backpropagation Network". In: Proceedings of the 5th ACNN.

1994c

  • (Andrews & Geva, 1994) ⇒ Robert Andrews, and Shlomo Geva (1994). “Rule Extraction From A Constraint Back Propagation MLP".

1992