To read this content please select one of the options below:

Backpropagation

Alex M. Andrew (Earley, Reading, UK)

Kybernetes

ISSN: 0368-492X

Article publication date: 1 December 2001

697

Abstract

The popular backpropagation algorithm for training neural nets is a special case of an earlier principle of significance feedback, which in turn has much in common with Selfridge’s “Pandemonium” and a connection with McCulloch’s “redundancy of potential command”. Ways in which the effects might operate in real neural nets are reviewed, and the ideas are related to the current interest in heterogeneous agents. The tendency to restrict attention to numerical optimisation is regretted.

Keywords

Citation

Andrew, A.M. (2001), "Backpropagation", Kybernetes, Vol. 30 No. 9/10, pp. 1110-1117. https://doi.org/10.1108/03684920110405601

Publisher

:

MCB UP Ltd

Copyright © 2001, MCB UP Limited

Related articles