May

10

The machine-learning algorithm called "backpropagation" is a very beautiful algorithm.

An early (~1960) learning machine was the Perceptron by F. Rosenblatt. A Perceptron is a 1-layer neural network, a linear classifier. This path did not succeed very well. One reason was because the community working on it was unable to extend to 2-layer (and more) systems, which needed the backpropagation algorithm to work. And it hadn't been invented yet (although backpropagation algorithms on 1960s computers would also have been a challenge).

The sad part is that at the time of the Perceptron, perhaps even before, the backpropagation algorithm was already known in the physics community. Of course it didn't have this name and the same applications. So, having some people able to interface between different communities, able to speak/understand both, can sometimes be useful.

However, today the backpropagation algorithm works, and is doing many good jobs. 


Comments

Name

Email

Website

Speak your mind

Archives

Resources & Links

Search