Paper
1 October 1991 Fast algorithm for a neocognitron neural network with back-propagation
Kent Pu Qing, Robert W. Means
Author Affiliations +
Abstract
The neocognitron is a neural network that consists of many layers of partially connected cells. A new neocognitron architecture called the multilayer neocognitron with backpropagation learning (MNEOBP) is proposed, and it is shown that the original Neocognitron that is learned by backpropagation is a special case of what is proposed here. The MNEOBP has a number of advantages: (1) The algorithm for the MNEOBP is four times faster than the earlier algorithm since the number of cells calculated is reduced by a factor of four. (2) During the learning process, the mask (kernel) size is changed, this can speed up the training time by almost a factor of three. (3) The MNEOBP architecture can be implemented with a new digital neural network VLSI chip set called the Vision Processor (ViP). The ViP exploits the convolution structure of the network and can process a single 32 X 32 input layer in only 25.6 microsecond(s) with an 8 X 8 receptive field kernel.
© (1991) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Kent Pu Qing and Robert W. Means "Fast algorithm for a neocognitron neural network with back-propagation", Proc. SPIE 1569, Stochastic and Neural Methods in Signal Processing, Image Processing, and Computer Vision, (1 October 1991); https://doi.org/10.1117/12.48371
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Image processing

Computer vision technology

Convolution

Machine vision

Signal processing

Stochastic processes

RELATED CONTENT


Back to Top