Paper
2 March 1994 Neural networks for vision-based collision avoidance
Maddalena Brattoli, Gabriella Convertino, Arcangelo Distante
Author Affiliations +
Abstract
In this paper we investigate the application of an artificial neural network to the computation of the instantaneous observer's heading in a static environment. This parameter has an important role in vision-based collision avoidance systems and relies upon accurately locating the focus of expansion (FOE) associated with the radial optical flow pattern arising as a consequence of the translational component of motion. The approach proposed in this paper is based on a feed-forward neural network able to compute the image coordinates of the FOE. It is assumed that the input signals are supplied to the network by a sensorial module computing the optical flow associated with a sequence of time-varying images of the viewed scene. A number of experiments have been performed both for theoretical and for realistic optical flow fields. In this latter real-world experiments the input sensorial module has been simulated through an Hopfield network. Experimental results show that the proposed neural architecture is able to recover the FOE position of testing flow fields with a mean error of 0.1 pixels for exact theoretical motion fields. Moreover it seems resistant to noise and its performances appear appreciable also in real world contexts.
© (1994) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Maddalena Brattoli, Gabriella Convertino, and Arcangelo Distante "Neural networks for vision-based collision avoidance", Proc. SPIE 2243, Applications of Artificial Neural Networks V, (2 March 1994); https://doi.org/10.1117/12.169991
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Optical flow

Neural networks

Collision avoidance

Sensors

Cameras

Motion models

Neurons

Back to Top