Self-organizing neural networks for perception of visual motion
Neural Networks
Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Sensors for mobile robots: theory and application
Sensors for mobile robots: theory and application
Vision for Mobile Robot Navigation: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
Sensor Modelling, Design and Data Processing for Autonomous Navigation in Confined Environments
Sensor Modelling, Design and Data Processing for Autonomous Navigation in Confined Environments
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
A Silicon Implementation of the Fly's Optomotor Control System
Neural Computation
Visual motion pattern extraction and fusion for collision detection in complex dynamic scenes
Computer Vision and Image Understanding
IEEE Transactions on Neural Networks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A modified model for the Lobula Giant Movement Detector and its FPGA implementation
Computer Vision and Image Understanding
Hi-index | 0.00 |
Reliably recognizing objects approaching on a collision course is extremely important. A synthetic vision system is proposed to tackle the problem of collision recognition in dynamic environments. The system combines the outputs of four whole-field motion-detecting neurons, each receiving inputs from a network of neurons employing asymmetric lateral inhibition to suppress their responses to one direction of motion. An evolutionary algorithm is then used to adjust the weights between the four motion-detecting neurons to tune the system to detect collisions in two test environments. To do this, a population of agents, each representing a proposed synthetic visual system, either were shown images generated by a mobile Khepera robot navigating in a simplified laboratory environment or were shown images videoed outdoors from a moving vehicle. The agents had to cope with the local environment correctly in order to survive. After 400 generations, the best agent recognized imminent collisions reliably in the familiar environment where it had evolved. However, when the environment was swapped, only the agent evolved to cope in the robotic environment still signaled collision reliably. This study suggests that whole-field direction-selective neurons, with selectivity based on asymmetric lateral inhibition, can be organized into a synthetic vision system, which can then be adapted to play an important role in collision detection in complex dynamic scenes.