Self-Organization of Pulse-Coupled Oscillators with Application to Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multi-focus image fusion using pulse coupled neural network
Pattern Recognition Letters
Feature Extraction using Unit-linking Pulse Coupled Neural Network and its Applications
Neural Processing Letters
Medical image fusion using m-PCNN
Information Fusion
Classification Using Multi-valued Pulse Coupled Neural Network
Neural Information Processing
Review article: Review of pulse-coupled neural networks
Image and Vision Computing
Image fusion based on a new contourlet packet
Information Fusion
Multi-focus image fusion using PCNN
Pattern Recognition
Car plate localization using modified PCNN in complicated environment
ICIC'06 Proceedings of the 2006 international conference on Intelligent computing: Part II
Multisensor image fusion using a pulse coupled neural network
AICI'10 Proceedings of the 2010 international conference on Artificial intelligence and computational intelligence: Part I
Multispectral and panchromatic images fusion by adaptive PCNN
MMM'10 Proceedings of the 16th international conference on Advances in Multimedia Modeling
Spiking cortical model for multifocus image fusion
Neurocomputing
Hi-index | 0.00 |
This paper presents the first physiologically motivated pulse coupled neural network (PCNN)-based image fusion network for object detection. Primate vision processing principles, such as expectation driven filtering, state dependent modulation, temporal synchronization, and multiple processing paths are applied to create a physiologically motivated image fusion network. PCNN are used to fuse the results of several object detection techniques to improve object detection accuracy. Image processing techniques (wavelets, morphological, etc.) are used to extract target features and PCNN are used to focus attention by segmenting and fusing the information. The object detection property of the resulting image fusion network is demonstrated on mammograms and forward-looking infrared radar (FLIR) images. The network removed 94% of the false detections without removing any true detections in the FLIR images and removed 46% of the false detections while removing only 7% of the true detections in the mammograms. The model exceeded the accuracy obtained by any individual filtering methods or by logical ANDing the individual object detection technique results