Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Comparison of a neural network detector vs Neyman-Pearson optimal detector
ICASSP '96 Proceedings of the Acoustics, Speech, and Signal Processing, 1996. on Conference Proceedings., 1996 IEEE International Conference - Volume 06
Approximating the Neyman-Pearson detector for swerling I targets with low complexity neural networks
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Neural networks for signal detection in non-Gaussian noise
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
MultiLayer Perceptrons (MLPs) trained in a supervised way to minimize the Mean Square Error are able to approximate the Neyman-Pearson detector. The known target detection in a Weibull-distributed clutter and white Gaussian noise is considered. Because the difficulty to obtain analytical expressions for the optimum detector under this environment, a suboptimum detector like the Target Sequence Known A Priori (TSKAP) detector is taken as reference. The results show a MLP-based detector dependency with the training algorithm for low MLP sizes, being the Levenberg-Marquardt algorithm better than the Back-Propagation one. On the other hand, this dependency does not exist for high MLP sizes. Also, this detector is sensitive to the MLP size, but for sizes greater than 20 hidden neurons, very low improvement is achieved. So, the MLP-based detector is better than the TSKAP one, even for very low complexity (6 inputs, 5 hidden neurons and 1 output) MLPs.