Low complexity MLP-based radar detector: influence of the training algorithm and the MLP size

  • Authors:
  • R. Vicen-Bueno;M. P. Jarabo-Amores;D. Mata-Moya;M. Rosa-Zurera;R. Gil-Pita

  • Affiliations:
  • Signal Theory and Communications Department, Escuela Politécnica Superior, Universidad de Alcalá, Alcalá de Henares, Madrid, Spain;Signal Theory and Communications Department, Escuela Politécnica Superior, Universidad de Alcalá, Alcalá de Henares, Madrid, Spain;Signal Theory and Communications Department, Escuela Politécnica Superior, Universidad de Alcalá, Alcalá de Henares, Madrid, Spain;Signal Theory and Communications Department, Escuela Politécnica Superior, Universidad de Alcalá, Alcalá de Henares, Madrid, Spain;Signal Theory and Communications Department, Escuela Politécnica Superior, Universidad de Alcalá, Alcalá de Henares, Madrid, Spain

  • Venue:
  • IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

MultiLayer Perceptrons (MLPs) trained in a supervised way to minimize the Mean Square Error are able to approximate the Neyman-Pearson detector. The known target detection in a Weibull-distributed clutter and white Gaussian noise is considered. Because the difficulty to obtain analytical expressions for the optimum detector under this environment, a suboptimum detector like the Target Sequence Known A Priori (TSKAP) detector is taken as reference. The results show a MLP-based detector dependency with the training algorithm for low MLP sizes, being the Levenberg-Marquardt algorithm better than the Back-Propagation one. On the other hand, this dependency does not exist for high MLP sizes. Also, this detector is sensitive to the MLP size, but for sizes greater than 20 hidden neurons, very low improvement is achieved. So, the MLP-based detector is better than the TSKAP one, even for very low complexity (6 inputs, 5 hidden neurons and 1 output) MLPs.