Bidirectional associative memories
IEEE Transactions on Systems, Man and Cybernetics
High speed A/D converters: understanding data converters through spice
High speed A/D converters: understanding data converters through spice
Neural Networks for Optimization and Signal Processing
Neural Networks for Optimization and Signal Processing
Atomic Decomposition by Basis Pursuit
SIAM Review
Extensions of compressed sensing
Signal Processing - Sparse approximations in signal and image processing
Nonlinear Optimization
IEEE Transactions on Neural Networks
IEEE Transactions on Signal Processing
A Recurrent Neural Network for Solving a Class of General Variational Inequalities
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Uncertainty principles and ideal atomic decomposition
IEEE Transactions on Information Theory
Stability and statistical properties of second-order bidirectional associative memory
IEEE Transactions on Neural Networks
Analysis for a class of winner-take-all model
IEEE Transactions on Neural Networks
Microcode optimization with neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper develops two neural network models, based on Lagrange programming neural networks (LPNNs), for recovering sparse signals in compressive sampling. The first model is for the standard recovery of sparse signals. The second one is for the recovery of sparse signals from noisy observations. Their properties, including the optimality of the solutions and the convergence behavior of the networks, are analyzed. We show that for the first case, the network converges to the global minimum of the objective function. For the second case, the convergence is locally stable.