System identification
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Approximation capabilities of multilayer feedforward networks
Neural Networks
Improving the convergence of the back-propagation algorithm
Neural Networks
Towards the Optimal Learning Rate for Backpropagation
Neural Processing Letters
Smaller nets may perform better: special transfer functions
Neural Networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
A stochastic gradient adaptive filter with gradient adaptive stepsize
IEEE Transactions on Signal Processing
Nonlinear adaptive prediction of nonstationary signals
IEEE Transactions on Signal Processing
Fingerprinting protocol for images based on additive homomorphic property
IEEE Transactions on Image Processing
High-order and multilayer perceptron initialization
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A homomorphic feedforward network (HFFN) for nonlinear adaptive filtering is introduced. This is achieved by a two-layer feedforward architecture with an exponential hidden layer and logarithmic preprocessing step. This way, the overall input-output relationship can be seen as a generalized Volterra model, or as a bank of homomorphic filters. Gradient-based learning for this architecture is introduced, together with some practical issues related to the choice of optimal learning parameters and weight initialization. The performance and convergence speed are verified by analysis and extensive simulations. For rigor, the simulations are conducted on artificial and real-life data, and the performances are compared against those obtained by a sigmoidal feedforward network (FFN) with identical topology. The proposed HFFN proved to be a viable alternative to FFNs, especially in the critical case of online learning on small-and medium-scale data sets.