Approximation by fully complex multilayer perceptrons
Neural Computation
Letters: Fully complex extreme learning machine
Neurocomputing
A comprehensive survey on functional link neural networks and an adaptive PSO–BP learning for CFLNN
Neural Computing and Applications - Special Issue - KES2008
Evolutionarily optimized features in functional link neural network for classification
Expert Systems with Applications: An International Journal
Real-time hand gesture recognition using complex-valued neural network (CVNN)
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part I
Nonlinear adaptive prediction of nonstationary signals
IEEE Transactions on Signal Processing
On the complex backpropagation algorithm
IEEE Transactions on Signal Processing
The complex backpropagation algorithm
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
Evolution of functional link networks
IEEE Transactions on Evolutionary Computation
Nonlinear channel equalization for QAM signal constellation usingartificial neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Channel equalization using adaptive complex radial basis function networks
IEEE Journal on Selected Areas in Communications
Nonlinear blind equalization schemes using complex-valued multilayer feedforward neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Stochastic choice of basis functions in adaptive function approximation and the functional-link net
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Functional link networks are single-layered neural networks that impose nonlinearity in the input layer using nonlinear functions of the original input variables. In this paper, we present a fully complex-valued functional link network (CFLN) with multivariate polynomials as the nonlinear functions. Unlike multilayer neural networks, the CFLN is free from local minima problem, and it offers very fast learning of parameters because of its linear structure. Polynomial based CFLN does not require an activation function which is a major concern in the complex-valued neural networks. However, it is important to select a smaller subset of polynomial terms (monomials) for faster and better performance since the number of all possible monomials may be quite large. Here, we use the orthogonal least squares (OLS) method in a constructive fashion (starting from lower degree to higher) for the selection of a parsimonious subset of monomials. It is argued here that computing CFLN in purely complex domain is advantageous than in double-dimensional real domain, in terms of number of connection parameters, faster design, and possibly generalization performance. Simulation results on a function approximation, wind prediction with real-world data, and a nonlinear channel equalization problem exhibit that the OLS based CFLN yields very simple structure having favorable performance.