Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Computer Algebra and Symbolic Computation: Elementary Algorithms
Computer Algebra and Symbolic Computation: Elementary Algorithms
Growing and Pruning Neural Tree Networks
IEEE Transactions on Computers
Time-series forecasting using flexible neural tree model
Information Sciences: an International Journal
Neural Networks: A Comprehensive Foundation (3rd Edition)
Neural Networks: A Comprehensive Foundation (3rd Edition)
Flexible neural trees ensemble for stock index modeling
Neurocomputing
A Bayesian evolutionary approach to the design and learning of heterogeneous neural trees
Integrated Computer-Aided Engineering
A Branch and Bound Algorithm for Feature Subset Selection
IEEE Transactions on Computers
Evolutionary induction of sparse neural trees
Evolutionary Computation
Feature selection and intrusion detection using hybrid flexible neural tree
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part III
Prediction of MPEG-coded video source traffic using recurrent neural networks
IEEE Transactions on Signal Processing
Competitive neural trees for pattern classification
IEEE Transactions on Neural Networks
A comparison between neural-network forecasting techniques-case study: river flow forecasting
IEEE Transactions on Neural Networks
New results on recurrent network training: unifying the algorithms and accelerating convergence
IEEE Transactions on Neural Networks
A tree-structured adaptive network for function approximation in high-dimensional spaces
IEEE Transactions on Neural Networks
Sparse basis selection: new results and application to adaptive prediction of video source traffic
IEEE Transactions on Neural Networks
Packet Loss Rate Prediction Using the Sparse Basis Prediction Model
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper a model called symbolic function network (SFN) is introduced; that is based on using elementary functions (for example powers, the exponential function, and the logarithm) as building blocks. The proposed method uses these building blocks to synthesize a function that best fits the training data in a regression framework. The resulting network is of the form of a tree, where adding nodes horizontally means having a summation of elementary functions and adding nodes vertically means concatenating elementary functions. Several new algorithms were proposed to construct the tree based on the concepts of forward greedy search and backward greedy search, together with applying the steepest descent concept. The method is tested on a number of examples and it is shown to exhibit good performance.