Backward Elimination Methods for Associative Memory Network Pruning
International Journal of Hybrid Intelligent Systems
International Journal of Systems Science
A new RBF neural network with boundary value constraints
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on human computing
LSMS/ICSEE'10 Proceedings of the 2010 international conference on Life system modeling and simulation and intelligent computing, and 2010 international conference on Intelligent computing for sustainable energy and environment: Part II
A forward regression algorithm based on M-estimators
CONTROL'05 Proceedings of the 2005 WSEAS international conference on Dynamical systems and control
ICIC'06 Proceedings of the 2006 international conference on Intelligent Computing - Volume Part I
Hi-index | 0.00 |
A very efficient learning algorithm for model subset selection is introduced based on a new composite cost function that simultaneously optimizes the model approximation ability and model robustness and adequacy. The derived model parameters are estimated via forward orthogonal least squares, but the model subset selection cost function includes a D-optimality design criterion that maximizes the determinant of the design matrix of the subset to ensure the model robustness, adequacy, and parsimony of the final model. The proposed approach is based on the forward orthogonal least square (OLS) algorithm, such that new D-optimality-based cost function is constructed based on the orthogonalization process to gain computational advantages and hence to maintain the inherent advantage of computational efficiency associated with the conventional forward OLS approach. Illustrative examples are included to demonstrate the effectiveness of the new approach.