Symbolic and Neural Learning Algorithms: An Experimental Comparison
Machine Learning
Bilinear separation of two sets in n-space
Computational Optimization and Applications
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Tabu Search
Advanced Scout: Data Mining and Knowledge Discovery in NBA Data
Data Mining and Knowledge Discovery
The Surgical Separation of Sets
Journal of Global Optimization
Pattern Classification by Linear Goal Programming and its Extensions
Journal of Global Optimization
Analysis of Bounds for Multilinear Functions
Journal of Global Optimization
An Implementation of Logical Analysis of Data
IEEE Transactions on Knowledge and Data Engineering
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Global Optimization of Multiplicative Programs
Journal of Global Optimization
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Hi-index | 0.00 |
This paper develops a new methodology for pattern classification by concurrently determined k piecewise linear and convex discriminant functions. Toward the end, we design a new l1-norm distance metric for measuring misclassification errors and use it to develop a mixed 0-1 integer and linear program (MILP) for the k piecewise linear and convex separation of data. The proposed model is meritorious in that it considers the synergy as well as the individual role of the k hyperplanes in constructing a decision surface and exploits the advances in theory and algorithms and the advent of powerful softwares for MILP for its solution. With artificially created data, we illustrate pros and cons of pattern classification by the proposed methodology. With six benchmark classification datasets, we demonstrate that the proposed approach is effective and competitive with well-established learning methods. In summary, the classifiers constructed by the proposed approach obtain the best prediction rates on three of the six datasets and the second best records for two of the remaining three datasets.