Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Symbolic Interpretation of Artificial Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Rule extraction from linear support vector machines
Proceedings of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
In many applications such as bioinformatics and medical decision-making, the interpretability is important to make the model acceptable to the user and help the expert discover the novel and perhaps valuable knowledge hidden behind the data. This paper presents a novel feature selection and rule extraction method which is based on multiple kernel support vector machine (MK-SVM). This method has two outstanding properties. Firstly, the multiple kernels are described as the convex combination of the single feature basic kernels. It makes the feature selection problem in the context of SVM transformed into an ordinary multiple parameters learning problem. A 1-norm based linear programming is proposed to carry out the optimization of those parameters. Secondly, the rules are obtained in an easy way: only the support vectors necessary. It is demonstrated in theory that every support vector obtained by this method is just the vertex of the hypercube. Then a tree-like algorithm is proposed to extract the if-then rules. Three UCI datasets are used to demonstrate the effectiveness and efficiency of this approach.