Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Algorithm 772: STRIPACK: Delaunay triangulation and Voronoi diagram on the surface of a sphere
ACM Transactions on Mathematical Software (TOMS)
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
The Journal of Machine Learning Research
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Stability of feature selection algorithms: a study on high-dimensional spaces
Knowledge and Information Systems
A stability index for feature selection
AIAP'07 Proceedings of the 25th conference on Proceedings of the 25th IASTED International Multi-Conference: artificial intelligence and applications
Bayes Machines for binary classification
Pattern Recognition Letters
The Group-Lasso for generalized linear models: uniqueness of solutions and efficient algorithms
Proceedings of the 25th international conference on Machine learning
Bayesian Inference and Optimal Design for the Sparse Linear Model
The Journal of Machine Learning Research
Stable feature selection via dense feature groups
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Group lasso with overlap and graph lasso
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Convex variational Bayesian inference for large scale generalized linear models
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Consensus group stable feature selection
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Expectation propagation for approximate inference in dynamic bayesian networks
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Sparse coding for image denoising using spike and slab prior
Neurocomputing
Speeding up large-scale learning with a social prior
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Generalized spike-and-slab priors for Bayesian group feature selection using expectation propagation
The Journal of Machine Learning Research
Hi-index | 0.01 |
In some classification problems there is prior information about the joint relevance of groups of features. This knowledge can be encoded in a network whose nodes correspond to features and whose edges connect features that should be either both excluded or both included in the predictive model. In this paper, we introduce a novel network-based sparse Bayesian classifier (NBSBC) that makes use of the information about feature dependencies encoded in such a network to improve its prediction accuracy, especially in problems with a high-dimensional feature space and a limited amount of available training data. Approximate Bayesian inference is efficiently implemented in this model using expectation propagation. The NBSBC method is validated on four real-world classification problems from different domains of application: phonemes, handwritten digits, precipitation records and gene expression measurements. A comparison with state-of-the-art methods (support vector machine, network-based support vector machine and graph lasso) show that NBSBC has excellent predictive performance. It has the best accuracy in three of the four problems analyzed and ranks second in the modeling of the precipitation data. NBSBC also yields accurate and robust rankings of the individual features according to their relevance to the solution of the classification problem considered. The accuracy and stability of these estimates is an important factor in the good overall performance of this method.