Bayesian regularization and pruning using a Laplace prior
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Expectation Propagation for approximate Bayesian inference
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Cancer classification and prediction using logistic regression with Bayesian gene selection
Journal of Biomedical Informatics - Special issue: Biomedical machine learning
Gene selection using a two-level hierarchical Bayesian model
Bioinformatics
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Bayesian Inference and Optimal Design for the Sparse Linear Model
The Journal of Machine Learning Research
Gene selection from microarray data for cancer classification-a machine learning approach
Computational Biology and Chemistry
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Hi-index | 0.10 |
Microarray experiments are a very promising tool for early diagnosis and disease treatment. The datasets obtained in these experiments typically consist of a small number of instances and a large number of covariates, most of which are irrelevant for discrimination. These characteristics pose severe difficulties for standard learning algorithms. A Bayesian approach can be useful to overcome these problems and produce more accurate and robust predictions. However, exact Bayesian inference is computationally costly and in many cases infeasible. In practice, some form of approximation has to be made. In this paper we consider a Bayesian linear model for microarray data classification based on a prior distribution that favors sparsity in the model coefficients. Expectation Propagation (EP) is then used to perform approximate inference as an alternative to computationally more expensive methods, such as Markov Chain Monte Carlo (MCMC) sampling. The model considered is evaluated on 15 microarray datasets and compared with other state-of-the-art classification algorithms. These experiments show that the Bayesian model trained with EP performs well on the datasets investigated and is also useful to identify relevant genes for subsequent analysis.