The Strength of Weak Learnability
Machine Learning
Boosting a weak learning algorithm by majority
Information and Computation
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Efficient Mining of Frequent Subgraphs in the Presence of Isomorphism
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Boosting for transfer learning
Proceedings of the 24th international conference on Machine learning
Entire regularization paths for graph data
Proceedings of the 24th international conference on Machine learning
Model-shared subspace boosting for multi-label classification
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Mining significant graph patterns by leap search
Proceedings of the 2008 ACM SIGMOD international conference on Management of data
Boosting with incomplete information
Proceedings of the 25th international conference on Machine learning
ManifoldBoost: stagewise function approximation for fully-, semi- and un-supervised learning
Proceedings of the 25th international conference on Machine learning
Random classification noise defeats all convex potential boosters
Proceedings of the 25th international conference on Machine learning
Multi-class cost-sensitive boosting with p-norm loss functions
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Partial least squares regression for graph mining
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Structure feature selection for graph classification
Proceedings of the 17th ACM conference on Information and knowledge management
Association Rules Network: Definition and Applications
Statistical Analysis and Data Mining
Boosting with structural sparsity
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Group lasso with overlap and graph lasso
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
ABC-boost: adaptive base class boost for multi-class classification
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Information theoretic regularization for semi-supervised boosting
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Graph classification based on pattern co-occurrence
Proceedings of the 18th ACM conference on Information and knowledge management
L2 norm regularized feature kernel regression for graph data
Proceedings of the 18th ACM conference on Information and knowledge management
Anomaly localization for network data streams with graph joint sparse PCA
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
A novel approach for mining representative spatial motifs of proteins
Proceedings of the ACM Conference on Bioinformatics, Computational Biology and Biomedicine
Computational prediction of toxicity
International Journal of Data Mining and Bioinformatics
Graph classification with imbalanced class distributions and noise
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
Boosting is a very successful classification algorithm that produces a linear combination of "weak" classifiers (a.k.a. base learners) to obtain high quality classification models. In this paper we propose a new boosting algorithm where base learners have structure relationships in the functional space. Though such relationships are generic, our work is particularly motivated by the emerging topic of pattern based classification for semi-structured data including graphs. Towards an efficient incorporation of the structure information, we have designed a general model where we use an undirected graph to capture the relationship of subgraph-based base learners. In our method, we combine both L1 norm and Laplacian based L2 norm penalty with Logit loss function of Logit Boost. In this approach, we enforce model sparsity and smoothness in the functional space spanned by the basis functions. We have derived efficient optimization algorithms based on coordinate decent for the new boosting formulation and theoretically prove that it exhibits a natural grouping effect for nearby spatial or overlapping features. Using comprehensive experimental study, we have demonstrated the effectiveness of the proposed learning methods.