Structured induction in expert systems
Structured induction in expert systems
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
C4.5: programs for machine learning
C4.5: programs for machine learning
Learning Boolean concepts in the presence of many irrelevant features
Artificial Intelligence
Controlling constructive induction in CiPF: an MDL approach
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
The nature of statistical learning theory
The nature of statistical learning theory
Error reduction through learning multiple descriptions
Machine Learning
From data mining to knowledge discovery: an overview
Advances in knowledge discovery and data mining
Graphical models for discovering knowledge
Advances in knowledge discovery and data mining
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Feature Selection for Knowledge Discovery and Data Mining
Feature Selection for Knowledge Discovery and Data Mining
Knowledge Discovery and Data Mining: The Info-Fuzzy Network (Ifn) Methodology
Knowledge Discovery and Data Mining: The Info-Fuzzy Network (Ifn) Methodology
Data mining by attribute decomposition with semiconductor manufacturing case study
Data mining for design and manufacturing
On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality
Data Mining and Knowledge Discovery
On Comparing Classifiers: Pitfalls toAvoid and a Recommended Approach
Data Mining and Knowledge Discovery
Feature Transformation by Function Decomposition
IEEE Intelligent Systems
Semi-Naive Bayesian Classifier
EWSL '91 Proceedings of the European Working Session on Machine Learning
Problem Decomposition and the Learning of Skills
ECML '95 Proceedings of the 8th European Conference on Machine Learning
Generalization Bounds for Decision Trees
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
An Investigation of Analysis Techniques for Software Datasets
METRICS '99 Proceedings of the 6th International Symposium on Software Metrics
A Projection Pursuit Algorithm for Exploratory Data Analysis
IEEE Transactions on Computers
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Computational Statistics & Data Analysis
A self-organized, distributed, and adaptive rule-based induction system
IEEE Transactions on Neural Networks
A class decomposition approach for GA-based classifiers
Engineering Applications of Artificial Intelligence
Artificial Intelligence Review
Privacy-preserving data mining: A feature set partitioning approach
Information Sciences: an International Journal
Develop multi-hierarchy classification model: rough set based feature decomposition method
ICAPR'05 Proceedings of the Third international conference on Advances in Pattern Recognition - Volume Part I
Hybrid random subsample classifier ensemble for high dimensional data sets
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
This paper presents the Feature Decomposition Approach for improving supervised learning tasks. While in Feature Selection the aim is to identify a representative set of features from which to construct a classification model, in Feature Decomposition, the goal is to decompose the original set of features into several subsets. A classification model is built for each subset, and then all generated models are combined. This paper presents theoretical and practical aspects of the Feature Decomposition Approach. A greedy procedure, called DOT (Decomposed Oblivious Trees), is developed to decompose the input features set into subsets and to build a classification model for each subset separately. The results achieved in the empirical comparison testing with well-known learning algorithms (like C4.5) indicate the superiority of the feature decomposition approach in learning tasks that contains high number of features and moderate numbers of tuples.