Boolean Feature Discovery in Empirical Learning
Machine Learning
A general lower bound on the number of examples needed for learning
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Empirical Learning as a Function of Concept Character
Machine Learning
Feature construction: an analytic framework and an application to decision trees
Feature construction: an analytic framework and an application to decision trees
The replication problem: a constructive induction approach
EWSL-91 Proceedings of the European working session on learning on Machine learning
Learning hard concepts through constructive induction: framework and rationale
Computational Intelligence
Opportunistic constructive induction: using fragments of domain knowledge to guide construction
Opportunistic constructive induction: using fragments of domain knowledge to guide construction
Iterative feature construction for improving inductive learning algorithms
Expert Systems with Applications: An International Journal
On preprocessing data for financial credit risk evaluation
Expert Systems with Applications: An International Journal
Generation of attributes for learning algorithms
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Embedding monte carlo search of features in tree-based ensemble methods
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Unsupervised feature construction for improving data representation and semantics
Journal of Intelligent Information Systems
Hi-index | 0.00 |
A class of concept learning algorithms CL augments standard similarity-based techniques by performing feature construction based on the SBL output. Pagallo and Hausslcr's FRINGE, Pagallo's extension Symmetric FRINGE (Sym-Fringe) and a refinement we call DCFringe are all instances of this class using decision trees as their underlying representation. These methods use patterns at the fringe of the tree to guide their construction, but DCFringe uses limited construction of conjunction and disjunction. Experiments with small DNF and CNF concepts show that DCFringe outperforms both the purely conjunctive FRINGE and the less restrictive SymFringe, in terms of accuracy, conciseness, and efficiency. Further, the gain of these methods is linked to the size of the training set. We discuss the apparent limitation of current methods to concepts exhibiting a low degree of feature interaction, and suggest ways to alleviate it. This leads to a feature construction approach based on a wider variety of patterns restricted by statistical measures and optional knowledge.