Construction of decision trees by using feature importance value for improved learning performance
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Hi-index | 0.00 |
Standard hybrid learners that use domainknowledge require stronger knowledge that is hard andexpensive to acquire. However, weaker domainknowledge can benefit from prior knowledge while beingcost effective. Weak knowledge in the form of featurerelative importance (FRI) is presented and explained.Feature relative importance is a real valuedapproximation of a feature's importance provided byexperts. Advantage of using this knowledge isdemonstrated by IANN, a modified multilayer neuralnetwork algorithm. IANN is a very simple modification ofstandard neural network algorithm but attains significantperformance gains. Experimental results in the field ofmolecular biology show higher performance over otherempirical learning algorithms including standardbackpropagation and support vector machines. IANNperformance is even comparable to a theory refinementsystem KBANN that uses stronger domain knowledge.