Quantization Complexity and Independent Measurements
IEEE Transactions on Computers
A Classifier Design Technique for Discrete Variable Pattern Recognition Problems
IEEE Transactions on Computers
Probability Error in Global Optimal Hierarchical Classifier with Intuitionistic Fuzzy Observations
HAIS '09 Proceedings of the 4th International Conference on Hybrid Artificial Intelligence Systems
Probability Error in Bayes Optimal Classifier with Intuitionistic Fuzzy Observations
ICIAR '09 Proceedings of the 6th International Conference on Image Analysis and Recognition
Interval-valued fuzzy observations in Bayes classifier
IDEAL'09 Proceedings of the 10th international conference on Intelligent data engineering and automated learning
Costs-sensitive classification in multistage classifier with fuzzy observations of object features
HAIS'11 Proceedings of the 6th international conference on Hybrid artificial intelligent systems - Volume Part II
Estimations of the error in bayes classifier with fuzzy observations
ICCCI'11 Proceedings of the Third international conference on Computational collective intelligence: technologies and applications - Volume Part I
Randomness and fuzziness in bayes multistage classifier
HAIS'10 Proceedings of the 5th international conference on Hybrid Artificial Intelligence Systems - Volume Part I
Decision rules for a hierarchical classifier
Pattern Recognition Letters
Hi-index | 14.99 |
A performance measure is derived for a multiclass hierarchical classifier under the assumption that a maximum likelihood rule is used at each node and the features at different nodes of the tree are class-conditionally statistically independent. The mean accuracy of an estimated hierarchical classifier is then defined as its performance averaged across all classification problems, when an estimated decision rule is used at every node. For a balanced binary decision tree, it is shown that there exists an optimum number of quantization levels for the features which maximizes the mean accuracy. The optimum quantization level increases with the number of training samples per class available to estimate the node decisions and is a nondecreasing function of the depth of the tree.