Clustering with Feature Order Preferences
PRICAI '08 Proceedings of the 10th Pacific Rim International Conference on Artificial Intelligence: Trends in Artificial Intelligence
Regularized Local Reconstruction for Clustering
PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Graph-optimized locality preserving projections
Pattern Recognition
Clustering with feature order preferences
Intelligent Data Analysis - Artificial Intelligence
Hi-index | 754.84 |
Partition entropy is the numerical metric of uncertainty within a partition of a finite set, while conditional entropy measures the degree of difficulty in predicting a decision partition when a condition partition is provided. Since two direct methods exist for defining conditional entropy based on its partition entropy, the inequality postulates of monotonicity, which conditional entropy satisfies, are actually additional constraints on its entropy. Thus, in this paper partition entropy is defined as a function of probability distribution, satisfying all the inequalities of not only partition entropy itself but also its conditional counterpart. These inequality postulates formalize the intuitive understandings of uncertainty contained in partitions of finite sets. We study the relationships between these inequalities, and reduce the redundancies among them. According to two different definitions of conditional entropy from its partition entropy, the convenient and unified checking conditions for any partition entropy are presented, respectively. These properties generalize and illuminate the common nature of all partition entropies.