Unsupervised Optimal Fuzzy Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition with Fuzzy Objective Function Algorithms
Pattern Recognition with Fuzzy Objective Function Algorithms
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Principal components analysis competitive learning
Neural Computation
Discriminative cluster analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive dimension reduction using discriminant analysis and K-means clustering
Proceedings of the 24th international conference on Machine learning
Nonlinear adaptive distance metric learning for clustering
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Enhancing semi-supervised clustering: a feature projection perspective
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
IEEE Transactions on Pattern Analysis and Machine Intelligence
Semi-supervised clustering with metric learning: An adaptive kernel method
Pattern Recognition
Constrained Laplacian Eigenmap for dimensionality reduction
Neurocomputing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Locally Consistent Concept Factorization for Document Clustering
IEEE Transactions on Knowledge and Data Engineering
Robust kernel fuzzy clustering
FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part I
Robust fuzzy clustering of relational data
IEEE Transactions on Fuzzy Systems
Fast accurate fuzzy clustering through data reduction
IEEE Transactions on Fuzzy Systems
Hi-index | 0.01 |
Traditionally unsupervised dimensionality reduction methods may not necessarily improve the separability of the data resided in different clusters due to ignorance of the inherent relationship between subspace selection and clustering. It is known that soft clustering using fuzzy c-means or its variants can provide a better and more meaningful data partition than hard clustering, which motivates us to develop a novel entropy regularized soft K-means algorithm for discriminant analysis (ResKmeans) in this paper. ResKmeans performs soft clustering and subspace selection simultaneously and thus gives rise to a generalized linear discriminant analysis (GELDA) which captures both the intra-cluster compactness and the inter-cluster separability. Furthermore, we clarify both the relationship between GELDA and conventional LDA and the inherent relationship between subspace selection and soft clustering. Experimental results on real-world data sets show ResKmeans is superior to other popular clustering algorithms.