Co-clustering documents and words using bipartite spectral graph partitioning
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Information-theoretic co-clustering
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Orthogonal nonnegative matrix t-factorizations for clustering
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Statistical Analysis and Data Mining
Graph Regularized Nonnegative Matrix Factorization for Data Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Graph dual regularization non-negative matrix factorization for co-clustering
Pattern Recognition
Optimization with Sparsity-Inducing Penalties
Foundations and Trends® in Machine Learning
Locally Discriminative Coclustering
IEEE Transactions on Knowledge and Data Engineering
On trivial solution and scale transfer problems in graph regularized NMF
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Fast nonnegative matrix tri-factorization for large-scale data co-clustering
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Robust Non-negative Graph Embedding: Towards noisy data, unreliable graphs, and noisy labels
CVPR '12 Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Hi-index | 0.00 |
Nonnegative Matrix Tri-factorization (NMTF) and its graph regularized extensions have been widely used for co-clustering task to group data points and features simultaneously. However existing methods are sensitive to noises and outliers which is because of the squared loss function is used to measure the quality of data reconstruction and graph regularization. In this paper, we extend GNMTF by introducing a sparse outlier matrix into the data reconstruction function and applying the l1 norm to measure graph dual regularization errors, which leads to a novel Robust Co-Clustering (RCC) method. Accordingly, RCC is expected to obtain a more faithful approximation to the data recovered from sparse outliers, and achieve robust regularization by reducing the regularization errors of unreliable graphs via l1 norm. To solve the optimization problem of RCC, an alternating iterative algorithm is provided and its convergence is also proved. We also show the connection between the sparse outlier matrix in data reconstruction function and the robust Huber M-estimator. Experimental results on real-world data sets show that our RCC consistently outperforms the other algorithms in terms of clustering performance, which validates the effectiveness and robustness of the proposed approach.