Inductive learning algorithms and representations for text categorization
Proceedings of the seventh international conference on Information and knowledge management
Introduction to Modern Information Retrieval
Introduction to Modern Information Retrieval
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Evolutionary spectral clustering by incorporating temporal smoothness
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Towards a theoretical foundation for laplacian-based manifold methods
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Continuous user feedback learning for data capture from business documents
HAIS'12 Proceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part II
Hi-index | 0.00 |
In this paper, we consider semi-supervised classification on evolutionary data, where the distribution of the data and the underlying concept that we aim to learn change over time due to short-term noises and long-term drifting, making a single aggregated classifier inapplicable for long-term classification. The drift is smooth if we take a localized view over the time dimension, which enables us to impose temporal smoothness assumption for the learning algorithm. We first discuss how to carry out such assumption using temporal regularizers defined in a structural way with respect to the Hilbert space, and then derive the online algorithm that efficiently finds the closed-form solution to the classification functions. Experimental results on real-world evolutionary mailing list data demonstrate that our algorithm outperforms classical semi-supervised learning algorithms in both algorithmic stability and classification accuracy.