A concept-based model for enhancing text categorization
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Enhancing Text Categorization Using Sentence Semantics
ADMA '08 Proceedings of the 4th international conference on Advanced Data Mining and Applications
Automatic categorization of questions for user-interactive question answering
Information Processing and Management: an International Journal
Correlation based multi-document summarization for scientific articles and news group
Proceedings of the International Conference on Advances in Computing, Communications and Informatics
Hi-index | 0.01 |
Most of text mining techniques are based on word and/or phrase analysis of the text. The statistical analysis of a term (word or phrase) frequency captures the importance of the term within a document. However, to achieve a more accurate analysis, the underlying mining technique should indicate terms that capture the semantics of the text from which the importance of a term in a sentence and in the document can be derived. A new concept-based mining model that relies on the analysis of both the sentence and the document, rather than, the traditional analysis of the document dataset only is introduced. The proposed mining model consists of a concept-based analysis of terms and a concept-based similarity measure. The term which contributes to the sentence semantics is analyzed with respect to its importance at the sentence and document levels. The model can efficiently find significant matching terms, either words or phrases, of the documents according to the semantics of the text. The similarity between documents relies on a new concept-based similarity measure which is applied to the matching terms between documents. Experiments using the proposed concept-based term analysis and similarity measure in text clustering are conducted. Experimental results demonstrate that the newly developed concept-based mining model enhances the clustering quality of sets of documents substantially.