Learning human-like knowledge by singular value decomposition: a progress report
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Inconsistencies, negations and changes in ontologies
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach
Ontology-based information extraction: An introduction and a survey of current approaches
Journal of Information Science
Components for information extraction: ontology-based information extractors and generic platforms
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
Debugging OWL-DL ontologies: a heuristic approach
ISWC'05 Proceedings of the 4th international conference on The Semantic Web
Consistent evolution of OWL ontologies
ESWC'05 Proceedings of the Second European conference on The Semantic Web: research and Applications
Hi-index | 0.00 |
Automatic grading systems for summaries and essays have been studied for years. Most commercial and research implementations are based in statistical methods, such as Latent Semantic Analysis (LSA), which can provide high accuracy on similarity between the essay and the graded or standard essays, but they can offer very limited feedback. In the present work, we propose a novel method to provide both grades and meaningful feedback for student summaries by Ontology-based Information Extraction (OBIE). We use ontological concepts and relationships to create extraction rules to identify correct statements. Based on ontology constraints (e.g., disjointness between concepts), we define patterns that are logically inconsistent with the ontology to create rules to extract incorrect statements. Experiments show that the grades given to 18 student summaries on Ecosystems by OBIE are correlated to human gradings. OBIE also provide meaningful feedback on the errors those students made in their summaries.