The structure-mapping engine: algorithm and examples
Artificial Intelligence
Global information management via local autonomous agents
Readings in agents
A collaborative approach to ontology design
Communications of the ACM - Ontology: different ways of representing the same concept
Predicting how ontologies for the semantic web will evolve
Communications of the ACM - Ontology: different ways of representing the same concept
Making ontologies work for resolving redundancies across documents
Communications of the ACM - Ontology: different ways of representing the same concept
The case for reflective middleware
Communications of the ACM - Adaptive middleware
Interaction of Purposeful Agents that Use Different Ontologies
MICAI '00 Proceedings of the Mexican International Conference on Artificial Intelligence: Advances in Artificial Intelligence
Knowledge accumulation through automatic merging of ontologies
Expert Systems with Applications: An International Journal
Short communication: A novel sentence similarity measure for semantic-based expert systems
Expert Systems with Applications: An International Journal
ARES'11 Proceedings of the IFIP WG 8.4/8.9 international cross domain conference on Availability, reliability and security for business, enterprise and health information systems
Prototype system for pursuing firm's core capability
Information Systems Frontiers
Hi-index | 12.06 |
Two agents previously unknown to each other cannot communicate by exchanging concepts (nodes of their own ontology): they need to use a common communication language. If they do not use a standard protocol, most likely they use a natural language. The ambiguities of it, and the different concepts the agents possess, give rise to imperfect understanding among them: How closely concepts in ontology O"A map to which of O"B? Can we measure these mismatches? Given a concept from ontology O"A, a method is provided to find the most similar concept in O"B, and to measure the similarity between both concepts. The paper also gives an algorithm to gauge du(A, B), the degree of understanding that agent A has about the ontology of B. The procedures use word comparison, since no agent (except the Very Wise Creature, VWC) can measure du directly. Examples are given.