The society of mind
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
The structure-mapping engine: algorithm and examples
Artificial Intelligence
Computational Linguistics
CYC: a large-scale investment in knowledge infrastructure
Communications of the ACM
Representation and Reality
Open Mind Common Sense: Knowledge Acquisition from the General Public
On the Move to Meaningful Internet Systems, 2002 - DOA/CoopIS/ODBASE 2002 Confederated International Conferences DOA, CoopIS and ODBASE 2002
A conceptual parser for natural language
IJCAI'69 Proceedings of the 1st international joint conference on Artificial intelligence
A commonsense approach to predictive text entry
CHI '04 Extended Abstracts on Human Factors in Computing Systems
ConceptNet — A Practical Commonsense Reasoning Tool-Kit
BT Technology Journal
BT Technology Journal
Comparing, integrating lexical definitional knowledge from multiple sources
CLS '04 Proceedings of the HLT-NAACL Workshop on Computational Lexical Semantics
IEEE Transactions on Multimedia - Special issue on integration of context and content
Hi-index | 0.00 |
The knowledge representation tradition in computational lexicon design represents words as static encapsulations of purely lexical knowledge. We suggest that this view poses certain limitations on the ability of the lexicon to generate nuance-laden and context-sensitive meanings, because word boundaries are obstructive, and the impact of non-lexical knowledge on meaning is unaccounted for. Hoping to address these problematics, we explore a contextcentered approach to lexicon design called a Bubble Lexicon. Inspired by Ross Quillian's Semantic Memory System, we represent word-concepts as nodes on a symbolic-connectionist network. In a Bubble Lexicon, a word's meaning is defined by a dynamically grown context-sensitive bubble; thus giving a more natural account of systematic polysemy. Linguistic assembly tasks such as attribute attachment are made context-sensitive, and the incorporation of general world knowledge improves generative capability. Indicative trials over an implementation of the Bubble Lexicon lends support to our hypothesis that unpacking meaning from predefined word structures is a step toward a more natural handling of context in language.