Generative communication in Linda
ACM Transactions on Programming Languages and Systems (TOPLAS)
The society of mind
Concurrent, object-oriented natural language parsing: the ParseTalk model
International Journal of Human-Computer Studies - Special issue: object-oriented approaches in artificial intelligence and human-computer interaction
Communicating and mobile systems: the &pgr;-calculus
Communicating and mobile systems: the &pgr;-calculus
Tractatus Logico Philosophicus (Routledge Classics) (Routledge Classics)
Tractatus Logico Philosophicus (Routledge Classics) (Routledge Classics)
The Emotion Machine: Commonsense Thinking, Artificial Intelligence, and the Future of the Human Mind
The Emotion Machine: Commonsense Thinking, Artificial Intelligence, and the Future of the Human Mind
IBM Journal of Research and Development
Architectures for intelligent systems
IBM Systems Journal
PADL '09 Proceedings of the 11th International Symposium on Practical Aspects of Declarative Languages
Two Paradigms Are Better Than One, and Multiple Paradigms Are Even Better
ICCS '09 Proceedings of the 17th International Conference on Conceptual Structures: Conceptual Structures: Leveraging Semantic Technologies
Learning to map the virtual evolution of knowledge
ICCS'10 Proceedings of the 18th international conference on Conceptual structures: from information to intelligence
Biological and and Psycholinguistic Influences on Architectures For Natural Language Processing
Proceedings of the 2010 conference on Biologically Inspired Cognitive Architectures 2010: Proceedings of the First Annual Meeting of the BICA Society
Cognitive architectures for conceptual structures
ICCS'11 Proceedings of the 19th international conference on Conceptual structures for discovering knowledge
Hi-index | 0.00 |
No human being can understand every text or dialog in his or her native language, and no one should expect a computer to do so. However, people have a remarkable ability to learn and to extend their understanding without explicit training. Fundamental to human understanding is the ability to learn and use language in social interactions that Wittgenstein called language games. Those language games use and extend prelinguistic knowledge learned through perception, action, and social interactions. This article surveys the technology that has been developed for natural language processing and the successes and failures of various attempts. Although many useful applications have been implemented, the original goal of language understanding seems as remote as ever. Fundamental to understanding is the ability to recognize an utterance as a move in a social game and to respond in terms of a mental model of the game, the players, and the environment. Those models use and extend the prelinguistic models learned through perception, action, and social interactions. Secondary uses of language, such as reading a book, are derivative processes that elaborate and extend the mental models originally acquired by interacting with people and the environment. A computer system that relates language to virtual models might mimic some aspects of understanding, but full understanding requires the ability to learn and use new knowledge in social and sensory-motor interactions. These issues are illustrated with an analysis of some NLP systems and a recommended strategy for the future. None of the systems available today can understand language at the level of a child, but with a shift in strategy there is hope of designing more robust and usable systems in the future.