A cookbook for using the model-view controller user interface paradigm in Smalltalk-80
Journal of Object-Oriented Programming
Simple fast algorithms for the editing distance between trees and related problems
SIAM Journal on Computing
Automatic, look-and-feel independent dialog creation for graphical user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Generating user interfaces from data models and dialogue net specifications
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
A knowledge-based user interface management system
Readings in intelligent user interfaces
JADE: a FIPA2000 compliant agent development environment
Proceedings of the fifth international conference on Autonomous agents
XIML: a common representation for interaction data
Proceedings of the 7th international conference on Intelligent user interfaces
Teallach: A Model-Based User Interface Development Environment for Object Databases
UIDIS '99 Proceedings of the 1999 User Interfaces to Data Intensive Systems
Using XForms to simplify Web programming
WWW '05 Proceedings of the 14th international conference on World Wide Web
A survey on tree edit distance and related problems
Theoretical Computer Science
Finding Syntactic Similarities Between XML Documents
DEXA '06 Proceedings of the 17th International Conference on Database and Expert Systems Applications
A Flexible Content Adaptation System Using a Rule-Based Approach
IEEE Transactions on Knowledge and Data Engineering
Human-Computer Interaction (3rd Edition)
Human-Computer Interaction (3rd Edition)
Web Information Extraction by HTML Tree Edit Distance Matching
ICCIT '07 Proceedings of the 2007 International Conference on Convergence Information Technology
Adaptive User Interface Generation for Web Services
ICEBE '07 Proceedings of the IEEE International Conference on e-Business Engineering
International Journal of Web and Grid Services
Hi-index | 0.00 |
Natural and intuitive interaction between users and complex systems is a crucial research topic in human-computer interaction. A major direction is the definition and implementation of systems with natural language understanding capabilities. The interaction in natural language is often performed by means of systems called chatbots. A chatbot is a conversational agent with a proper knowledge base able to interact with users. Chatbots appearance can be very sophisticated with 3D avatars and speech processing modules. However the interaction between the system and the user is only performed through textual areas for inputs and replies. An interaction able to add to natural language also graphical widgets could be more effective. On the other side, a graphical interaction involving also the natural language can increase the comfort of the user instead of using only graphical widgets. In many applications multi-modal communication must be preferred when the user and the system have a tight and complex interaction. Typical examples are cultural heritages applications (intelligent museum guides, picture browsing) or systems providing the user with integrated information taken from different and heterogenous sources as in the case of the iGoogle™ interface. We propose to mix the two modalities (verbal and graphical) to build systems with a reconfigurable interface, which is able to change with respect to the particular application context. The result of this proposal is the Graphical Artificial Intelligence Markup Language (GAIML) an extension of AIML allowing merging both interaction modalities. In this context a suitable chatbot system called Graphbot is presented to support this language. With this language is possible to define personalized interface patterns that are the most suitable ones in relation to the data types exchanged between the user and the system according to the context of the dialogue.