The human-computer interaction handbook
Advances in Robust Multimodal Interface Design
IEEE Computer Graphics and Applications
Introduction: Service-oriented computing
Communications of the ACM - Service-oriented computing
ICARE software components for rapidly developing multimodal interfaces
Proceedings of the 6th international conference on Multimodal interfaces
Asynchronous Mediation for Integrating Business and Operational Processes
IEEE Internet Computing
Model-driven Development of Complex Software: A Research Roadmap
FOSE '07 2007 Future of Software Engineering
The openinterface framework: a tool for multimodal interaction.
CHI '08 Extended Abstracts on Human Factors in Computing Systems
A survey of autonomic computing—degrees, models, and applications
ACM Computing Surveys (CSUR)
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Squidy: a zoomable design environment for natural user interfaces
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 2009 international conference on Multimodal interfaces
HephaisTK: a toolkit for rapid prototyping of multimodal interfaces
Proceedings of the 2009 international conference on Multimodal interfaces
Towards a Service Mediation Framework for Dynamic Applications
APSCC '10 Proceedings of the 2010 IEEE Asia-Pacific Services Computing Conference
Towards an Automatic Integration of Heterogeneous Services and Devices
APSCC '10 Proceedings of the 2010 IEEE Asia-Pacific Services Computing Conference
Autonomic management of multimodal interaction: DynaMo in action
Proceedings of the 4th ACM SIGCHI symposium on Engineering interactive computing systems
Hi-index | 0.00 |
Heterogeneity and dynamicity of pervasive environments require the construction of flexible multimodal interfaces at run time. In this paper, we present how we use an autonomic approach to build and maintain adaptable input multimodal interfaces in smart building environments. We have developed an autonomic solution relying on partial interaction models specified by interaction designers and developers. The role of the autonomic manager is to build complete interaction techniques based on runtime conditions and in conformity with the predicted models. The sole purpose here is to combine and complete partial models in order to obtain an appropriate multimodal interface. We illustrate our autonomic solution by considering a running example based on an existing application and several input devices.