UbiMob '08 Proceedings of the 4th French-speaking conference on Mobility and ubiquity computing
Immersive authoring of Tangible Augmented Reality content: A user study
Journal of Visual Languages and Computing
Insights on the design of intml
Presence: Teleoperators and Virtual Environments
Lessons learnt from an experience with an augmented reality iPhone learning game
Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology
Integrating heterogeneous tools into model-centric development of interactive applications
MODELS'07 Proceedings of the 10th international conference on Model Driven Engineering Languages and Systems
Hi-index | 0.00 |
Augmented Reality (AR) applications provide a high potential to support the user in solving tasks e.g. in scenarios from the domain of assembly, maintenance and repair. AR applications enrich the physical world around the user with additional useful virtual information. However, the integration of physical objects into AR user interfaces poses a challenge to developers. In addition, the information displayed in such an environment often strongly depends on the user's current task. In this paper we present an approach to specify the integration of real objects into AR user interfaces and the task dependent visualization of AR user interface elements on the design level. To describe user tasks, the AR user interface structure and the relations between them we use UML activity diagrams in combination with the Scene Structure and Integration Modelling Language (SSIML), a visual language which provides support for the description of 3D user interface structures. Furthermore, code can be generated from the visual models. The proposed concepts are illustrated by an AR application example from the domain of assembly and lead to a new language called SSIML/AR.