Towards an understanding of model executability
Proceedings of the international conference on Formal Ontology in Information Systems - Volume 2001
MDA Distilled
Design and Development of Multidevice User Interfaces through Multiple Logical Descriptions
IEEE Transactions on Software Engineering
TAMODIA '05 Proceedings of the 4th international workshop on Task models and diagrams
The 4C Reference Model for Distributed User Interfaces
ICAS '08 Proceedings of the Fourth International Conference on Autonomic and Autonomous Systems
The COMETs inspector: towards run time plasticity control based on a semantic network
TAMODIA'06 Proceedings of the 5th international conference on Task models and diagrams for users interface design
Model-driven adaptation for plastic user interfaces
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction
USIXML: a language supporting multi-path development of user interfaces
EHCI-DSVIS'04 Proceedings of the 2004 international conference on Engineering Human Computer Interaction and Interactive Systems
Utilizing Dynamic Executable Models for User Interface Development
Interactive Systems. Design, Specification, and Verification
Automated Usability Evaluation during Model-Based Interactive System Development
HCSE-TAMODIA '08 Proceedings of the 2nd Conference on Human-Centered Software Engineering and 7th International Workshop on Task Models and Diagrams
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
Adjustable context adaptations for user interfaces at runtime
Proceedings of the International Conference on Advanced Visual Interfaces
Towards multimodal interaction in smart home environments: the home operating system
Proceedings of the 8th ACM Conference on Designing Interactive Systems
Dynamic user interface distribution for flexible multimodal interaction
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Proceedings of the 3rd ACM SIGCHI symposium on Engineering interactive computing systems
Configurable executable task models supporting the transition from design time to runtime
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: design and development approaches - Volume Part I
Building multimodal interfaces out of executable, model-based interactors and mappings
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: design and development approaches - Volume Part I
Efficient generation of ambient intelligent user interfaces
KES'11 Proceedings of the 15th international conference on Knowledge-based and intelligent information and engineering systems - Volume Part IV
UsiComp: an extensible model-driven composer
Proceedings of the 4th ACM SIGCHI symposium on Engineering interactive computing systems
Engineering device-spanning, multimodal web applications using a model-based design approach
Proceedings of the 18th Brazilian symposium on Multimedia and the web
Hi-index | 0.00 |
Model-based user interface development is grounded on the idea to utilize models at design time to derive user interfaces from the modeled information. There is however an increasing demand for user interfaces that adapt to the context of use at runtime. The shift from design time to runtime means, that different design decisions are postponed until runtime. Utilizing user interface models at runtime provides a possibility to utilize the same basis of information for these postponed decisions. The approach we are following goes even one step further. Instead of only postponing several design decisions, we aim at the utilization of stateful and executable models at runtime to completely express the user interaction and the user interface logic in a model-based way.