Deploying and evaluating a mixed reality mobile treasure hunt: Snap2Play
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
In support of city exploration
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using semantic descriptions for adaptive mobile games UIs
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
UrbanWeb: a platform for mobile context-awaresocial computing
Proceedings of the 21st ACM conference on Hypertext and hypermedia
ICIDS'11 Proceedings of the 4th international conference on Interactive Digital Storytelling
QuesTInSitu: From tests to routes for assessment in situ activities
Computers & Education
A platform on the cloud for self-creation of mobile interactive learning trails
International Journal of Mobile Learning and Organisation
Guidelines for the Design of Location-Based Audio for Mobile Learning
International Journal of Mobile and Blended Learning
To the Castle! A comparison of two audio guides to enable public discovery of historical events
Personal and Ubiquitous Computing
Of Catwalk Technologies and Boundary Creatures
ACM Transactions on Computer-Human Interaction (TOCHI) - Special Issue of “The Turn to The Wild”
Hi-index | 0.00 |
With today's abundance of captured/created graphics/images, audio, and video in hand, can we make full use of this media to explore new experiences or applications? Along these lines, HP Labs has developed a prototype technology called Mediascapes or Mscapes. A Mediascape is a context-aware multimedia experience that allows you to trigger multimedia content based on your context, such as physical locations. Although we believe that some similar concepts have been proposed in piecemeal here and there before, Mediascapes offers the user some totally new experiences. Want to know more details? Follow me into the world of Mediascapes.