Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Maximising screen-space on mobile computing devices
CHI '99 Extended Abstracts on Human Factors in Computing Systems
The human-computer interaction handbook
A paradigm shift: alternative interaction techniques for use with mobile & wearable devices
CASCON '03 Proceedings of the 2003 conference of the Centre for Advanced Studies on Collaborative research
ACM Transactions on Applied Perception (TAP)
Finding Trading Patterns in Stock Market Data
IEEE Computer Graphics and Applications
Artificial Life
ACM Transactions on Applied Perception (TAP)
Studies on Human Computer Interface Design of Chinese Mobile Phone Users
ICWL '08 Proceedings of the 7th international conference on Advances in Web Based Learning
Where to locate wearable displays?: reaction time performance of visual alerts from tip to toe
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Seam carving for enhancing image usability on mobiles
BCS-HCI '08 Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 2
A Study of Message Reading Efficiency of Color Screen Mobile Phones
Proceedings of the 2006 conference on Learning by Effective Utilization of Technologies: Facilitating Intercultural Understanding
The Allobrain: An interactive, stereographic, 3D audio, immersive virtual world
International Journal of Human-Computer Studies
SensorTune: a mobile auditory interface for DIY wireless sensor networks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Meta-Analytical Review of Empirical Mobile Usability Studies
Journal of Usability Studies
Using sound to identify correlations in market data
CMMR/ICAD'09 Proceedings of the 6th international conference on Auditory Display
Hi-index | 0.00 |
A problem with mobile computing devices is the output of dynamic information owing to their small screens. This paper describes an experiment to investigate the use of non-speech sounds to present dynamic information without using visual display space. Results showed that non-speech sound could be used in a simple share-dealing scenario to present a “sound graph” of share prices. This allowed participants to reduce the workload they had to invest in share-price monitoring as they could listen to the graph whilst they worked in a share accumulation window.