Effective sounds in complex systems: the ARKOLA simulation
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
On the effective use and reuse of HCI knowledge
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction in the new millennium, Part 2
Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Ambient touch: designing tactile interfaces for handheld devices
Proceedings of the 15th annual ACM symposium on User interface software and technology
TiltType: accelerometer-supported text entry for very small devices
Proceedings of the 15th annual ACM symposium on User interface software and technology
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
Micro-Accelerometer Based Hardware Interfaces for Wearable Computer Mixed Reality Applications
ISWC '02 Proceedings of the 6th IEEE International Symposium on Wearable Computers
TiltText: using tilt for text input to mobile phones
Proceedings of the 16th annual ACM symposium on User interface software and technology
Tilt to Scroll: Evaluating a Motion Based Vibrotactile Mobile Interface
WHC '05 Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
Need for non-visual feedback with long response times in mobile HCI
WWW '05 Special interest tracks and posters of the 14th international conference on World Wide Web
Designing a Wearable User Interface for Hands-free Interaction in Maintenance Applications
PERCOMW '06 Proceedings of the 4th annual IEEE international conference on Pervasive Computing and Communications Workshops
The benefits of augmenting telephone voice menu navigation with visual browsing and search
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Visualization of hand gestures for pervasive computing environments
Proceedings of the working conference on Advanced visual interfaces
A new approach to haptic augmentation of the GUI
Proceedings of the 8th international conference on Multimodal interfaces
Shoogle: excitatory multimodal interaction on mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A motion-based marking menu system
CHI '07 Extended Abstracts on Human Factors in Computing Systems
An empirical evaluation of some articulatory and cognitive aspects of marking menus
Human-Computer Interaction
Making progress with sounds - the design & evaluation of an audio progress bar
ICAD'98 Proceedings of the 1998 international conference on Auditory Display
Development of a single 3-axis accelerometer sensor based wearable gesture recognition band
UIC'07 Proceedings of the 4th international conference on Ubiquitous Intelligence and Computing
Wrist rotation for interaction in mobile contexts
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
Roly-poly: a haptic interface with a self-righting feature
EuroHaptics'10 Proceedings of the 2010 international conference on Haptics: generating and perceiving tangible sensations, Part I
Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback
International Journal of Human-Computer Studies
Gesture-aware remote controls: guidelines and interaction technique
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Aural browsing on-the-go: listening-based back navigation in large web architectures
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparing modalities and feedback for peripheral interaction
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
As the form factors of computational devices diversify, the concept of eyes-free interaction is becoming increasingly relevant: it is no longer hard to imagine use scenarios in which screens are inappropriate. However, there is currently little consensus about this term. It is regularly employed in different contexts and with different intents. One key consequence of this multiplicity of meanings is a lack of easily accessible insights into how to best build an eyes-free system. This paper seeks to address this issue by thoroughly reviewing the literature, proposing a concise definition and presenting a set of design principles. The application of these principles is then elaborated through a case study of the design of an eyes-free motion input system for a wearable device.