Peephole displays: pen interaction on spatially aware handheld computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A simple movement time model for scrolling
CHI '05 Extended Abstracts on Human Factors in Computing Systems
On the usability of gesture interfaces in virtual reality environments
CLIHC '05 Proceedings of the 2005 Latin American conference on Human-computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
It's a long way to Monte Carlo: probabilistic display in GPS navigation
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Show me the way to Monte Carlo: density-based trajectory navigation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Target acquisition with camera phones when used as magic lenses
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Peephole pointing: modeling acquisition of dynamically revealed targets
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Dynamic audiotactile feedback in gesture interaction
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
Wrist rotation for interaction in mobile contexts
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
Perception of dynamic audiotactile feedback to gesture input
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
GeoPoke: rotational mechanical systems metaphor for embodied geosocial interaction
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
Evaluating haptics for information discovery while walking
Proceedings of the 23rd British HCI Group Annual Conference on People and Computers: Celebrating People and Technology
Vibrotactile Guidance Cues for Target Acquisition
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Scanning angles for directional pointing
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
The influence of angle size in navigation applications using pointing gestures
HAID'10 Proceedings of the 5th international conference on Haptic and audio interaction design
Acquisition of dynamically revealed multimodal targets
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Spatial gestures using a tactile-proprioceptive display
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
Proceedings of the 6th International Conference on Body Area Networks
Improving game accessibility with vibrotactile-enhanced hearing instruments
ICCHP'12 Proceedings of the 13th international conference on Computers Helping People with Special Needs - Volume Part I
Assessing a multimodal user interface in a target acquisition task
BCS-HCI '12 Proceedings of the 26th Annual BCS Interaction Specialist Group Conference on People and Computers
Dynamic tactile guidance for visual search tasks
Proceedings of the 25th annual ACM symposium on User interface software and technology
Navigation your way: from spontaneous independent exploration to dynamic social journeys
Personal and Ubiquitous Computing
Navigation by pointing to GPS locations
Personal and Ubiquitous Computing
Haptic target acquisition to enable spatial gestures in nonvisual displays
Proceedings of Graphics Interface 2013
Hi-index | 0.00 |
This study is based on a user scenario where augmented reality targets could be found by scanning the environment with a mobile device and getting a tactile feedback exactly in the direction of the target. In order to understand how accurately and quickly the targets can be found, we prepared an experiment setup where a sensor-actuator device consisting of orientation tracking hardware and a tactile actuator were used. The targets with widths 5°, 10°, 15°, 20°, and 25° and various distances between each other were rendered in a 90° -wide space successively, and the task of the test participants was to find them as quickly as possible. The experiment consisted of two conditions: the first one provided tactile feedback only when pointing was on the target and the second one included also another cue indicating the proximity of the target. The average target finding time was 1.8 seconds. The closest targets appeared to be not the easiest to find, which was attributed to the adapted scanning velocity causing the missing the closest targets. We also found that our data did not correlate well with Fitts' model, which may have been caused by the non-normal data distribution. After filtering out 30% of the least representative data items, the correlation reached up to 0.71. Overall, the performance between conditions did not differ from each other significantly. The only significant improvement in the performance offered by the close-to-target cue occurred in the tasks where the targets where the furthest from each other.