Could Olfactory Displays Improve Data Visualization?
Computing in Science and Engineering
Audiovisual guidance for simulated one point force exertion tasks
Proceedings of the 2006 ACM international conference on Virtual reality continuum and its applications
Interactive sound rendering in complex and dynamic scenes using frustum tracing
IEEE Transactions on Visualization and Computer Graphics
Haptic Navigation in the World Wide Web
UAHCI '09 Proceedings of the 5th International Conference on Universal Access in Human-Computer Interaction. Part III: Applications and Services
RESound: interactive sound rendering for dynamic virtual environments
MM '09 Proceedings of the 17th ACM international conference on Multimedia
ACM SIGGRAPH 2009 Courses
Enhancing 2D scatter plot visualization of multivariate data with haptic effects
Proceedings of the 12th Brazilian Symposium on Human Factors in Computing Systems
Hi-index | 0.00 |
Mapping information onto more than one sensory modality might let us increase human bandwidth for understanding complex, multivariate data. Researchers have done significant work to explore the effective use of a single non-visual sense for data display. Unfortunately, few researchers have examined the question of what data is best expressed in what way. Lacking a theory of multisensory perception and processing of information, the critical issue is determining what data "best" maps onto what sensory input channel. Consider the problem of hydrocarbon reservoirs. Most reservoir engineers would agree that the number of variables required to characterize a reservoir is large (perhaps 30 to 50). If you depend only on visual displays limited to seven variables at a time, then it might require as many as seven such displays to cover the full range of necessary variables. You then must either mentally integrate across those seven displays or go through a process of variable selection and redisplay to achieve a specific goal.