Representation and recognition in vision
Representation and recognition in vision
Measuring and predicting visual fidelity
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Scale-Space Theory in Computer Vision
Scale-Space Theory in Computer Vision
ACM Transactions on Graphics (TOG)
3D Shape Histograms for Similarity Search and Classification in Spatial Databases
SSD '99 Proceedings of the 6th International Symposium on Advances in Spatial Databases
Evaluation of Two Principal Approaches to Objective Image Quality Assessment
IV '04 Proceedings of the Information Visualisation, Eighth International Conference
Haptic Rendering: Introductory Concepts
IEEE Computer Graphics and Applications
A similarity-based approach to perceptual feature validation
APGV '05 Proceedings of the 2nd symposium on Applied perception in graphics and visualization
Vision: A Computational Investigation into the Human Representation and Processing of Visual Information
Image quality assessment: from error visibility to structural similarity
IEEE Transactions on Image Processing
Analyzing Perceptual Representations of Complex, Parametrically-Defined Shapes Using MDS
EuroHaptics '08 Proceedings of the 6th international conference on Haptics: Perception, Devices and Scenarios
Hi-index | 0.00 |
The perceived similarity between objects may well vary according to the sensory modality/modalities in which they are experienced, an important consideration for the design of multimodal interfaces. In this study, we present a similarity-based method for comparing the perceptual importance of object properties in touch and in vision and show how the method can also be used to validate computational measures of object properties. Using either vision or touch, human subjects judged the similarity between novel, 3D objects which varied parametrically in shape and texture. Similarities were also computed using a set of state-of-the art 2D and 3D computational measures. Two resolutions of 2D and 3D object data were used for these computations in order to test for scale dependencies. Multidimensional scaling (MDS) was then performed on all similarity data, yielding maps of the stimuli in both perceptual and computational spaces, as well as the relative weight of shape and texture dimensions. For this object set, we found that visual subjects accorded more importance to shape than texture, while haptic subjects weighted them roughly evenly. Fit errors between human and computational maps were then calculated to assess each feature's perceptual validity. Shape-biased features provided good overall fits to the human visual data; however, no single feature yielded a good overall fit to the haptic data, in which we observed large individual differences. This work demonstrates how MDS techniques can be used to evaluate computational object features using the criterion of perceptual similarity. It also demonstrates a way of assessing how the perceptual validity of a feature varies as a function of parameters such as the target modality and the resolution of object data. Potential applications of this method for the design of unimodal and multimodal human---machine interfaces are discussed.