Synergistic use of direct manipulation and natural language
CHI '89 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Integrating simultaneous input from speech, gaze, and hand gestures
Intelligent multimedia interfaces
An approach to natural gesture in virtual environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on virtual reality software and technology
Integration and synchronization of input modes during multimodal human-computer interaction
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Synergistic use of direct manipulation and natural language
Readings in intelligent user interfaces
Natural language with integrated deictic and graphic gestures
Readings in intelligent user interfaces
Integrating simultaneous input from speech, gaze, and hand gestures
Readings in intelligent user interfaces
Mutual disambiguation of recognition errors in a multimodel architecture
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Perceptual user interfaces: multimodal interfaces that process what comes naturally
Communications of the ACM
A Multimedia System for Temporally Situated Perceptual Psycholinguistic Analysis
Multimedia Tools and Applications
Unencumbered Gestural Interaction
IEEE MultiMedia
Vector Coherence Mapping: A Parallelizable Approach to Image Flow Computation
ACCV '98 Proceedings of the Third Asian Conference on Computer Vision-Volume II
Research Challenges in Gesture: Open Issues and Unsolved Problems
Proceedings of the International Gesture Workshop on Gesture and Sign Language in Human-Computer Interaction
Velocity Profile Based Recognition of Dynamic Gestures with Discrete Hidden Markov Models
Proceedings of the International Gesture Workshop on Gesture and Sign Language in Human-Computer Interaction
Progress in Sign Languages Recognition
Proceedings of the International Gesture Workshop on Gesture and Sign Language in Human-Computer Interaction
Neural Architecture for Gesture-Based Human-Machine-Interaction
Proceedings of the International Gesture Workshop on Gesture and Sign Language in Human-Computer Interaction
Gesture recognition using the Perseus architecture
CVPR '96 Proceedings of the 1996 Conference on Computer Vision and Pattern Recognition (CVPR '96)
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Invited Speech: "Gestural Interface to a visual computing Environment for Molecular biologists"
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Recovering the Temporal Structure of Natural Gesture
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Robust classification of hand postures against complex backgrounds
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
CHI '82 Proceedings of the 1982 Conference on Human Factors in Computing Systems
“Put-that-there”: Voice and gesture at the graphics interface
SIGGRAPH '80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques
A Parallel Algorithm for Dynamic Gesture Tracking
RATFG-RTS '99 Proceedings of the International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems
Natural language with integrated deictic and graphic gestures
HLT '89 Proceedings of the workshop on Speech and Natural Language
Multimodal transformed social interaction
Proceedings of the 6th international conference on Multimodal interfaces
Multimodal model integration for sentence unit detection
Proceedings of the 6th international conference on Multimodal interfaces
A study on the use of semaphoric gestures to support secondary task interactions
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Hand Motion Gesture Frequency Properties and Multimodal Discourse Analysis
International Journal of Computer Vision
Exploratory study of lexical patterns in multimodal cues
MMUI '05 Proceedings of the 2005 NICTA-HCSNet Multimodal User Interaction Workshop - Volume 57
Oscillatory gestures and discourse
ICME '03 Proceedings of the 2003 International Conference on Multimedia and Expo - Volume 2
Using maximum entropy (ME) model to incorporate gesture cues for SU detection
Proceedings of the 8th international conference on Multimodal interfaces
Incorporating gesture and gaze into multimodal models of human-to-human communication
NAACL-DocConsortium '06 Proceedings of the 2006 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology: companion volume: doctoral consortium
Semantic back-pointers from gesture
NAACL-DocConsortium '06 Proceedings of the 2006 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology: companion volume: doctoral consortium
Berlin Brain-Computer Interface-The HCI communication channel for discovery
International Journal of Human-Computer Studies
Vision-based hand pose estimation: A review
Computer Vision and Image Understanding
The catchment feature model: a device for multimodal fusion and a bridge between signal and sense
EURASIP Journal on Applied Signal Processing
Interactive robots as social partners and peer tutors for children: a field trial
Human-Computer Interaction
Visual Hints for Tangible Gestures in Augmented Reality
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Artificial Intelligence Review
Multimodality and parallelism in design interaction: co-designers' alignment and coalitions
Proceedings of the 2006 conference on Cooperative Systems Design: Seamless Integration of Artifacts and Conversations -- Enhanced Concepts of Infrastructure for Communication
Proceedings of the 2007 conference on New Trends in Software Methodologies, Tools and Techniques: Proceedings of the sixth SoMeT_07
Journal of Visual Languages and Computing
Multimodal Interaction for Mobile Learning
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
Discourse topic and gestural form
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Gesture salience as a hidden variable for coreference resolution and keyframe extraction
Journal of Artificial Intelligence Research
Between linguistic attention and gaze fixations inmultimodal conversational interfaces
Proceedings of the 2009 international conference on Multimodal interfaces
Gesture interaction in cooperation scenarios
CRIWG'09 Proceedings of the 15th international conference on Groupware: design, implementation, and use
Natural exploration of multimedia contents
Proceedings of the 7th International Conference on Advances in Mobile Computing and Multimedia
PRICAI'10 Proceedings of the 11th Pacific Rim international conference on Trends in artificial intelligence
Vocal sketching: a prototype tool for designing multimodal interaction
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Utilizing gestures to improve sentence boundary detection
Multimedia Tools and Applications
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
Conversational gaze mechanisms for humanlike robots
ACM Transactions on Interactive Intelligent Systems (TiiS)
Generic gesture kernel modeling and its application with virtual garment design
Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry
A multimodal discourse ontology for meeting understanding
MLMI'05 Proceedings of the Second international conference on Machine Learning for Multimodal Interaction
Gesture features for coreference resolution
MLMI'06 Proceedings of the Third international conference on Machine Learning for Multimodal Interaction
Enhancing user experience through pervasive information systems: The case of pervasive retailing
International Journal of Information Management: The Journal for Information Professionals
Multimodal analysis of the implicit affective channel in computer-mediated textual communication
Proceedings of the 14th ACM international conference on Multimodal interaction
Proceeding of the 16th International Academic MindTrek Conference
Choosing and modeling the hand gesture database for a natural user interface
GW'11 Proceedings of the 9th international conference on Gesture and Sign Language in Human-Computer Interaction and Embodied Communication
Enabling the blind to see gestures
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on the theory and practice of embodied interaction in HCI and interaction design
On the naturalness of touchless: Putting the “interaction” back into NUI
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on the theory and practice of embodied interaction in HCI and interaction design
Rotating, tilting, bouncing: using an interactive chair to promote activity in office environments
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Towards designing audio assistance for comprehending haptic graphs: a multimodal perspective
UAHCI'13 Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: design methods, tools, and interaction techniques for eInclusion - Volume Part I
Context-based bounding volume morphing in pointing gesture application
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
Functional gestures for human-environment interaction
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
Review Article: Multimodal interaction: A review
Pattern Recognition Letters
Hi-index | 0.00 |
Gesture and speech combine to form a rich basis for human conversational interaction. To exploit these modalities in HCI, we need to understand the interplay between them and the way in which they support communication. We propose a framework for the gesture research done to date, and present our work on the cross-modal cues for discourse segmentation in free-form gesticulation accompanying speech in natural conversation as a new paradigm for such multimodal interaction. The basis for this integration is the psycholinguistic concept of the coequal generation of gesture and speech from the same semantic intent. We present a detailed case study of a gesture and speech elicitation experiment in which a subject describes her living space to an interlocutor. We perform two independent sets of analyses on the video and audio data: video and audio analysis to extract segmentation cues, and expert transcription of the speech and gesture data by microanalyzing the videotape using a frame-accurate videoplayer to correlate the speech with the gestural entities. We compare the results of both analyses to identify the cues accessible in the gestural and audio data that correlate well with the expert psycholinguistic analysis. We show that "handedness" and the kind of symmetry in two-handed gestures provide effective supersegmental discourse cues.