TABLETOP '06 Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems
A method and tool for human-human interaction and instant collaboration in CSCW-based CAD
Computers in Industry - Special issue: Collaborative environments for concurrent engineering
Exploring mutual engagement in creative collaborations
Proceedings of the 6th ACM SIGCHI conference on Creativity & cognition
Collaborative interaction with volumetric displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tangible augmented prototyping of digital handheld products
Computers in Industry
Impact of CAD tools on creative problem solving in engineering design
Computer-Aided Design
Advanced Engineering Informatics
Towards a collaborative modeling and simulation platform on the Internet
Advanced Engineering Informatics
Advanced Engineering Informatics
Ontological product modeling for collaborative design
Advanced Engineering Informatics
Tangible authoring of 3D virtual scenes in dynamic augmented reality environment
Computers in Industry
Improving co-located collaboration with show-through techniques
3DUI '10 Proceedings of the 2010 IEEE Symposium on 3D User Interfaces
Mixed reality in virtual world teleconferencing
VR '10 Proceedings of the 2010 IEEE Virtual Reality Conference
An image-based system for sharing a 3D object by transmitting to remote locations
VR '10 Proceedings of the 2010 IEEE Virtual Reality Conference
Hi-index | 0.00 |
This paper considers applying novel Virtual Environments (VEs) in collaborative product design, focusing on reviewing activities. Companies are usually anchored to commercial ICT tools, which are mature and reliable. However, two main problems emerge: the difficulty in selecting the most suitable tools for specific purposes and the complexity in evaluating the impact that using technology has on design collaboration. The present work aims to face both aspects by proposing a structured benchmarking method based on expert judgements and defining a set of benchmarking weights based on experimental tests. The method considers both human-human interaction and teamwork-related aspects. A subsequent evaluation protocol considering both process efficiency and human-human interaction allows a closed-loop verification process. Pilot projects evaluate different technologies, and the benchmarking weights are verified and adjusted for more reliable system assessment. This paper focuses on synchronous and remote design review activities: three different tools have been compared according to expert judgements. The two best performing tools have been implemented as pilot projects within real industrial chains. Design collaboration has been assessed by considering both process performance and human-human interaction quality, as well as benchmarking results validated by indicating some corrective actions. The final benchmarking weights can thus be further adopted for an agile system benchmark in synchronous and remote design. The main findings suggest defining both an innovative process to verify the expert benchmark reliability and a trusty benchmarking method to evaluate tools for synchronous and remote design without experimental testing. Furthermore, the proposed method has a general validity and can be properly set for different collaborative dimensions.