Wizard of Oz studies: why and how
IUI '93 Proceedings of the 1st international conference on Intelligent user interfaces
Usability for digital libraries
Proceedings of the 2nd ACM/IEEE-CS joint conference on Digital libraries
Pervasive information access and the rise of human-information interaction
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Glass Box: An Instrumented Infrastructure for Supporting Human Interaction with Information
HICSS '05 Proceedings of the Proceedings of the 38th Annual Hawaii International Conference on System Sciences - Volume 09
Evaluation of a mobile information system in context
Information Processing and Management: an International Journal
Spatial filters for mobile information retrieval
Proceedings of the 4th ACM workshop on Geographical information retrieval
Proceedings of the 2008 Workshop on BEyond time and errors: novel evaLuation methods for Information Visualization
A user-oriented assessment of enterprise information systems
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: applications and services
Journal of Systems and Software
Utility assessment in TRANSTAC: using a set of complementary methods
PerMIS '09 Proceedings of the 9th Workshop on Performance Metrics for Intelligent Systems
Context-aware retrieval going social
FDIA'09 Proceedings of the Third BCS-IRSG conference on Future Directions in Information Access
Generating applications directly on the mobile device: an empirical evaluation
Proceedings of the International Working Conference on Advanced Visual Interfaces
Proceedings of the 30th ACM international conference on Design of communication
Computer Speech and Language
Hi-index | 0.00 |
Society today has a wealth of information available due to information technology. The challenge facing researchers working in information access is how to help users easily locate the information needed. Evaluation methodologies and metrics are important tools to assess progress in human information interaction (HII). To properly evaluate these systems, evaluations need to consider the performance of the various components, the usability of the system, and the impact of the system on the end user. Current usability metrics are adequate for evaluating the efficiency, effectiveness, and user satisfaction of such systems. Performance measures for new intelligent technologies will have to be developed. Regardless of how well the systems are and how usable the systems are, it is critical that impact measures are developed. For HII systems to be useful, we need to assess how well information analysts work with the systems. This evaluation needs to go beyond technical performance metrics and usability metrics. What are the metrics for evaluating utility? This paper describes research efforts focused on developing metrics for the intelligence community that measure the impact of new software to facilitate information interaction.