Development of an instrument measuring user satisfaction of the human-computer interface
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use
International Journal of Human-Computer Interaction
Affective computing
Information architecture for the World Wide Web
Information architecture for the World Wide Web
Web navigation: designing the user experience
Web navigation: designing the user experience
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The state of the art in automating usability evaluation of user interfaces
ACM Computing Surveys (CSUR)
Designing Web Usability: The Practice of Simplicity
Designing Web Usability: The Practice of Simplicity
Model-Based Design and Evaluation of Interactive Applications
Model-Based Design and Evaluation of Interactive Applications
Designing the User Interface: Strategies for Effective Human-Computer Interaction
Designing the User Interface: Strategies for Effective Human-Computer Interaction
Usability Engineering
Web Pages That Suck: Learn Good Design by Looking at Bad Design
Web Pages That Suck: Learn Good Design by Looking at Bad Design
Web sites that work: designing with your eyes open
CHI '99 Extended Abstracts on Human Factors in Computing Systems
Remote automatic evaluation of web sites based on task models and browser monitoring
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Evaluating and Designing the Quality of Web Sites
IEEE MultiMedia
Questionnaires as a software evaluation tool
CHI '83 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Practical usability evaluation
CHI EA '97 CHI '97 Extended Abstracts on Human Factors in Computing Systems
User-interface design, culture, and the future
Proceedings of the Working Conference on Advanced Visual Interfaces
What did they do? understanding clickstreams with the WebQuilt visualization system
Proceedings of the Working Conference on Advanced Visual Interfaces
A Gestalt-like perceptual measure for home page design using a fuzzy entropy approach
International Journal of Human-Computer Studies
Studying information seeking on the non-English Web: An experiment on a Spanish business Web portal
International Journal of Human-Computer Studies - Human-computer interaction research in the managemant information systems discipline
A usability study on human-computer interface for middle-aged learners
Computers in Human Behavior
The review of the website evaluation framework in Information Systems and marketing journals
International Journal of Information Systems and Change Management
A strategic framework for website evaluation based on a review of the literature from 1995-2006
Information and Management
Facets of simplicity for the smartphone interface: A structural model
International Journal of Human-Computer Studies
Key website factors in e-business strategy
International Journal of Information Management: The Journal for Information Professionals
Building group recommendations in e-learning systems
Transactions on Computational Collective Intelligence VII
Gender Differences in Interface Type Task Analysis
International Journal of Information Systems and Social Change
Hi-index | 0.00 |
A new goal-based approach to measure usability of web sites is presented, strongly taking into account the customer's expectations, which are often hardly foreseeable as a whole. After a general discussion on web site design issues, we present a short survey of evaluation methods currently used for web sites. We next introduce a new taxonomy of site categories in a three-dimensional space, derived from Aristotle's rhetorical triangle, including different aspects of the site designer's goals. In our approach, we use this taxonomy to identify a number of sites belonging to the same category, in order to carry out a comparative analysis of their features. This analysis is the basis for a two-shot generation of a form for the evaluation of that category of sites. In the first shot, the users fill a generic evaluation form, acquainting them with sites characteristics. They are next asked to perform specific tasks of their choice, according to what they expect from a site of the given category. They note their impressions and list those features they found useful; the analysis of their comments is exploited to formulate statements specific to the given category, to be added to the initial form (second shot). We found that the responses to the second, expanded form, provide more comprehensive criteria for site evaluation, and turn helpful to precisely locate flaws in site functionalities. After testing, our methodology has proved very promising and may be applied for the evaluation of any other site category, most of all those providing a set of special services.