Employing log metrics to evaluate search behaviour and success: case study BBC search engine
Journal of Information Science
Controlling the complexity in comparing search user interfaces via user studies
Information Processing and Management: an International Journal
Web search strategies: The influence of Web experience and task type
Information Processing and Management: an International Journal
EGOV '08 Proceedings of the 7th international conference on Electronic Government
How evaluator domain expertise affects search result relevance judgments
Proceedings of the 17th ACM conference on Information and knowledge management
Multilingual search strategies
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Using the Internet: Skill related problems in users' online behavior
Interacting with Computers
How does search behavior change as search becomes more difficult?
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A computational framework for authoring and searching product design specifications
Advanced Engineering Informatics
Brand positioning strategy using search engine marketing
MIS Quarterly
Web browsing behavior recording system
KES'11 Proceedings of the 15th international conference on Knowledge-based and intelligent information and engineering systems - Volume Part IV
Evaluation of an adaptive search suggestion system
ECIR'2010 Proceedings of the 32nd European conference on Advances in Information Retrieval
The search dashboard: how reflection and comparison impact search behavior
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Web-based interaction: A review of three important human factors
International Journal of Information Management: The Journal for Information Professionals
Analyzing and mining a code search engine usage log
Empirical Software Engineering
The impact of task phrasing on the choice of search keywords and on the search process and success
Journal of the American Society for Information Science and Technology
Journal of the American Society for Information Science and Technology
Hi-index | 0.00 |
Several previous studies have measured differences in the information search success of novices and experts. However, the definitions of novices and experts have varied greatly between the studies, and so have the measures used for search success. Instead of dividing the searchers into different groups based on their expertise, we chose to model search success with task completion speed, TCS. Towards this goal, 22 participants performed three fact-finding tasks and two broader tasks in an observational user study. In our model, there were two variables related to the Web experience of the participants. Other variables included, for example, the speed of query iteration, the length of the queries, the proportion of precise queries, and the speed of evaluating result documents. Our results showed that the variables related to Web experience had expected effects on TCS. The increase in the years of Web use was related to improvement in TCS in the broader tasks, whereas the less frequent Web use was related to a decrease in TCS in the fact-finding tasks. Other variables having significant effects on TCS in either of the task types were the speed of composing queries, the average number of query terms per query, the proportion of precise queries, and the participants' own evaluation of their search skills. In addition to the statistical models, we present several qualitative findings of the participants' search strategies. These results give valuable insight into the successful strategies in Web search beyond the previous knowledge of the expert–novice differences. © 2006 Wiley Periodicals, Inc.