Cumulated gain-based evaluation of IR techniques
ACM Transactions on Information Systems (TOIS)
Improving web search ranking by incorporating user behavior information
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
Mining the search trails of surfing crowds: identifying relevant websites from user activity
Proceedings of the 17th international conference on World Wide Web
BrowseRank: letting web users vote for page importance
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
How does clickthrough data reflect retrieval quality?
Proceedings of the 17th ACM conference on Information and knowledge management
The query-flow graph: model and applications
Proceedings of the 17th ACM conference on Information and knowledge management
Beyond the session timeout: automatic hierarchical segmentation of search topics in query logs
Proceedings of the 17th ACM conference on Information and knowledge management
A dynamic bayesian network click model for web search ranking
Proceedings of the 18th international conference on World wide web
Beyond DCG: user behavior as a predictor of a successful search
Proceedings of the third ACM international conference on Web search and data mining
A user behavior model for average precision and its generalization to graded judgments
Proceedings of the 33rd international ACM SIGIR conference on Research and development in information retrieval
Unbiased offline evaluation of contextual-bandit-based news article recommendation algorithms
Proceedings of the fourth ACM international conference on Web search and data mining
No clicks, no problem: using cursor movements to understand and improve search
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Large-scale validation and analysis of interleaved search evaluation
ACM Transactions on Information Systems (TOIS)
Improving searcher models using mouse cursor activity
SIGIR '12 Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval
UMAP'12 Proceedings of the 20th international conference on User Modeling, Adaptation, and Personalization
Online multitasking and user engagement
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Quizz: targeted crowdsourcing with a billion (potential) users
Proceedings of the 23rd international conference on World wide web
Hi-index | 0.00 |
In the online industry, user engagement is measured with various engagement metrics used to assess users' depth of engagement with a website. Widely-used metrics include clickthrough rates, page views and dwell time. Relying solely on these metrics can lead to contradictory if not erroneous conclusions regarding user engagement. In this paper, we propose the time between two user visits, or the absence time, to measure user engagement. Our assumption is that if users find a website interesting, engaging or useful, they will return to it sooner -a reflection of their engagement with the site -than if this is not the case. This assumption has the advantage of being simple and intuitive and applicable to a large number of settings. As a case study, we use a community Q&A website, and compare the behaviour of users exposed to six functions used to rank past answers, both in terms of traditional metrics and absence time. We use Survival Analysis to show the relation between absence time and other engagement metrics. We demonstrate that the absence time leads to coherent, interpretable results and helps to better understand other metrics commonly used to evaluate user engagement in search.