Machine Learning
Optimizing search engines using clickthrough data
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Beyond independent relevance: methods and evaluation metrics for subtopic retrieval
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
Implicit feedback for inferring user preference: a bibliography
ACM SIGIR Forum
An efficient boosting algorithm for combining preferences
The Journal of Machine Learning Research
TREC: Experiment and Evaluation in Information Retrieval (Digital Libraries and Electronic Publishing)
Learning user interaction models for predicting web search result preferences
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
User performance versus precision measures for simple search tasks
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
Improving web search ranking by incorporating user behavior information
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
Evaluating the accuracy of implicit feedback from clicks and query reformulations in Web search
ACM Transactions on Information Systems (TOIS)
The influence of caption features on clickthrough patterns in web search
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
A support vector method for optimizing average precision
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
Active exploration for learning rankings from clickthrough data
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
An experimental comparison of click position-bias models
WSDM '08 Proceedings of the 2008 International Conference on Web Search and Data Mining
A user browsing model to predict search engine click data from past observations.
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Novelty and diversity in information retrieval evaluation
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
How does clickthrough data reflect retrieval quality?
Proceedings of the 17th ACM conference on Information and knowledge management
A dynamic bayesian network click model for web search ranking
Proceedings of the 18th international conference on World wide web
Query-page intention matching using clicked titles and snippets to boost search rankings
Proceedings of the 9th ACM/IEEE-CS joint conference on Digital libraries
PSkip: estimating relevance ranking quality from web search clickthrough data
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
On the local optimality of LambdaRank
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Minimally invasive randomization for collecting unbiased preferences from clickthrough logs
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
Efficient algorithms for ranking with SVMs
Information Retrieval
Learning more powerful test statistics for click-based retrieval evaluation
Proceedings of the 33rd international ACM SIGIR conference on Research and development in information retrieval
ViewSer: a tool for large-scale remote studies of web search result examination
CHI '11 Extended Abstracts on Human Factors in Computing Systems
ViewSer: enabling large-scale remote user studies of web search examination and interaction
Proceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval
The role of attractiveness in web image search
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Large-scale validation and analysis of interleaved search evaluation
ACM Transactions on Information Systems (TOIS)
Proceedings of the fifth ACM international conference on Web search and data mining
Mining test oracles of web search engines
ASE '11 Proceedings of the 2011 26th IEEE/ACM International Conference on Automated Software Engineering
Multi-objective optimization for sponsored search
Proceedings of the Sixth International Workshop on Data Mining for Online Advertising and Internet Economy
On caption bias in interleaving experiments
Proceedings of the 21st ACM international conference on Information and knowledge management
Practical online retrieval evaluation
ECIR'13 Proceedings of the 35th European conference on Advances in Information Retrieval
Incorporating vertical results into search click models
Proceedings of the 36th international ACM SIGIR conference on Research and development in information retrieval
Beliefs and biases in web search
Proceedings of the 36th international ACM SIGIR conference on Research and development in information retrieval
Captions and biases in diagnostic search
ACM Transactions on the Web (TWEB)
Behavioral dynamics on the web: Learning, modeling, and prediction
ACM Transactions on Information Systems (TOIS)
TellMyRelevance!: predicting the relevance of web search results from cursor interactions
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Struggling or exploring?: disambiguating long search sessions
Proceedings of the 7th ACM international conference on Web search and data mining
Bias in algorithmic filtering and personalization
Ethics and Information Technology
Hi-index | 0.00 |
Leveraging clickthrough data has become a popular approach for evaluating and optimizing information retrieval systems. Although data is plentiful, one must take care when interpreting clicks, since user behavior can be affected by various sources of presentation bias. While the issue of position bias in clickthrough data has been the topic of much study, other presentation bias effects have received comparatively little attention. For instance, since users must decide whether to click on a result based on its summary (e.g., the title, URL and abstract), one might expect clicks to favor "more attractive" results. In this paper, we examine result summary attractiveness as a potential source of presentation bias. This study distinguishes itself from prior work by aiming to detect systematic biases in click behavior due to attractive summaries inflating perceived relevance. Our experiments conducted on the Google web search engine show substantial evidence of presentation bias in clicks towards results with more attractive titles.