Task complexity affects information seeking and use
Information Processing and Management: an International Journal
Understanding and facilitating the browsing of electronic text
International Journal of Human-Computer Studies
Introduction to Modern Information Retrieval
Introduction to Modern Information Retrieval
Summarizing scientific articles: experiments with relevance and rhetorical status
Computational Linguistics - Summarization
Characteristics of geographic information needs
Proceedings of the 4th ACM workshop on Geographical information retrieval
A task-based information retrieval interface to support bioinformatics analysis
Proceedings of the second international symposium on Information interaction in context
Whetting the appetite of scientists: producing summaries tailored to the citation context
Proceedings of the 9th ACM/IEEE-CS joint conference on Digital libraries
In-browser summarisation: generating elaborative summaries biased towards the reading context
HLT-Short '08 Proceedings of the 46th Annual Meeting of the Association for Computational Linguistics on Human Language Technologies: Short Papers
Scientific paper summarization using citation summary networks
COLING '08 Proceedings of the 22nd International Conference on Computational Linguistics - Volume 1
Web Semantics: Science, Services and Agents on the World Wide Web
Improving MeSH classification of biomedical articles using citation contexts
Journal of Biomedical Informatics
Hi-index | 0.00 |
Practitioners and researchers need to stay up-to-date with the latest advances in their fields, but the constant growth in the amount of literature available makes this task increasingly difficult. We investigated the literature browsing task via a user requirements analysis, and identified the information needs that biomedical researchers commonly encounter in this application scenario. Our analysis reveals that a number of literature-based research tasks are preformed which can be served by both generic and contextually tailored preview summaries. Based on this study, we describe the design of an implemented literature browsing support tool which helps readers of scientific literature decide whether or not to pursue and read a cited document. We present findings from a preliminary user evaluation, suggesting that our prototype helps users make relevance judgements about cited documents.