Report on the SIGIR workshop on "entertain me": supporting complex search tasks

  • Authors:
  • Nicholas J. Belkin;Charles L.A. Clarke;Ning Gao;Jaap Kamps;Jussi Karlgren

  • Affiliations:
  • Rutgers, USA;Waterloo, Canada;Peking University, China;University of Amsterdam, The Netherlands;SICS Stockholm, Sweden

  • Venue:
  • ACM SIGIR Forum
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Searchers with a complex information need typically slice-and-dice their problem into several queries and subqueries, and laboriously combine the answers post hoc to solve their tasks. Consider planning a social event at the last day of SIGIR, in the unknown city of Beijing, factoring in distances, timing, and preferences on budget, cuisine, and entertainment. A system supporting the entire search episode should "know" a lot, either from profiles or implicit information, or from explicit information in the query or from feedback. This may lead to the (interactive) construction of a complexly structured query, but sometimes the most obvious query for a complex need is dead simple: entertain me. Rather than returning ten-blue-lines in response to a 2.4-word query, the desired system should support searchers during their whole task or search episode, by iteratively constructing a complex query or search strategy, by exploring the result-space at every stage, and by combining the partial answers into a coherent whole. The workshop brought together a varied group of researchers covering both user and system centered approaches, who worked together on the problem and potential solutions. There was a strong feeling that we made substantial progress. First, there was general optimism on the wealth of contextual information that can be derived from context or natural interactions without the need for obstrusive explicit feedback. Second, the task of "contextual suggestions"--matching specific types of results against rich profiles--was identified as a manageable first step, and concrete plans for such as track were discussed in the aftermath of the workshop. Third, the identified dimensions of variation--such as the level of engagement, or user versus system initiative--give clear suggestions of the types of input a searcher is willing or able to give and the type of response expected from a system.