HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Evaluating Focused Retrieval Tasks

Abstract : Focused retrieval, identified by question answering, passage retrieval, and XML element retrieval, is becoming increasingly important within the broad task of information retrieval. In this paper, we present a taxonomy of text retrieval tasks based on the structure of the answers required by a task. Of particular importance are the in context tasks of focused retrieval, where not only relevant documents should be retrieved but also relevant information within each document should be correctly identified. Answers containing relevant information could be, for example, best entry points, or non-overlapping passages or elements. Our main research question is: How should the effectiveness of focused retrieval be evaluated? We propose an evaluation framework where different aspects of the in context focused retrieval tasks can be consistently evaluated and compared, and use fidelity tests on simulated runs to show what is measured. Results from our fidelity experiments demonstrate the usefulness of the proposed evaluation framework, and show its ability to measure different aspects and model different evaluation assumptions of focused retrieval.
Document type :
Conference papers
Complete list of metadata

Cited literature [22 references]  Display  Hide  Download

Contributor : Jovan Pehcevski Connect in order to contact the contributor
Submitted on : Thursday, August 9, 2007 - 5:22:25 PM
Last modification on : Thursday, February 3, 2022 - 11:15:49 AM
Long-term archiving on: : Friday, April 9, 2010 - 12:35:09 AM


Files produced by the author(s)


  • HAL Id : inria-00166790, version 1



Jovan Pehcevski, James A. Thom. Evaluating Focused Retrieval Tasks. SIGIR 2007 Workshop on Focused Retrieval, Jul 2007, Amsterdam, Netherlands. ⟨inria-00166790⟩



Record views


Files downloads