Skip to Main content Skip to Navigation
Conference papers

Evaluating Focused Retrieval Tasks

Abstract : Focused retrieval, identified by question answering, passage retrieval, and XML element retrieval, is becoming increasingly important within the broad task of information retrieval. In this paper, we present a taxonomy of text retrieval tasks based on the structure of the answers required by a task. Of particular importance are the in context tasks of focused retrieval, where not only relevant documents should be retrieved but also relevant information within each document should be correctly identified. Answers containing relevant information could be, for example, best entry points, or non-overlapping passages or elements. Our main research question is: How should the effectiveness of focused retrieval be evaluated? We propose an evaluation framework where different aspects of the in context focused retrieval tasks can be consistently evaluated and compared, and use fidelity tests on simulated runs to show what is measured. Results from our fidelity experiments demonstrate the usefulness of the proposed evaluation framework, and show its ability to measure different aspects and model different evaluation assumptions of focused retrieval.
Document type :
Conference papers
Complete list of metadata

Cited literature [22 references]  Display  Hide  Download

https://hal.inria.fr/inria-00166790
Contributor : Jovan Pehcevski <>
Submitted on : Thursday, August 9, 2007 - 5:22:25 PM
Last modification on : Tuesday, December 10, 2019 - 1:44:03 PM
Long-term archiving on: : Friday, April 9, 2010 - 12:35:09 AM

File

jovanp-Focused.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : inria-00166790, version 1

Collections

Citation

Jovan Pehcevski, James A. Thom. Evaluating Focused Retrieval Tasks. SIGIR 2007 Workshop on Focused Retrieval, Jul 2007, Amsterdam, Netherlands. ⟨inria-00166790⟩

Share

Metrics

Record views

350

Files downloads

189