Online Evaluation of Coreference Resolution

Abstract : This paper presents the design of an online evaluation service for coreference resolution in texts. We argue that coreference, as an equivalence relation between referring expressions (RE) in texts, should be properly distinguished from anaphora and has therefore to be evaluated separately. The annotation model for coreference is based on links between REs. The program presented in this article compares two such annotations, which may be the output of coreference resolution tools or of human judgement. In order to evaluate the agreement between the two annotations, the evaluator first converts the input annotation format into a pivot format, then abstracts equivalence classes from the links and provides five scores representing in different ways the similarity between the two partitions: MUC, B3, Kappa, Core-discourse-entity, and Mutual-information. Although we consider that the identification of REs (i.e. the elements of the partition) should not be part of coreference resolution properly speaking, we propose several solutions for the frequent case when the input files do not agree on the elements of the text to consider as REs.
Type de document :
Communication dans un congrès
4th International Conference on Language Resources and Evaluation - LREC'04, 2004, Lisbonne, Portugal, 4 p, 2004
Liste complète des métadonnées

https://hal.inria.fr/halshs-00005023
Contributeur : Laurent Romary <>
Soumis le : mardi 13 janvier 2009 - 11:18:10
Dernière modification le : jeudi 11 janvier 2018 - 06:24:27
Document(s) archivé(s) le : samedi 14 mai 2011 - 00:03:31

Fichiers

Identifiants

  • HAL Id : halshs-00005023, version 2

Citation

Andrei Popescu-Belis, Loïs Rigouste, Susanne Salmon-Alt, Laurent Romary. Online Evaluation of Coreference Resolution. 4th International Conference on Language Resources and Evaluation - LREC'04, 2004, Lisbonne, Portugal, 4 p, 2004. 〈halshs-00005023v2〉

Partager

Métriques

Consultations de la notice

330

Téléchargements de fichiers

81