SiGMa: Simple Greedy Matching for Aligning Large Knowledge Bases

Simon Lacoste-Julien 1, 2 Konstantina Palla 3 Alex Davies 3 Gjergji Kasneci 4 Thore Graepel 5 Zoubin Ghahramani 3
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, ENS Paris - École normale supérieure - Paris, Inria Paris-Rocquencourt, CNRS - Centre National de la Recherche Scientifique : UMR8548
Abstract : The Internet has enabled the creation of a growing number of large-scale knowledge bases in a variety of domains containing complementary information. Tools for automatically aligning these knowledge bases would make it possible to unify many sources of structured knowledge and answer complex queries. However, the efficient alignment of large-scale knowledge bases still poses a considerable challenge. Here, we present Simple Greedy Matching (SiGMa), a simple algorithm for aligning knowledge bases with millions of entities and facts. SiGMa is an iterative propagation algorithm which leverages both the structural information from the relationship graph as well as flexible similarity measures between entity properties in a greedy local search, thus making it scalable. Despite its greedy nature, our experiments indicate that SiGMa can efficiently match some of the world's largest knowledge bases with high precision. We provide additional experiments on benchmark datasets which demonstrate that SiGMa can outperform state-of-the-art approaches both in accuracy and efficiency.
Type de document :
Pré-publication, Document de travail
10 pages + 2 pages appendix; 5 figures -- initial preprint. 2012
Liste complète des métadonnées
Contributeur : Simon Lacoste-Julien <>
Soumis le : jeudi 20 décembre 2012 - 23:12:03
Dernière modification le : jeudi 11 janvier 2018 - 06:23:26

Lien texte intégral


  • HAL Id : hal-00768180, version 1
  • ARXIV : 1207.4525



Simon Lacoste-Julien, Konstantina Palla, Alex Davies, Gjergji Kasneci, Thore Graepel, et al.. SiGMa: Simple Greedy Matching for Aligning Large Knowledge Bases. 10 pages + 2 pages appendix; 5 figures -- initial preprint. 2012. 〈hal-00768180〉



Consultations de la notice