Transfer learning and distant supervision for multilingual Transformer models: A study on African languages - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Transfer learning and distant supervision for multilingual Transformer models: A study on African languages

Résumé

Multilingual transformer models like mBERT and XLM-RoBERTa have obtained great improvements for many NLP tasks on a variety of languages. However, recent works also showed that results from high-resource languages could not be easily transferred to realistic, low-resource scenarios. In this work, we study trends in performance for different amounts of available resources for the three African languages Hausa, isiXhosa and Yorùbá on both NER and topic classification. We show that in combination with transfer learning or distant supervision, these models can achieve with as little as 10 or 100 labeled sentences the same performance as baselines with much more supervised training data. However, we also find settings where this does not hold. Our discussions and additional experiments on assumptions such as time and hardware restrictions highlight challenges and opportunities in low-resource learning.
Fichier principal
Vignette du fichier
hedderich_EMNLP2020.pdf (378.74 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03350901 , version 1 (21-09-2021)

Identifiants

  • HAL Id : hal-03350901 , version 1

Citer

Michael A Hedderich, David I Adelani, Dawei Zhu, Jesujoba Alabi, Udia Markus, et al.. Transfer learning and distant supervision for multilingual Transformer models: A study on African languages. 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), Nov 2020, Punta Cana, Dominica. ⟨hal-03350901⟩

Collections

GENCI
17 Consultations
144 Téléchargements

Partager

Gmail Facebook X LinkedIn More