Rethinking Language Models within the Framework of Dynamic Bayesian Networks - Archive ouverte HAL Access content directly
Conference Papers Year : 2005

Rethinking Language Models within the Framework of Dynamic Bayesian Networks

(1) , (1) , (1)
1

Abstract

We present a new approach for language modeling based on dynamic Bayesian networks. The philosophy behind this architecture is to learn from data the appropriate relations of dependency between the linguistic variables used in language modeling process. It is an original and coherent framework that processes words and classes in the same model. This approach leads to new data-driven language models capable of outperforming classical ones, sometimes with lower computational complexity. We present experiments on a small and medium corpora. The results show that this new technique is very promising and deserves further investigations.
Fichier principal
Vignette du fichier
RethinkingNotre (1).pdf (172.42 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

inria-00000449 , version 1 (21-11-2017)

Identifiers

  • HAL Id : inria-00000449 , version 1

Cite

Murat Deviren, Khalid Daoudi, Kamel Smaïli. Rethinking Language Models within the Framework of Dynamic Bayesian Networks. 18th Conference of the Canadian Society for Computational Studies of Intelligence, Canadian AI 2005, May 2005, Victoria, Canada. pp.432-437. ⟨inria-00000449⟩
43 View
111 Download

Share

Gmail Facebook Twitter LinkedIn More