A Random Matrix Perspective on Random Tensors - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue Journal of Machine Learning Research Année : 2022

A Random Matrix Perspective on Random Tensors

Résumé

Several machine learning problems such as latent variable model learning and community detection can be addressed by estimating a low-rank signal from a noisy tensor. Despite recent substantial progress on the fundamental limits of the corresponding estimators in the large-dimensional setting, some of the most significant results are based on spin glass theory, which is not easily accessible to non-experts. We propose a sharply distinct and more elementary approach, relying on tools from random matrix theory. The key idea is to study random matrices arising from contractions of a random tensor, which give access to its spectral properties. In particular, for a symmetric dth-order rank-one model with Gaussian noise, our approach yields a novel characterization of maximum likelihood (ML) estimation performance in terms of a fixed-point equation valid in the regime where weak recovery is possible. For d=3, the solution to this equation matches the existing results. We conjecture that the same holds for any order d, based on numerical evidence for d∈{4,5}. Moreover, our analysis illuminates certain properties of the large-dimensional ML landscape. Our approach can be extended to other models, including asymmetric and non-Gaussian ones.

Dates et versions

hal-03793940 , version 1 (02-10-2022)

Licence

Paternité

Identifiants

Citer

José Henrique de M Goulart, Romain Couillet, Pierre Comon. A Random Matrix Perspective on Random Tensors. Journal of Machine Learning Research, 2022, 23 (264), pp.1-36. ⟨hal-03793940⟩
108 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More