Skip to Main content Skip to Navigation
Conference papers

Learning fast dictionaries for sparse representations using low-rank tensor decompositions

Abstract : A new dictionary learning model is introduced where the dictionary matrix is constrained as a sum of R Kronecker products of K terms. It offers a more compact representation and requires fewer training data than the general dictionary learning model, while generalizing Tucker dictionary learning. The proposed Higher Order Sum of Kroneckers model can be computed by merging dictionary learning approaches with the tensor Canonic Polyadic Decomposition. Experiments on image denoising illustrate the advantages of the proposed approach.
Document type :
Conference papers
Complete list of metadata

Cited literature [19 references]  Display  Hide  Download

https://hal.inria.fr/hal-01709343
Contributor : Cassio F. Dantas <>
Submitted on : Friday, April 27, 2018 - 5:25:51 PM
Last modification on : Thursday, January 7, 2021 - 4:19:35 PM

File

notes (1).pdf
Files produced by the author(s)

Identifiers

Citation

Cassio Fraga Dantas, Jérémy Cohen, Rémi Gribonval. Learning fast dictionaries for sparse representations using low-rank tensor decompositions. LVA/ICA 2018 - 14th International Conference on Latent Variable Analysis and Signal Separation, Jul 2018, Guildford, United Kingdom. pp.456-466, ⟨10.1007/978-3-319-93764-9_42⟩. ⟨hal-01709343v2⟩

Share

Metrics

Record views

1172

Files downloads

1704