# Enlarged Krylov Subspace Conjugate Gradient Methods for Reducing Communication

1 ALPINES - Algorithms and parallel tools for integrated numerical simulations
LJLL - Laboratoire Jacques-Louis Lions, Inria Paris-Rocquencourt, Institut National des Sciences Mathématiques et de leurs Interactions
Abstract : In this paper we introduce a new approach for reducing communication in Krylov subspace methods that consists of enlarging the Krylov subspace by a maximum of $t$ vectors per iteration, based on a domain decomposition of the graph of $A$. The obtained enlarged Krylov subspace $\mathscr{K}_{k,t}(A,r_0)$ is a superset of the Krylov subspace $\mathcal{K}_k(A,r_0)$, $\mathcal{K}_k(A,r_0) \subset \mathscr{K}_{k,t}(A,r_0)$. Thus, we search for the solution of the system $Ax=b$ in $\mathscr{K}_{k,t}(A,r_0)$ instead of $\mathcal{K}_k(A,r_0)$. Moreover, we show in this paper that the enlarged Krylov projection subspace methods lead to faster convergence in terms of iterations and parallelizable algorithms with less communication, with respect to Krylov methods.
Type de document :
Article dans une revue
SIAM Journal on Matrix Analysis and Applications, Society for Industrial and Applied Mathematics, 2016, 37 (2), pp.744-773. 〈10.1137/140989492〉

https://hal.inria.fr/hal-01357899
Contributeur : Laura Grigori <>
Soumis le : mardi 30 août 2016 - 15:48:53
Dernière modification le : vendredi 25 mai 2018 - 12:02:06

### Citation

Laura Grigori, Sophie Moufawad, Frédéric Nataf. Enlarged Krylov Subspace Conjugate Gradient Methods for Reducing Communication. SIAM Journal on Matrix Analysis and Applications, Society for Industrial and Applied Mathematics, 2016, 37 (2), pp.744-773. 〈10.1137/140989492〉. 〈hal-01357899〉

### Métriques

Consultations de la notice