Skip to Main content Skip to Navigation
Journal articles

Enlarged Krylov Subspace Conjugate Gradient Methods for Reducing Communication

Laura Grigori 1 Sophie Moufawad 2 Frédéric Nataf 2
1 ALPINES - Algorithms and parallel tools for integrated numerical simulations
LJLL - Laboratoire Jacques-Louis Lions, Inria Paris-Rocquencourt, INSMI - Institut National des Sciences Mathématiques et de leurs Interactions
Abstract : In this paper we introduce a new approach for reducing communication in Krylov subspace methods that consists of enlarging the Krylov subspace by a maximum of $t$ vectors per iteration, based on a domain decomposition of the graph of $A$. The obtained enlarged Krylov subspace $\mathscr{K}_{k,t}(A,r_0)$ is a superset of the Krylov subspace $\mathcal{K}_k(A,r_0)$, $\mathcal{K}_k(A,r_0) \subset \mathscr{K}_{k,t}(A,r_0)$. Thus, we search for the solution of the system $Ax=b$ in $\mathscr{K}_{k,t}(A,r_0)$ instead of $\mathcal{K}_k(A,r_0)$. Moreover, we show in this paper that the enlarged Krylov projection subspace methods lead to faster convergence in terms of iterations and parallelizable algorithms with less communication, with respect to Krylov methods.
Complete list of metadatas
Contributor : Laura Grigori <>
Submitted on : Tuesday, August 30, 2016 - 3:48:53 PM
Last modification on : Tuesday, June 16, 2020 - 11:16:09 AM

Links full text



Laura Grigori, Sophie Moufawad, Frédéric Nataf. Enlarged Krylov Subspace Conjugate Gradient Methods for Reducing Communication. SIAM Journal on Matrix Analysis and Applications, Society for Industrial and Applied Mathematics, 2016, 37 (2), pp.744-773. ⟨10.1137/140989492⟩. ⟨hal-01357899⟩



Record views