Expressive Power of Invariant and Equivariant Graph Neural Networks - Archive ouverte HAL Access content directly
Conference Papers Year :

Expressive Power of Invariant and Equivariant Graph Neural Networks

(1) , (1, 2)
1
2

Abstract

Various classes of Graph Neural Networks (GNN) have been proposed and shown to be successful in a wide range of applications with graph structured data. In this paper, we propose a theoretical framework able to compare the expressive power of these GNN architectures. The current universality theorems only apply to intractable classes of GNNs. Here, we prove the first approximation guarantees for practical GNNs, paving the way for a better understanding of their generalization. Our theoretical results are proved for invariant GNNs computing a graph embedding (permutation of the nodes of the input graph does not affect the output) and equivariant GNNs computing an embedding of the nodes (permutation of the input permutes the output). We show that Folklore Graph Neural Networks (FGNN), which are tensor based GNNs augmented with matrix multiplication are the most expressive architectures proposed so far for a given tensor order. We illustrate our results on the Quadratic Assignment Problem (a NP-Hard combinatorial problem) by showing that FGNNs are able to learn how to solve the problem, leading to much better average performances than existing algorithms (based on spectral, SDP or other GNNs architectures). On a practical side, we also implement masked tensors to handle batches of graphs of varying sizes.

Dates and versions

hal-03464024 , version 1 (02-12-2021)

Identifiers

Cite

Waïss Azizian, Marc Lelarge. Expressive Power of Invariant and Equivariant Graph Neural Networks. ICLR 2021 - International Conference on Learning Representations, May 2021, Virtual, Unknown Region. ⟨hal-03464024⟩
38 View
0 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More