Seeking universal approximation for continuous counterparts of GNNs on large random graphs
- Others:
- GIPSA Pôle Géométrie, Apprentissage, Information et Algorithmes (GIPSA-GAIA) ; Grenoble Images Parole Signal Automatique (GIPSA-lab) ; Centre National de la Recherche Scientifique (CNRS)-Université Grenoble Alpes (UGA)-Institut polytechnique de Grenoble - Grenoble Institute of Technology (Grenoble INP) ; Université Grenoble Alpes (UGA)-Centre National de la Recherche Scientifique (CNRS)-Université Grenoble Alpes (UGA)-Institut polytechnique de Grenoble - Grenoble Institute of Technology (Grenoble INP) ; Université Grenoble Alpes (UGA)
- Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA) ; Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes) ; Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-Institut National de Recherche en Informatique et en Automatique (Inria)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique) ; Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)
- Laboratoire Jean Alexandre Dieudonné (LJAD) ; Université Nice Sophia Antipolis (1965 - 2019) (UNS)-Centre National de la Recherche Scientifique (CNRS)-Université Côte d'Azur (UniCA)
- ANR-21-CE48-0009,GRANOLA,Processus ponctuels déterminantaux sur graphes pour l'algèbre linéaire numérique randomisée(2021)
Description
We explore the expressive power of Graph Neural Networks (GNNs) in the context of large random graphs. In this document, we focus on Spectral GNNs, defined over random graphs drawn from a latent space Random Graph Model (RGM). Definition 1 (GNN). Let G be a graph with adjacency matrix A. A GNN Φ G over G, is a nonlinear map from the set of graph signals over G into itself. It is composed of successive operations among the following. (i) Graph filtering by F (A), where F is any continuous real valued function over the spectrum of A. (ii) Composition by smooth functions, typically Multi Layers Perceptrons (MLPs). (iii) Computation of graph properties: norm, mean, etc. Definition 2 (Latent space RGM). A latent space is a probability space (X , P ) where X ⊂ R d is compact. A latent space RGM is a triplet (W, X , P ), where W :
A graph drawn from this model has n nodes X 1 , . . . , X n i.i.d. from P , which are pairwise connected with probability W (X i , X j ) .
The work [1] introduces a notion of limit of the GNN over large random graphs, which is called the continuous Graph Neural Network (cGNN). This limit cGNN, denoted as Φ W processes functions over the latent space X whereas the GNN treats graph signals. Remark that, a graph signal over a random graph G n drawn from the RGM of Def. 2, can be seen as the sampling of some function f over X , since the nodes are n point i.i.d. in X . Informally, Φ W is the "continuous counterpart" of the GNN over large random graphs, i.e. , Φ Gn ≈ Φ W when n is large. Formally, let us introduce ι n , the evaluation operator at points X 1 , . . . , X n i.i.d. P . Then, Φ W is such that for any smooth 1 function f . Φ Gn (ι n f ) -→ n→∞ ι n Φ W (f ) .
(1)
Abstract
International audience
Additional details
- URL
- https://hal.science/hal-04728922
- URN
- urn:oai:HAL:hal-04728922v1
- Origin repository
- UNICA