- tags
- Neural networks, Graphs

## Basic properties

To operate on graphs, a neural network must be invariant to isomorphism of these graphs. This translates to permutation invariance for the nodes of a graph.

\[ f(\mathbf{PX}) = f(\mathbf{X}) \]

Where \(\mathbf{P}\) is a permutation matrix. For simple sets, this amounts to performing node-wise transformations and use a permutation invariant aggregator (sum/max/avg/…). This was done in (Zaheer et al. 2018).

\[ f(\mathbf{X}) = \phi\left( \bigoplus_i \psi(\mathbf{x}_i) \right) \]

In the case of GNNs, the local neighborhood of a node can also be defined, and therefore the neural network can be written:

\[ f(\mathbf{X}, \mathbf{A}) = \begin{bmatrix} — g(\mathbf{x}_1 , \mathbf{X}_{\mathcal{N}_1} ) — \\\

— g(\mathbf{x}_2 , \mathbf{X}_{\mathcal{N}_2} ) — \\\

\vdots\\\

— g(\mathbf{x}_n , \mathbf{X}_{\mathcal{N}_n} ) — \end{bmatrix} \]

where \(\mathbf{A}\) is the adjacency matrix of the graph. The function \(g\) also need to be permutation invariant to ensure the permutation invariance of \(f\).

The various ways of defining \(g\) give rise to three main flavours of GNNs:

## Bibliography

Zaheer, Manzil, Satwik Kottur, Siamak Ravanbakhsh, Barnabas Poczos, Ruslan Salakhutdinov, and Alexander Smola. April 2018. “Deep Sets”. *arXiv:1703.06114 [Cs, Stat]*, April.