Bi-Weekly Talk: Eran Rosenbluth: Global Affairs
Wednesday, November 22, 2023, 10:30am
Location: RWTH Aachen University, Department of Computer Science - Ahornstr. 55, building E3, room 9u10
Speaker: Eran Rosenbluth
Graph Neural Networks (GNNs) is a class of graph-processing computational models, extensively used in graph-learning tasks. The computation of a GNN comprises a sequence of local computations for each vertex in the graph.
Transformers is a class of sequence-processing computational models, extensively used in sequence-learning tasks i.e. text. The computation of a transformer comprises a sequence of global computations for each element in the sequence. At the heart of an element's global computation is a sophisticated averaging mechanism known as Attention.
Graph Transformers (GT) is a class of graph-processing computational models that combine GNN-style local computations and global-attention computations, for each vertex in the graph. Such a model can potentially describe phenomena that are characterized by properties that are a combination of local-structural and global information.
Another class of models that can describe local-global properties, augments GNNs with a simpler global computation: Global aggregation, a.k.a. Virtual Node (VN). Compared to GNN+Attention, the GNN+VN model requires less compute resources.
We compare the uniform expressivity of the two hybrid models, GNN+Attention and GNN+VN. We prove that neither model subsumes the other, and demonstrate the theoretical results with experiments.
Joint work with Jan Toenshoff, Martin Ritzert, and Martin Grohe