Bi-Weekly Talk: Eran Rosenbluth: Global Affairs
Mittwoch, 22.11.2023, 10.30 Uhr
Ort: RWTH Aachen University, Informatikzentrum - Ahornstr. 55, Erweiterungsgebäude E3, Raum 9u10
Vortragender: Eran Rosenbluth
Abstract:
Graph Neural Networks (GNNs) is a class of graph-processing computational models, extensively used in graph-learning tasks. The computation of a GNN comprises a sequence of local computations for each vertex in the graph.
Transformers is a class of sequence-processing computational models, extensively used in sequence-learning tasks i.e. text. The computation of a transformer comprises a sequence of global computations for each element in the sequence. At the heart of an element's global computation is a sophisticated averaging mechanism known as Attention.
Graph Transformers (GT) is a class of graph-processing computational models that combine GNN-style local computations and global-attention computations, for each vertex in the graph. Such a model can potentially describe phenomena that are characterized by properties that are a combination of local-structural and global information.
Another class of models that can describe local-global properties, augments GNNs with a simpler global computation: Global aggregation, a.k.a. Virtual Node (VN). Compared to GNN+Attention, the GNN+VN model requires less compute resources.
We compare the uniform expressivity of the two hybrid models, GNN+Attention and GNN+VN. We prove that neither model subsumes the other, and demonstrate the theoretical results with experiments.
Joint work with Jan Toenshoff, Martin Ritzert, and Martin Grohe