On Enabling Layer-Parallelism for Graph Neural Networks using IMEX Integration
Abstract
Graph Neural Networks (GNNs) are a type of neural networks designed to perform machine
learning tasks with graph data. Recently, there have been several works to train
differential equation-inspired GNN architectures, which are suitable for robust training
when equipped with a relatively large number of layers. Neural networks with more layers
are potentially more expressive. However, the training time increases linearly with the
number of layers. Parallel-in-layer training is a method that was developed to overcome
the increase in training time of deeper networks and was first applied to training residual
networks. In this thesis, we first give an overview of existing works on layer-parallel training
and graph neural networks inspired by differential equations. We then discuss issues
that are encountered when these graph neural network architectures are trained parallel-in-
layer and propose solutions to address these issues. Finally, we present and evaluate
experimental results about layer-parallel GNN training using the proposed approach.
Collections
Cite this version of the work
Omer Ege Kara
(2024).
On Enabling Layer-Parallelism for Graph Neural Networks using IMEX Integration. UWSpace.
http://hdl.handle.net/10012/20673
Other formats