2025-07-25 –, David Lawrence Hall Room 107
This talk introduces GraphNeuralNetworks.jl, a Julia-based framework for deep learning on graphs. It supports both dense and sparse graphs, multiple GPU backends, and flexible manipulation of standard, heterogeneous, and temporal structures. The framework provides gather/scatter message-passing primitives for defining custom layers, along with a collection of standard layers for rapid prototyping. Real-world use cases and ongoing developments will also be discussed.
In this talk, I will introduce GraphNeuralNetworks.jl, an open-source framework for deep learning on graphs, developed in the Julia programming language. Designed for flexibility and performance, the library supports multiple GPU backends and works seamlessly with both sparse and dense graph representations. I’ll show how it enables intuitive manipulation of standard, heterogeneous, and temporal graphs, with full support for attributes at the node, edge, and graph levels. The framework allows users to define custom graph convolutional layers using gather/scatter message-passing primitives and includes a suite of popular layers for rapid experimentation. I will also highlight real-world use cases and ongoing developments within the ecosystem.