The recent DGL 0.5 release is a major update on many aspects of the project including documentation, APIs, system speed and scalability. This article highlights some of the new features...
Build your models with PyTorch, TensorFlow or MXNet.
DGL adopts advanced optimization techniques like kernel fusion, multi-thread and multi-process acceleration, and
automatic sparse format tuning. Compared with other popular GNN frameworks such as PyTorch Geometric, DGL is both
faster and more memory-friendly.
Ecosystem of Domain specific toolkits
DGL supports a variety of domains. DGL-KE is an easy-to-use and highly scalable package for learning
large-scale knowledge graph embeddings. DGL-LifeSci is a specialized package for applications
in bioinformatics and cheminformatics powered by graph neural networks.
Keep track of what's new in DGL, such as important bug fixes, new features, new releases, etc.
Brought to you by NYU, NYU-Shanghai, and Amazon AWS.
NYU Professor, Director of Facebook AI Lab
By far the cleanest and most elegant library for graph neural networks in PyTorch. Highly recommended! Unifies Capsule Nets (GNNs on bipartite graphs)
and Transformers (GCNs with attention on fully-connected graphs) in a single API.
Inventor of Graph Convolutional Network
I taught my students Deep Graph Library (DGL) in my lecture on "Graph Neural Networks" today. It is a great resource to develop GNNs with PyTorch.