Build your models with PyTorch, TensorFlow or Apache MXNet.
Fast and memory-efficient message passing primitives for training Graph Neural Networks. Scale to giant graphs via multi-GPU acceleration and distributed training infrastructure.
DGL empowers a variety of domain-specific projects including DGL-KE for learning large-scale knowledge graph embeddings, DGL-LifeSci for bioinformatics and cheminformatics, and many others.
We are thrilled to announce the arrival of DGL 1.0, a significant milestone of the past 3+ years of development.
As Graph Neural Networks (GNNs) has become increasingly popular, there is a wide interest of designing deeper GNN architecture. However, deep GNNs suffer from the oversmoothing issue where the learnt...
Check out how DGL v0.9.1 helps users partition graphs of billions of nodes and edges.
Check out the highlighted features of the new 0.9 release!
Deep learning on graphs is very new direction. We use blogs to introduce new ideas and researches of this area and explains how DGL can support them very easily.
Read All BlogsGot questions? Interested in contributing? or simply want to know what others are playing with? Use our forum for all kinds of discussion.
Visit Our ForumBrought to you by NYU, NYU-Shanghai, and Amazon AWS.
By far the cleanest and most elegant library for graph neural networks in PyTorch. Highly recommended! Unifies Capsule Nets (GNNs on bipartite graphs) and Transformers (GCNs with attention on fully-connected graphs) in a single API.
I taught my students Deep Graph Library (DGL) in my lecture on "Graph Neural Networks" today. It is a great resource to develop GNNs with PyTorch.