.. _sphx_glr_tutorials_models_2_small_graph: .. _tutorials2-index: Batching many small graphs ------------------------------- * **Tree-LSTM** `[paper] `__ `[tutorial] <2_small_graph/3_tree-lstm.html>`__ `[PyTorch code] `__: Sentences have inherent structures that are thrown away by treating them simply as sequences. Tree-LSTM is a powerful model that learns the representation by using prior syntactic structures such as a parse-tree. The challenge in training is that simply by padding a sentence to the maximum length no longer works. Trees of different sentences have different sizes and topologies. DGL solves this problem by adding the trees to a bigger container graph, and then using message-passing to explore maximum parallelism. Batching is a key API for this. .. raw:: html
.. thumbnail-parent-div-open .. raw:: html
.. only:: html .. image:: /tutorials/models/2_small_graph/images/thumb/sphx_glr_3_tree-lstm_thumb.png :alt: :ref:`sphx_glr_tutorials_models_2_small_graph_3_tree-lstm.py` .. raw:: html
Tree-LSTM in DGL
.. thumbnail-parent-div-close .. raw:: html
.. toctree:: :hidden: /tutorials/models/2_small_graph/3_tree-lstm