diff --git a/beginner_source/basics/autogradqs_tutorial.py b/beginner_source/basics/autogradqs_tutorial.py index 8eff127ddee..2753103eaa8 100644 --- a/beginner_source/basics/autogradqs_tutorial.py +++ b/beginner_source/basics/autogradqs_tutorial.py @@ -133,7 +133,8 @@ # - To mark some parameters in your neural network as **frozen parameters**. # - To **speed up computations** when you are only doing forward pass, because computations on tensors that do # not track gradients would be more efficient. - +# See this `note `_ +# for additional reference. ###################################################################### @@ -160,6 +161,16 @@ # - accumulates them in the respective tensor’s ``.grad`` attribute # - using the chain rule, propagates all the way to the leaf tensors. # +# To get a sense of what this computational graph looks like we can use the following tools: +# +#1. torchviz is a package to visualize computational graphs. +# See the repository here: `https://github.com/szagoruyko/pytorchviz `_ +# +#2. Setting ``TORCH_LOGS="+autograd"`` enables logging for the backward pass. See details in this +# discussion: `https://dev-discuss.pytorch.org/t/highlighting-a-few-recent-autograd-features-h2-2023/1787 `_ +# +# +# # .. note:: # **DAGs are dynamic in PyTorch** # An important thing to note is that the graph is recreated from scratch; after each