mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 00:21:07 +01:00
I think this makes sense to do? Otherwise, if you call `backward()` in your traced function, you can't get gradients out of any tensors that should have been leaves. Pull Request resolved: https://github.com/pytorch/pytorch/pull/77474 Approved by: https://github.com/ezyang |
||
|---|---|---|
| .. | ||
| experimental | ||
| passes | ||
| __init__.py | ||
| __init__.pyi | ||
| _compatibility.py | ||
| _pytree.py | ||
| _symbolic_trace.py | ||
| annotate.py | ||
| graph_module.py | ||
| graph.py | ||
| immutable_collections.py | ||
| interpreter.py | ||
| node.py | ||
| operator_schemas.py | ||
| OVERVIEW.md | ||
| proxy.py | ||
| subgraph_rewriter.py | ||
| tensor_type.py | ||