pytorch/torch/csrc/distributed/autograd
Pritam Damania 6e43f0db8b Use correct signatures for METH_NOARGS. (#45528)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/45528

As described in https://github.com/pytorch/pytorch/issues/45419,
resolving a bunch of cpython signature issues.

#Closes: https://github.com/pytorch/pytorch/issues/45419
ghstack-source-id: 113385726

Test Plan: sentinel

Reviewed By: albanD

Differential Revision: D24000626

fbshipit-source-id: d334596f1f0256063691aa044c8fb2face260817
2020-10-02 10:43:58 -07:00
..
context Replace FutureMessage with c10::ivalue::Future in DistEngine. (#44239) 2020-09-11 01:03:42 -07:00
engine Replace FutureMessage with c10::ivalue::Future in DistEngine. (#44239) 2020-09-11 01:03:42 -07:00
functions Distributed Autograd: Allow multiple backward passes to accumulate gradients. (#32506) 2020-02-06 23:27:21 -08:00
rpc_messages Add prefix of remote events for RPC profiling (#40066) 2020-06-22 11:01:07 -07:00
autograd.cpp [dist_autograd] expose distributed backward C++ API (#38656) 2020-06-08 19:42:21 -07:00
autograd.h [dist_autograd] expose distributed backward C++ API (#38656) 2020-06-08 19:42:21 -07:00
init.cpp Use correct signatures for METH_NOARGS. (#45528) 2020-10-02 10:43:58 -07:00
python_autograd.h [dist_autograd] expose distributed backward C++ API (#38656) 2020-06-08 19:42:21 -07:00
utils.cpp [RPC profiling] Don't wrap toHere() calls with profiling (#44655) 2020-09-22 21:17:00 -07:00
utils.h [RPC profiling] Don't wrap toHere() calls with profiling (#44655) 2020-09-22 21:17:00 -07:00