Commit Graph

7 Commits

Author SHA1 Message Date
Bugra Akyildiz
27c7158166 Remove __future__ imports for legacy Python2 supports (#45033)
Summary:
There is a module called `2to3` which you can target for future specifically to remove these, the directory of `caffe2` has the most redundant imports:

```2to3 -f future -w caffe2```

Pull Request resolved: https://github.com/pytorch/pytorch/pull/45033

Reviewed By: seemethere

Differential Revision: D23808648

Pulled By: bugra

fbshipit-source-id: 38971900f0fe43ab44a9168e57f2307580d36a38
2020-09-23 17:57:02 -07:00
Jerry Zhang
d3742603cb DeviceScope support for CUDA and testing (#15357)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/15357

Supporting device option in FQ bn folding for ITER related ops

Reviewed By: wat3rBro

Differential Revision: D13370259

fbshipit-source-id: 4324c2716dfa69ddedc661ae2b1ad34c2f6fc4b6
2019-01-30 18:42:12 -08:00
Yiming Wu
a1494efdfa fix auto grad summing for IfOp where intermediate output needs renaming (#14772)
Summary:
fix auto grad summing for IfOp where intermediate output needs renaming.

Bug before this diff:
- we only renames the output of IfOp without changing the subnet ops output
- this results in blob not found error

the unittest provides an example
this diff fix that for IfOp
Pull Request resolved: https://github.com/pytorch/pytorch/pull/14772

Differential Revision: D13327090

Pulled By: harouwu

fbshipit-source-id: ec40ee88526ace3619c54551e223dd71158a02f8
2018-12-09 08:26:46 -08:00
Orion Reblitz-Richardson
1d5780d42c Remove Apache headers from source.
* LICENSE file contains details, so removing from individual source files.
2018-03-27 13:10:18 -07:00
Ilia Cherniavskii
d28720b90a Backpropagation for While op
Summary: Adds support for backprop to While op, fixes gradient computation for Pow

Reviewed By: azzolini

Differential Revision: D6456875

fbshipit-source-id: 9f660317ad6f3898ff7d8ce43098f85c3426409b
2017-12-18 16:03:45 -08:00
Yangqing Jia
8286ce1e3a Re-license to Apache
Summary: Closes https://github.com/caffe2/caffe2/pull/1260

Differential Revision: D5906739

Pulled By: Yangqing

fbshipit-source-id: e482ba9ba60b5337d9165f28f7ec68d4518a0902
2017-09-28 16:22:00 -07:00
Ilia Cherniavskii
f8f5e79f5f Backpropagation for If operator
Summary:
Adding backward pass support for If operator:
 - Implemented necessary changes to Do operator and generation of gradient Do operator to properly forward gradient blobs in and out of subnet
 - Using WorkspaceManager to keep track of workspaces used by Do, in case we need to have access to local blobs to compute gradients (also important for loop's backprop)
 - Update to Workspace to handle blob binding from multiple parent workspaces
 - Implemented generation of gradient If operator
 - Unit test to build and train a net with If control op

Reviewed By: azzolini

Differential Revision: D5745096

fbshipit-source-id: 1023c90a2113716254424d1e50b9e560fe9083e5
2017-09-18 16:17:42 -07:00