anjali411
|
4bf076e964
|
Add __all__ to torch.distributed, futures, fx, nn, package, benchmark submodules (#80520)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80520
Approved by: https://github.com/rohan-varma
|
2022-07-08 14:31:24 +00:00 |
|
PyTorch MergeBot
|
58532256e9
|
Revert "Add __all__ for torch.distributed and fx modules (#80460)"
This reverts commit 5d40c3d5c8.
Reverted https://github.com/pytorch/pytorch/pull/80460 on behalf of https://github.com/malfet due to Broke MacOS testing, see https://github.com/pytorch/pytorch/runs/7105579664?check_suite_focus=true
|
2022-06-29 16:20:55 +00:00 |
|
anjali411
|
5d40c3d5c8
|
Add __all__ for torch.distributed and fx modules (#80460)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80460
Approved by: https://github.com/albanD, https://github.com/rohan-varma
|
2022-06-29 02:53:56 +00:00 |
|
James Reed
|
0559cb37cd
|
[FX] Ensure BC coverage for all of torch.fx.passes (#65081)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/65081
Test Plan: Imported from OSS
Reviewed By: jbschlosser, khabinov
Differential Revision: D30967428
Pulled By: jamesr66a
fbshipit-source-id: 2ff83da728dc469f086cf504e71b43396db612d8
|
2021-09-17 09:32:43 -07:00 |
|
James Reed
|
cf7409e184
|
[FX] Move graph_manipulation and param_fetch out of experimental and into passes (#65183)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/65183
ghstack-source-id: 138309655
Test Plan: waitforsadcastle
Reviewed By: protonu
Differential Revision: D31007630
fbshipit-source-id: 77d14b284737aabbe2b9e6394177a0c2e40aafba
|
2021-09-17 09:32:40 -07:00 |
|