Aaron Orenstein
|
0b2a3687b9
|
PEP585 update - torch/fx (#145166)
See #145101 for details.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/145166
Approved by: https://github.com/bobrenjc93
|
2025-01-20 18:11:54 +00:00 |
|
Xuehai Pan
|
abbd71d29d
|
[BE][Easy] enable PYFMT for torch.fx (#138443)
Reproduce command:
```bash
ghstack checkout https://github.com/pytorch/pytorch/pull/138443
git checkout HEAD~1 torch/
lintrunner -a --take "PYFMT" --all-files
```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/138443
Approved by: https://github.com/ezyang
|
2024-10-21 19:15:49 +00:00 |
|
Aaron Orenstein
|
ed86ac2f25
|
[BE] typing for decorators - fx/_compatibility (#134054)
Summary: See #131429
Test Plan: unit tests pass
Differential Revision: D61493706
Pull Request resolved: https://github.com/pytorch/pytorch/pull/134054
Approved by: https://github.com/oulgen
|
2024-08-26 04:00:27 +00:00 |
|
PyTorch MergeBot
|
945bf78894
|
Revert "[BE] typing for decorators - fx/_compatibility (#131568)"
This reverts commit 193f62fde9.
Reverted https://github.com/pytorch/pytorch/pull/131568 on behalf of https://github.com/clee2000 due to same as https://github.com/pytorch/pytorch/pull/131572#issuecomment-2254328359 but I clicked the wrong link by accident. This is where it actually starts ([comment](https://github.com/pytorch/pytorch/pull/131568#issuecomment-2254330781))
|
2024-07-28 03:43:39 +00:00 |
|
Aaron Orenstein
|
193f62fde9
|
[BE] typing for decorators - fx/_compatibility (#131568)
See #131429
Pull Request resolved: https://github.com/pytorch/pytorch/pull/131568
Approved by: https://github.com/justinchuby, https://github.com/oulgen, https://github.com/zou3519
|
2024-07-25 22:24:19 +00:00 |
|
Aaron Orenstein
|
5a0068cc69
|
[BE] mypy: disallow untyped decorators (#131428)
Untyped decorators strip the types from their decorated function so even if the underlying function is fully typed then callers to it don't get any benefit from type annotations.
Step 1 - Enable the error and override in all the offending files.
#131429
Pull Request resolved: https://github.com/pytorch/pytorch/pull/131428
Approved by: https://github.com/justinchuby, https://github.com/oulgen
|
2024-07-23 21:50:55 +00:00 |
|
anjali411
|
4bf076e964
|
Add __all__ to torch.distributed, futures, fx, nn, package, benchmark submodules (#80520)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80520
Approved by: https://github.com/rohan-varma
|
2022-07-08 14:31:24 +00:00 |
|
PyTorch MergeBot
|
58532256e9
|
Revert "Add __all__ for torch.distributed and fx modules (#80460)"
This reverts commit 5d40c3d5c8.
Reverted https://github.com/pytorch/pytorch/pull/80460 on behalf of https://github.com/malfet due to Broke MacOS testing, see https://github.com/pytorch/pytorch/runs/7105579664?check_suite_focus=true
|
2022-06-29 16:20:55 +00:00 |
|
anjali411
|
5d40c3d5c8
|
Add __all__ for torch.distributed and fx modules (#80460)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80460
Approved by: https://github.com/albanD, https://github.com/rohan-varma
|
2022-06-29 02:53:56 +00:00 |
|
James Reed
|
0559cb37cd
|
[FX] Ensure BC coverage for all of torch.fx.passes (#65081)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/65081
Test Plan: Imported from OSS
Reviewed By: jbschlosser, khabinov
Differential Revision: D30967428
Pulled By: jamesr66a
fbshipit-source-id: 2ff83da728dc469f086cf504e71b43396db612d8
|
2021-09-17 09:32:43 -07:00 |
|
James Reed
|
cf7409e184
|
[FX] Move graph_manipulation and param_fetch out of experimental and into passes (#65183)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/65183
ghstack-source-id: 138309655
Test Plan: waitforsadcastle
Reviewed By: protonu
Differential Revision: D31007630
fbshipit-source-id: 77d14b284737aabbe2b9e6394177a0c2e40aafba
|
2021-09-17 09:32:40 -07:00 |
|