PyTorch MergeBot
033e733021
Revert "[BE] wrap deprecated function/class with typing_extensions.deprecated ( #126898 )"
...
This reverts commit 749a132fb0 .
Reverted https://github.com/pytorch/pytorch/pull/126898 on behalf of https://github.com/fbgheith due to switching typing-extensions=4.3.0 to 4.9.0 causes internal failure ([comment](https://github.com/pytorch/pytorch/pull/126898#issuecomment-2142884456 ))
2024-05-31 19:47:24 +00:00
Xuehai Pan
749a132fb0
[BE] wrap deprecated function/class with typing_extensions.deprecated ( #126898 )
...
Use `typing_extensions.deprecated` for deprecation annotation if possible. Otherwise, add `category=FutureWarning` to `warnings.warn("message")` if the category is missing.
Note that only warnings that their messages contain `[Dd]eprecat(ed|ion)` are updated in this PR.
UPDATE: Use `FutureWarning` instead of `DeprecationWarning`.
Resolves #126888
- #126888
Pull Request resolved: https://github.com/pytorch/pytorch/pull/126898
Approved by: https://github.com/albanD
2024-05-29 12:09:27 +00:00
Oguz Ulgen
52bcf120e5
Make inductor config hashing more portable ( #127022 )
...
Summary: masnesral and I noticed that config contains non portable artifacts. Lets fix that.
Test Plan: adhoc testing
Differential Revision: D57748025
Pull Request resolved: https://github.com/pytorch/pytorch/pull/127022
Approved by: https://github.com/masnesral
2024-05-25 03:01:33 +00:00
Edward Z. Yang
46712b019d
Enable local_partial_types ( #118467 )
...
When using dmypy, this setting is enabled and cannot be turned off. Force it for regular mypy too.
Signed-off-by: Edward Z. Yang <ezyang@meta.com>
Pull Request resolved: https://github.com/pytorch/pytorch/pull/118467
Approved by: https://github.com/Skylion007
ghstack dependencies: #118414 , #118418 , #118432
2024-01-28 13:38:22 +00:00
Jason Ansel
e5e9f390be
[dynamo] Optimize overheads from _TorchDynamoContext ( #118070 )
...
Based on `python benchmarks/dynamo/microbenchmarks/overheads.py`:
- Before `18.1us`
- After `12.2us`
Pull Request resolved: https://github.com/pytorch/pytorch/pull/118070
Approved by: https://github.com/yanboliang , https://github.com/anijain2305
ghstack dependencies: #118065
2024-01-25 05:04:56 +00:00
David Berard
5c0976fa04
Revert "[dynamo] guarded config ( #111299 )" ( #115386 )
...
This reverts commit 5927e9cbf2 .
Differential Revision: [D51959266](https://our.internmc.facebook.com/intern/diff/D51959266 )
Pull Request resolved: https://github.com/pytorch/pytorch/pull/115386
Approved by: https://github.com/yanboliang , https://github.com/malfet
ghstack dependencies: #115384 , #115401 , #115385
2023-12-11 19:35:42 +00:00
Jon Chuang
00b67193ef
[utils] move config_typing.pyi to torch.utils ( #113929 )
...
Pull Request resolved: https://github.com/pytorch/pytorch/pull/113929
Approved by: https://github.com/ezyang , https://github.com/jansel
ghstack dependencies: #111299 , #111300 , #113901 , #113916
2023-11-17 18:51:57 +00:00
Jon Chuang
5927e9cbf2
[dynamo] guarded config ( #111299 )
...
---
Fixes: https://github.com/pytorch/pytorch/issues/110682
Replaces: https://github.com/pytorch/pytorch/pull/111074
The guards are installed based on config that is valid at the call to `torch.compile`, rather than at any subsequent call / triggered compilation. Subsequent compilations will restore the config if there is a config mismatch of the existing global config with the saved config.
TODO:
- [X] add tests
Follow up PRs:
- [x] add revised cache size computation (follow up PR: #111300 , based on: https://github.com/pytorch/pytorch/pull/107496 )
- [ ] handle run-only mode?
- [ ] config restoration itself is not thread-safe (tracked: https://github.com/pytorch/pytorch/issues/111150 )
Pull Request resolved: https://github.com/pytorch/pytorch/pull/111299
Approved by: https://github.com/ezyang
2023-11-17 09:59:58 +00:00
Jez Ng
dc63248b76
Make dynamo configs more amenable to static type checking ( #112130 )
...
`install_config_module` makes a regular module into a ConfigModule with
extra methods defined on it. mypy thinks those extra methods (or module
functions) are undefined since it cannot analyze something so
dynamic. As a workaround, I've created a fake module that defines these
extra functions, which I import into the config modules during type
checking.
As part of this change, I've also added more types to config_utils.py
and enabled typechecking for torch/_dynamo/config.py.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/112130
Approved by: https://github.com/jansel
2023-11-08 21:17:45 +00:00
Peter Bell
65ecb36621
Move ShapeEnv config out of dynamo ( #112933 )
...
Previously there was a circular dependency between fx and dynamo that happened
to work out since ShapeEnv didn't access the config at module init time.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/112933
Approved by: https://github.com/ezyang
2023-11-07 01:10:25 +00:00