pytorch/torch/distributed/tensor/parallel
zeshengzong cb71bcc542 Replace clone.detach with detach.clone (#140264)
Fixes #64532

As state in issue, replace `clone.detach` by `detach.clone`

Pull Request resolved: https://github.com/pytorch/pytorch/pull/140264
Approved by: https://github.com/soulitzer
2024-11-13 07:01:02 +00:00
..
__init__.py
_data_parallel_utils.py [reland][dtensor] move DTensor to public namespace (#134203) 2024-09-08 17:08:40 +00:00
_utils.py Revert "Deprecate torch._utils.is_compiling() and torch._dynamo.external_utils.is_compiling() (#127690)" 2024-11-05 23:10:38 +00:00
api.py Allow parallelize_module to get device_mesh from ambient context (#134247) 2024-10-09 00:19:03 +00:00
ddp.py [DDP][FSDP2] keep DTensor params for replicate(fully_shard) (#133059) 2024-08-09 18:38:05 +00:00
fsdp.py Replace clone.detach with detach.clone (#140264) 2024-11-13 07:01:02 +00:00
input_reshard.py [BE]: Update mypy to 1.11.2 (#133816) 2024-09-16 19:44:11 +00:00
loss.py Remove unused Python variables in torch/[b-z]* (#136963) 2024-10-19 16:45:22 +00:00
style.py [reland][dtensor] move DTensor to public namespace (#134203) 2024-09-08 17:08:40 +00:00