..
_composable
Add early_stop kwarg to torch.utils.checkpoint ( #160781 )
2025-08-26 22:32:35 +00:00
_pycute
[CuTe] Change the logic of pycute manipulation ops like coalesce, complement from co-lex to lex ( #162690 )
2025-09-16 19:53:45 +00:00
_shard
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
_sharded_tensor
_sharding_spec
_symmetric_memory
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
_tensor
_tools
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
algorithms
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
autograd
[remove untyped defs] batch 1 ( #157011 )
2025-06-30 23:54:40 +00:00
benchmarks
checkpoint
[DCP] DTensor slice dequantization with proper block alignment ( #163532 )
2025-09-23 16:48:16 +00:00
elastic
[BE] Delete all pre py-3.10 checks ( #163653 )
2025-09-23 23:22:53 +00:00
examples
Support XPU in memory tracker ( #150703 )
2025-06-12 21:33:52 +00:00
fsdp
[FSDP2] idempotent reset_sharded_param: no-op if _local_tensor is already padded ( #163130 )
2025-09-18 09:20:37 +00:00
launcher
154849 Add support to handle IGUSR1 and SIGUSR2 in multiprocessing ( #160690 )
2025-09-09 22:23:06 +00:00
nn
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
optim
[BE][5/16] fix typos in torch/ (torch/distributed/) ( #156315 )
2025-06-23 02:57:28 +00:00
pipelining
Inspect schedule IR comms ( #162996 )
2025-09-16 16:59:06 +00:00
rpc
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
tensor
Shortcut redistribution when num_shards == 1 ( #163742 )
2025-09-24 23:49:08 +00:00
__init__.py
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
_C_stubs.py
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
_checkpointable.py
[BE]: Backport runtime_checkable perf improvements/behavior from 3.12 ( #155130 )
2025-06-06 13:28:05 +00:00
_composable_state.py
_dist2.py
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
_distributed_c10d.py
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
_functional_collectives_impl.py
_functional_collectives.py
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
_mesh_layout.py
[DeviceMesh] Make CuTe layout as mesh layout to be ready for using in DeviceMesh ( #162414 )
2025-09-15 17:04:41 +00:00
_serialization.py
[BE][5/16] fix typos in torch/ (torch/distributed/) ( #156315 )
2025-06-23 02:57:28 +00:00
_state_dict_utils.py
fix-unpin-memory-tensor-param ( #160992 )
2025-08-26 21:55:25 +00:00
argparse_util.py
c10d_logger.py
collective_utils.py
[C10D] add _summarize_ranks util ( #160284 )
2025-08-28 00:17:53 +00:00
constants.py
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
CONTRIBUTING.md
fix torch/distributed contributing doc ( #158934 )
2025-07-28 17:01:05 +00:00
device_mesh.py
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
distributed_c10d.py
[RELAND] Always build USE_DISTRIBUTED ( #160449 ) and Make distributed modules importable even when backend not built ( #159889 ) ( #162594 )
2025-09-22 21:12:18 +00:00
launch.py
logging_handlers.py
remote_device.py
rendezvous.py
[BE][5/16] fix typos in torch/ (torch/distributed/) ( #156315 )
2025-06-23 02:57:28 +00:00
run.py
Support XPU in --nproc-per-node option to torchrun ( #159474 )
2025-09-12 08:32:04 +00:00
utils.py