| .. |
|
_composable
|
[DeviceMesh] Clean up the call into mesh_resouces to get root mesh (#165787)
|
2025-10-21 02:54:04 +00:00 |
|
_local_tensor
|
[10/N] Apply ruff UP035 rule (#165709)
|
2025-10-25 00:20:13 +00:00 |
|
_pycute
|
Enable all PIE rules on ruff (#165814)
|
2025-10-18 07:36:18 +00:00 |
|
_shard
|
[2/N] More ruff SIM fixes (#165031)
|
2025-10-14 14:22:54 +00:00 |
|
_sharded_tensor
|
|
|
|
_sharding_spec
|
|
|
|
_symmetric_memory
|
Add doc for Symmetric Memory (#166148)
|
2025-10-25 03:41:15 +00:00 |
|
_tensor
|
|
|
|
_tools
|
[2/N] More ruff SIM fixes (#165031)
|
2025-10-14 14:22:54 +00:00 |
|
algorithms
|
[2/N] More ruff SIM fixes (#165031)
|
2025-10-14 14:22:54 +00:00 |
|
autograd
|
|
|
|
benchmarks
|
Add pyrefly suppressions to torch/distributed (7/n) (#165002)
|
2025-10-09 04:08:25 +00:00 |
|
checkpoint
|
Clean up unused Pyrefly suppressions (#166178)
|
2025-10-25 05:32:21 +00:00 |
|
elastic
|
[pytorch][torchelastic] Duplicate stdout and stderr and apply custom filter in torchrun (#160712)
|
2025-10-23 14:22:21 +00:00 |
|
examples
|
|
|
|
fsdp
|
[DeviceMesh] Clean up the call into mesh_resouces to get root mesh (#165787)
|
2025-10-21 02:54:04 +00:00 |
|
launcher
|
[pytorch][torchelastic] Duplicate stdout and stderr and apply custom filter in torchrun (#160712)
|
2025-10-23 14:22:21 +00:00 |
|
nn
|
Add pyrefly suppressions to torch/distributed (7/n) (#165002)
|
2025-10-09 04:08:25 +00:00 |
|
optim
|
[2/N] More ruff SIM fixes (#165031)
|
2025-10-14 14:22:54 +00:00 |
|
pipelining
|
Clean up unused Pyrefly suppressions (#166178)
|
2025-10-25 05:32:21 +00:00 |
|
rpc
|
Enable all flake8-logging-format rules (#164655)
|
2025-10-19 00:59:28 +00:00 |
|
tensor
|
Clean up unused Pyrefly suppressions (#166178)
|
2025-10-25 05:32:21 +00:00 |
|
__init__.py
|
[RFC] Add pyrefly to lintrunner (#165179)
|
2025-10-16 20:07:09 +00:00 |
|
_checkpointable.py
|
|
|
|
_composable_state.py
|
Revert "[distributed] Replace assert statements with AssertionError exceptions (#165216)"
|
2025-10-14 17:05:16 +00:00 |
|
_dist2.py
|
Revert "[distributed] Replace assert statements with AssertionError exceptions (#165216)"
|
2025-10-14 17:05:16 +00:00 |
|
_functional_collectives_impl.py
|
Revert "[distributed] Replace assert statements with AssertionError exceptions (#165216)"
|
2025-10-14 17:05:16 +00:00 |
|
_functional_collectives.py
|
[RFC] Add pyrefly to lintrunner (#165179)
|
2025-10-16 20:07:09 +00:00 |
|
_mesh_layout.py
|
[DeviceMesh] Simplify unflatten method (#165556)
|
2025-10-17 17:57:51 +00:00 |
|
_serialization.py
|
distributed/serialization: support zero sized tensors (#164198)
|
2025-09-30 08:11:29 +00:00 |
|
_state_dict_utils.py
|
Enable all SIM rules except disabled ones (#164645)
|
2025-10-17 07:27:11 +00:00 |
|
argparse_util.py
|
|
|
|
c10d_logger.py
|
Add pyrefly suppressions to torch/distributed (7/n) (#165002)
|
2025-10-09 04:08:25 +00:00 |
|
collective_utils.py
|
Revert "[distributed] Replace assert statements with AssertionError exceptions (#165216)"
|
2025-10-14 17:05:16 +00:00 |
|
constants.py
|
Revert "[RELAND] Always build USE_DISTRIBUTED (#160449) and Make distributed modules importable even when backend not built (#159889) (#162594)"
|
2025-09-25 13:47:46 +00:00 |
|
CONTRIBUTING.md
|
|
|
|
device_mesh.py
|
Revert "[DeviceMesh] Implement a device mesh concatenate api for submesh and SPMD use case (#163358)"
|
2025-10-24 15:58:54 +00:00 |
|
distributed_c10d.py
|
Clean up unused Pyrefly suppressions (#166178)
|
2025-10-25 05:32:21 +00:00 |
|
launch.py
|
|
|
|
logging_handlers.py
|
|
|
|
remote_device.py
|
Add pyrefly suppressions to torch/distributed (7/n) (#165002)
|
2025-10-09 04:08:25 +00:00 |
|
rendezvous.py
|
Revert "[distributed] Replace assert statements with AssertionError exceptions (#165216)"
|
2025-10-14 17:05:16 +00:00 |
|
run.py
|
[pytorch][torchelastic] Duplicate stdout and stderr and apply custom filter in torchrun (#160712)
|
2025-10-23 14:22:21 +00:00 |
|
utils.py
|
Clean up unused Pyrefly suppressions (#166178)
|
2025-10-25 05:32:21 +00:00 |