| .. |
|
_composable
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
_shard
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
_sharded_tensor
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
_sharding_spec
|
[BE][Easy] enable UFMT for torch/distributed/ (#128870)
|
2024-06-22 18:53:28 +00:00 |
|
_symmetric_memory
|
[Async TP] More robust support for rowwise scales when fusing matmul reduce-scatter (#149247)
|
2025-03-27 03:15:30 +00:00 |
|
_tensor
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
_tools
|
Add support for non functional collectives under FakeTensorMode and fake_pg for memory tracking (#147566)
|
2025-03-08 18:00:49 +00:00 |
|
algorithms
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
autograd
|
[BE][Easy] enable UFMT for torch/distributed/{algorithms,autograd,benchmarks,checkpoint,elastic}/ (#128866)
|
2024-06-18 13:51:53 +00:00 |
|
benchmarks
|
[BE][CI] bump ruff to 0.8.4 (#143753)
|
2024-12-24 12:24:10 +00:00 |
|
checkpoint
|
Fix bug in _load_state_dict_from_keys method (#150058)
|
2025-03-27 16:36:00 +00:00 |
|
elastic
|
Expose the rendezvous keepalive arguments (#145228)
|
2025-03-03 19:11:56 +00:00 |
|
examples
|
[BE][Easy] enable UFMT for torch/distributed/ (#128870)
|
2024-06-22 18:53:28 +00:00 |
|
fsdp
|
[dynamo] add dynamo disable reasons to codebase (#150440)
|
2025-04-02 04:26:48 +00:00 |
|
launcher
|
PEP585 update - torch/distributed (#145164)
|
2025-01-21 04:23:29 +00:00 |
|
nn
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
optim
|
[BE][Ez]: Use itertools.chain.from_iterable when possible (#148190)
|
2025-03-06 20:37:06 +00:00 |
|
pipelining
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
rpc
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
tensor
|
[dtensor][tp] add a ParallelStyle PrepareModuleInputOutput (#150372)
|
2025-04-01 19:15:43 +00:00 |
|
__init__.py
|
[c10d] Add a collective time estimator for NCCL comms (#149343)
|
2025-03-19 07:54:02 +00:00 |
|
_checkpointable.py
|
PEP585 update - torch/distributed (#145164)
|
2025-01-21 04:23:29 +00:00 |
|
_composable_state.py
|
[FSDP2] Make module-to-state mapping use weakrefs (#139650)
|
2024-11-05 02:16:52 +00:00 |
|
_functional_collectives_impl.py
|
PEP585 update - torch/distributed (#145164)
|
2025-01-21 04:23:29 +00:00 |
|
_functional_collectives.py
|
[Async TP] More robust support for rowwise scales when fusing matmul reduce-scatter (#149247)
|
2025-03-27 03:15:30 +00:00 |
|
_serialization.py
|
PEP585: More UP006 fixes (#146392)
|
2025-02-20 06:18:13 +00:00 |
|
_state_dict_utils.py
|
Create and send full_tensor on ProcessGroup-supported device in _broadcast_tensors (#148865)
|
2025-03-12 20:56:31 +00:00 |
|
argparse_util.py
|
Flip default value for mypy disallow_untyped_defs [5/11] (#127842)
|
2024-06-08 18:49:18 +00:00 |
|
c10d_logger.py
|
PEP585 update - torch/distributed (#145164)
|
2025-01-21 04:23:29 +00:00 |
|
collective_utils.py
|
PEP585 update - torch/distributed (#145164)
|
2025-01-21 04:23:29 +00:00 |
|
constants.py
|
[BE][Easy] enable UFMT for torch/distributed/ (#128870)
|
2024-06-22 18:53:28 +00:00 |
|
CONTRIBUTING.md
|
Clean up distributed/CONTRIBUTING.md (#128450)
|
2024-06-22 02:41:22 +00:00 |
|
device_mesh.py
|
[DeviceMesh] Add some documentation for from_group API and add a 2D test (#146364)
|
2025-03-01 00:57:37 +00:00 |
|
distributed_c10d.py
|
[Reland] Launch kernel on current stream & remove record_stream entirely (#150398)
|
2025-04-01 16:46:07 +00:00 |
|
launch.py
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
logging_handlers.py
|
PEP585 update - torch/distributed (#145164)
|
2025-01-21 04:23:29 +00:00 |
|
remote_device.py
|
[BE][Easy] fix ruff rule needless-bool (SIM103) (#130206)
|
2024-07-14 08:17:52 +00:00 |
|
rendezvous.py
|
Fix dist.init_process_group on windows (#148266)
|
2025-03-05 00:07:56 +00:00 |
|
run.py
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |
|
utils.py
|
[BE][PYFMT] migrate PYFMT for torch.{distributed,distributions} to ruff format (#144547)
|
2025-02-28 07:35:56 +00:00 |