| .. |
|
_composable
|
Add global registry to composable API contract (#90579)
|
2022-12-10 22:41:10 +00:00 |
|
_shard
|
Fix non-existing parameters in docstrings (#90505)
|
2022-12-09 21:43:09 +00:00 |
|
_sharded_tensor
|
|
|
|
_sharding_spec
|
Add __all__ for a few distributed modules plus a little typing (reland) (#84872)
|
2022-09-13 21:57:49 +00:00 |
|
_spmd
|
Remove eager mode support form CommTensor (#84978)
|
2022-09-14 17:23:23 +00:00 |
|
_tensor
|
[dtensor] delete unused torch_function (#90449)
|
2022-12-10 01:29:02 +00:00 |
|
_tools
|
[reland] add save and load stats in memory_tracker (#90510)
|
2022-12-10 01:39:22 +00:00 |
|
algorithms
|
Remove DDP import (#89982)
|
2022-12-01 14:56:48 +00:00 |
|
autograd
|
Integrate xdoctest - Rebased (#82797)
|
2022-08-12 02:08:01 +00:00 |
|
benchmarks
|
Fix typos in messages under torch (#89049)
|
2022-11-17 04:18:14 +00:00 |
|
checkpoint
|
[Checkpoint][2D][6/N] Add optimizer and update default_planner to core distributed (#90212)
|
2022-12-08 02:53:29 +00:00 |
|
elastic
|
Fix exception causes all over the codebase (#90271)
|
2022-12-07 04:29:00 +00:00 |
|
examples
|
add memory_tracker tool to help profiling memory usages (#88825)
|
2022-11-29 06:42:57 +00:00 |
|
fsdp
|
[FSDP] Fix _pre_forward type annotation (#90621)
|
2022-12-11 06:39:38 +00:00 |
|
launcher
|
Add __all__ to torch.distributed and tensorboard submodules (#80444)
|
2022-06-28 16:33:22 +00:00 |
|
nn
|
Fix non-existing parameters in docstrings (#90505)
|
2022-12-09 21:43:09 +00:00 |
|
optim
|
[PT-D][Easy] Reformat the optim code within PTD code base (#90399)
|
2022-12-08 06:38:59 +00:00 |
|
pipeline
|
Fix exception causes all over the codebase (#90271)
|
2022-12-07 04:29:00 +00:00 |
|
rpc
|
Fix non-existing parameters in docstrings (#90505)
|
2022-12-09 21:43:09 +00:00 |
|
tensor
|
Revert "remove torch.equal usages (#89527)"
|
2022-12-02 21:36:13 +00:00 |
|
__init__.py
|
Add torch.distributed.DistBackendError exception type, thrown from C10D_NCCL_CHECK (#88134)
|
2022-11-08 13:26:42 +00:00 |
|
argparse_util.py
|
|
|
|
c10d_error_logger.py
|
[C10D][BE] Add exception handlers to c10d collectives function (#87643) (#87988)
|
2022-10-29 04:38:34 +00:00 |
|
constants.py
|
|
|
|
CONTRIBUTING.md
|
Fix some links in torch/distributed/CONTRIBUTING.md (#79855)
|
2022-06-21 00:48:30 +00:00 |
|
distributed_c10d.py
|
Hybrid Sharded Data Parallel (#89915)
|
2022-12-08 16:18:03 +00:00 |
|
launch.py
|
Integrate xdoctest - Rebased (#82797)
|
2022-08-12 02:08:01 +00:00 |
|
logging_handlers.py
|
[C10D][BE] Add exception handlers to c10d collectives function (#87643) (#87988)
|
2022-10-29 04:38:34 +00:00 |
|
remote_device.py
|
Rewrite ShardedTensor.gather to use dist.gather instead of gather_object (#77272)
|
2022-05-17 02:14:40 +00:00 |
|
rendezvous.py
|
Fix exception causes all over the codebase (#90271)
|
2022-12-07 04:29:00 +00:00 |
|
run.py
|
Fix exception causes all over the codebase (#90271)
|
2022-12-07 04:29:00 +00:00 |
|
utils.py
|
[DDP] Add PackedSequence support when device_ids is specified (#86614)
|
2022-10-10 21:50:59 +00:00 |