[distributed][docs] Delete distributed optimimzer section from RPC and add reference to namespace docs page (#68068)

Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/68068

cc pietern mrshenli pritamdamania87 zhaojuanmao satgera rohan-varma gqchen aazzolini osalpekar jiayisuse SciPioneer H-Huang

Test Plan: Imported from OSS

Reviewed By: pritamdamania87

Differential Revision: D32286554

Pulled By: jamesr66a

fbshipit-source-id: a43fe1f0cfa74721f467b128f2e878bd02f32546
This commit is contained in:
James Reed 2021-11-09 14:57:52 -08:00 committed by Facebook GitHub Bot
parent 7c90bd77ec
commit eaf0457eef
2 changed files with 5 additions and 12 deletions

View File

@ -4,11 +4,8 @@
Distributed Optimizers
======================
.. autoclass:: torch.distributed.optim.DistributedOptimizer
:members:
.. warning ::
Distributed optimizer is not currently supported when using CUDA tensors
.. autoclass:: torch.distributed.optim.PostLocalSGDOptimizer
:members:
.. autoclass:: torch.distributed.optim.ZeroRedundancyOptimizer
:members:
.. automodule:: torch.distributed.optim
:members: DistributedOptimizer, PostLocalSGDOptimizer, ZeroRedundancyOptimizer

View File

@ -261,11 +261,7 @@ using RPC. For more details see :ref:`distributed-autograd-design`.
Distributed Optimizer
---------------------
.. warning ::
Distributed optimizer is not currently supported when using CUDA tensors
.. automodule:: torch.distributed.optim
:members: DistributedOptimizer
See the `torch.distributed.optim <https://pytorch.org/docs/master/distributed.optim.html>`__ page for documentation on distributed optimizers.
Design Notes
------------