mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-08 07:39:33 +01:00
[Docs] Fix docstring format (#99396)
Fixes #ISSUE_NUMBER Pull Request resolved: https://github.com/pytorch/pytorch/pull/99396 Approved by: https://github.com/awgu
This commit is contained in:
parent
64efd88845
commit
b51f92ebda
|
|
@ -216,7 +216,7 @@ class FullyShardedDataParallel(nn.Module, _FSDPState):
|
|||
Args:
|
||||
module (nn.Module):
|
||||
This is the module to be wrapped with FSDP.
|
||||
process_group: Optional[Union[ProcessGroup, Tuple[ProcessGroup, ProcessGroup]]]
|
||||
process_group (Optional[Union[ProcessGroup, Tuple[ProcessGroup, ProcessGroup]]]):
|
||||
This is the process group used for collective communications and
|
||||
the one over which the model is sharded. For hybrid sharding strategies such as
|
||||
``ShardingStrategy.HYBRID_SHARD`` users can
|
||||
|
|
@ -1458,7 +1458,7 @@ class FullyShardedDataParallel(nn.Module, _FSDPState):
|
|||
corresponding to the unflattened parameters and holding the
|
||||
sharded optimizer state.
|
||||
model (torch.nn.Module):
|
||||
Refer to :meth:``shard_full_optim_state_dict``.
|
||||
Refer to :meth:`shard_full_optim_state_dict`.
|
||||
optim (torch.optim.Optimizer): Optimizer for ``model`` 's
|
||||
parameters.
|
||||
|
||||
|
|
@ -1785,7 +1785,7 @@ class FullyShardedDataParallel(nn.Module, _FSDPState):
|
|||
) -> Dict[str, Any]:
|
||||
"""
|
||||
This hook is intended be used by ``torch.distributed.NamedOptimizer``.
|
||||
The functionality is identical to ``:meth:optim_state_dict`` except
|
||||
The functionality is identical to :meth:`optim_state_dict` except
|
||||
for the different arguments.
|
||||
|
||||
Args:
|
||||
|
|
@ -1916,7 +1916,7 @@ class FullyShardedDataParallel(nn.Module, _FSDPState):
|
|||
) -> Dict[str, Any]:
|
||||
"""
|
||||
This hook is intended be used by ``torch.distributed.NamedOptimizer``.
|
||||
The functionality is identical to ``:meth:optim_state_dict_to_load``
|
||||
The functionality is identical to :meth:`optim_state_dict_to_load`
|
||||
except for the different arguments.
|
||||
|
||||
Args:
|
||||
|
|
|
|||
Loading…
Reference in New Issue
Block a user