pytorch/torch/distributed/_shard/sharder.py
Xuehai Pan 758a0a88a2 [BE][Easy] enable ruff rule PIE790: unnecessary pass statement (#133200)
This PR removes unnecessary `pass` statement. This is semanticly safe because the bytecode for the Python code does not change.

Note that if there is a docstring in the function, a empty function does not need a `pass` statement as placeholder.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/133200
Approved by: https://github.com/malfet, https://github.com/eqy, https://github.com/kit1980
2024-08-15 15:50:19 +00:00

30 lines
901 B
Python

import abc
import torch.nn as nn
class Sharder(abc.ABC):
"""
This is an interface which allows user to create more advanced
sharding strategies that are not easily be composed by the
`ShardingSpec`.
:class:`torch.distributed._shard.sharding_plan.ShardingPlan` could
take an object of the `Sharder` and call `shard` to shard the module,
then replace the original module with sharded module returned.
"""
@abc.abstractmethod
def shard(self, module: nn.Module) -> nn.Module:
"""
Shard a module base on the implementation of this method, and
return the sharded version of the module.
Args:
module (:class:`torch.nn.Module`):
The module to apply sharding to.
Returns:
A :class:`torch.nn.Module` object that represents a module
that's already been sharded.
"""