pytorch/torch
Rodrigo Berriel b80bdcc73b Add register_module alias to nn.Module (#65174)
Summary:
Fixes https://github.com/pytorch/pytorch/issues/60397. I'm not sure how aliases are supposed to be implemented, but this is the most basic/direct way, IMO. As a side-effect, this implementation results in a "duplicate" doc entry, inheriting the one from `add_module`:

![monkey-patch](https://user-images.githubusercontent.com/7027770/133693137-8408d8e7-1f4f-436b-b176-57dda9bc3a32.png)

An alternative implementation could be:

```python
def register_module(self, name: str, module: Optional['Module']) -> None:
    r"""Alias for :func:`add_module`."""
    self.add_module(name, module)
```

which results in this documentation:

![image](https://user-images.githubusercontent.com/7027770/133693249-d969a71a-be44-489d-9633-4f38b44ab887.png)

Questions:
1. Should I replicate the tests? There are two for `add_module`: [test_add_module_raises_error_if_attr_exists](873255c6d9/test/test_nn.py (L1420-L1434)) and [test_add_module](873255c6d9/test/test_nn.py (L1837-L1855)).
2. This PR only adds `register_module` to `nn.Module`. There is an `add_module` in [`_RemoteModule`](https://github.com/pytorch/pytorch/blob/master/torch/distributed/nn/api/remote_module.py#L311-L312), which raises `NotSupported`, and there is another one in [`ConcreteModuleTypeBuilder`](873255c6d9/torch/_C/__init__.pyi.in (L468)), which means something else, I think. Should I do anything about them?

cc ngimel SsnL

Pull Request resolved: https://github.com/pytorch/pytorch/pull/65174

Reviewed By: soulitzer

Differential Revision: D31089717

Pulled By: jbschlosser

fbshipit-source-id: abd8d14a434fd8c7efa0bd8c242df56da33491e9
2021-09-22 16:37:28 -07:00
..
_C Cleaning up DDP SPMD in reducer.cpp (#64113) 2021-09-21 16:13:18 -07:00
ao [quant] AO migration of the torch/quantization/quantize_fx.py and torch/quantization/fx/* (#65033) 2021-09-22 09:29:15 -07:00
autograd Adds keyword only args to gradcheck (#65290) 2021-09-21 06:31:07 -07:00
backends [CoreML][fbcode] Add the preprocess python APIs (#64521) 2021-09-17 00:25:14 -07:00
contrib Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
cpu Allow disabling cache in autocast (automatic mixed precision) (#63552) 2021-09-08 07:47:18 -07:00
csrc [Static Runtime] Added NNC implementation for signed log1p kernel. (#65387) 2021-09-22 15:53:33 -07:00
cuda [CUDA graphs] Beta, not prototype (#65247) 2021-09-20 13:32:36 -07:00
distributed Cleaning up DDP SPMD in reducer.cpp (#64113) 2021-09-21 16:13:18 -07:00
distributions Poisson zero rate (#61511) 2021-08-19 08:30:28 -07:00
fft Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
for_onnx
futures Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
fx [quant] AO migration of the torch/quantization/quantize_fx.py and torch/quantization/fx/* (#65033) 2021-09-22 09:29:15 -07:00
jit Add register_module alias to nn.Module (#65174) 2021-09-22 16:37:28 -07:00
legacy
lib
linalg Array API: Add torch.linalg.matmul alias to torch.matmul (#63227) 2021-09-07 12:35:32 -07:00
multiprocessing
nn Add register_module alias to nn.Module (#65174) 2021-09-22 16:37:28 -07:00
onnx [ONNX] Support torch.isfinite export (#64759) 2021-09-21 15:47:48 -07:00
optim To add state dict and load_dict for Chained Scheduler (#65034) 2021-09-15 13:11:41 -07:00
package [package] Make it possible to re-save a PackageImporter module (#65101) 2021-09-17 16:25:11 -07:00
profiler [Profiler] Change FLOP/s to Total FLOPs (#62779) 2021-08-16 13:43:32 -07:00
quantization [quant] AO migration of the torch/quantization/quantize_fx.py and torch/quantization/fx/* (#65033) 2021-09-22 09:29:15 -07:00
sparse Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
special [special] Alias igamma, igammac to special.gammaninc, special.gammaincc (#61902) 2021-09-07 15:31:26 -07:00
testing [DDP] Custom buffer reduction (#64513) 2021-09-22 14:11:35 -07:00
utils Remove .data from benchmarks and tensorboard (#65389) 2021-09-22 11:16:59 -07:00
__config__.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
__future__.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
__init__.py [quant] AO migration of the torch/quantization/quantize_fx.py and torch/quantization/fx/* (#65033) 2021-09-22 09:29:15 -07:00
_appdirs.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_classes.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_deploy.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_jit_internal.py Support Union in TorchScript (#64234) 2021-09-03 06:12:24 -07:00
_linalg_utils.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_lobpcg.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_lowrank.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_namedtensor_internals.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_ops.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_python_dispatcher.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_six.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_sources.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_storage_docs.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_tensor_docs.py Initial implementation of nanmean (#62671) 2021-09-13 05:53:58 -07:00
_tensor_str.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_tensor.py Adds DLPack support (#57110) 2021-09-12 19:47:15 -07:00
_torch_docs.py Fix torch.any documentation (#65310) 2021-09-22 11:24:20 -07:00
_utils_internal.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_utils.py Add support for the ONNX Runtime Eager Mode backend (#58248) 2021-08-20 11:17:13 -07:00
_VF.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
_vmap_internals.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
abi-check.cpp
autocast_mode.py Allow disabling cache in autocast (automatic mixed precision) (#63552) 2021-09-08 07:47:18 -07:00
CMakeLists.txt [CoreML][OSS] Integrate with CMake (#64523) 2021-09-17 10:32:00 -07:00
custom_class_detail.h
custom_class.h
deploy.h
extension.h
functional.py implement "xy" indexing for torch.meshgrid (#62724) 2021-09-17 08:31:17 -07:00
hub.py Torchhub: More robust assumption regarding main or master branch (#64364) 2021-09-20 10:36:13 -07:00
library.h Add support for the ONNX Runtime Eager Mode backend (#58248) 2021-08-20 11:17:13 -07:00
overrides.py Add Tensor._make_wrapper_subclass (#65340) 2021-09-22 11:10:47 -07:00
py.typed
quasirandom.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
random.py Adds return type annotation for fork_rng function (#63724) 2021-08-27 09:03:40 -07:00
README.txt
script.h
serialization.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
storage.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00
torch_version.py Added more version comparison operations (#63848) 2021-09-09 10:30:20 -07:00
types.py Revert D30279364: [codemod][lint][fbcode/c*] Enable BLACK by default 2021-08-12 11:45:01 -07:00

Note [TH abstraction violation]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

TH/THC provide some hpp headers, which are proper C++ headers rather than
C headers.  These headers serve double duty as *internal implementation
detail* headers, whose contents should largely not be used by external
clients.

Ideally, we would not install these headers at all; instead, you should
use public functions (in headers like `THTensor.h`, NOT `THTensor.hpp`)
to manipulate these structs.  However, there are a few places
in torch/csrc where we violate this abstraction.  They are marked with
a pointer to this note.  Each of those sites will have to be refactored
when we refactor the guts of THTensor and related structures.