Commit Graph

5 Commits

Author SHA1 Message Date
Michael Suo
341262754f module dedupe (#26666)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26666

Changes:
- Introduce a `ConcreteModuleType` concept. This acts both as the key into the type
  cache, and as the source of truth for `ModuleValue::attr` queries. It needs
  to do both jobs because that's how we ensure correctness (if the types are
  different, it's because `ModuleValue::attr` would return different things).
- Now `recursive_script` will first construct a `ConcreteModuleType` and search for a
  pre-existing type before starting compilation.
- All previous paths to creating a `ScriptModule` (including inheriting from
  `ScriptModule`) are now rewritten to go through `create_script_module`, so
  that we have only a single place where construction happens.

Behavioral changes:
- Big change to `torch.jit.ScriptModule` inheritance: all attributes are now
  recursively scripted if possible, matching recursive scripting semantics.
  This makes it hard to keep something from being scripted (for example, a
  Python submodule). Possibly we'll need an `ignore()` type thing for
  attributes. In particular, this adds `self.training` to *every* ScriptModule, since
  it's present on every `nn.Module`.
- I believe this change to be transparent to existing users of the inheritance API, since if you had an attribute that is unscriptable that you never used, there is no error. In some cases, we will create new attributes (even if they are unused), which will increase serialized model size from before.

Test Plan: Imported from OSS

Differential Revision: D17551196

Pulled By: suo

fbshipit-source-id: b476d1c9feb3ddfd63406d90989aaf9dfe890591
2019-10-12 09:51:57 -07:00
davidriazati
0046092178 Reduce special casing around 'training' (#27109)
Summary:
Most of this was old cruft left over from special handling of `training` before we had a `bool` type. This makes all modules have a `training` attribute that is true by default and removes all other special handling.

Fixes #26884
](https://our.intern.facebook.com/intern/diff/17728129/)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/27109

Pulled By: driazati

Differential Revision: D17728129

fbshipit-source-id: 8ddc9fbb07a953dd05529538bfdd01ed88b5cb57
2019-10-07 13:52:59 -07:00
Jerry Zhang
98c02e6df3 Enable tests (#27103)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/27103

att

Test Plan:
python test/test_quantization.py 'GraphModePostTrainingQuantTest'

Imported from OSS

Differential Revision: D17678261

fbshipit-source-id: 5caa7512c6ff4a613980c86b5b221e0cfbe0a173
2019-10-01 12:10:21 -07:00
Jerry Zhang
f742ceaa46 API - add more passes to graph mode (#27093)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/27093

Add `insert_prepack_unpack` and `fold_prepack` to `convert_script`

Test Plan:
.

Imported from OSS

Differential Revision: D17678262

fbshipit-source-id: 4bfd6681af6fce226cc77aed8dd84066cbd8ed17
2019-10-01 11:26:02 -07:00
Jerry Zhang
09f0e949cd PyTorch Graph Mode Quantization API (#26390)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/26390

`quantize_script`: top level API for graph mode quantization

Test Plan:
there are some known issues, we can enable test after all known issues are fixed.

Imported from OSS

Differential Revision: D17645132

fbshipit-source-id: 61f261d5607409d493b39a2f4e05ebd017279f6b
2019-09-27 19:23:51 -07:00