mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
Summary: after converting nn.multihead attention we weren't deleting the old in_proj_weight and in_proj_bias despite not (really) using them. Test Plan: python test/test_quantization.py -k "test_custom_module_multi_head_attention" Reviewers: Subscribers: Tasks: Tags: Pull Request resolved: https://github.com/pytorch/pytorch/pull/110407 Approved by: https://github.com/jerryzh168 |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| activation.py | ||
| batchnorm.py | ||
| conv.py | ||
| dropout.py | ||
| embedding_ops.py | ||
| functional_modules.py | ||
| linear.py | ||
| normalization.py | ||
| rnn.py | ||
| utils.py | ||