Commit Graph

8 Commits

Author SHA1 Message Date
Thibaut Durand
01da732691 Fix type annotation of torch.split (#100655)
The type annotation indicates `list` but the returned type is `tuple`
```python
>>> import torch
>>> type(torch.arange(10).split(4))
<class 'tuple'>
```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/100655
Approved by: https://github.com/kit1980
2023-05-16 21:35:41 +00:00
Peter Bell
b9e919fed7 [fx] fix merge_matmul tests making invalid torch.split calls
The merge_matmul tests currently call `torch.split` with a list of
`torch.fx.Attribute` objects which won't work once `torch.split` is moved into
C++ because the argument parser actually checks the list contains integers
before calling `__torch_function__`.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/75019

Approved by: https://github.com/jamesr66a, https://github.com/jfix71
2022-05-09 20:04:21 +00:00
Shirong Wu
ea8a0184b7 Fix fuse_parallel_linear (#76202)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/76202

move legalize_graph to a common tool class.

Reviewed By: yinghai, jfix71, 842974287

Differential Revision: D35694145

fbshipit-source-id: b044df3b46b3029c383581f7853a4338c2b13c62
(cherry picked from commit 49884d557d220f981f5f894bdcd381df749e3efb)
2022-04-22 18:59:07 +00:00
James Reed
7b73fdf597 [FX] Fix retracing wrapped functions (#58061)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/58061

Test Plan: Imported from OSS

Reviewed By: yuhc

Differential Revision: D28358801

Pulled By: jamesr66a

fbshipit-source-id: c7c9a8a80e5bfe1eb1f6d2cf858ac7e57153a860
2021-05-17 19:50:16 -07:00
Ansley Ussery
85109ce427 Support submodule manipulation in GraphModule (#52358)
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/52358

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D26759260

Pulled By: ansley

fbshipit-source-id: 25d2b9124a7d957704f1700a45dca143aaed391d
2021-03-04 14:52:35 -08:00
Meghan Lele
11cdb910b4 [fx] Add matrix multiplication fusion pass (#50151)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/50151

**Summary**
This commit adds a graph transformation pass that merges several matrix
multiplications that use the same RHS operand into one large matrix
multiplication. The LHS operands from all of the smaller matrix multiplications
are concatenated together and used as an input in the large matrix multiply,
and the result is split in order to obtain the same products as the original
set of matrix multiplications.

**Test Plan**
This commit adds a simple unit test with two matrix multiplications that share
the same RHS operand.

`python test/test_fx_experimental.py -k merge_matmul -v`

Test Plan: Imported from OSS

Reviewed By: ngimel

Differential Revision: D25809409

Pulled By: SplitInfinity

fbshipit-source-id: fb55c044a54dea9f07b71aa60d44b7a8f3966ed0
2021-01-06 21:49:37 -08:00
Natalia Gimelshein
ad7d208ba5 Revert D25239967: [fx] Add matrix multiplication fusion pass
Test Plan: revert-hammer

Differential Revision:
D25239967 (9b7f3fa146)

Original commit changeset: fb99ad25b7d8

fbshipit-source-id: 370167b5ade8bf2b3a6cccdf4290ea07b8347c79
2021-01-05 23:22:26 -08:00
Meghan Lele
9b7f3fa146 [fx] Add matrix multiplication fusion pass (#50120)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/50120

This commit adds a graph transformation pass that merges several matrix
multiplications that use the same RHS operand into one large matrix
multiplication. The LHS operands from all of the smaller matrix multiplications
are concatenated together and used as an input in the large matrix multiply,
and the result is split in order to obtain the same products as the original
set of matrix multiplications.

Test Plan:
This commit adds a simple unit test with two matrix multiplications that share
the same RHS operand.

`buck test //caffe2/test:fx_experimental`

Reviewed By: jamesr66a

Differential Revision: D25239967

fbshipit-source-id: fb99ad25b7d83ff876da6d19dc4abd112d13001e
2021-01-05 19:37:08 -08:00