The merge_matmul tests currently call `torch.split` with a list of
`torch.fx.Attribute` objects which won't work once `torch.split` is moved into
C++ because the argument parser actually checks the list contains integers
before calling `__torch_function__`.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/75019
Approved by: https://github.com/jamesr66a, https://github.com/jfix71
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/50151
**Summary**
This commit adds a graph transformation pass that merges several matrix
multiplications that use the same RHS operand into one large matrix
multiplication. The LHS operands from all of the smaller matrix multiplications
are concatenated together and used as an input in the large matrix multiply,
and the result is split in order to obtain the same products as the original
set of matrix multiplications.
**Test Plan**
This commit adds a simple unit test with two matrix multiplications that share
the same RHS operand.
`python test/test_fx_experimental.py -k merge_matmul -v`
Test Plan: Imported from OSS
Reviewed By: ngimel
Differential Revision: D25809409
Pulled By: SplitInfinity
fbshipit-source-id: fb55c044a54dea9f07b71aa60d44b7a8f3966ed0
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/50120
This commit adds a graph transformation pass that merges several matrix
multiplications that use the same RHS operand into one large matrix
multiplication. The LHS operands from all of the smaller matrix multiplications
are concatenated together and used as an input in the large matrix multiply,
and the result is split in order to obtain the same products as the original
set of matrix multiplications.
Test Plan:
This commit adds a simple unit test with two matrix multiplications that share
the same RHS operand.
`buck test //caffe2/test:fx_experimental`
Reviewed By: jamesr66a
Differential Revision: D25239967
fbshipit-source-id: fb99ad25b7d83ff876da6d19dc4abd112d13001e