pytorch/torch/fx
Wang Xu b46787d6d7 add cost_aware_partition (#47673)
Summary:
[WIP]This PR adds cost_aware_partition method in Partitioner class. The method partitions the fx graph module based on the latency of the whole graph.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/47673

Reviewed By: gcatron

Differential Revision: D24896685

Pulled By: scottxu0730

fbshipit-source-id: 1b1651fe82ce56554f99d68da116e585c74099ed
2020-11-11 19:31:37 -08:00
..
experimental add cost_aware_partition (#47673) 2020-11-11 19:31:37 -08:00
__init__.py Support default args in symbolic tracing (#47615) 2020-11-10 18:57:00 -08:00
graph_module.py Look in named-buffers of module for tensors (#47641) 2020-11-11 19:08:16 -08:00
graph.py [FX] Fix uses not updating when erasing a node (#47720) 2020-11-11 11:02:15 -08:00
immutable_collections.py [fx] make sure args/kwargs are immutable (#46325) 2020-10-14 15:51:43 -07:00
node.py [FX] Add a bunch of docstrings (#47719) 2020-11-11 10:59:57 -08:00
proxy.py Support default args in symbolic tracing (#47615) 2020-11-10 18:57:00 -08:00
symbolic_trace.py Look in named-buffers of module for tensors (#47641) 2020-11-11 19:08:16 -08:00