pytorch/torch/distributed/_tensor/debug
Kumar Ashutosh 405a0040cf Adds tool to visualize sharding (#114307)
This pull request adds a tool to visualize sharding. It uses the device_mesh and placement details to construct a visualization of the split of a torch dtensor.

Things to fix:

- [x] This implementation only uses the first element of the placement tuple, when can there be more than one elements?
- [x] The calculation of the split is happening here but maybe it is already done somewhere internally in Shard class and can we directly call that here?

Fixes #108746

Pull Request resolved: https://github.com/pytorch/pytorch/pull/114307
Approved by: https://github.com/wanchaol
2023-12-12 06:18:03 +00:00
..
__init__.py [dtensor] add CommDebugMode for debugging (#113592) 2023-11-27 02:40:28 +00:00
comm_mode.py [dtensor] add CommDebugMode for debugging (#113592) 2023-11-27 02:40:28 +00:00
op_coverage.py [dtensor] refactor op dispatch and fix is_same_size/equal (#112927) 2023-11-13 22:46:31 +00:00
visualize_sharding.py Adds tool to visualize sharding (#114307) 2023-12-12 06:18:03 +00:00