Commit Graph

17 Commits

Author SHA1 Message Date
cyy
df458be4e5 [4/N] Apply py39 ruff and pyupgrade fixes (#143257)
```torch/fx/passes/annotate_getitem_nodes.py``` was changed to support the new type hinting annotations.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/143257
Approved by: https://github.com/justinchuby, https://github.com/albanD
2025-01-04 10:47:51 +00:00
Oguz Ulgen
221350e3a4 Add None return type to init -- tests (#132352)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/132352
Approved by: https://github.com/ezyang
ghstack dependencies: #132335, #132351
2024-08-01 15:44:51 +00:00
Xuehai Pan
548c460bf1 [BE][Easy][7/19] enforce style for empty lines in import segments in test/[a-c]*/ and test/[q-z]*/ (#129758)
See https://github.com/pytorch/pytorch/pull/129751#issue-2380881501. Most changes are auto-generated by linter.

You can review these PRs via:

```bash
git diff --ignore-all-space --ignore-blank-lines HEAD~1
```

Pull Request resolved: https://github.com/pytorch/pytorch/pull/129758
Approved by: https://github.com/ezyang
2024-07-31 10:54:03 +00:00
Arun Pa
f71e368969 UFMT formatting on test/autograd test/ao test/cpp test/backends (#123369)
Partially addresses #123062

Ran lintrunner on
- test/_test_bazel.py
- test/ao
- test/autograd test/backends test/benchmark_uitls test/conftest.py test/bottleneck_test test/cpp

Pull Request resolved: https://github.com/pytorch/pytorch/pull/123369
Approved by: https://github.com/huydhn
2024-04-05 18:51:38 +00:00
Kazuaki Ishizaki
deb800ee81 Fix typo under test directory (#111304)
This PR fixes typo in comments under `test` directory.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/111304
Approved by: https://github.com/Skylion007
2023-10-16 23:06:06 +00:00
Justin Chu
c0d8a4af0a [BE] Enable ruff's UP rules and autoformat ao/ (#105430)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/105430
Approved by: https://github.com/albanD, https://github.com/malfet
2023-07-19 13:44:37 +00:00
Aaron Gokaslan
2f95a3d0fc [BE]: Apply ruff PERF fixes to torch (#104917)
Applies automated ruff fixes in the PERF modules and enables all automatic ones. I also updated ruff which applied some additional fixes.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/104917
Approved by: https://github.com/ezyang, https://github.com/albanD
2023-07-11 20:45:21 +00:00
Edward Z. Yang
f216fea44f Remove commented out pdb (#101993)
Signed-off-by: Edward Z. Yang <ezyang@meta.com>
Pull Request resolved: https://github.com/pytorch/pytorch/pull/101993
Approved by: https://github.com/Skylion007, https://github.com/wconstab
2023-05-23 02:20:10 +00:00
PyTorch MergeBot
dda7ce4bb3 Revert "[core][pruning][be] Rename sparsifier folder to pruner (#98758)"
This reverts commit 778fd1922a.

Reverted https://github.com/pytorch/pytorch/pull/98758 on behalf of https://github.com/jcaip due to https://www.internalfb.com/diff/D44905951 need to fix broken import in fbcode
2023-04-13 16:30:47 +00:00
Jesse Cai
778fd1922a [core][pruning][be] Rename sparsifier folder to pruner (#98758)
Summary:
att

Test Plan:
```
python test/test_ao_sparsity.py
```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/98758
Approved by: https://github.com/jerryzh168
2023-04-11 17:26:29 +00:00
Zafar
0e30da3f2f [refactor] Renaming ao.sparsity to ao.pruning (#84867)
`Sparsity` as a term doesn't reflect the tools that are developed by the AO. The `torch/ao/sparsity` also has utilities for structured pruning, which internally we always referred to as just "pruning". To avoid any confusion, we renamed `Sparsity` to `Prune`. We will not be introducing the backwards compatibility, as so far this toolset was kept under silent development.

This change will reflect the changes in the documentation as well.

**TODO:**
- [ ] Change the tutorials
- [ ] Confirm no bc-breakages
- [ ] Reflect the changes in the trackers and RFC docs

Fixes #ISSUE_NUMBER

Pull Request resolved: https://github.com/pytorch/pytorch/pull/84867
Approved by: https://github.com/supriyar
2022-10-07 00:58:41 +00:00
macandro96
85a9e7367c [ao][sparsity] Store mask as sparse coo during serialization for the activation sparsifier (#82181)
The stored mask is dumped as `torch.sparse_coo` while serializing. While restoring the state,
the mask is converted to a dense tensor again.

Test Plan:
```python test/test_ao_sparsity.py TestActivationSparsifier```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/82181
Approved by: https://github.com/z-a-f
2022-07-26 23:01:35 +00:00
macandro96
1ba63e5a56 [ao][sparsity] Serialization support (#80890)
Implemented dumping and loading of state_dicts and __get_state__ and __set_state__ functions.
hook and layer are removed from the data_groups dictionary before serializing.

In the future, might have to treat functions differently before serializing. Currently, it is being
treated similar to other types while serializing.

Test Plan:
```python test/test_ao_sparsity.py TestActivationSparsifier```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80890
Approved by: https://github.com/z-a-f
2022-07-22 21:57:56 +00:00
macandro96
aa23447904 [ao][sparsity] Implementation of squash_mask() (#80889)
Unregisters aggreagate hook that was applied earlier and registers sparsification hooks.
The sparsification hook will apply the mask to the activations before it is fed into the
attached layer.

Test Plan:
```python test/test_ao_sparsity.py TestActivationSparsifier```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80889
Approved by: https://github.com/z-a-f
2022-07-22 21:54:24 +00:00
macandro96
6b3bf3d6d9 [ao][sparsity] Implementation of step() and update_mask() (#80888)
The step() internally calls the update_mask() function for each layer
The update_mask() applies reduce_fn and mask_fn to compute the sparsification mask.
Note:
    the reduce_fn and mask_fn is called for each feature, dim over the data

Test Plan:
```python test/test_ao_sparsity.py TestActivationSparsifier```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80888
Approved by: https://github.com/z-a-f
2022-07-22 21:50:10 +00:00
macandro96
5fe3a1669c [ao][sparsity] Implementation of register_layer() and get_mask() (#80887)
The register_layer() attaches a pre-forward hook to the layer to aggregate
activations over time. The mask shape is also inferred here.

The get_mask() returns the computed mask associated to the attached layer.
The mask is
    - a torch tensor is features for that layer is None.
    - a list of torch tensors for each feature, otherwise

Test Plan:
```python test/test_ao_sparsity.py TestActivationSparsifier```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80887
Approved by: https://github.com/z-a-f
2022-07-22 21:47:34 +00:00
macandro96
f87d8c2f62 [ao][sparsity] Basic implementation of activation sparsifier (#80886)
The Activation sparsifier class aims to sparsify/prune activations in a neural
network. The idea is to attach the sparsifier to a layer (or layers) and it
zeroes out the activations based on the mask_fn (or sparsification function)
input by the user.
The mask_fn is applied once all the inputs are aggregated and reduced i.e.
mask = mask_fn(reduce_fn(aggregate_fn(activations)))

Note::
    The sparsification mask is computed on the input **before it goes through the attached layer**.

Test Plan:
```python test/test_ao_sparsity.py TestActivationSparsifier```
Pull Request resolved: https://github.com/pytorch/pytorch/pull/80886
Approved by: https://github.com/HDCharles
2022-07-22 21:43:33 +00:00