pytorch/docs/source/nn.attention.experimental.rst
Boyuan Feng 68134a320e [Flex Attention] Paged Attention (#137164)
This PR adds paged attention for flex attention.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/137164
Approved by: https://github.com/drisspg
2024-10-29 17:05:22 +00:00

8 lines
245 B
ReStructuredText

torch.nn.attention.experimental
===============================
.. currentmodule:: torch.nn.attention.experimental
.. py:module:: torch.nn.attention.experimental
.. warning::
These APIs are experimental and subject to change without notice.