mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 12:21:27 +01:00
# Motivation fix https://github.com/pytorch/pytorch/issues/135726 After merging two free blocks, I made a stupid mistake of ignoring the correct size to decrease the active memory size, which should be the original block size instead of the merged block size. # Additional Context Add a UT to guard this scenario. Pull Request resolved: https://github.com/pytorch/pytorch/pull/135818 Approved by: https://github.com/EikanWang |
||
|---|---|---|
| .. | ||
| impl | ||
| test | ||
| CMakeLists.txt | ||
| XPUCachingAllocator.cpp | ||
| XPUCachingAllocator.h | ||
| XPUDeviceProp.h | ||
| XPUException.h | ||
| XPUFunctions.cpp | ||
| XPUFunctions.h | ||
| XPUMacros.h | ||
| XPUStream.cpp | ||
| XPUStream.h | ||