Commit Graph

3 Commits

Author SHA1 Message Date
albanD
f06b70a6e9 Fix memory leak during packing in tuples (#13305)
Summary:
Verified on python 3.6 that it fixes #13243
Pull Request resolved: https://github.com/pytorch/pytorch/pull/13305

Differential Revision: D12838764

Pulled By: soumith

fbshipit-source-id: 206a8b22d1d05e5f156f1db1baaa82358f3eaa83
2018-10-30 08:32:26 -07:00
Zachary DeVito
d985cf46f1
Add workaround to fix include warnings in Python 2 builds. (#6716) 2018-04-24 12:30:19 -07:00
gchanan
067f799e9f
Implement remaining Variable fallthrough methods via ATen (#3744)
* Use aten version of is_signed.

* Define is_cuda native function and use it for variable.

* Use ATen dim for Variable dim/ndimension.

* Get rid of dim, ndimension fallthroughs in variable.py.

* Move size/stride Variable methods to use ATen.

* Implement shape property on Variable via ATen.

* Remove the _getattr__ function from Variable.

* Get rid of dispatch functions and avoid cast.

* Add THPUtils_packInt64Array.

* Throw python errors.

* Use fallthrough and fix fallthrough generation for native functions.

* is_cuda is a property, not a method.
2017-11-17 15:57:56 -05:00