mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-07 00:21:07 +01:00
* Don't override Tensor, Storage macros defined outside torch/csrc in torch/csrc. This PR does the following: 1) Removes THSTensor macros in torch/csrc, which aren't used. 2) For macros defined outside of torch/csrc (THTensor, THTensor_, THStorage, THStorage_): a) No longer override them, i.e. previously THTensor could actually be THCTensor if a generic file was included from a file including THCP.h. b) Instead, introduce new macros THW* (e.g. THWTensor) to represent a (potentially empty) wildcard character. In addition to making this code easier to read and codemod, this allows us to more freely change TH/THC; for example: currently in the THC random code, the state is casted to THByteTensor*; this happens to work because the macros don't happen to override THByteTensor. But if THByteTensor just becomes an alias of THTensor (which is the plan for a single tensor type), then this no longer works. The whole thing is a bit of a mess previously because you really have to understand which macros and redefined and which aren't. We could also rename the macros that live in torch/csrc (e.g. the THPTensor macros), but since that is more self contained, I punted for now. * Don't change the plugin.
12 lines
254 B
C++
12 lines
254 B
C++
#ifndef TH_GENERIC_FILE
|
|
#define TH_GENERIC_FILE "generic/serialization.h"
|
|
#else
|
|
|
|
template <class io>
|
|
void THPStorage_(writeFileRaw)(THWStorage *self, io fd);
|
|
|
|
template <class io>
|
|
THWStorage * THPStorage_(readFileRaw)(io fd, THWStorage *storage);
|
|
|
|
#endif
|