pytorch/torch/csrc/jit/api
Scott Wolchok e88d1c4f10 [PyTorch] Add tuple inline storage (#64066)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/64066

I noticed a bunch of time being spent heap-allocating Tuples
in the unpickler. 1-, 2-, and 3-element Tuples are apparently common
enough that they get their own bytecode instructions, so I decided to
try also giving them their own representation. We store up to 3
IValues inline in `Tuple` rather than doing a second heap allocation
for a `std::vector<IValue>`.
ghstack-source-id: 140695395

Test Plan:
Added automated tests for TupleElements.

Pixel 3 before: https://www.internalfb.com/intern/aibench/details/761596366576284
Pixel 3 after: https://www.internalfb.com/intern/aibench/details/591414145082422
We went from 347 ms to 302 ms.

Reviewed By: dhruvbird

Differential Revision: D30592622

fbshipit-source-id: 93625c54c9dca5f765ef6d5c191944179cb281a8
2021-10-15 12:16:51 -07:00
..
compilation_unit.h
function_impl.cpp irange-ify 2 (#62113) 2021-07-26 12:00:52 -07:00
function_impl.h Make pytorch clang-tidy clean (#60649) 2021-07-01 12:21:07 -07:00
method.h Make name() part of IMethod interface (#63995) 2021-08-30 13:31:55 -07:00
module_save.cpp
module.cpp [PyTorch] Add tuple inline storage (#64066) 2021-10-15 12:16:51 -07:00
module.h [jit] Reduce refcounting of Types (#65345) 2021-10-08 09:03:04 -07:00
object.cpp
object.h [jit] Reduce refcounting of Types (#65345) 2021-10-08 09:03:04 -07:00