Commit Graph

15 Commits

Author SHA1 Message Date
Pavithran Ramachandran
d9d34922a0 Extend jit::load to work on flatbuffer file (#75022)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/75022

Extending torch::jit::load to read flatbuffer file
ghstack-source-id: 152820697

Test Plan: CI

Reviewed By: iseeyuan

Differential Revision: D35060736

fbshipit-source-id: d653a5af662a46107ff4fd70209fd2a0a4d40f20
(cherry picked from commit 109e14a54bd279011c8f9066e6c29e8e0b1fc4db)
2022-04-02 01:33:34 +00:00
Pavithran Ramachandran
7aaa75af05 Extending _get_bytecode_version to support flatbuffers format (#75021)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/75021

Extending `_get_bytecode_version` to support flatbuffers.
ghstack-source-id: 152771695

(Note: this ignores all push blocking failures!)

Test Plan:
```
~/fbsource/xplat] cd ~/fbsource/xplat/ && buck test //xplat/caffe2:test_lite_interpreter
Building: finished in 0.8 sec (100%) 327/327 jobs, 0/327 updated
  Total time: 0.9 sec
Testing: finished in 06:59.5 min (85 PASS/0 FAIL)
BUILD SUCCEEDED
RESULTS FOR //xplat/caffe2:test_lite_interpreter
PASS    412.3s 85 Passed   0 Skipped   0 Failed   //xplat/caffe2:test_lite_interpreter
TESTS PASSED
```

Reviewed By: iseeyuan

Differential Revision: D34900498

fbshipit-source-id: 65743076d43a933c5381ec128d0268f22c0a8441
(cherry picked from commit 457c76c7d1df6050b941c56a8198162e2e4a3388)
2022-04-01 15:05:37 +00:00
Pavithran Ramachandran
6905feea1a Adding versions to flatbuffer schema (#74989)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/74989

Adding bytecode and operator version to be serialized in flatbuffer files
ghstack-source-id: 152720235

(Note: this ignores all push blocking failures!)

Test Plan: CI

Reviewed By: iseeyuan

Differential Revision: D35265693

fbshipit-source-id: f47a21036e82c0df3e787e3f330a8140f9c922fc
(cherry picked from commit fc1d9b8dadaf454109a5c9ae583f283b2550ee4e)
2022-03-31 20:26:16 +00:00
Han Qi
75d6cbe605 [4/5]Testing jit module in flatbuffer in Python. (#74387)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/74387

Make temporary python bindings for flatbuffer to test ScriptModule save / load.

(Note: this ignores all push blocking failures!)

Test Plan: unittest

Reviewed By: iseeyuan

Differential Revision: D34968080

fbshipit-source-id: d23b16abda6e4b7ecf6b1198ed6e00908a3db903
(cherry picked from commit 5cbbc390c5f54146a1c469106ab4a6286c754325)
2022-03-24 23:29:47 +00:00
Han Qi
4b4f652f79 [3/5] Put JIT source inside flatbuffer (#74245)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/74245

title

Test Plan: unittest

Reviewed By: iseeyuan

Differential Revision: D34881612

fbshipit-source-id: 7037982e9267ad72b86e91cd5f2d92426d71dd56
(cherry picked from commit 88f34eb55b2bee6ef8ef27188e075fa2b8767fdf)
2022-03-17 18:46:47 +00:00
Dave Bort
6c18a9951b [PyTorchEdge] Start writing magic to flatbuffer output (#74084)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/74084

Now that the schema includes a magic file header string, write it to the flatbuffer data generated by `flatbuffer_serializer`.
ghstack-source-id: 151109277

Test Plan: A later diff in this stack (D34408538) tests that the output data contains the magic header.

Reviewed By: pavithranrao

Differential Revision: D34809318

fbshipit-source-id: edb45d57e56fa4b30675eb9ce6e4e258abfd5417
(cherry picked from commit f5e8a3ff70eba186ac9e7b91739010e55cd6c5a6)
2022-03-14 23:44:58 +00:00
Pavithran Ramachandran
62eb7d64cf [PyTorchEdge] Extend flatbuffer to support extra files map (#72951)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/72951

Extend flatbuffer to support extra files map

Flatbuffer schema has extra files. The users can write extra files by providing a `map<string, string>` which will be part of the flatbuffer model asset and and can be loaded back similar to pickle.
ghstack-source-id: 149622799

Test Plan:
fb:

```[pavithran@devvm5216.vll0 ~/fbsource/fbcode] cd ~/fbsource/fbcode/ && buck test caffe2/test/cpp/jit:jit -- FlatbufferTest.ExtraFiles
Parsing buck files: finished in 0.7 sec
Downloaded 0/8 artifacts, 0.00 bytes, 100.0% cache miss (for updated rules)
Building: finished in 20.0 sec (100%) 22343/22343 jobs, 4/22343 updated
  Total time: 20.7 sec
More details at https://www.internalfb.com/intern/buck/build/7dba5034-d623-4a1e-afa1-b0e809df7066
BUILD SUCCEEDED
Tpx test run coordinator for Facebook. See https://fburl.com/tpx for details.
Running with tpx session id: 9c1ac1e0-a8c0-4a62-95df-8f49695aa7d1
Trace available for this run at /tmp/tpx-20220216-144630.207992/trace.log
RemoteExecution session id: reSessionID-9c1ac1e0-a8c0-4a62-95df-8f49695aa7d1-tpx
Started reporting to test run: https://www.internalfb.com/intern/testinfra/testrun/7318349470518809
    ✓ ListingSuccess: caffe2/test/cpp/jit:jit : 468 tests discovered (17.211)
    ✓ Pass: caffe2/test/cpp/jit:jit - FlatbufferTest.ExtraFiles (0.169)
Summary
  Pass: 1
  ListingSuccess: 1
If you need help understanding your runs, please follow the wiki: https://fburl.com/posting_in_tpx_users
Finished test run: https://www.internalfb.com/intern/testinfra/testrun/7318349470518809````

Reviewed By: iseeyuan

Differential Revision: D34286346

fbshipit-source-id: 4e09ab25b8ed6af6f8923db3aab046c255f13bb8
(cherry picked from commit ce8d88e22a360b25253d8a75f428d523fa88a79a)
2022-02-24 19:39:32 +00:00
Han Qi
57f039b41f Fixing few bugs in torch flatbuffer (#72349)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/72349

1. Interface call'd methods need to be registered to class. Previously all interface calls are inlined so  there was no such problem.
2. parseDoubleList and parseBoolList got reversed when refactoring.

Test Plan:
1. Get ASR's test model at
```
mkdir ~/asr1 && cd ~/asr1
fbpkg fetch speech.tuna.milan.ondevice.en_us
```
2. Convert model:
```
cd ~/fbsource
buck run //xplat/caffe2/fb/lite_predictor:convert_model -- --model=$HOME/asr1/pytorchmodel.pt --output_name=$HOME/asr1/pytorchmodel.ff
```
3. Ran lite_predictor_flatbuffer
```
 buck run //xplat/caffe2/fb/lite_predictor:lite_predictor_flatbuffer -- --model=$HOME/asr1/pytorchmodel.ff --method_to_call=encode_src --method_to_generate_input=get_all_bundled_inputs_for_encode_src

```

See perf metric generated (means loading and inference succeeded).

Reviewed By: gmagogsfm, zhxchen17

Differential Revision: D33959746

fbshipit-source-id: 24671e1189438119f477032eb6c29bd7736e74ca
(cherry picked from commit 5e18809350)
2022-02-05 00:25:27 +00:00
Martin Yuan
1aa2257cac Error message update: use proper name of custom c++ classes (#71922)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/71922

Use proper name in the error message and remove "torchbind", since it's not official in documentation.

Test Plan: Imported from OSS

Reviewed By: cccclai

Differential Revision: D33824899

Pulled By: iseeyuan

fbshipit-source-id: 41968494c04fab39292d9cc4dc6e15cca99cbff4
(cherry picked from commit 9732a52ed2)
2022-01-28 01:43:19 +00:00
Zhengxu Chen
b486797864 [jit][edge] Make flatbuffer_serailzer print correct type strings. (#71935)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/71935

flatbuffer serializer today prints type strings based on platform. For example "DynamicType" will be exported if C10_MOBILE is present. Although it's not intended behavior, we should be able to export the correct type name to reduce confusion from users.
ghstack-source-id: 147821109

Test Plan:
```
buck run fbcode/mode/dbg //arvr/firmware/silicon/turing:test_torch -c pt.has_backtraces=1 -c turing.min_runtime=1 -c turing.dsp_op=1 -c turing.model_file=test1.ptl

Downloaded 0/66 artifacts, 0.00 bytes, 100.0% cache miss (for updated rules)
Building: finished in 38.2 sec (100%) 345/345 jobs, 36/345 updated
  Total time: 38.2 sec
BUILD SUCCEEDED
Conv:  input [1, 32, 4, 4] residuals [1] weights [4, 4, 1, 1, 2, 32] nlu_params [4, 128] in_ch 32 out_ch 32 groups 1 kernel  stride  padding  upsample 0 op_type 0 act_type 0
--tensor: 0x7ffdd461c6e8
        device: cpu
        is_quantized: 0
        contiguous: 1
        layout: Strided
        dtype: int
        itemsize: 4
        data_ptr: 0x7f781a0a2c10
        dim: 4
        size: [1, 32, 4, 4]
        stride: [512, 16, 4, 1]
dump data/size: 0x7f781a0a2c10/512
        0       00000004
        1       00000004
        2       00000004
        3       00000004
        4       00000004
        5       00000004
        6       00000004
        7       00000004
        8       00000004
        9       00000004
        10      00000004
        11      00000004
        12      00000004
        13      00000004
        14      00000004
        15      00000004
```

Reviewed By: qihqi

Differential Revision: D33826292

fbshipit-source-id: 3c579d89d31fe8d0df5ea6915746aa70da7e3d5c
(cherry picked from commit 9723a84f83)
2022-01-27 23:22:56 +00:00
Han Qi
1bc3571078 [pytorch][PR] Add ability for a mobile::Module to save as flatbuffer (#70201)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/70201

Included functions:
save_mobile_module -> saves a mobile::Module to flatbuffer
load_mobile_module_from_file -> loads a flatbuffer into mobile::Module
parse_mobile_module -> parses from bytes or deserialized flatbuffer module object

Compared to previous attempts, this diff only adds flatbuffer to cmake target and leaves fbcode/xplat ones unchanged.

Test Plan: unittest

Reviewed By: malfet, gmagogsfm

Differential Revision: D33239362

fbshipit-source-id: b9ca36b83d6af2d78cc50b9eb9e2a6fa7fce0763
2022-01-12 16:30:39 -08:00
Yanan Cao
17f3179d60 Back out "[pytorch][PR] Add ability for a mobile::Module to save as flatbuffer" (#69796)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/69796

(Note: this ignores all push blocking failures!)

Test Plan: External CI + Sandcastle

Reviewed By: zhxchen17

Differential Revision: D33032671

fbshipit-source-id: dbf6690e960e25d6a5f19043cbe792add2acd7ef
2021-12-10 21:29:53 -08:00
Han Qi
d3649309e6 [pytorch][PR] Add ability for a mobile::Module to save as flatbuffer (#69306)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/69306

Included functions:

save_mobile_module -> saves a mobile::Module to flatbuffer
load_mobile_module_from_file -> loads a flatbuffer into mobile::Module
parse_mobile_module -> parses from bytes or deserialized flatbuffer
Module object

Test Plan: unittests

Reviewed By: gmagogsfm

Differential Revision: D32806835

fbshipit-source-id: 71913c6650e225634f878946bd16960d377a7f57
2021-12-09 14:53:31 -08:00
Alban Desmaison
00ebbd5ef6 Revert D32010095: [pytorch][PR] Add ability for a mobile::Module to save as flatbuffer
Test Plan: revert-hammer

Differential Revision:
D32010095 (41d35dc201)

Original commit changeset: d763b0557780

fbshipit-source-id: bf746a0389135c9f5f67f00f449435ce08fb5f6d
2021-12-02 06:41:40 -08:00
Han Qi
41d35dc201 Add ability for a mobile::Module to save as flatbuffer (#67351)
Summary:
Included functions:

* save_mobile_module -> saves a mobile::Module to flatbuffer
* load_mobile_module_from_file -> loads a flatbuffer into mobile::Module
* parse_mobile_module -> parses from bytes or deserialized flatbuffer
      Module object

Fixes #{issue number}

Pull Request resolved: https://github.com/pytorch/pytorch/pull/67351

Reviewed By: iseeyuan

Differential Revision: D32010095

Pulled By: qihqi

fbshipit-source-id: d763b0557780f7c2661b6485105b045e41a5e8f1
2021-12-01 23:58:15 -08:00