Commit Graph

17 Commits

Author SHA1 Message Date
Bugra Akyildiz
27c7158166 Remove __future__ imports for legacy Python2 supports (#45033)
Summary:
There is a module called `2to3` which you can target for future specifically to remove these, the directory of `caffe2` has the most redundant imports:

```2to3 -f future -w caffe2```

Pull Request resolved: https://github.com/pytorch/pytorch/pull/45033

Reviewed By: seemethere

Differential Revision: D23808648

Pulled By: bugra

fbshipit-source-id: 38971900f0fe43ab44a9168e57f2307580d36a38
2020-09-23 17:57:02 -07:00
David Gisser
91bdb872ce fix spelling mistake: excpected -> expected
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/28817

Differential Revision: D18544562

Pulled By: dgisser

fbshipit-source-id: 51f728e807f9c4bb30f58585d5b6f436cb880153
2020-01-17 00:11:08 -08:00
Yan Zhu
ac9f0a6884 refactor preproc, support dense in TumHistory layer
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/11131

Reviewed By: xianjiec

Differential Revision: D9358415

fbshipit-source-id: 38bf0e597e22d540d9e985ac8da730f80971d745
2018-09-05 16:10:13 -07:00
Orion Reblitz-Richardson
1d5780d42c Remove Apache headers from source.
* LICENSE file contains details, so removing from individual source files.
2018-03-27 13:10:18 -07:00
Lin Yang
252211b001 testPairwiseDotProduct
Summary: as title.

Reviewed By: kennyhorror

Differential Revision: D6793829

fbshipit-source-id: f803e0400635ca37184f1dd5bb711bfe0e4bea21
2018-01-26 11:33:08 -08:00
Ellie Wen
fc3f88d8a4 higher order interaction of embeddings
Summary:
Get higher order interaction of embeddings, similar to cross net but applied in the embedding level.
Formula:
  e_(l+1,i) = element_wise_mul[e_(0,i), \sum_i(e_(l,i) * w_(l,i))] + e_(l,i) + b
where l means the l-th layer of this higher order net, i means the i-th embedding in the list.

Finally, concat all the embeddings in the last layer, or concat the sum of each embedding, and attach to the output blob of dot processor.

Differential Revision: D6244001

fbshipit-source-id: 96292914158347b79fc1299694d65605999b55e8
2017-11-30 08:51:09 -08:00
Ellie Wen
84b76a0712 fix shape info in concat layer
Summary:
The output shape info is incorrect, e.g. if we have 4 embeddings with dim size 32, the actual shape is (4, 32),
but the previous implementation in concat layer will give us (128, 1). This bug doesn't affect the dot products
calculation because the actual shape of the blob is still (4, 32) in concat_split_op

Differential Revision: D6264793

fbshipit-source-id: 82995e83a8c859cbd15617ff7850a35b30b453b6
2017-11-07 21:08:39 -08:00
Yangqing Jia
8286ce1e3a Re-license to Apache
Summary: Closes https://github.com/caffe2/caffe2/pull/1260

Differential Revision: D5906739

Pulled By: Yangqing

fbshipit-source-id: e482ba9ba60b5337d9165f28f7ec68d4518a0902
2017-09-28 16:22:00 -07:00
Jiyan Yang
a8695178aa Adding parameter sharing API to Dper2
Summary:
To achive this, I modified the blob name scheme defined in a layer.
Before it was scope/fc_w and scope/fc_w_auto_0 (if there is another fc
    within the same scope).
Now I change it to scope/fc/w and scope/fc_auto_0/w.
That is, we rely on the uniqueness of the scoped layer name to define
names for blobs.

I also overwrote the create_param method in LayerModelHelper to let it
use the resolved name for blobs given the sharingparameter context.

There are some details such as making the initializer more structured
that I need to finalize.

Reviewed By: kennyhorror

Differential Revision: D5435132

fbshipit-source-id: a0525f5ea0977e255dd5ea765b38913f5951d455
2017-08-03 00:33:18 -07:00
Bangsheng Tang
e5a7891038 dot product using matmul
Summary:
1. PairwiseDotProduct in layers
2. add_axis argument in Concat and Split(just for backward propagtion)

Reviewed By: xianjiec

Differential Revision: D5383208

fbshipit-source-id: 8e18ce371fff2da2da77b1a728142d69cd48e9c3
2017-07-17 23:20:37 -07:00
Thomas Dudziak
5355634dac Dict fixes/improvements and unittest targets for Python 3 in caffe2 core
Summary: As title

Reviewed By: salexspb

Differential Revision: D5316104

fbshipit-source-id: aee43819d817842e5ce6ba3d045a55b1a2491c30
2017-06-29 17:05:41 -07:00
Kittipat Virochsiri
ffc6bad116 Concat axis=0
Summary: Previously, the code below would go out of bound.

Reviewed By: xianjiec

Differential Revision: D4968037

fbshipit-source-id: 3760e2cddc919c45d85ac644ac3fabf72dbaf666
2017-05-01 12:19:34 -07:00
Aaron Markham
58f7f2b441 doxygen python block added
Summary: Closes https://github.com/caffe2/caffe2/pull/226

Differential Revision: D4793550

Pulled By: JoelMarcey

fbshipit-source-id: cc33e58186304fa8dcac2ee9115dcc271d785b1e
2017-03-29 06:46:16 -07:00
Xianjie Chen
e5858485ca small change to concat layer to make tensor board vis nicer
Summary:
otherwise the blob will be in different namescope, e.g., `_nested`: https://fburl.com/ntlsaezv.
this make tensorboard ugly.

Reviewed By: dzhulgakov

Differential Revision: D4696946

fbshipit-source-id: 73627feccd7c4896964e6c549b7241bcce4f49a7
2017-03-12 23:01:18 -07:00
Xianjie Chen
d0621a2449 NextScopedBlob with well-defined behavior and respect namescope
Summary:
Remove the use of `NextName` in layer model helper, so that the same function return `model_helper` that should construct identical `Net`, when under the same NameScope.

The `NextScopedBlob` should only take effect when there is real name conflicting, otherwise it returns ScopedBlobReference.

This is critical for parameter blobs. In long run, we need to be able to specify parameter blobs more explicitly. (kennyhorror is working on this). This solution works in short term for e.g., two tower sparse nn models.

Reviewed By: kennyhorror

Differential Revision: D4555423

fbshipit-source-id: 2c4b99a61392e5d51aa878f7346466a8f14be187
2017-02-16 17:16:36 -08:00
Yangqing Jia
589398950f fbsync at f5a877 2016-11-18 15:41:06 -08:00
Yangqing Jia
238ceab825 fbsync. TODO: check if build files need update. 2016-11-15 00:00:46 -08:00