pytorch/caffe2/python/layers
Lin Jiang 1f158adeee Add support for attention weight in SparseLookup (#26748)
Summary:
Support attention weights input to SparseLookup. In attention sum pooling, if attention weights can be pre-calculated before embedding lookup,  they can be passed to SparseLookup and processed by SparseLengthsWeightedSum op. One example is id_score attention sum pooling.

Essentially the net is converted from:
  LengthsSum(Mul(Gather(keys, w), att_weight))
to:
  SpaseLenghtsWeightedSum(keys, w, att_weight)

It unblocks potential efficiency gain with distributed training.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/26748

Test Plan: unit test

Reviewed By: chocjy

Differential Revision: D17553345

Pulled By: wheatkit

fbshipit-source-id: 60cc3c4b0bc1eade5459ac598e85286f3849a412
2019-10-08 20:22:25 -07:00
..
__init__.py
adaptive_weight.py
add_bias.py
arc_cosine_feature_map.py
batch_huber_loss.py Add new regression loss function type to FBLearner (#21080) 2019-06-17 17:43:00 -07:00
batch_lr_loss.py Exponential decay of the weight of task loss (#27508) 2019-10-08 09:15:41 -07:00
batch_mse_loss.py
batch_normalization.py
batch_sigmoid_cross_entropy_loss.py
batch_softmax_loss.py
blob_weighted_sum.py
bpr_loss.py Add BPR loss to TTSN (#24439) 2019-08-15 23:20:15 -07:00
bucket_weighted.py Make hashing default for bucket-weighted pooling (#24266) 2019-08-13 13:56:32 -07:00
build_index.py
concat.py
constant_weight.py
conv.py
dropout.py add dropout during eval (#17549) 2019-02-28 23:21:29 -08:00
fc_without_bias.py
fc.py Integrate FC fp16 exporter into Dper2 (#26582) 2019-09-29 10:19:28 -07:00
feature_sparse_to_dense.py Return list of AccessedFeatures from get_accessed_features (#23983) 2019-08-14 10:50:27 -07:00
functional.py
gather_record.py
homotopy_weight.py
label_smooth.py
last_n_window_collector.py
layer_normalization.py
layers.py Return list of AccessedFeatures from get_accessed_features (#23983) 2019-08-14 10:50:27 -07:00
margin_rank_loss.py
merge_id_lists.py
pairwise_similarity.py
position_weighted.py
random_fourier_features.py
reservoir_sampling.py
sampling_train.py
sampling_trainable_mixin.py
select_record_by_context.py
semi_random_features.py
sparse_dropout_with_replacement.py hook up dropout sparse with replacement operator 2019-07-23 14:34:25 -07:00
sparse_feature_hash.py Refactor and expose metadata of tum_history layer for online prediction 2019-08-15 00:27:11 -07:00
sparse_lookup.py Add support for attention weight in SparseLookup (#26748) 2019-10-08 20:22:25 -07:00
split.py Enable variable size embedding (#25782) 2019-09-09 22:08:32 -07:00
tags.py
uniform_sampling.py