pytorch/caffe2/python/layers
Jiyan Yang 33f421027c Allow recency weight pooling for fp16 (#20506)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/20506

as titled

Reviewed By: alex1o1o7cloud

Differential Revision: D15342758

fbshipit-source-id: 89e7cb6d7b9511ef6c70611359736328571d7fc0
2019-05-14 20:13:38 -07:00
..
__init__.py
adaptive_weight.py
add_bias.py
arc_cosine_feature_map.py
batch_lr_loss.py try to enable uncertainty for lr loss (#17236) 2019-04-11 07:35:19 -07:00
batch_mse_loss.py
batch_normalization.py
batch_sigmoid_cross_entropy_loss.py
batch_softmax_loss.py
blob_weighted_sum.py
bucket_weighted.py Implement bucket-based attention pooling for IdScoreList features (#13004) 2018-10-25 18:04:08 -07:00
build_index.py
concat.py refactor preproc, support dense in TumHistory layer 2018-09-05 16:10:13 -07:00
constant_weight.py
conv.py
dropout.py add dropout during eval (#17549) 2019-02-28 23:21:29 -08:00
fc_without_bias.py
fc.py fc layer accept axis argument (#13822) 2018-11-11 13:44:57 -08:00
feature_sparse_to_dense.py Revert D13551909: [fbcode] logdevice for generic feature type 2019-01-25 00:33:06 -08:00
functional.py
gather_record.py
homotopy_weight.py
label_smooth.py
last_n_window_collector.py
layer_normalization.py Allow use substitute ops for LayerNorm (#12177) 2018-10-11 17:36:10 -07:00
layers.py
margin_rank_loss.py
merge_id_lists.py
pairwise_similarity.py move matrix formation for dot products to precompute/request-only (#10531) 2018-08-15 11:02:10 -07:00
position_weighted.py Remove unused code base for distributed training (#10282) 2018-08-16 20:10:17 -07:00
random_fourier_features.py
reservoir_sampling.py
sampling_train.py
sampling_trainable_mixin.py
select_record_by_context.py
semi_random_features.py
sparse_feature_hash.py
sparse_lookup.py Allow recency weight pooling for fp16 (#20506) 2019-05-14 20:13:38 -07:00
split.py
tags.py Remove unused code base for distributed training (#10282) 2018-08-16 20:10:17 -07:00
uniform_sampling.py