pytorch/caffe2/python/layers
Fei Tian 809ee9d04c Enable personalized FC weight_init and sparse_emb weight_init (#31707)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/31707

Change the initialization value for FC weight init and sparse embedding lookup init.

Previous default initialization is uniform(-\sqrt(1/input_dim), \sqrt(1/input_dim)); Now pass into a flexible hyperparameter, say \alpha into it, to change into uniform(-\sqrt(\alpha/input_dim), \sqrt(\alpha/input_dim));

Reviewed By: chonglinsun

Differential Revision: D18825615

fbshipit-source-id: 4c5f2e07f2b3f5d642fd96d64dbf68892ebeb30b
2020-01-07 10:10:54 -08:00
..
__init__.py
adaptive_weight.py
add_bias.py
arc_cosine_feature_map.py
batch_huber_loss.py Add new regression loss function type to FBLearner (#21080) 2019-06-17 17:43:00 -07:00
batch_lr_loss.py Fix typos (#30606) 2019-12-02 20:17:42 -08:00
batch_mse_loss.py Change dper3 loss module to match dper2 (#28265) 2019-10-18 10:08:38 -07:00
batch_normalization.py
batch_sigmoid_cross_entropy_loss.py
batch_softmax_loss.py Linearizable Label: Class Weights, Allow Missing Label, and Average by Batch Size (#29707) 2019-11-13 16:52:27 -08:00
blob_weighted_sum.py
bpr_loss.py Add BPR loss to TTSN (#24439) 2019-08-15 23:20:15 -07:00
bucket_weighted.py add feature name into module and update position weighted to match dper2 2019-10-14 08:06:19 -07:00
build_index.py
concat.py
constant_weight.py
conv.py
dropout.py add dropout during eval (#17549) 2019-02-28 23:21:29 -08:00
fc_with_bootstrap.py Creating new layer FCWithBootstrap used in bootstrapping uncertainty approach (#29152) 2019-11-04 21:18:15 -08:00
fc_without_bias.py Enable personalized FC weight_init and sparse_emb weight_init (#31707) 2020-01-07 10:10:54 -08:00
fc.py Enable personalized FC weight_init and sparse_emb weight_init (#31707) 2020-01-07 10:10:54 -08:00
feature_sparse_to_dense.py Fix typos (#30606) 2019-12-02 20:17:42 -08:00
functional.py
gather_record.py
homotopy_weight.py
label_smooth.py
last_n_window_collector.py
layer_normalization.py
layers.py Return list of AccessedFeatures from get_accessed_features (#23983) 2019-08-14 10:50:27 -07:00
margin_rank_loss.py
merge_id_lists.py
pairwise_similarity.py
position_weighted.py
random_fourier_features.py
reservoir_sampling.py
sampling_train.py
sampling_trainable_mixin.py
select_record_by_context.py
semi_random_features.py
sparse_dropout_with_replacement.py hook up dropout sparse with replacement operator 2019-07-23 14:34:25 -07:00
sparse_feature_hash.py Refactor and expose metadata of tum_history layer for online prediction 2019-08-15 00:27:11 -07:00
sparse_lookup.py Enable personalized FC weight_init and sparse_emb weight_init (#31707) 2020-01-07 10:10:54 -08:00
split.py Enable variable size embedding (#25782) 2019-09-09 22:08:32 -07:00
tags.py
uniform_sampling.py