diff --git a/develop/doc/api/v1/trainer_config_helpers/layers.html b/develop/doc/api/v1/trainer_config_helpers/layers.html index 73a36aae31ca1f603adcab478eaf9b65ceb81557..989dee056d66090c54770713b4b925ba2f6dc8eb 100644 --- a/develop/doc/api/v1/trainer_config_helpers/layers.html +++ b/develop/doc/api/v1/trainer_config_helpers/layers.html @@ -3488,6 +3488,7 @@ A fast and simple algorithm for training neural probabilistic language models.label (LayerOutput) – label layer
  • weight (LayerOutput) – weight layer, can be None(default)
  • num_classes (int) – number of classes.
  • +
  • act (BaseActivation) – Activation, default is Sigmoid.
  • num_neg_samples (int) – number of negative samples. Default is 10.
  • neg_distribution (list|tuple|collections.Sequence|None) – The distribution for generating the random negative labels. A uniform distribution will be used if not provided. diff --git a/develop/doc/api/v2/config/layer.html b/develop/doc/api/v2/config/layer.html index be00e43bbfb6a559f1ccd0ed69ebe08293b47c27..65c26eb16bada9244a9561423da37da86c9f07c1 100644 --- a/develop/doc/api/v2/config/layer.html +++ b/develop/doc/api/v2/config/layer.html @@ -4140,6 +4140,7 @@ A fast and simple algorithm for training neural probabilistic language models.label (paddle.v2.config_base.Layer) – label layer
  • weight (paddle.v2.config_base.Layer) – weight layer, can be None(default)
  • num_classes (int) – number of classes.
  • +
  • act (paddle.v2.Activation.Base) – Activation, default is Sigmoid.
  • num_neg_samples (int) – number of negative samples. Default is 10.
  • neg_distribution (list|tuple|collections.Sequence|None) – The distribution for generating the random negative labels. A uniform distribution will be used if not provided. diff --git a/develop/doc_cn/api/v1/trainer_config_helpers/layers.html b/develop/doc_cn/api/v1/trainer_config_helpers/layers.html index b549631786cd8f2ac92b7e122e02268e2e5d51c7..8108ce920c3b382f509facd8323d5d8aced8dfa7 100644 --- a/develop/doc_cn/api/v1/trainer_config_helpers/layers.html +++ b/develop/doc_cn/api/v1/trainer_config_helpers/layers.html @@ -3495,6 +3495,7 @@ A fast and simple algorithm for training neural probabilistic language models.label (LayerOutput) – label layer
  • weight (LayerOutput) – weight layer, can be None(default)
  • num_classes (int) – number of classes.
  • +
  • act (BaseActivation) – Activation, default is Sigmoid.
  • num_neg_samples (int) – number of negative samples. Default is 10.
  • neg_distribution (list|tuple|collections.Sequence|None) – The distribution for generating the random negative labels. A uniform distribution will be used if not provided. diff --git a/develop/doc_cn/api/v2/config/layer.html b/develop/doc_cn/api/v2/config/layer.html index a43dff00170fe604440ecd925c1a1212986bb478..0ee55275501115283365a2c3e2779d6ad7c62995 100644 --- a/develop/doc_cn/api/v2/config/layer.html +++ b/develop/doc_cn/api/v2/config/layer.html @@ -4147,6 +4147,7 @@ A fast and simple algorithm for training neural probabilistic language models.label (paddle.v2.config_base.Layer) – label layer
  • weight (paddle.v2.config_base.Layer) – weight layer, can be None(default)
  • num_classes (int) – number of classes.
  • +
  • act (paddle.v2.Activation.Base) – Activation, default is Sigmoid.
  • num_neg_samples (int) – number of negative samples. Default is 10.
  • neg_distribution (list|tuple|collections.Sequence|None) – The distribution for generating the random negative labels. A uniform distribution will be used if not provided.