提交 bac4e0d4 编写于 作者: A A. Unique TensorFlower 提交者: TensorFlower Gardener

Fix doc for _DNNLinearCombinedBaseEstimator.

Change: 125575345
上级 45aa96d0
...@@ -80,7 +80,6 @@ class _DNNLinearCombinedBaseEstimator(estimator.BaseEstimator): ...@@ -80,7 +80,6 @@ class _DNNLinearCombinedBaseEstimator(estimator.BaseEstimator):
Args: Args:
model_dir: Directory to save model parameters, graph and etc. model_dir: Directory to save model parameters, graph and etc.
n_classes: number of target classes. Default is binary classification.
weight_column_name: A string defining feature column name representing weight_column_name: A string defining feature column name representing
weights. It is used to down weight or boost examples during training. It weights. It is used to down weight or boost examples during training. It
will be multiplied by the loss of the example. will be multiplied by the loss of the example.
...@@ -92,10 +91,10 @@ class _DNNLinearCombinedBaseEstimator(estimator.BaseEstimator): ...@@ -92,10 +91,10 @@ class _DNNLinearCombinedBaseEstimator(estimator.BaseEstimator):
dnn_feature_columns: An iterable containing all the feature columns used dnn_feature_columns: An iterable containing all the feature columns used
by deep part of the model. All items in the set should be instances of by deep part of the model. All items in the set should be instances of
classes derived from `FeatureColumn`. classes derived from `FeatureColumn`.
dnn_hidden_units: List of hidden units per layer. All layers are fully
connected.
dnn_optimizer: An instance of `tf.Optimizer` used to apply gradients to dnn_optimizer: An instance of `tf.Optimizer` used to apply gradients to
the deep part of the model. If `None`, will use an Adagrad optimizer. the deep part of the model. If `None`, will use an Adagrad optimizer.
dnn_hidden_units: List of hidden units per layer. All layers are fully
connected.
dnn_activation_fn: Activation function applied to each layer. If `None`, dnn_activation_fn: Activation function applied to each layer. If `None`,
will use `tf.nn.relu`. will use `tf.nn.relu`.
dnn_dropout: When not None, the probability we will drop out dnn_dropout: When not None, the probability we will drop out
...@@ -485,10 +484,10 @@ class DNNLinearCombinedClassifier(_DNNLinearCombinedBaseEstimator): ...@@ -485,10 +484,10 @@ class DNNLinearCombinedClassifier(_DNNLinearCombinedBaseEstimator):
dnn_feature_columns: An iterable containing all the feature columns used dnn_feature_columns: An iterable containing all the feature columns used
by deep part of the model. All items in the set must be instances of by deep part of the model. All items in the set must be instances of
classes derived from `FeatureColumn`. classes derived from `FeatureColumn`.
dnn_hidden_units: List of hidden units per layer. All layers are fully
connected.
dnn_optimizer: An instance of `tf.Optimizer` used to apply gradients to dnn_optimizer: An instance of `tf.Optimizer` used to apply gradients to
the deep part of the model. If `None`, will use an Adagrad optimizer. the deep part of the model. If `None`, will use an Adagrad optimizer.
dnn_hidden_units: List of hidden units per layer. All layers are fully
connected.
dnn_activation_fn: Activation function applied to each layer. If `None`, dnn_activation_fn: Activation function applied to each layer. If `None`,
will use `tf.nn.relu`. will use `tf.nn.relu`.
dnn_dropout: When not None, the probability we will drop out dnn_dropout: When not None, the probability we will drop out
...@@ -732,10 +731,10 @@ class DNNLinearCombinedRegressor(_DNNLinearCombinedBaseEstimator): ...@@ -732,10 +731,10 @@ class DNNLinearCombinedRegressor(_DNNLinearCombinedBaseEstimator):
dnn_feature_columns: An iterable containing all the feature columns used dnn_feature_columns: An iterable containing all the feature columns used
by deep part of the model. All items in the set must be instances of by deep part of the model. All items in the set must be instances of
classes derived from `FeatureColumn`. classes derived from `FeatureColumn`.
dnn_hidden_units: List of hidden units per layer. All layers are fully
connected.
dnn_optimizer: An instance of `tf.Optimizer` used to apply gradients to dnn_optimizer: An instance of `tf.Optimizer` used to apply gradients to
the deep part of the model. If `None`, will use an Adagrad optimizer. the deep part of the model. If `None`, will use an Adagrad optimizer.
dnn_hidden_units: List of hidden units per layer. All layers are fully
connected.
dnn_activation_fn: Activation function applied to each layer. If None, dnn_activation_fn: Activation function applied to each layer. If None,
will use `tf.nn.relu`. will use `tf.nn.relu`.
dnn_dropout: When not None, the probability we will drop out dnn_dropout: When not None, the probability we will drop out
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册