未验证 提交 8318094b 编写于 作者: T Tao Luo 提交者: GitHub

Merge pull request #11626 from ktlichkid/fix-log

Fix log and relu layer
...@@ -4920,16 +4920,16 @@ def random_crop(x, shape, seed=None): ...@@ -4920,16 +4920,16 @@ def random_crop(x, shape, seed=None):
return out return out
def log(x): def log(input):
""" """
Calculates the natural log of the given input tensor, element-wise. Calculates the natural log of the given input tensor, element-wise.
.. math:: .. math::
Out = \\ln(x) Out = \\ln(input)
Args: Args:
x (Variable): Input tensor. input (Variable): Input tensor.
Returns: Returns:
Variable: The natural log of the input tensor computed element-wise. Variable: The natural log of the input tensor computed element-wise.
...@@ -4938,7 +4938,7 @@ def log(x): ...@@ -4938,7 +4938,7 @@ def log(x):
.. code-block:: python .. code-block:: python
output = fluid.layers.log(x) output = fluid.layers.log(input)
""" """
helper = LayerHelper('log', **locals()) helper = LayerHelper('log', **locals())
dtype = helper.input_dtype(input_param_name='x') dtype = helper.input_dtype(input_param_name='x')
...@@ -4947,18 +4947,18 @@ def log(x): ...@@ -4947,18 +4947,18 @@ def log(x):
return out return out
def relu(x): def relu(input):
""" """
Relu takes one input data (Tensor) and produces one output data (Tensor) Relu takes one input data (Tensor) and produces one output data (Tensor)
where the rectified linear function, y = max(0, x), is applied to where the rectified linear function, y = max(0, input), is applied to
the tensor elementwise. the tensor elementwise.
.. math:: .. math::
Out = \\max(0, x) Out = \\max(0, input)
Args: Args:
x (Variable): The input tensor. input (Variable): The input tensor.
Returns: Returns:
Variable: The output tensor with the same shape as input. Variable: The output tensor with the same shape as input.
...@@ -4967,7 +4967,7 @@ def relu(x): ...@@ -4967,7 +4967,7 @@ def relu(x):
.. code-block:: python .. code-block:: python
output = fluid.layers.relu(x) output = fluid.layers.relu(input)
""" """
helper = LayerHelper('relu', **locals()) helper = LayerHelper('relu', **locals())
dtype = helper.input_dtype(input_param_name='x') dtype = helper.input_dtype(input_param_name='x')
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册