提交 2bc812d8 编写于 作者: C Chris Yann 提交者: Qingsheng Li

Fix relu and log function by changing input parameter name from 'input' to 'x' (#11683)

* Fix relu and log

* Update nn.py
上级 acfd177d
......@@ -4920,16 +4920,16 @@ def random_crop(x, shape, seed=None):
return out
def log(input):
def log(x):
"""
Calculates the natural log of the given input tensor, element-wise.
.. math::
Out = \\ln(input)
Out = \\ln(x)
Args:
input (Variable): Input tensor.
x (Variable): Input tensor.
Returns:
Variable: The natural log of the input tensor computed element-wise.
......@@ -4938,7 +4938,7 @@ def log(input):
.. code-block:: python
output = fluid.layers.log(input)
output = fluid.layers.log(x)
"""
helper = LayerHelper('log', **locals())
dtype = helper.input_dtype(input_param_name='x')
......@@ -4947,18 +4947,18 @@ def log(input):
return out
def relu(input):
def relu(x):
"""
Relu takes one input data (Tensor) and produces one output data (Tensor)
where the rectified linear function, y = max(0, input), is applied to
where the rectified linear function, y = max(0, x), is applied to
the tensor elementwise.
.. math::
Out = \\max(0, input)
Out = \\max(0, x)
Args:
input (Variable): The input tensor.
x (Variable): The input tensor.
Returns:
Variable: The output tensor with the same shape as input.
......@@ -4967,7 +4967,7 @@ def relu(input):
.. code-block:: python
output = fluid.layers.relu(input)
output = fluid.layers.relu(x)
"""
helper = LayerHelper('relu', **locals())
dtype = helper.input_dtype(input_param_name='x')
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册