未验证 提交 567dabeb 编写于 作者: C cyberslack_lee 提交者: GitHub

修改COPY-FROM No.18 (#54842)

上级 3138e7aa
...@@ -152,6 +152,12 @@ def monkey_patch_math_tensor(): ...@@ -152,6 +152,12 @@ def monkey_patch_math_tensor():
def _ndim_(var): def _ndim_(var):
return len(var.shape) return len(var.shape)
def ndimension(var):
return len(var.shape)
def dim(var):
return len(var.shape)
@property @property
def _size_(var): def _size_(var):
return int(np.prod(var.shape)) return int(np.prod(var.shape))
...@@ -174,8 +180,8 @@ def monkey_patch_math_tensor(): ...@@ -174,8 +180,8 @@ def monkey_patch_math_tensor():
('__len__', _len_), ('__len__', _len_),
('__index__', _index_), ('__index__', _index_),
('astype', astype), ('astype', astype),
('dim', lambda x: len(x.shape)), ('dim', dim),
('ndimension', lambda x: len(x.shape)), ('ndimension', ndimension),
('ndim', _ndim_), ('ndim', _ndim_),
('size', _size_), ('size', _size_),
('T', _T_), ('T', _T_),
......
...@@ -1260,6 +1260,7 @@ class Variable(metaclass=VariableMetaClass): ...@@ -1260,6 +1260,7 @@ class Variable(metaclass=VariableMetaClass):
In Static Graph Mode: In Static Graph Mode:
.. code-block:: python .. code-block:: python
:name: code-example-1
import paddle.fluid as fluid import paddle.fluid as fluid
cur_program = fluid.Program() cur_program = fluid.Program()
...@@ -1271,6 +1272,7 @@ class Variable(metaclass=VariableMetaClass): ...@@ -1271,6 +1272,7 @@ class Variable(metaclass=VariableMetaClass):
In Dygraph Mode: In Dygraph Mode:
.. code-block:: python .. code-block:: python
:name: code-example-2
import paddle.fluid as fluid import paddle.fluid as fluid
import numpy as np import numpy as np
...@@ -5743,21 +5745,22 @@ class Program: ...@@ -5743,21 +5745,22 @@ class Program:
use :code:`clone` after :code:`Opimizer.minimize`, but we still use :code:`clone` after :code:`Opimizer.minimize`, but we still
recommend you to use :code:`clone` before using :code:`Opimizer.minimize`. recommend you to use :code:`clone` before using :code:`Opimizer.minimize`.
For Example: Examples:
:: .. code-block:: python
:name: code-example-1
import paddle import paddle
import paddle.static as static import paddle.static as static
paddle.enable_static() paddle.enable_static()
img = static.data(name='image', shape=[None, 784]) img = static.data(name='image', shape=[None, 784])
pred = static.nn.fc(x=img, size=10, actvation='relu') pred = static.nn.fc(x=img, size=10, actvation='relu')
loss = paddle.mean(pred) loss = paddle.mean(pred)
# Here we use clone before Momentum # Here we use clone before Momentum
test_program = static.default_main_program().clone(for_test=True) test_program = static.default_main_program().clone(for_test=True)
optimizer = paddle.optimizer.Momentum(learning_rate=0.01, momentum=0.9) optimizer = paddle.optimizer.Momentum(learning_rate=0.01, momentum=0.9)
optimizer.minimize(loss) optimizer.minimize(loss)
Args: Args:
...@@ -5778,6 +5781,7 @@ class Program: ...@@ -5778,6 +5781,7 @@ class Program:
after :code:`clone`: after :code:`clone`:
.. code-block:: python .. code-block:: python
:name: code-example-2
import paddle import paddle
...@@ -5795,6 +5799,7 @@ class Program: ...@@ -5795,6 +5799,7 @@ class Program:
1. To clone a test program, the sample code is: 1. To clone a test program, the sample code is:
.. code-block:: python .. code-block:: python
:name: code-example-3
import paddle import paddle
import paddle.static as static import paddle.static as static
...@@ -5847,6 +5852,7 @@ class Program: ...@@ -5847,6 +5852,7 @@ class Program:
2. The clone method can be avoid if you create program for training and program for testing individually. 2. The clone method can be avoid if you create program for training and program for testing individually.
.. code-block:: python .. code-block:: python
:name: code-example-4
import paddle import paddle
import paddle.static as static import paddle.static as static
...@@ -7235,30 +7241,32 @@ def program_guard(main_program, startup_program=None): ...@@ -7235,30 +7241,32 @@ def program_guard(main_program, startup_program=None):
Default: None. Default: None.
Examples: Examples:
.. code-block:: python .. code-block:: python
:name: code-example-1
import paddle import paddle
paddle.enable_static() paddle.enable_static()
main_program = paddle.static.Program() main_program = paddle.static.Program()
startup_program = paddle.static.Program() startup_program = paddle.static.Program()
with paddle.static.program_guard(main_program, startup_program): with paddle.static.program_guard(main_program, startup_program):
data = paddle.static.data(name='image', shape=[None, 784, 784], dtype='float32') data = paddle.static.data(name='image', shape=[None, 784, 784], dtype='float32')
hidden = paddle.static.nn.fc(x=data, size=10, activation='relu') hidden = paddle.static.nn.fc(x=data, size=10, activation='relu')
Notes: The temporary :code:`Program` can be used if the user does not need Notes: The temporary :code:`Program` can be used if the user does not need
to construct either of startup program or main program. to construct either of startup program or main program.
Examples: Examples:
.. code-block:: python .. code-block:: python
:name: code-example-2
import paddle import paddle
paddle.enable_static() paddle.enable_static()
main_program = paddle.static.Program() main_program = paddle.static.Program()
# does not care about startup program. Just pass a temporary value. # does not care about startup program. Just pass a temporary value.
with paddle.static.program_guard(main_program, paddle.static.Program()): with paddle.static.program_guard(main_program, paddle.static.Program()):
data = paddle.static.data(name='image', shape=[None, 784, 784], dtype='float32') data = paddle.static.data(name='image', shape=[None, 784, 784], dtype='float32')
""" """
from .data_feeder import check_type from .data_feeder import check_type
......
...@@ -323,6 +323,48 @@ def monkey_patch_variable(): ...@@ -323,6 +323,48 @@ def monkey_patch_variable():
""" """
return len(self.shape) return len(self.shape)
def ndimension(self):
"""
Returns the dimension of current Variable
Returns:
the dimension
Examples:
.. code-block:: python
import paddle
paddle.enable_static()
# create a static Variable
x = paddle.static.data(name='x', shape=[3, 2, 1])
# print the dimension of the Variable
print(x.ndimension)
"""
return len(self.shape)
def dim(self):
"""
Returns the dimension of current Variable
Returns:
the dimension
Examples:
.. code-block:: python
import paddle
paddle.enable_static()
# create a static Variable
x = paddle.static.data(name='x', shape=[3, 2, 1])
# print the dimension of the Variable
print(x.dim)
"""
return len(self.shape)
def _scalar_add_(var, value): def _scalar_add_(var, value):
return _scalar_op_(var, 1.0, value) return _scalar_op_(var, 1.0, value)
...@@ -509,8 +551,8 @@ def monkey_patch_variable(): ...@@ -509,8 +551,8 @@ def monkey_patch_variable():
('append', append), ('append', append),
('item', _item), ('item', _item),
('pop', pop), ('pop', pop),
('dim', lambda x: len(x.shape)), ('dim', dim),
('ndimension', lambda x: len(x.shape)), ('ndimension', ndimension),
('ndim', _ndim_), ('ndim', _ndim_),
( (
'__add__', '__add__',
......
...@@ -896,16 +896,18 @@ def cond(pred, true_fn=None, false_fn=None, name=None, return_names=None): ...@@ -896,16 +896,18 @@ def cond(pred, true_fn=None, false_fn=None, name=None, return_names=None):
3. If it is in static graph mode, any tensors or operations created outside 3. If it is in static graph mode, any tensors or operations created outside
or inside of ``true_fn`` and ``false_fn`` will be in net building or inside of ``true_fn`` and ``false_fn`` will be in net building
regardless of which branch is selected at runtime. This has frequently regardless of which branch is selected at runtime. This has frequently
surprised users who expected a lazy semantics. For example: surprised users who expected a lazy semantics.
.. code-block:: python Examples:
.. code-block:: python
:name: code-example-1
import paddle import paddle
a = paddle.zeros((1, 1)) a = paddle.zeros((1, 1))
b = paddle.zeros((1, 1)) b = paddle.zeros((1, 1))
c = a * b c = a * b
out = paddle.static.nn.cond(a < b, lambda: a + c, lambda: b * b) out = paddle.static.nn.cond(a < b, lambda: a + c, lambda: b * b)
No matter whether ``a < b`` , ``c = a * b`` will be in net building and No matter whether ``a < b`` , ``c = a * b`` will be in net building and
run. ``a + c`` and ``b * b`` will be in net building, but only one run. ``a + c`` and ``b * b`` will be in net building, but only one
...@@ -933,6 +935,7 @@ def cond(pred, true_fn=None, false_fn=None, name=None, return_names=None): ...@@ -933,6 +935,7 @@ def cond(pred, true_fn=None, false_fn=None, name=None, return_names=None):
Examples: Examples:
.. code-block:: python .. code-block:: python
:name: code-example-2
import paddle import paddle
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册