提交 97cddf7d 编写于 作者: T Travis CI

Deploy to GitHub Pages: 59c48f98

上级 2e94fbcf
# Design Doc: Block and Scope
## The Representation of Computation
Both deep learning systems and programming languages help users describe computation procedures. These systems use various representations of computation:
- Caffe, Torch, and Paddle: sequences of layers.
- TensorFlow, Caffe2, Mxnet: graphs of operators.
- PaddlePaddle: nested blocks, like C++ and Java programs.
## Block in Programming Languages and Deep Learning
In programming languages, a block is a pair of curly braces that includes local variables definitions and a sequence of instructions, or operators.
Blocks work with control flow structures like `if`, `else`, and `for`, which have equivalents in deep learning:
| programming languages | PaddlePaddle |
|-----------------------|-----------------------|
| for, while loop | RNN, WhileOp |
| if, if-else, switch | IfElseOp, SwitchOp |
| sequential execution | a sequence of layers |
A key difference is that a C++ program describes a one pass computation, whereas a deep learning program describes both the forward and backward passes.
## Stack Frames and the Scope Hierarchy
The existence of the backward makes the execution of a block of traditional programs and PaddlePaddle different to each other:
| programming languages | PaddlePaddle |
|-----------------------|-------------------------------|
| stack | scope hierarchy |
| stack frame | scope |
| push at entering block| push at entering block |
| pop at leaving block | destroy at minibatch completes|
1. In traditional programs:
- When the execution enters the left curly brace of a block, the runtime pushes a frame into the stack, where it realizes local variables.
- After the execution leaves the right curly brace, the runtime pops the frame.
- The maximum number of frames in the stack is the maximum depth of nested blocks.
1. In PaddlePaddle
- When the execution enters a block, PaddlePaddle adds a new scope, where it realizes variables.
- PaddlePaddle doesn't pop a scope after the execution of the block because variables therein are to be used by the backward pass. So it has a stack forest known as a *scope hierarchy*.
- The height of the highest tree is the maximum depth of nested blocks.
- After the process of a minibatch, PaddlePaddle destroys the scope hierarchy.
## Use Blocks in C++ and PaddlePaddle Programs
Let us consolidate the discussion by presenting some examples.
### Blocks with `if-else` and `IfElseOp`
The following C++ programs shows how blocks are used with the `if-else` structure:
```c++
int x = 10;
int y = 20;
int out;
bool cond = false;
if (cond) {
int z = x + y;
out = softmax(z);
} else {
int z = fc(x);
out = z;
}
```
An equivalent PaddlePaddle program from the design doc of the [IfElseOp operator](./if_else_op.md) is as follows:
```python
import paddle as pd
x = var(10)
y = var(20)
cond = var(false)
ie = pd.create_ifelseop(inputs=[x], output_num=1)
with ie.true_block():
x = ie.inputs(true, 0)
z = operator.add(x, y)
ie.set_output(true, 0, operator.softmax(z))
with ie.false_block():
x = ie.inputs(false, 0)
z = layer.fc(x)
ie.set_output(true, 0, operator.softmax(z))
out = b(cond)
```
In both examples, the left branch computes `softmax(x+y)` and the right branch computes `fc(x)`.
A difference is that variables in the C++ program contain scalar values, whereas those in the PaddlePaddle programs are mini-batches of instances. The `ie.input(true, 0)` invocation returns instances in the 0-th input, `x`, that corresponds to true values in `cond` as the local variable `x`, where `ie.input(false, 0)` returns instances corresponding to false values.
### Blocks with `for` and `RNNOp`
The following RNN model from the [RNN design doc](./rnn.md)
```python
x = sequence([10, 20, 30])
m = var(0)
W = tensor()
U = tensor()
rnn = create_rnn(inputs=[input])
with rnn.stepnet() as net:
x = net.set_inputs(0)
h = net.add_memory(init=m)
fc_out = pd.matmul(W, x)
hidden_out = pd.matmul(U, h.pre(n=1))
sum = pd.add_two(fc_out, hidden_out)
act = pd.sigmoid(sum)
h.update(act) # update memory with act
net.set_outputs(0, act, hidden_out) # two outputs
o1, o2 = rnn()
print o1, o2
```
has its equivalent C++ program as follows
```c++
int* x = {10, 20, 30};
int m = 0;
int W = some_value();
int U = some_other_value();
int mem[sizeof(x) / sizeof(x[0]) + 1];
int o1[sizeof(x) / sizeof(x[0]) + 1];
int o2[sizeof(x) / sizeof(x[0]) + 1];
for (int i = 1; i <= sizeof(x)/sizeof(x[0]); ++i) {
int x = x[i-1];
if (i == 1) mem[0] = m;
int fc_out = W * x;
int hidden_out = Y * mem[i-1];
int sum = fc_out + hidden_out;
int act = sigmoid(sum);
mem[i] = act;
o1[i] = act;
o2[i] = hidden_out;
}
print_array(o1);
print_array(o2);
```
## Compilation and Execution
Like TensorFlow programs, a PaddlePaddle program is written in Python. The first part describes a neural network as a protobuf message, and the rest part executes the message for training or inference.
The generation of this protobuf message is like what a compiler generates a binary executable file. The execution of the message that the OS executes the binary file.
## The "Binary Executable File Format"
The definition of the protobuf message is as follows:
```protobuf
message BlockDesc {
repeated VarDesc vars = 1;
repeated OpDesc ops = 2;
}
```
The step net in above RNN example would look like
```
BlockDesc {
vars = {
VarDesc {...} // x
VarDesc {...} // h
VarDesc {...} // fc_out
VarDesc {...} // hidden_out
VarDesc {...} // sum
VarDesc {...} // act
}
ops = {
OpDesc {...} // matmul
OpDesc {...} // add_two
OpDesc {...} // sigmoid
}
};
```
Also, the RNN operator in above example is serialized into a protobuf message of type `OpDesc` and would look like:
```
OpDesc {
inputs = {0} // the index of x
outputs = {5, 3} // indices of act and hidden_out
attrs {
"memories" : {1} // the index of h
"step_net" : <above step net>
}
};
```
This `OpDesc` value is in the `ops` field of the `BlockDesc` value representing the global block.
## The Compilation of Blocks
During the generation of the Protobuf message, the Block should store VarDesc (the Protobuf message which describes Variable) and OpDesc (the Protobuf message which describes Operator).
VarDesc in a block should have its name scope to avoid local variables affect parent block's name scope.
Child block's name scopes should inherit the parent's so that OpDesc in child block can reference a VarDesc that stored in parent block. For example
```python
a = pd.Varaible(shape=[20, 20])
b = pd.fc(a, params=["fc.w", "fc.b"])
rnn = pd.create_rnn()
with rnn.stepnet() as net:
x = net.set_inputs(a)
# reuse fc's parameter
fc_without_b = pd.get_variable("fc.w")
net.set_outputs(fc_without_b)
out = rnn()
```
the method `pd.get_variable` can help retrieve a Variable by a name, a Variable may store in a parent block, but might be retrieved in a child block, so block should have a variable scope that supports inheritance.
In compiler design, the symbol table is a data structure created and maintained by compilers to store information about the occurrence of various entities such as variable names, function names, classes, etc.
To store the definition of variables and operators, we define a C++ class `SymbolTable`, like the one used in compilers.
`SymbolTable` can do the following stuff:
- store the definitions (some names and attributes) of variables and operators,
- to verify if a variable was declared,
- to make it possible to implement type checking (offer Protobuf message pointers to `InferShape` handlers).
```c++
// Information in SymbolTable is enough to trace the dependency graph. So maybe
// the Eval() interface takes a SymbolTable is enough.
class SymbolTable {
public:
SymbolTable(SymbolTable* parent) : parent_(parent) {}
OpDesc* NewOp(const string& name="");
// TODO determine whether name is generated by python or C++
// currently assume that a unique name will be generated by C++ if the
// argument name left default.
VarDesc* NewVar(const string& name="");
// find a VarDesc by name, if recursive true, find parent's SymbolTable
// recursively.
// this interface is introduced to support InferShape, find protobuf messages
// of variables and operators, pass pointers into InferShape.
// operator
//
// NOTE maybe some C++ classes such as VarDescBuilder and OpDescBuilder should
// be proposed and embedded into pybind to enable python operate on C++ pointers.
VarDesc* FindVar(const string& name, bool recursive=true);
OpDesc* FindOp(const string& name);
BlockDesc Compile() const;
private:
SymbolTable* parent_;
map<string, OpDesc> ops_;
map<string, VarDesc> vars_;
};
```
After all the description of variables and operators is added into SymbolTable,
the block has enough information to run.
The `Block` class takes a `BlockDesc` as input, and provide `Run` and `InferShape` functions.
```c++
namespace {
class Block : OperatorBase {
public:
Block(const BlockDesc& desc) desc_(desc) {}
void InferShape(const framework::Scope& scope) const override {
if (!symbols_ready_) {
CreateVariables(scope);
CreateOperators();
}
// should run InferShape first.
for (auto& op : runtime_table_.ops()) {
op->InferShape(scope);
}
}
void Run(const framework::Scope& scope,
const platform::DeviceContext& dev_ctx) const override {
PADDLE_ENFORCE(symbols_ready_, "operators and variables should be created first.");
for (auto& op : runtime_table_.ops()) {
op->Run(scope, dev_ctx);
}
}
void CreateVariables(const framework::Scope& scope);
void CreateOperators();
// some other necessary interfaces of NetOp are list below
// ...
private:
BlockDesc desc_;
bool symbols_ready_{false};
};
```
## The Execution of Blocks
Block inherits from OperatorBase, which has a Run method.
Block's Run method will run its operators sequentially.
There is another important interface called `Eval`, which take some arguments called targets, and generate a minimal graph which takes targets as the end points and creates a new Block,
after `Run`, `Eval` will get the latest value and return the targets.
The definition of Eval is as follows:
```c++
// clean a block description by targets using the corresponding dependency graph.
// return a new BlockDesc with minimal number of operators.
// NOTE not return a Block but the block's description so that this can be distributed
// to a cluster.
BlockDesc Prune(const BlockDesc& desc, vector<string> targets);
void Block::Eval(const vector<string>& targets,
const framework::Scope& scope,
const platform::DeviceContext& dev_ctx) {
BlockDesc min_desc = Prune(desc_, targets);
Block min_block(min_desc);
min_block.Run(scope, dev_ctx);
}
```
<!DOCTYPE html>
<!--[if IE 8]><html class="no-js lt-ie9" lang="en" > <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js" lang="en" > <!--<![endif]-->
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Design Doc: Block and Scope &mdash; PaddlePaddle documentation</title>
<link rel="stylesheet" href="../_static/css/theme.css" type="text/css" />
<link rel="index" title="Index"
href="../genindex.html"/>
<link rel="search" title="Search" href="../search.html"/>
<link rel="top" title="PaddlePaddle documentation" href="../index.html"/>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/perfect-scrollbar/0.6.14/css/perfect-scrollbar.min.css" type="text/css" />
<link rel="stylesheet" href="../_static/css/override.css" type="text/css" />
<script>
var _hmt = _hmt || [];
(function() {
var hm = document.createElement("script");
hm.src = "//hm.baidu.com/hm.js?b9a314ab40d04d805655aab1deee08ba";
var s = document.getElementsByTagName("script")[0];
s.parentNode.insertBefore(hm, s);
})();
</script>
<script src="../_static/js/modernizr.min.js"></script>
</head>
<body class="wy-body-for-nav" role="document">
<header class="site-header">
<div class="site-logo">
<a href="/"><img src="../_static/images/PP_w.png"></a>
</div>
<div class="site-nav-links">
<div class="site-menu">
<a class="fork-on-github" href="https://github.com/PaddlePaddle/Paddle" target="_blank"><i class="fa fa-github"></i>Fork me on Github</a>
<div class="language-switcher dropdown">
<a type="button" data-toggle="dropdown">
<span>English</span>
<i class="fa fa-angle-up"></i>
<i class="fa fa-angle-down"></i>
</a>
<ul class="dropdown-menu">
<li><a href="/doc_cn">中文</a></li>
<li><a href="/doc">English</a></li>
</ul>
</div>
<ul class="site-page-links">
<li><a href="/">Home</a></li>
</ul>
</div>
<div class="doc-module">
<ul>
<li class="toctree-l1"><a class="reference internal" href="../getstarted/index_en.html">GET STARTED</a></li>
<li class="toctree-l1"><a class="reference internal" href="../howto/index_en.html">HOW TO</a></li>
<li class="toctree-l1"><a class="reference internal" href="../api/index_en.html">API</a></li>
</ul>
<div role="search">
<form id="rtd-search-form" class="wy-form" action="../search.html" method="get">
<input type="text" name="q" placeholder="Search docs" />
<input type="hidden" name="check_keywords" value="yes" />
<input type="hidden" name="area" value="default" />
</form>
</div>
</div>
</div>
</header>
<div class="main-content-wrap">
<nav class="doc-menu-vertical" role="navigation">
<ul>
<li class="toctree-l1"><a class="reference internal" href="../getstarted/index_en.html">GET STARTED</a><ul>
<li class="toctree-l2"><a class="reference internal" href="../getstarted/build_and_install/index_en.html">Install and Build</a><ul>
<li class="toctree-l3"><a class="reference internal" href="../getstarted/build_and_install/docker_install_en.html">PaddlePaddle in Docker Containers</a></li>
<li class="toctree-l3"><a class="reference internal" href="../getstarted/build_and_install/build_from_source_en.html">Installing from Sources</a></li>
</ul>
</li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="../howto/index_en.html">HOW TO</a><ul>
<li class="toctree-l2"><a class="reference internal" href="../howto/usage/cmd_parameter/index_en.html">Set Command-line Parameters</a><ul>
<li class="toctree-l3"><a class="reference internal" href="../howto/usage/cmd_parameter/use_case_en.html">Use Case</a></li>
<li class="toctree-l3"><a class="reference internal" href="../howto/usage/cmd_parameter/arguments_en.html">Argument Outline</a></li>
<li class="toctree-l3"><a class="reference internal" href="../howto/usage/cmd_parameter/detail_introduction_en.html">Detail Description</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="../howto/usage/cluster/cluster_train_en.html">Run Distributed Training</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/usage/k8s/k8s_en.html">Paddle On Kubernetes</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/usage/k8s/k8s_aws_en.html">Distributed PaddlePaddle Training on AWS with Kubernetes</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/dev/build_en.html">Build PaddlePaddle from Source Code and Run Unit Test</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/dev/new_layer_en.html">Write New Layers</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/dev/contribute_to_paddle_en.html">Contribute Code</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/deep_model/rnn/index_en.html">RNN Models</a><ul>
<li class="toctree-l3"><a class="reference internal" href="../howto/deep_model/rnn/rnn_config_en.html">RNN Configuration</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="../howto/optimization/gpu_profiling_en.html">Tune GPU Performance</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="../api/index_en.html">API</a><ul>
<li class="toctree-l2"><a class="reference internal" href="../api/v2/model_configs.html">Model Configuration</a><ul>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/activation.html">Activation</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/layer.html">Layers</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/evaluators.html">Evaluators</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/optimizer.html">Optimizer</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/pooling.html">Pooling</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/networks.html">Networks</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/attr.html">Parameter Attribute</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="../api/v2/data.html">Data Reader Interface and DataSets</a></li>
<li class="toctree-l2"><a class="reference internal" href="../api/v2/run_logic.html">Training and Inference</a></li>
</ul>
</li>
</ul>
</nav>
<section class="doc-content-wrap">
<div role="navigation" aria-label="breadcrumbs navigation">
<ul class="wy-breadcrumbs">
<li>Design Doc: Block and Scope</li>
</ul>
</div>
<div class="wy-nav-content" id="doc-content">
<div class="rst-content">
<div role="main" class="document" itemscope="itemscope" itemtype="http://schema.org/Article">
<div itemprop="articleBody">
<div class="section" id="design-doc-block-and-scope">
<span id="design-doc-block-and-scope"></span><h1>Design Doc: Block and Scope<a class="headerlink" href="#design-doc-block-and-scope" title="Permalink to this headline"></a></h1>
<div class="section" id="the-representation-of-computation">
<span id="the-representation-of-computation"></span><h2>The Representation of Computation<a class="headerlink" href="#the-representation-of-computation" title="Permalink to this headline"></a></h2>
<p>Both deep learning systems and programming languages help users describe computation procedures. These systems use various representations of computation:</p>
<ul class="simple">
<li>Caffe, Torch, and Paddle: sequences of layers.</li>
<li>TensorFlow, Caffe2, Mxnet: graphs of operators.</li>
<li>PaddlePaddle: nested blocks, like C++ and Java programs.</li>
</ul>
</div>
<div class="section" id="block-in-programming-languages-and-deep-learning">
<span id="block-in-programming-languages-and-deep-learning"></span><h2>Block in Programming Languages and Deep Learning<a class="headerlink" href="#block-in-programming-languages-and-deep-learning" title="Permalink to this headline"></a></h2>
<p>In programming languages, a block is a pair of curly braces that includes local variables definitions and a sequence of instructions, or operators.</p>
<p>Blocks work with control flow structures like <code class="docutils literal"><span class="pre">if</span></code>, <code class="docutils literal"><span class="pre">else</span></code>, and <code class="docutils literal"><span class="pre">for</span></code>, which have equivalents in deep learning:</p>
<p>| programming languages | PaddlePaddle |
|&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;|&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;|
| for, while loop | RNN, WhileOp |
| if, if-else, switch | IfElseOp, SwitchOp |
| sequential execution | a sequence of layers |</p>
<p>A key difference is that a C++ program describes a one pass computation, whereas a deep learning program describes both the forward and backward passes.</p>
</div>
<div class="section" id="stack-frames-and-the-scope-hierarchy">
<span id="stack-frames-and-the-scope-hierarchy"></span><h2>Stack Frames and the Scope Hierarchy<a class="headerlink" href="#stack-frames-and-the-scope-hierarchy" title="Permalink to this headline"></a></h2>
<p>The existence of the backward makes the execution of a block of traditional programs and PaddlePaddle different to each other:</p>
<p>| programming languages | PaddlePaddle |
|&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;|&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;-|
| stack | scope hierarchy |
| stack frame | scope |
| push at entering block| push at entering block |
| pop at leaving block | destroy at minibatch completes|</p>
<ol class="simple">
<li>In traditional programs:<ul>
<li>When the execution enters the left curly brace of a block, the runtime pushes a frame into the stack, where it realizes local variables.</li>
<li>After the execution leaves the right curly brace, the runtime pops the frame.</li>
<li>The maximum number of frames in the stack is the maximum depth of nested blocks.</li>
</ul>
</li>
<li>In PaddlePaddle<ul>
<li>When the execution enters a block, PaddlePaddle adds a new scope, where it realizes variables.</li>
<li>PaddlePaddle doesn&#8217;t pop a scope after the execution of the block because variables therein are to be used by the backward pass. So it has a stack forest known as a <em>scope hierarchy</em>.</li>
<li>The height of the highest tree is the maximum depth of nested blocks.</li>
<li>After the process of a minibatch, PaddlePaddle destroys the scope hierarchy.</li>
</ul>
</li>
</ol>
</div>
<div class="section" id="use-blocks-in-c-and-paddlepaddle-programs">
<span id="use-blocks-in-c-and-paddlepaddle-programs"></span><h2>Use Blocks in C++ and PaddlePaddle Programs<a class="headerlink" href="#use-blocks-in-c-and-paddlepaddle-programs" title="Permalink to this headline"></a></h2>
<p>Let us consolidate the discussion by presenting some examples.</p>
<div class="section" id="blocks-with-if-else-and-ifelseop">
<span id="blocks-with-if-else-and-ifelseop"></span><h3>Blocks with <code class="docutils literal"><span class="pre">if-else</span></code> and <code class="docutils literal"><span class="pre">IfElseOp</span></code><a class="headerlink" href="#blocks-with-if-else-and-ifelseop" title="Permalink to this headline"></a></h3>
<p>The following C++ programs shows how blocks are used with the <code class="docutils literal"><span class="pre">if-else</span></code> structure:</p>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="kt">int</span> <span class="n">x</span> <span class="o">=</span> <span class="mi">10</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">y</span> <span class="o">=</span> <span class="mi">20</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">out</span><span class="p">;</span>
<span class="kt">bool</span> <span class="n">cond</span> <span class="o">=</span> <span class="nb">false</span><span class="p">;</span>
<span class="k">if</span> <span class="p">(</span><span class="n">cond</span><span class="p">)</span> <span class="p">{</span>
<span class="kt">int</span> <span class="n">z</span> <span class="o">=</span> <span class="n">x</span> <span class="o">+</span> <span class="n">y</span><span class="p">;</span>
<span class="n">out</span> <span class="o">=</span> <span class="n">softmax</span><span class="p">(</span><span class="n">z</span><span class="p">);</span>
<span class="p">}</span> <span class="k">else</span> <span class="p">{</span>
<span class="kt">int</span> <span class="n">z</span> <span class="o">=</span> <span class="n">fc</span><span class="p">(</span><span class="n">x</span><span class="p">);</span>
<span class="n">out</span> <span class="o">=</span> <span class="n">z</span><span class="p">;</span>
<span class="p">}</span>
</pre></div>
</div>
<p>An equivalent PaddlePaddle program from the design doc of the <a class="reference internal" href="if_else_op.html"><span class="doc">IfElseOp operator</span></a> is as follows:</p>
<div class="highlight-python"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">paddle</span> <span class="kn">as</span> <span class="nn">pd</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">var</span><span class="p">(</span><span class="mi">10</span><span class="p">)</span>
<span class="n">y</span> <span class="o">=</span> <span class="n">var</span><span class="p">(</span><span class="mi">20</span><span class="p">)</span>
<span class="n">cond</span> <span class="o">=</span> <span class="n">var</span><span class="p">(</span><span class="n">false</span><span class="p">)</span>
<span class="n">ie</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">create_ifelseop</span><span class="p">(</span><span class="n">inputs</span><span class="o">=</span><span class="p">[</span><span class="n">x</span><span class="p">],</span> <span class="n">output_num</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="k">with</span> <span class="n">ie</span><span class="o">.</span><span class="n">true_block</span><span class="p">():</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">ie</span><span class="o">.</span><span class="n">inputs</span><span class="p">(</span><span class="n">true</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>
<span class="n">z</span> <span class="o">=</span> <span class="n">operator</span><span class="o">.</span><span class="n">add</span><span class="p">(</span><span class="n">x</span><span class="p">,</span> <span class="n">y</span><span class="p">)</span>
<span class="n">ie</span><span class="o">.</span><span class="n">set_output</span><span class="p">(</span><span class="n">true</span><span class="p">,</span> <span class="mi">0</span><span class="p">,</span> <span class="n">operator</span><span class="o">.</span><span class="n">softmax</span><span class="p">(</span><span class="n">z</span><span class="p">))</span>
<span class="k">with</span> <span class="n">ie</span><span class="o">.</span><span class="n">false_block</span><span class="p">():</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">ie</span><span class="o">.</span><span class="n">inputs</span><span class="p">(</span><span class="n">false</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>
<span class="n">z</span> <span class="o">=</span> <span class="n">layer</span><span class="o">.</span><span class="n">fc</span><span class="p">(</span><span class="n">x</span><span class="p">)</span>
<span class="n">ie</span><span class="o">.</span><span class="n">set_output</span><span class="p">(</span><span class="n">true</span><span class="p">,</span> <span class="mi">0</span><span class="p">,</span> <span class="n">operator</span><span class="o">.</span><span class="n">softmax</span><span class="p">(</span><span class="n">z</span><span class="p">))</span>
<span class="n">out</span> <span class="o">=</span> <span class="n">b</span><span class="p">(</span><span class="n">cond</span><span class="p">)</span>
</pre></div>
</div>
<p>In both examples, the left branch computes <code class="docutils literal"><span class="pre">softmax(x+y)</span></code> and the right branch computes <code class="docutils literal"><span class="pre">fc(x)</span></code>.</p>
<p>A difference is that variables in the C++ program contain scalar values, whereas those in the PaddlePaddle programs are mini-batches of instances. The <code class="docutils literal"><span class="pre">ie.input(true,</span> <span class="pre">0)</span></code> invocation returns instances in the 0-th input, <code class="docutils literal"><span class="pre">x</span></code>, that corresponds to true values in <code class="docutils literal"><span class="pre">cond</span></code> as the local variable <code class="docutils literal"><span class="pre">x</span></code>, where <code class="docutils literal"><span class="pre">ie.input(false,</span> <span class="pre">0)</span></code> returns instances corresponding to false values.</p>
</div>
<div class="section" id="blocks-with-for-and-rnnop">
<span id="blocks-with-for-and-rnnop"></span><h3>Blocks with <code class="docutils literal"><span class="pre">for</span></code> and <code class="docutils literal"><span class="pre">RNNOp</span></code><a class="headerlink" href="#blocks-with-for-and-rnnop" title="Permalink to this headline"></a></h3>
<p>The following RNN model from the <a class="reference external" href="design/rnn.md">RNN design doc</a></p>
<div class="highlight-python"><div class="highlight"><pre><span></span><span class="n">x</span> <span class="o">=</span> <span class="n">sequence</span><span class="p">([</span><span class="mi">10</span><span class="p">,</span> <span class="mi">20</span><span class="p">,</span> <span class="mi">30</span><span class="p">])</span>
<span class="n">m</span> <span class="o">=</span> <span class="n">var</span><span class="p">(</span><span class="mi">0</span><span class="p">)</span>
<span class="n">W</span> <span class="o">=</span> <span class="n">tensor</span><span class="p">()</span>
<span class="n">U</span> <span class="o">=</span> <span class="n">tensor</span><span class="p">()</span>
<span class="n">rnn</span> <span class="o">=</span> <span class="n">create_rnn</span><span class="p">(</span><span class="n">inputs</span><span class="o">=</span><span class="p">[</span><span class="nb">input</span><span class="p">])</span>
<span class="k">with</span> <span class="n">rnn</span><span class="o">.</span><span class="n">stepnet</span><span class="p">()</span> <span class="k">as</span> <span class="n">net</span><span class="p">:</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">net</span><span class="o">.</span><span class="n">set_inputs</span><span class="p">(</span><span class="mi">0</span><span class="p">)</span>
<span class="n">h</span> <span class="o">=</span> <span class="n">net</span><span class="o">.</span><span class="n">add_memory</span><span class="p">(</span><span class="n">init</span><span class="o">=</span><span class="n">m</span><span class="p">)</span>
<span class="n">fc_out</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">matmul</span><span class="p">(</span><span class="n">W</span><span class="p">,</span> <span class="n">x</span><span class="p">)</span>
<span class="n">hidden_out</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">matmul</span><span class="p">(</span><span class="n">U</span><span class="p">,</span> <span class="n">h</span><span class="o">.</span><span class="n">pre</span><span class="p">(</span><span class="n">n</span><span class="o">=</span><span class="mi">1</span><span class="p">))</span>
<span class="nb">sum</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">add_two</span><span class="p">(</span><span class="n">fc_out</span><span class="p">,</span> <span class="n">hidden_out</span><span class="p">)</span>
<span class="n">act</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">sigmoid</span><span class="p">(</span><span class="nb">sum</span><span class="p">)</span>
<span class="n">h</span><span class="o">.</span><span class="n">update</span><span class="p">(</span><span class="n">act</span><span class="p">)</span> <span class="c1"># update memory with act</span>
<span class="n">net</span><span class="o">.</span><span class="n">set_outputs</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="n">act</span><span class="p">,</span> <span class="n">hidden_out</span><span class="p">)</span> <span class="c1"># two outputs</span>
<span class="n">o1</span><span class="p">,</span> <span class="n">o2</span> <span class="o">=</span> <span class="n">rnn</span><span class="p">()</span>
<span class="k">print</span> <span class="n">o1</span><span class="p">,</span> <span class="n">o2</span>
</pre></div>
</div>
<p>has its equivalent C++ program as follows</p>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="kt">int</span><span class="o">*</span> <span class="n">x</span> <span class="o">=</span> <span class="p">{</span><span class="mi">10</span><span class="p">,</span> <span class="mi">20</span><span class="p">,</span> <span class="mi">30</span><span class="p">};</span>
<span class="kt">int</span> <span class="n">m</span> <span class="o">=</span> <span class="mi">0</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">W</span> <span class="o">=</span> <span class="n">some_value</span><span class="p">();</span>
<span class="kt">int</span> <span class="n">U</span> <span class="o">=</span> <span class="n">some_other_value</span><span class="p">();</span>
<span class="kt">int</span> <span class="n">mem</span><span class="p">[</span><span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">)</span> <span class="o">/</span> <span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span> <span class="o">+</span> <span class="mi">1</span><span class="p">];</span>
<span class="kt">int</span> <span class="n">o1</span><span class="p">[</span><span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">)</span> <span class="o">/</span> <span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span> <span class="o">+</span> <span class="mi">1</span><span class="p">];</span>
<span class="kt">int</span> <span class="n">o2</span><span class="p">[</span><span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">)</span> <span class="o">/</span> <span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span> <span class="o">+</span> <span class="mi">1</span><span class="p">];</span>
<span class="k">for</span> <span class="p">(</span><span class="kt">int</span> <span class="n">i</span> <span class="o">=</span> <span class="mi">1</span><span class="p">;</span> <span class="n">i</span> <span class="o">&lt;=</span> <span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">)</span><span class="o">/</span><span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">[</span><span class="mi">0</span><span class="p">]);</span> <span class="o">++</span><span class="n">i</span><span class="p">)</span> <span class="p">{</span>
<span class="kt">int</span> <span class="n">x</span> <span class="o">=</span> <span class="n">x</span><span class="p">[</span><span class="n">i</span><span class="o">-</span><span class="mi">1</span><span class="p">];</span>
<span class="k">if</span> <span class="p">(</span><span class="n">i</span> <span class="o">==</span> <span class="mi">1</span><span class="p">)</span> <span class="n">mem</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span> <span class="o">=</span> <span class="n">m</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">fc_out</span> <span class="o">=</span> <span class="n">W</span> <span class="o">*</span> <span class="n">x</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">hidden_out</span> <span class="o">=</span> <span class="n">Y</span> <span class="o">*</span> <span class="n">mem</span><span class="p">[</span><span class="n">i</span><span class="o">-</span><span class="mi">1</span><span class="p">];</span>
<span class="kt">int</span> <span class="n">sum</span> <span class="o">=</span> <span class="n">fc_out</span> <span class="o">+</span> <span class="n">hidden_out</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">act</span> <span class="o">=</span> <span class="n">sigmoid</span><span class="p">(</span><span class="n">sum</span><span class="p">);</span>
<span class="n">mem</span><span class="p">[</span><span class="n">i</span><span class="p">]</span> <span class="o">=</span> <span class="n">act</span><span class="p">;</span>
<span class="n">o1</span><span class="p">[</span><span class="n">i</span><span class="p">]</span> <span class="o">=</span> <span class="n">act</span><span class="p">;</span>
<span class="n">o2</span><span class="p">[</span><span class="n">i</span><span class="p">]</span> <span class="o">=</span> <span class="n">hidden_out</span><span class="p">;</span>
<span class="p">}</span>
<span class="n">print_array</span><span class="p">(</span><span class="n">o1</span><span class="p">);</span>
<span class="n">print_array</span><span class="p">(</span><span class="n">o2</span><span class="p">);</span>
</pre></div>
</div>
</div>
</div>
<div class="section" id="compilation-and-execution">
<span id="compilation-and-execution"></span><h2>Compilation and Execution<a class="headerlink" href="#compilation-and-execution" title="Permalink to this headline"></a></h2>
<p>Like TensorFlow programs, a PaddlePaddle program is written in Python. The first part describes a neural network as a protobuf message, and the rest part executes the message for training or inference.</p>
<p>The generation of this protobuf message is like what a compiler generates a binary executable file. The execution of the message that the OS executes the binary file.</p>
</div>
<div class="section" id="the-binary-executable-file-format">
<span id="the-binary-executable-file-format"></span><h2>The &#8220;Binary Executable File Format&#8221;<a class="headerlink" href="#the-binary-executable-file-format" title="Permalink to this headline"></a></h2>
<p>The definition of the protobuf message is as follows:</p>
<div class="highlight-protobuf"><div class="highlight"><pre><span></span><span class="kd">message</span> <span class="nc">BlockDesc</span> <span class="p">{</span>
<span class="k">repeated</span> <span class="n">VarDesc</span> <span class="na">vars</span> <span class="o">=</span> <span class="mi">1</span><span class="p">;</span>
<span class="k">repeated</span> <span class="n">OpDesc</span> <span class="na">ops</span> <span class="o">=</span> <span class="mi">2</span><span class="p">;</span>
<span class="p">}</span>
</pre></div>
</div>
<p>The step net in above RNN example would look like</p>
<div class="highlight-default"><div class="highlight"><pre><span></span><span class="n">BlockDesc</span> <span class="p">{</span>
<span class="nb">vars</span> <span class="o">=</span> <span class="p">{</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">x</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">h</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">fc_out</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">hidden_out</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="nb">sum</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">act</span>
<span class="p">}</span>
<span class="n">ops</span> <span class="o">=</span> <span class="p">{</span>
<span class="n">OpDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">matmul</span>
<span class="n">OpDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">add_two</span>
<span class="n">OpDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">sigmoid</span>
<span class="p">}</span>
<span class="p">};</span>
</pre></div>
</div>
<p>Also, the RNN operator in above example is serialized into a protobuf message of type <code class="docutils literal"><span class="pre">OpDesc</span></code> and would look like:</p>
<div class="highlight-default"><div class="highlight"><pre><span></span><span class="n">OpDesc</span> <span class="p">{</span>
<span class="n">inputs</span> <span class="o">=</span> <span class="p">{</span><span class="mi">0</span><span class="p">}</span> <span class="o">//</span> <span class="n">the</span> <span class="n">index</span> <span class="n">of</span> <span class="n">x</span>
<span class="n">outputs</span> <span class="o">=</span> <span class="p">{</span><span class="mi">5</span><span class="p">,</span> <span class="mi">3</span><span class="p">}</span> <span class="o">//</span> <span class="n">indices</span> <span class="n">of</span> <span class="n">act</span> <span class="ow">and</span> <span class="n">hidden_out</span>
<span class="n">attrs</span> <span class="p">{</span>
<span class="s2">&quot;memories&quot;</span> <span class="p">:</span> <span class="p">{</span><span class="mi">1</span><span class="p">}</span> <span class="o">//</span> <span class="n">the</span> <span class="n">index</span> <span class="n">of</span> <span class="n">h</span>
<span class="s2">&quot;step_net&quot;</span> <span class="p">:</span> <span class="o">&lt;</span><span class="n">above</span> <span class="n">step</span> <span class="n">net</span><span class="o">&gt;</span>
<span class="p">}</span>
<span class="p">};</span>
</pre></div>
</div>
<p>This <code class="docutils literal"><span class="pre">OpDesc</span></code> value is in the <code class="docutils literal"><span class="pre">ops</span></code> field of the <code class="docutils literal"><span class="pre">BlockDesc</span></code> value representing the global block.</p>
</div>
<div class="section" id="the-compilation-of-blocks">
<span id="the-compilation-of-blocks"></span><h2>The Compilation of Blocks<a class="headerlink" href="#the-compilation-of-blocks" title="Permalink to this headline"></a></h2>
<p>During the generation of the Protobuf message, the Block should store VarDesc (the Protobuf message which describes Variable) and OpDesc (the Protobuf message which describes Operator).</p>
<p>VarDesc in a block should have its name scope to avoid local variables affect parent block&#8217;s name scope.
Child block&#8217;s name scopes should inherit the parent&#8217;s so that OpDesc in child block can reference a VarDesc that stored in parent block. For example</p>
<div class="highlight-python"><div class="highlight"><pre><span></span><span class="n">a</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">Varaible</span><span class="p">(</span><span class="n">shape</span><span class="o">=</span><span class="p">[</span><span class="mi">20</span><span class="p">,</span> <span class="mi">20</span><span class="p">])</span>
<span class="n">b</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">fc</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">params</span><span class="o">=</span><span class="p">[</span><span class="s2">&quot;fc.w&quot;</span><span class="p">,</span> <span class="s2">&quot;fc.b&quot;</span><span class="p">])</span>
<span class="n">rnn</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">create_rnn</span><span class="p">()</span>
<span class="k">with</span> <span class="n">rnn</span><span class="o">.</span><span class="n">stepnet</span><span class="p">()</span> <span class="k">as</span> <span class="n">net</span><span class="p">:</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">net</span><span class="o">.</span><span class="n">set_inputs</span><span class="p">(</span><span class="n">a</span><span class="p">)</span>
<span class="c1"># reuse fc&#39;s parameter</span>
<span class="n">fc_without_b</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s2">&quot;fc.w&quot;</span><span class="p">)</span>
<span class="n">net</span><span class="o">.</span><span class="n">set_outputs</span><span class="p">(</span><span class="n">fc_without_b</span><span class="p">)</span>
<span class="n">out</span> <span class="o">=</span> <span class="n">rnn</span><span class="p">()</span>
</pre></div>
</div>
<p>the method <code class="docutils literal"><span class="pre">pd.get_variable</span></code> can help retrieve a Variable by a name, a Variable may store in a parent block, but might be retrieved in a child block, so block should have a variable scope that supports inheritance.</p>
<p>In compiler design, the symbol table is a data structure created and maintained by compilers to store information about the occurrence of various entities such as variable names, function names, classes, etc.</p>
<p>To store the definition of variables and operators, we define a C++ class <code class="docutils literal"><span class="pre">SymbolTable</span></code>, like the one used in compilers.</p>
<p><code class="docutils literal"><span class="pre">SymbolTable</span></code> can do the following stuff:</p>
<ul class="simple">
<li>store the definitions (some names and attributes) of variables and operators,</li>
<li>to verify if a variable was declared,</li>
<li>to make it possible to implement type checking (offer Protobuf message pointers to <code class="docutils literal"><span class="pre">InferShape</span></code> handlers).</li>
</ul>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="c1">// Information in SymbolTable is enough to trace the dependency graph. So maybe</span>
<span class="c1">// the Eval() interface takes a SymbolTable is enough.</span>
<span class="k">class</span> <span class="nc">SymbolTable</span> <span class="p">{</span>
<span class="k">public</span><span class="o">:</span>
<span class="n">SymbolTable</span><span class="p">(</span><span class="n">SymbolTable</span><span class="o">*</span> <span class="n">parent</span><span class="p">)</span> <span class="o">:</span> <span class="n">parent_</span><span class="p">(</span><span class="n">parent</span><span class="p">)</span> <span class="p">{}</span>
<span class="n">OpDesc</span><span class="o">*</span> <span class="n">NewOp</span><span class="p">(</span><span class="k">const</span> <span class="n">string</span><span class="o">&amp;</span> <span class="n">name</span><span class="o">=</span><span class="s">&quot;&quot;</span><span class="p">);</span>
<span class="c1">// TODO determine whether name is generated by python or C++</span>
<span class="c1">// currently assume that a unique name will be generated by C++ if the</span>
<span class="c1">// argument name left default.</span>
<span class="n">VarDesc</span><span class="o">*</span> <span class="nf">NewVar</span><span class="p">(</span><span class="k">const</span> <span class="n">string</span><span class="o">&amp;</span> <span class="n">name</span><span class="o">=</span><span class="s">&quot;&quot;</span><span class="p">);</span>
<span class="c1">// find a VarDesc by name, if recursive true, find parent&#39;s SymbolTable</span>
<span class="c1">// recursively.</span>
<span class="c1">// this interface is introduced to support InferShape, find protobuf messages</span>
<span class="c1">// of variables and operators, pass pointers into InferShape.</span>
<span class="c1">// operator</span>
<span class="c1">//</span>
<span class="c1">// NOTE maybe some C++ classes such as VarDescBuilder and OpDescBuilder should</span>
<span class="c1">// be proposed and embedded into pybind to enable python operate on C++ pointers.</span>
<span class="n">VarDesc</span><span class="o">*</span> <span class="nf">FindVar</span><span class="p">(</span><span class="k">const</span> <span class="n">string</span><span class="o">&amp;</span> <span class="n">name</span><span class="p">,</span> <span class="kt">bool</span> <span class="n">recursive</span><span class="o">=</span><span class="nb">true</span><span class="p">);</span>
<span class="n">OpDesc</span><span class="o">*</span> <span class="nf">FindOp</span><span class="p">(</span><span class="k">const</span> <span class="n">string</span><span class="o">&amp;</span> <span class="n">name</span><span class="p">);</span>
<span class="n">BlockDesc</span> <span class="nf">Compile</span><span class="p">()</span> <span class="k">const</span><span class="p">;</span>
<span class="k">private</span><span class="o">:</span>
<span class="n">SymbolTable</span><span class="o">*</span> <span class="n">parent_</span><span class="p">;</span>
<span class="n">map</span><span class="o">&lt;</span><span class="n">string</span><span class="p">,</span> <span class="n">OpDesc</span><span class="o">&gt;</span> <span class="n">ops_</span><span class="p">;</span>
<span class="n">map</span><span class="o">&lt;</span><span class="n">string</span><span class="p">,</span> <span class="n">VarDesc</span><span class="o">&gt;</span> <span class="n">vars_</span><span class="p">;</span>
<span class="p">};</span>
</pre></div>
</div>
<p>After all the description of variables and operators is added into SymbolTable,
the block has enough information to run.</p>
<p>The <code class="docutils literal"><span class="pre">Block</span></code> class takes a <code class="docutils literal"><span class="pre">BlockDesc</span></code> as input, and provide <code class="docutils literal"><span class="pre">Run</span></code> and <code class="docutils literal"><span class="pre">InferShape</span></code> functions.</p>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="k">namespace</span> <span class="p">{</span>
<span class="k">class</span> <span class="nc">Block</span> <span class="o">:</span> <span class="n">OperatorBase</span> <span class="p">{</span>
<span class="k">public</span><span class="o">:</span>
<span class="n">Block</span><span class="p">(</span><span class="k">const</span> <span class="n">BlockDesc</span><span class="o">&amp;</span> <span class="n">desc</span><span class="p">)</span> <span class="n">desc_</span><span class="p">(</span><span class="n">desc</span><span class="p">)</span> <span class="p">{}</span>
<span class="kt">void</span> <span class="n">InferShape</span><span class="p">(</span><span class="k">const</span> <span class="n">framework</span><span class="o">::</span><span class="n">Scope</span><span class="o">&amp;</span> <span class="n">scope</span><span class="p">)</span> <span class="k">const</span> <span class="k">override</span> <span class="p">{</span>
<span class="k">if</span> <span class="p">(</span><span class="o">!</span><span class="n">symbols_ready_</span><span class="p">)</span> <span class="p">{</span>
<span class="n">CreateVariables</span><span class="p">(</span><span class="n">scope</span><span class="p">);</span>
<span class="n">CreateOperators</span><span class="p">();</span>
<span class="p">}</span>
<span class="c1">// should run InferShape first.</span>
<span class="k">for</span> <span class="p">(</span><span class="k">auto</span><span class="o">&amp;</span> <span class="nl">op</span> <span class="p">:</span> <span class="n">runtime_table_</span><span class="p">.</span><span class="n">ops</span><span class="p">())</span> <span class="p">{</span>
<span class="n">op</span><span class="o">-&gt;</span><span class="n">InferShape</span><span class="p">(</span><span class="n">scope</span><span class="p">);</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="kt">void</span> <span class="n">Run</span><span class="p">(</span><span class="k">const</span> <span class="n">framework</span><span class="o">::</span><span class="n">Scope</span><span class="o">&amp;</span> <span class="n">scope</span><span class="p">,</span>
<span class="k">const</span> <span class="n">platform</span><span class="o">::</span><span class="n">DeviceContext</span><span class="o">&amp;</span> <span class="n">dev_ctx</span><span class="p">)</span> <span class="k">const</span> <span class="k">override</span> <span class="p">{</span>
<span class="n">PADDLE_ENFORCE</span><span class="p">(</span><span class="n">symbols_ready_</span><span class="p">,</span> <span class="s">&quot;operators and variables should be created first.&quot;</span><span class="p">);</span>
<span class="k">for</span> <span class="p">(</span><span class="k">auto</span><span class="o">&amp;</span> <span class="nl">op</span> <span class="p">:</span> <span class="n">runtime_table_</span><span class="p">.</span><span class="n">ops</span><span class="p">())</span> <span class="p">{</span>
<span class="n">op</span><span class="o">-&gt;</span><span class="n">Run</span><span class="p">(</span><span class="n">scope</span><span class="p">,</span> <span class="n">dev_ctx</span><span class="p">);</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="kt">void</span> <span class="n">CreateVariables</span><span class="p">(</span><span class="k">const</span> <span class="n">framework</span><span class="o">::</span><span class="n">Scope</span><span class="o">&amp;</span> <span class="n">scope</span><span class="p">);</span>
<span class="kt">void</span> <span class="nf">CreateOperators</span><span class="p">();</span>
<span class="c1">// some other necessary interfaces of NetOp are list below</span>
<span class="c1">// ...</span>
<span class="k">private</span><span class="o">:</span>
<span class="n">BlockDesc</span> <span class="n">desc_</span><span class="p">;</span>
<span class="kt">bool</span> <span class="n">symbols_ready_</span><span class="p">{</span><span class="nb">false</span><span class="p">};</span>
<span class="p">};</span>
</pre></div>
</div>
</div>
<div class="section" id="the-execution-of-blocks">
<span id="the-execution-of-blocks"></span><h2>The Execution of Blocks<a class="headerlink" href="#the-execution-of-blocks" title="Permalink to this headline"></a></h2>
<p>Block inherits from OperatorBase, which has a Run method.
Block&#8217;s Run method will run its operators sequentially.</p>
<p>There is another important interface called <code class="docutils literal"><span class="pre">Eval</span></code>, which take some arguments called targets, and generate a minimal graph which takes targets as the end points and creates a new Block,
after <code class="docutils literal"><span class="pre">Run</span></code>, <code class="docutils literal"><span class="pre">Eval</span></code> will get the latest value and return the targets.</p>
<p>The definition of Eval is as follows:</p>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="c1">// clean a block description by targets using the corresponding dependency graph.</span>
<span class="c1">// return a new BlockDesc with minimal number of operators.</span>
<span class="c1">// NOTE not return a Block but the block&#39;s description so that this can be distributed</span>
<span class="c1">// to a cluster.</span>
<span class="n">BlockDesc</span> <span class="nf">Prune</span><span class="p">(</span><span class="k">const</span> <span class="n">BlockDesc</span><span class="o">&amp;</span> <span class="n">desc</span><span class="p">,</span> <span class="n">vector</span><span class="o">&lt;</span><span class="n">string</span><span class="o">&gt;</span> <span class="n">targets</span><span class="p">);</span>
<span class="kt">void</span> <span class="n">Block</span><span class="o">::</span><span class="n">Eval</span><span class="p">(</span><span class="k">const</span> <span class="n">vector</span><span class="o">&lt;</span><span class="n">string</span><span class="o">&gt;&amp;</span> <span class="n">targets</span><span class="p">,</span>
<span class="k">const</span> <span class="n">framework</span><span class="o">::</span><span class="n">Scope</span><span class="o">&amp;</span> <span class="n">scope</span><span class="p">,</span>
<span class="k">const</span> <span class="n">platform</span><span class="o">::</span><span class="n">DeviceContext</span><span class="o">&amp;</span> <span class="n">dev_ctx</span><span class="p">)</span> <span class="p">{</span>
<span class="n">BlockDesc</span> <span class="n">min_desc</span> <span class="o">=</span> <span class="n">Prune</span><span class="p">(</span><span class="n">desc_</span><span class="p">,</span> <span class="n">targets</span><span class="p">);</span>
<span class="n">Block</span> <span class="nf">min_block</span><span class="p">(</span><span class="n">min_desc</span><span class="p">);</span>
<span class="n">min_block</span><span class="p">.</span><span class="n">Run</span><span class="p">(</span><span class="n">scope</span><span class="p">,</span> <span class="n">dev_ctx</span><span class="p">);</span>
<span class="p">}</span>
</pre></div>
</div>
</div>
</div>
</div>
</div>
<footer>
<hr/>
<div role="contentinfo">
<p>
&copy; Copyright 2016, PaddlePaddle developers.
</p>
</div>
Built with <a href="http://sphinx-doc.org/">Sphinx</a> using a <a href="https://github.com/snide/sphinx_rtd_theme">theme</a> provided by <a href="https://readthedocs.org">Read the Docs</a>.
</footer>
</div>
</div>
</section>
</div>
<script type="text/javascript">
var DOCUMENTATION_OPTIONS = {
URL_ROOT:'../',
VERSION:'',
COLLAPSE_INDEX:false,
FILE_SUFFIX:'.html',
HAS_SOURCE: true,
SOURCELINK_SUFFIX: ".txt",
};
</script>
<script type="text/javascript" src="../_static/jquery.js"></script>
<script type="text/javascript" src="../_static/underscore.js"></script>
<script type="text/javascript" src="../_static/doctools.js"></script>
<script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-AMS-MML_HTMLorMML"></script>
<script type="text/javascript" src="../_static/js/theme.js"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js" integrity="sha384-Tc5IQib027qvyjSMfHjOMaLkfuWVxZxUPnCJA7l2mCWNIpG9mGCD8wGNIcPD7Txa" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/perfect-scrollbar/0.6.14/js/perfect-scrollbar.jquery.min.js"></script>
<script src="../_static/js/paddle_doc_init.js"></script>
</body>
</html>
\ No newline at end of file
因为 它太大了无法显示 source diff 。你可以改为 查看blob
# Design Doc: Block and Scope
## The Representation of Computation
Both deep learning systems and programming languages help users describe computation procedures. These systems use various representations of computation:
- Caffe, Torch, and Paddle: sequences of layers.
- TensorFlow, Caffe2, Mxnet: graphs of operators.
- PaddlePaddle: nested blocks, like C++ and Java programs.
## Block in Programming Languages and Deep Learning
In programming languages, a block is a pair of curly braces that includes local variables definitions and a sequence of instructions, or operators.
Blocks work with control flow structures like `if`, `else`, and `for`, which have equivalents in deep learning:
| programming languages | PaddlePaddle |
|-----------------------|-----------------------|
| for, while loop | RNN, WhileOp |
| if, if-else, switch | IfElseOp, SwitchOp |
| sequential execution | a sequence of layers |
A key difference is that a C++ program describes a one pass computation, whereas a deep learning program describes both the forward and backward passes.
## Stack Frames and the Scope Hierarchy
The existence of the backward makes the execution of a block of traditional programs and PaddlePaddle different to each other:
| programming languages | PaddlePaddle |
|-----------------------|-------------------------------|
| stack | scope hierarchy |
| stack frame | scope |
| push at entering block| push at entering block |
| pop at leaving block | destroy at minibatch completes|
1. In traditional programs:
- When the execution enters the left curly brace of a block, the runtime pushes a frame into the stack, where it realizes local variables.
- After the execution leaves the right curly brace, the runtime pops the frame.
- The maximum number of frames in the stack is the maximum depth of nested blocks.
1. In PaddlePaddle
- When the execution enters a block, PaddlePaddle adds a new scope, where it realizes variables.
- PaddlePaddle doesn't pop a scope after the execution of the block because variables therein are to be used by the backward pass. So it has a stack forest known as a *scope hierarchy*.
- The height of the highest tree is the maximum depth of nested blocks.
- After the process of a minibatch, PaddlePaddle destroys the scope hierarchy.
## Use Blocks in C++ and PaddlePaddle Programs
Let us consolidate the discussion by presenting some examples.
### Blocks with `if-else` and `IfElseOp`
The following C++ programs shows how blocks are used with the `if-else` structure:
```c++
int x = 10;
int y = 20;
int out;
bool cond = false;
if (cond) {
int z = x + y;
out = softmax(z);
} else {
int z = fc(x);
out = z;
}
```
An equivalent PaddlePaddle program from the design doc of the [IfElseOp operator](./if_else_op.md) is as follows:
```python
import paddle as pd
x = var(10)
y = var(20)
cond = var(false)
ie = pd.create_ifelseop(inputs=[x], output_num=1)
with ie.true_block():
x = ie.inputs(true, 0)
z = operator.add(x, y)
ie.set_output(true, 0, operator.softmax(z))
with ie.false_block():
x = ie.inputs(false, 0)
z = layer.fc(x)
ie.set_output(true, 0, operator.softmax(z))
out = b(cond)
```
In both examples, the left branch computes `softmax(x+y)` and the right branch computes `fc(x)`.
A difference is that variables in the C++ program contain scalar values, whereas those in the PaddlePaddle programs are mini-batches of instances. The `ie.input(true, 0)` invocation returns instances in the 0-th input, `x`, that corresponds to true values in `cond` as the local variable `x`, where `ie.input(false, 0)` returns instances corresponding to false values.
### Blocks with `for` and `RNNOp`
The following RNN model from the [RNN design doc](./rnn.md)
```python
x = sequence([10, 20, 30])
m = var(0)
W = tensor()
U = tensor()
rnn = create_rnn(inputs=[input])
with rnn.stepnet() as net:
x = net.set_inputs(0)
h = net.add_memory(init=m)
fc_out = pd.matmul(W, x)
hidden_out = pd.matmul(U, h.pre(n=1))
sum = pd.add_two(fc_out, hidden_out)
act = pd.sigmoid(sum)
h.update(act) # update memory with act
net.set_outputs(0, act, hidden_out) # two outputs
o1, o2 = rnn()
print o1, o2
```
has its equivalent C++ program as follows
```c++
int* x = {10, 20, 30};
int m = 0;
int W = some_value();
int U = some_other_value();
int mem[sizeof(x) / sizeof(x[0]) + 1];
int o1[sizeof(x) / sizeof(x[0]) + 1];
int o2[sizeof(x) / sizeof(x[0]) + 1];
for (int i = 1; i <= sizeof(x)/sizeof(x[0]); ++i) {
int x = x[i-1];
if (i == 1) mem[0] = m;
int fc_out = W * x;
int hidden_out = Y * mem[i-1];
int sum = fc_out + hidden_out;
int act = sigmoid(sum);
mem[i] = act;
o1[i] = act;
o2[i] = hidden_out;
}
print_array(o1);
print_array(o2);
```
## Compilation and Execution
Like TensorFlow programs, a PaddlePaddle program is written in Python. The first part describes a neural network as a protobuf message, and the rest part executes the message for training or inference.
The generation of this protobuf message is like what a compiler generates a binary executable file. The execution of the message that the OS executes the binary file.
## The "Binary Executable File Format"
The definition of the protobuf message is as follows:
```protobuf
message BlockDesc {
repeated VarDesc vars = 1;
repeated OpDesc ops = 2;
}
```
The step net in above RNN example would look like
```
BlockDesc {
vars = {
VarDesc {...} // x
VarDesc {...} // h
VarDesc {...} // fc_out
VarDesc {...} // hidden_out
VarDesc {...} // sum
VarDesc {...} // act
}
ops = {
OpDesc {...} // matmul
OpDesc {...} // add_two
OpDesc {...} // sigmoid
}
};
```
Also, the RNN operator in above example is serialized into a protobuf message of type `OpDesc` and would look like:
```
OpDesc {
inputs = {0} // the index of x
outputs = {5, 3} // indices of act and hidden_out
attrs {
"memories" : {1} // the index of h
"step_net" : <above step net>
}
};
```
This `OpDesc` value is in the `ops` field of the `BlockDesc` value representing the global block.
## The Compilation of Blocks
During the generation of the Protobuf message, the Block should store VarDesc (the Protobuf message which describes Variable) and OpDesc (the Protobuf message which describes Operator).
VarDesc in a block should have its name scope to avoid local variables affect parent block's name scope.
Child block's name scopes should inherit the parent's so that OpDesc in child block can reference a VarDesc that stored in parent block. For example
```python
a = pd.Varaible(shape=[20, 20])
b = pd.fc(a, params=["fc.w", "fc.b"])
rnn = pd.create_rnn()
with rnn.stepnet() as net:
x = net.set_inputs(a)
# reuse fc's parameter
fc_without_b = pd.get_variable("fc.w")
net.set_outputs(fc_without_b)
out = rnn()
```
the method `pd.get_variable` can help retrieve a Variable by a name, a Variable may store in a parent block, but might be retrieved in a child block, so block should have a variable scope that supports inheritance.
In compiler design, the symbol table is a data structure created and maintained by compilers to store information about the occurrence of various entities such as variable names, function names, classes, etc.
To store the definition of variables and operators, we define a C++ class `SymbolTable`, like the one used in compilers.
`SymbolTable` can do the following stuff:
- store the definitions (some names and attributes) of variables and operators,
- to verify if a variable was declared,
- to make it possible to implement type checking (offer Protobuf message pointers to `InferShape` handlers).
```c++
// Information in SymbolTable is enough to trace the dependency graph. So maybe
// the Eval() interface takes a SymbolTable is enough.
class SymbolTable {
public:
SymbolTable(SymbolTable* parent) : parent_(parent) {}
OpDesc* NewOp(const string& name="");
// TODO determine whether name is generated by python or C++
// currently assume that a unique name will be generated by C++ if the
// argument name left default.
VarDesc* NewVar(const string& name="");
// find a VarDesc by name, if recursive true, find parent's SymbolTable
// recursively.
// this interface is introduced to support InferShape, find protobuf messages
// of variables and operators, pass pointers into InferShape.
// operator
//
// NOTE maybe some C++ classes such as VarDescBuilder and OpDescBuilder should
// be proposed and embedded into pybind to enable python operate on C++ pointers.
VarDesc* FindVar(const string& name, bool recursive=true);
OpDesc* FindOp(const string& name);
BlockDesc Compile() const;
private:
SymbolTable* parent_;
map<string, OpDesc> ops_;
map<string, VarDesc> vars_;
};
```
After all the description of variables and operators is added into SymbolTable,
the block has enough information to run.
The `Block` class takes a `BlockDesc` as input, and provide `Run` and `InferShape` functions.
```c++
namespace {
class Block : OperatorBase {
public:
Block(const BlockDesc& desc) desc_(desc) {}
void InferShape(const framework::Scope& scope) const override {
if (!symbols_ready_) {
CreateVariables(scope);
CreateOperators();
}
// should run InferShape first.
for (auto& op : runtime_table_.ops()) {
op->InferShape(scope);
}
}
void Run(const framework::Scope& scope,
const platform::DeviceContext& dev_ctx) const override {
PADDLE_ENFORCE(symbols_ready_, "operators and variables should be created first.");
for (auto& op : runtime_table_.ops()) {
op->Run(scope, dev_ctx);
}
}
void CreateVariables(const framework::Scope& scope);
void CreateOperators();
// some other necessary interfaces of NetOp are list below
// ...
private:
BlockDesc desc_;
bool symbols_ready_{false};
};
```
## The Execution of Blocks
Block inherits from OperatorBase, which has a Run method.
Block's Run method will run its operators sequentially.
There is another important interface called `Eval`, which take some arguments called targets, and generate a minimal graph which takes targets as the end points and creates a new Block,
after `Run`, `Eval` will get the latest value and return the targets.
The definition of Eval is as follows:
```c++
// clean a block description by targets using the corresponding dependency graph.
// return a new BlockDesc with minimal number of operators.
// NOTE not return a Block but the block's description so that this can be distributed
// to a cluster.
BlockDesc Prune(const BlockDesc& desc, vector<string> targets);
void Block::Eval(const vector<string>& targets,
const framework::Scope& scope,
const platform::DeviceContext& dev_ctx) {
BlockDesc min_desc = Prune(desc_, targets);
Block min_block(min_desc);
min_block.Run(scope, dev_ctx);
}
```
<!DOCTYPE html>
<!--[if IE 8]><html class="no-js lt-ie9" lang="en" > <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js" lang="en" > <!--<![endif]-->
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Design Doc: Block and Scope &mdash; PaddlePaddle 文档</title>
<link rel="stylesheet" href="../_static/css/theme.css" type="text/css" />
<link rel="index" title="索引"
href="../genindex.html"/>
<link rel="search" title="搜索" href="../search.html"/>
<link rel="top" title="PaddlePaddle 文档" href="../index.html"/>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/perfect-scrollbar/0.6.14/css/perfect-scrollbar.min.css" type="text/css" />
<link rel="stylesheet" href="../_static/css/override.css" type="text/css" />
<script>
var _hmt = _hmt || [];
(function() {
var hm = document.createElement("script");
hm.src = "//hm.baidu.com/hm.js?b9a314ab40d04d805655aab1deee08ba";
var s = document.getElementsByTagName("script")[0];
s.parentNode.insertBefore(hm, s);
})();
</script>
<script src="../_static/js/modernizr.min.js"></script>
</head>
<body class="wy-body-for-nav" role="document">
<header class="site-header">
<div class="site-logo">
<a href="/"><img src="../_static/images/PP_w.png"></a>
</div>
<div class="site-nav-links">
<div class="site-menu">
<a class="fork-on-github" href="https://github.com/PaddlePaddle/Paddle" target="_blank"><i class="fa fa-github"></i>Fork me on Github</a>
<div class="language-switcher dropdown">
<a type="button" data-toggle="dropdown">
<span>English</span>
<i class="fa fa-angle-up"></i>
<i class="fa fa-angle-down"></i>
</a>
<ul class="dropdown-menu">
<li><a href="/doc_cn">中文</a></li>
<li><a href="/doc">English</a></li>
</ul>
</div>
<ul class="site-page-links">
<li><a href="/">Home</a></li>
</ul>
</div>
<div class="doc-module">
<ul>
<li class="toctree-l1"><a class="reference internal" href="../getstarted/index_cn.html">新手入门</a></li>
<li class="toctree-l1"><a class="reference internal" href="../howto/index_cn.html">进阶指南</a></li>
<li class="toctree-l1"><a class="reference internal" href="../api/index_cn.html">API</a></li>
<li class="toctree-l1"><a class="reference internal" href="../faq/index_cn.html">FAQ</a></li>
</ul>
<div role="search">
<form id="rtd-search-form" class="wy-form" action="../search.html" method="get">
<input type="text" name="q" placeholder="Search docs" />
<input type="hidden" name="check_keywords" value="yes" />
<input type="hidden" name="area" value="default" />
</form>
</div>
</div>
</div>
</header>
<div class="main-content-wrap">
<nav class="doc-menu-vertical" role="navigation">
<ul>
<li class="toctree-l1"><a class="reference internal" href="../getstarted/index_cn.html">新手入门</a><ul>
<li class="toctree-l2"><a class="reference internal" href="../getstarted/build_and_install/index_cn.html">安装与编译</a><ul>
<li class="toctree-l3"><a class="reference internal" href="../getstarted/build_and_install/docker_install_cn.html">PaddlePaddle的Docker容器使用方式</a></li>
<li class="toctree-l3"><a class="reference internal" href="../getstarted/build_and_install/cmake/build_from_source_cn.html">PaddlePaddle的编译选项</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="../getstarted/concepts/use_concepts_cn.html">基本使用概念</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="../howto/index_cn.html">进阶指南</a><ul>
<li class="toctree-l2"><a class="reference internal" href="../howto/usage/cmd_parameter/index_cn.html">设置命令行参数</a><ul>
<li class="toctree-l3"><a class="reference internal" href="../howto/usage/cmd_parameter/use_case_cn.html">使用案例</a></li>
<li class="toctree-l3"><a class="reference internal" href="../howto/usage/cmd_parameter/arguments_cn.html">参数概述</a></li>
<li class="toctree-l3"><a class="reference internal" href="../howto/usage/cmd_parameter/detail_introduction_cn.html">细节描述</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="../howto/usage/cluster/cluster_train_cn.html">运行分布式训练</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/usage/k8s/k8s_basis_cn.html">Kubernetes 简介</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/usage/k8s/k8s_cn.html">Kubernetes单机训练</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/usage/k8s/k8s_distributed_cn.html">Kubernetes分布式训练</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/dev/build_cn.html">编译PaddlePaddle和运行单元测试</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/dev/write_docs_cn.html">如何贡献/修改文档</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/dev/contribute_to_paddle_cn.html">如何贡献代码</a></li>
<li class="toctree-l2"><a class="reference internal" href="../howto/deep_model/rnn/index_cn.html">RNN相关模型</a><ul>
<li class="toctree-l3"><a class="reference internal" href="../howto/deep_model/rnn/rnn_config_cn.html">RNN配置</a></li>
<li class="toctree-l3"><a class="reference internal" href="../howto/deep_model/rnn/recurrent_group_cn.html">Recurrent Group教程</a></li>
<li class="toctree-l3"><a class="reference internal" href="../howto/deep_model/rnn/hierarchical_layer_cn.html">支持双层序列作为输入的Layer</a></li>
<li class="toctree-l3"><a class="reference internal" href="../howto/deep_model/rnn/hrnn_rnn_api_compare_cn.html">单双层RNN API对比介绍</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="../howto/optimization/gpu_profiling_cn.html">GPU性能分析与调优</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="../api/index_cn.html">API</a><ul>
<li class="toctree-l2"><a class="reference internal" href="../api/v2/model_configs.html">模型配置</a><ul>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/activation.html">Activation</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/layer.html">Layers</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/evaluators.html">Evaluators</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/optimizer.html">Optimizer</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/pooling.html">Pooling</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/networks.html">Networks</a></li>
<li class="toctree-l3"><a class="reference internal" href="../api/v2/config/attr.html">Parameter Attribute</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="../api/v2/data.html">数据访问</a></li>
<li class="toctree-l2"><a class="reference internal" href="../api/v2/run_logic.html">训练与应用</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="../faq/index_cn.html">FAQ</a></li>
</ul>
</nav>
<section class="doc-content-wrap">
<div role="navigation" aria-label="breadcrumbs navigation">
<ul class="wy-breadcrumbs">
<li>Design Doc: Block and Scope</li>
</ul>
</div>
<div class="wy-nav-content" id="doc-content">
<div class="rst-content">
<div role="main" class="document" itemscope="itemscope" itemtype="http://schema.org/Article">
<div itemprop="articleBody">
<div class="section" id="design-doc-block-and-scope">
<span id="design-doc-block-and-scope"></span><h1>Design Doc: Block and Scope<a class="headerlink" href="#design-doc-block-and-scope" title="永久链接至标题"></a></h1>
<div class="section" id="the-representation-of-computation">
<span id="the-representation-of-computation"></span><h2>The Representation of Computation<a class="headerlink" href="#the-representation-of-computation" title="永久链接至标题"></a></h2>
<p>Both deep learning systems and programming languages help users describe computation procedures. These systems use various representations of computation:</p>
<ul class="simple">
<li>Caffe, Torch, and Paddle: sequences of layers.</li>
<li>TensorFlow, Caffe2, Mxnet: graphs of operators.</li>
<li>PaddlePaddle: nested blocks, like C++ and Java programs.</li>
</ul>
</div>
<div class="section" id="block-in-programming-languages-and-deep-learning">
<span id="block-in-programming-languages-and-deep-learning"></span><h2>Block in Programming Languages and Deep Learning<a class="headerlink" href="#block-in-programming-languages-and-deep-learning" title="永久链接至标题"></a></h2>
<p>In programming languages, a block is a pair of curly braces that includes local variables definitions and a sequence of instructions, or operators.</p>
<p>Blocks work with control flow structures like <code class="docutils literal"><span class="pre">if</span></code>, <code class="docutils literal"><span class="pre">else</span></code>, and <code class="docutils literal"><span class="pre">for</span></code>, which have equivalents in deep learning:</p>
<p>| programming languages | PaddlePaddle |
|&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;|&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;|
| for, while loop | RNN, WhileOp |
| if, if-else, switch | IfElseOp, SwitchOp |
| sequential execution | a sequence of layers |</p>
<p>A key difference is that a C++ program describes a one pass computation, whereas a deep learning program describes both the forward and backward passes.</p>
</div>
<div class="section" id="stack-frames-and-the-scope-hierarchy">
<span id="stack-frames-and-the-scope-hierarchy"></span><h2>Stack Frames and the Scope Hierarchy<a class="headerlink" href="#stack-frames-and-the-scope-hierarchy" title="永久链接至标题"></a></h2>
<p>The existence of the backward makes the execution of a block of traditional programs and PaddlePaddle different to each other:</p>
<p>| programming languages | PaddlePaddle |
|&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;|&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;-|
| stack | scope hierarchy |
| stack frame | scope |
| push at entering block| push at entering block |
| pop at leaving block | destroy at minibatch completes|</p>
<ol class="simple">
<li>In traditional programs:<ul>
<li>When the execution enters the left curly brace of a block, the runtime pushes a frame into the stack, where it realizes local variables.</li>
<li>After the execution leaves the right curly brace, the runtime pops the frame.</li>
<li>The maximum number of frames in the stack is the maximum depth of nested blocks.</li>
</ul>
</li>
<li>In PaddlePaddle<ul>
<li>When the execution enters a block, PaddlePaddle adds a new scope, where it realizes variables.</li>
<li>PaddlePaddle doesn&#8217;t pop a scope after the execution of the block because variables therein are to be used by the backward pass. So it has a stack forest known as a <em>scope hierarchy</em>.</li>
<li>The height of the highest tree is the maximum depth of nested blocks.</li>
<li>After the process of a minibatch, PaddlePaddle destroys the scope hierarchy.</li>
</ul>
</li>
</ol>
</div>
<div class="section" id="use-blocks-in-c-and-paddlepaddle-programs">
<span id="use-blocks-in-c-and-paddlepaddle-programs"></span><h2>Use Blocks in C++ and PaddlePaddle Programs<a class="headerlink" href="#use-blocks-in-c-and-paddlepaddle-programs" title="永久链接至标题"></a></h2>
<p>Let us consolidate the discussion by presenting some examples.</p>
<div class="section" id="blocks-with-if-else-and-ifelseop">
<span id="blocks-with-if-else-and-ifelseop"></span><h3>Blocks with <code class="docutils literal"><span class="pre">if-else</span></code> and <code class="docutils literal"><span class="pre">IfElseOp</span></code><a class="headerlink" href="#blocks-with-if-else-and-ifelseop" title="永久链接至标题"></a></h3>
<p>The following C++ programs shows how blocks are used with the <code class="docutils literal"><span class="pre">if-else</span></code> structure:</p>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="kt">int</span> <span class="n">x</span> <span class="o">=</span> <span class="mi">10</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">y</span> <span class="o">=</span> <span class="mi">20</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">out</span><span class="p">;</span>
<span class="kt">bool</span> <span class="n">cond</span> <span class="o">=</span> <span class="nb">false</span><span class="p">;</span>
<span class="k">if</span> <span class="p">(</span><span class="n">cond</span><span class="p">)</span> <span class="p">{</span>
<span class="kt">int</span> <span class="n">z</span> <span class="o">=</span> <span class="n">x</span> <span class="o">+</span> <span class="n">y</span><span class="p">;</span>
<span class="n">out</span> <span class="o">=</span> <span class="n">softmax</span><span class="p">(</span><span class="n">z</span><span class="p">);</span>
<span class="p">}</span> <span class="k">else</span> <span class="p">{</span>
<span class="kt">int</span> <span class="n">z</span> <span class="o">=</span> <span class="n">fc</span><span class="p">(</span><span class="n">x</span><span class="p">);</span>
<span class="n">out</span> <span class="o">=</span> <span class="n">z</span><span class="p">;</span>
<span class="p">}</span>
</pre></div>
</div>
<p>An equivalent PaddlePaddle program from the design doc of the <a class="reference internal" href="if_else_op.html"><span class="doc">IfElseOp operator</span></a> is as follows:</p>
<div class="highlight-python"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">paddle</span> <span class="kn">as</span> <span class="nn">pd</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">var</span><span class="p">(</span><span class="mi">10</span><span class="p">)</span>
<span class="n">y</span> <span class="o">=</span> <span class="n">var</span><span class="p">(</span><span class="mi">20</span><span class="p">)</span>
<span class="n">cond</span> <span class="o">=</span> <span class="n">var</span><span class="p">(</span><span class="n">false</span><span class="p">)</span>
<span class="n">ie</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">create_ifelseop</span><span class="p">(</span><span class="n">inputs</span><span class="o">=</span><span class="p">[</span><span class="n">x</span><span class="p">],</span> <span class="n">output_num</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="k">with</span> <span class="n">ie</span><span class="o">.</span><span class="n">true_block</span><span class="p">():</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">ie</span><span class="o">.</span><span class="n">inputs</span><span class="p">(</span><span class="n">true</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>
<span class="n">z</span> <span class="o">=</span> <span class="n">operator</span><span class="o">.</span><span class="n">add</span><span class="p">(</span><span class="n">x</span><span class="p">,</span> <span class="n">y</span><span class="p">)</span>
<span class="n">ie</span><span class="o">.</span><span class="n">set_output</span><span class="p">(</span><span class="n">true</span><span class="p">,</span> <span class="mi">0</span><span class="p">,</span> <span class="n">operator</span><span class="o">.</span><span class="n">softmax</span><span class="p">(</span><span class="n">z</span><span class="p">))</span>
<span class="k">with</span> <span class="n">ie</span><span class="o">.</span><span class="n">false_block</span><span class="p">():</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">ie</span><span class="o">.</span><span class="n">inputs</span><span class="p">(</span><span class="n">false</span><span class="p">,</span> <span class="mi">0</span><span class="p">)</span>
<span class="n">z</span> <span class="o">=</span> <span class="n">layer</span><span class="o">.</span><span class="n">fc</span><span class="p">(</span><span class="n">x</span><span class="p">)</span>
<span class="n">ie</span><span class="o">.</span><span class="n">set_output</span><span class="p">(</span><span class="n">true</span><span class="p">,</span> <span class="mi">0</span><span class="p">,</span> <span class="n">operator</span><span class="o">.</span><span class="n">softmax</span><span class="p">(</span><span class="n">z</span><span class="p">))</span>
<span class="n">out</span> <span class="o">=</span> <span class="n">b</span><span class="p">(</span><span class="n">cond</span><span class="p">)</span>
</pre></div>
</div>
<p>In both examples, the left branch computes <code class="docutils literal"><span class="pre">softmax(x+y)</span></code> and the right branch computes <code class="docutils literal"><span class="pre">fc(x)</span></code>.</p>
<p>A difference is that variables in the C++ program contain scalar values, whereas those in the PaddlePaddle programs are mini-batches of instances. The <code class="docutils literal"><span class="pre">ie.input(true,</span> <span class="pre">0)</span></code> invocation returns instances in the 0-th input, <code class="docutils literal"><span class="pre">x</span></code>, that corresponds to true values in <code class="docutils literal"><span class="pre">cond</span></code> as the local variable <code class="docutils literal"><span class="pre">x</span></code>, where <code class="docutils literal"><span class="pre">ie.input(false,</span> <span class="pre">0)</span></code> returns instances corresponding to false values.</p>
</div>
<div class="section" id="blocks-with-for-and-rnnop">
<span id="blocks-with-for-and-rnnop"></span><h3>Blocks with <code class="docutils literal"><span class="pre">for</span></code> and <code class="docutils literal"><span class="pre">RNNOp</span></code><a class="headerlink" href="#blocks-with-for-and-rnnop" title="永久链接至标题"></a></h3>
<p>The following RNN model from the <a class="reference external" href="design/rnn.md">RNN design doc</a></p>
<div class="highlight-python"><div class="highlight"><pre><span></span><span class="n">x</span> <span class="o">=</span> <span class="n">sequence</span><span class="p">([</span><span class="mi">10</span><span class="p">,</span> <span class="mi">20</span><span class="p">,</span> <span class="mi">30</span><span class="p">])</span>
<span class="n">m</span> <span class="o">=</span> <span class="n">var</span><span class="p">(</span><span class="mi">0</span><span class="p">)</span>
<span class="n">W</span> <span class="o">=</span> <span class="n">tensor</span><span class="p">()</span>
<span class="n">U</span> <span class="o">=</span> <span class="n">tensor</span><span class="p">()</span>
<span class="n">rnn</span> <span class="o">=</span> <span class="n">create_rnn</span><span class="p">(</span><span class="n">inputs</span><span class="o">=</span><span class="p">[</span><span class="nb">input</span><span class="p">])</span>
<span class="k">with</span> <span class="n">rnn</span><span class="o">.</span><span class="n">stepnet</span><span class="p">()</span> <span class="k">as</span> <span class="n">net</span><span class="p">:</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">net</span><span class="o">.</span><span class="n">set_inputs</span><span class="p">(</span><span class="mi">0</span><span class="p">)</span>
<span class="n">h</span> <span class="o">=</span> <span class="n">net</span><span class="o">.</span><span class="n">add_memory</span><span class="p">(</span><span class="n">init</span><span class="o">=</span><span class="n">m</span><span class="p">)</span>
<span class="n">fc_out</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">matmul</span><span class="p">(</span><span class="n">W</span><span class="p">,</span> <span class="n">x</span><span class="p">)</span>
<span class="n">hidden_out</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">matmul</span><span class="p">(</span><span class="n">U</span><span class="p">,</span> <span class="n">h</span><span class="o">.</span><span class="n">pre</span><span class="p">(</span><span class="n">n</span><span class="o">=</span><span class="mi">1</span><span class="p">))</span>
<span class="nb">sum</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">add_two</span><span class="p">(</span><span class="n">fc_out</span><span class="p">,</span> <span class="n">hidden_out</span><span class="p">)</span>
<span class="n">act</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">sigmoid</span><span class="p">(</span><span class="nb">sum</span><span class="p">)</span>
<span class="n">h</span><span class="o">.</span><span class="n">update</span><span class="p">(</span><span class="n">act</span><span class="p">)</span> <span class="c1"># update memory with act</span>
<span class="n">net</span><span class="o">.</span><span class="n">set_outputs</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="n">act</span><span class="p">,</span> <span class="n">hidden_out</span><span class="p">)</span> <span class="c1"># two outputs</span>
<span class="n">o1</span><span class="p">,</span> <span class="n">o2</span> <span class="o">=</span> <span class="n">rnn</span><span class="p">()</span>
<span class="k">print</span> <span class="n">o1</span><span class="p">,</span> <span class="n">o2</span>
</pre></div>
</div>
<p>has its equivalent C++ program as follows</p>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="kt">int</span><span class="o">*</span> <span class="n">x</span> <span class="o">=</span> <span class="p">{</span><span class="mi">10</span><span class="p">,</span> <span class="mi">20</span><span class="p">,</span> <span class="mi">30</span><span class="p">};</span>
<span class="kt">int</span> <span class="n">m</span> <span class="o">=</span> <span class="mi">0</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">W</span> <span class="o">=</span> <span class="n">some_value</span><span class="p">();</span>
<span class="kt">int</span> <span class="n">U</span> <span class="o">=</span> <span class="n">some_other_value</span><span class="p">();</span>
<span class="kt">int</span> <span class="n">mem</span><span class="p">[</span><span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">)</span> <span class="o">/</span> <span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span> <span class="o">+</span> <span class="mi">1</span><span class="p">];</span>
<span class="kt">int</span> <span class="n">o1</span><span class="p">[</span><span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">)</span> <span class="o">/</span> <span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span> <span class="o">+</span> <span class="mi">1</span><span class="p">];</span>
<span class="kt">int</span> <span class="n">o2</span><span class="p">[</span><span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">)</span> <span class="o">/</span> <span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span> <span class="o">+</span> <span class="mi">1</span><span class="p">];</span>
<span class="k">for</span> <span class="p">(</span><span class="kt">int</span> <span class="n">i</span> <span class="o">=</span> <span class="mi">1</span><span class="p">;</span> <span class="n">i</span> <span class="o">&lt;=</span> <span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">)</span><span class="o">/</span><span class="k">sizeof</span><span class="p">(</span><span class="n">x</span><span class="p">[</span><span class="mi">0</span><span class="p">]);</span> <span class="o">++</span><span class="n">i</span><span class="p">)</span> <span class="p">{</span>
<span class="kt">int</span> <span class="n">x</span> <span class="o">=</span> <span class="n">x</span><span class="p">[</span><span class="n">i</span><span class="o">-</span><span class="mi">1</span><span class="p">];</span>
<span class="k">if</span> <span class="p">(</span><span class="n">i</span> <span class="o">==</span> <span class="mi">1</span><span class="p">)</span> <span class="n">mem</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span> <span class="o">=</span> <span class="n">m</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">fc_out</span> <span class="o">=</span> <span class="n">W</span> <span class="o">*</span> <span class="n">x</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">hidden_out</span> <span class="o">=</span> <span class="n">Y</span> <span class="o">*</span> <span class="n">mem</span><span class="p">[</span><span class="n">i</span><span class="o">-</span><span class="mi">1</span><span class="p">];</span>
<span class="kt">int</span> <span class="n">sum</span> <span class="o">=</span> <span class="n">fc_out</span> <span class="o">+</span> <span class="n">hidden_out</span><span class="p">;</span>
<span class="kt">int</span> <span class="n">act</span> <span class="o">=</span> <span class="n">sigmoid</span><span class="p">(</span><span class="n">sum</span><span class="p">);</span>
<span class="n">mem</span><span class="p">[</span><span class="n">i</span><span class="p">]</span> <span class="o">=</span> <span class="n">act</span><span class="p">;</span>
<span class="n">o1</span><span class="p">[</span><span class="n">i</span><span class="p">]</span> <span class="o">=</span> <span class="n">act</span><span class="p">;</span>
<span class="n">o2</span><span class="p">[</span><span class="n">i</span><span class="p">]</span> <span class="o">=</span> <span class="n">hidden_out</span><span class="p">;</span>
<span class="p">}</span>
<span class="n">print_array</span><span class="p">(</span><span class="n">o1</span><span class="p">);</span>
<span class="n">print_array</span><span class="p">(</span><span class="n">o2</span><span class="p">);</span>
</pre></div>
</div>
</div>
</div>
<div class="section" id="compilation-and-execution">
<span id="compilation-and-execution"></span><h2>Compilation and Execution<a class="headerlink" href="#compilation-and-execution" title="永久链接至标题"></a></h2>
<p>Like TensorFlow programs, a PaddlePaddle program is written in Python. The first part describes a neural network as a protobuf message, and the rest part executes the message for training or inference.</p>
<p>The generation of this protobuf message is like what a compiler generates a binary executable file. The execution of the message that the OS executes the binary file.</p>
</div>
<div class="section" id="the-binary-executable-file-format">
<span id="the-binary-executable-file-format"></span><h2>The &#8220;Binary Executable File Format&#8221;<a class="headerlink" href="#the-binary-executable-file-format" title="永久链接至标题"></a></h2>
<p>The definition of the protobuf message is as follows:</p>
<div class="highlight-protobuf"><div class="highlight"><pre><span></span><span class="kd">message</span> <span class="nc">BlockDesc</span> <span class="p">{</span>
<span class="k">repeated</span> <span class="n">VarDesc</span> <span class="na">vars</span> <span class="o">=</span> <span class="mi">1</span><span class="p">;</span>
<span class="k">repeated</span> <span class="n">OpDesc</span> <span class="na">ops</span> <span class="o">=</span> <span class="mi">2</span><span class="p">;</span>
<span class="p">}</span>
</pre></div>
</div>
<p>The step net in above RNN example would look like</p>
<div class="highlight-default"><div class="highlight"><pre><span></span><span class="n">BlockDesc</span> <span class="p">{</span>
<span class="nb">vars</span> <span class="o">=</span> <span class="p">{</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">x</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">h</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">fc_out</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">hidden_out</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="nb">sum</span>
<span class="n">VarDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">act</span>
<span class="p">}</span>
<span class="n">ops</span> <span class="o">=</span> <span class="p">{</span>
<span class="n">OpDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">matmul</span>
<span class="n">OpDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">add_two</span>
<span class="n">OpDesc</span> <span class="p">{</span><span class="o">...</span><span class="p">}</span> <span class="o">//</span> <span class="n">sigmoid</span>
<span class="p">}</span>
<span class="p">};</span>
</pre></div>
</div>
<p>Also, the RNN operator in above example is serialized into a protobuf message of type <code class="docutils literal"><span class="pre">OpDesc</span></code> and would look like:</p>
<div class="highlight-default"><div class="highlight"><pre><span></span><span class="n">OpDesc</span> <span class="p">{</span>
<span class="n">inputs</span> <span class="o">=</span> <span class="p">{</span><span class="mi">0</span><span class="p">}</span> <span class="o">//</span> <span class="n">the</span> <span class="n">index</span> <span class="n">of</span> <span class="n">x</span>
<span class="n">outputs</span> <span class="o">=</span> <span class="p">{</span><span class="mi">5</span><span class="p">,</span> <span class="mi">3</span><span class="p">}</span> <span class="o">//</span> <span class="n">indices</span> <span class="n">of</span> <span class="n">act</span> <span class="ow">and</span> <span class="n">hidden_out</span>
<span class="n">attrs</span> <span class="p">{</span>
<span class="s2">&quot;memories&quot;</span> <span class="p">:</span> <span class="p">{</span><span class="mi">1</span><span class="p">}</span> <span class="o">//</span> <span class="n">the</span> <span class="n">index</span> <span class="n">of</span> <span class="n">h</span>
<span class="s2">&quot;step_net&quot;</span> <span class="p">:</span> <span class="o">&lt;</span><span class="n">above</span> <span class="n">step</span> <span class="n">net</span><span class="o">&gt;</span>
<span class="p">}</span>
<span class="p">};</span>
</pre></div>
</div>
<p>This <code class="docutils literal"><span class="pre">OpDesc</span></code> value is in the <code class="docutils literal"><span class="pre">ops</span></code> field of the <code class="docutils literal"><span class="pre">BlockDesc</span></code> value representing the global block.</p>
</div>
<div class="section" id="the-compilation-of-blocks">
<span id="the-compilation-of-blocks"></span><h2>The Compilation of Blocks<a class="headerlink" href="#the-compilation-of-blocks" title="永久链接至标题"></a></h2>
<p>During the generation of the Protobuf message, the Block should store VarDesc (the Protobuf message which describes Variable) and OpDesc (the Protobuf message which describes Operator).</p>
<p>VarDesc in a block should have its name scope to avoid local variables affect parent block&#8217;s name scope.
Child block&#8217;s name scopes should inherit the parent&#8217;s so that OpDesc in child block can reference a VarDesc that stored in parent block. For example</p>
<div class="highlight-python"><div class="highlight"><pre><span></span><span class="n">a</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">Varaible</span><span class="p">(</span><span class="n">shape</span><span class="o">=</span><span class="p">[</span><span class="mi">20</span><span class="p">,</span> <span class="mi">20</span><span class="p">])</span>
<span class="n">b</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">fc</span><span class="p">(</span><span class="n">a</span><span class="p">,</span> <span class="n">params</span><span class="o">=</span><span class="p">[</span><span class="s2">&quot;fc.w&quot;</span><span class="p">,</span> <span class="s2">&quot;fc.b&quot;</span><span class="p">])</span>
<span class="n">rnn</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">create_rnn</span><span class="p">()</span>
<span class="k">with</span> <span class="n">rnn</span><span class="o">.</span><span class="n">stepnet</span><span class="p">()</span> <span class="k">as</span> <span class="n">net</span><span class="p">:</span>
<span class="n">x</span> <span class="o">=</span> <span class="n">net</span><span class="o">.</span><span class="n">set_inputs</span><span class="p">(</span><span class="n">a</span><span class="p">)</span>
<span class="c1"># reuse fc&#39;s parameter</span>
<span class="n">fc_without_b</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">get_variable</span><span class="p">(</span><span class="s2">&quot;fc.w&quot;</span><span class="p">)</span>
<span class="n">net</span><span class="o">.</span><span class="n">set_outputs</span><span class="p">(</span><span class="n">fc_without_b</span><span class="p">)</span>
<span class="n">out</span> <span class="o">=</span> <span class="n">rnn</span><span class="p">()</span>
</pre></div>
</div>
<p>the method <code class="docutils literal"><span class="pre">pd.get_variable</span></code> can help retrieve a Variable by a name, a Variable may store in a parent block, but might be retrieved in a child block, so block should have a variable scope that supports inheritance.</p>
<p>In compiler design, the symbol table is a data structure created and maintained by compilers to store information about the occurrence of various entities such as variable names, function names, classes, etc.</p>
<p>To store the definition of variables and operators, we define a C++ class <code class="docutils literal"><span class="pre">SymbolTable</span></code>, like the one used in compilers.</p>
<p><code class="docutils literal"><span class="pre">SymbolTable</span></code> can do the following stuff:</p>
<ul class="simple">
<li>store the definitions (some names and attributes) of variables and operators,</li>
<li>to verify if a variable was declared,</li>
<li>to make it possible to implement type checking (offer Protobuf message pointers to <code class="docutils literal"><span class="pre">InferShape</span></code> handlers).</li>
</ul>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="c1">// Information in SymbolTable is enough to trace the dependency graph. So maybe</span>
<span class="c1">// the Eval() interface takes a SymbolTable is enough.</span>
<span class="k">class</span> <span class="nc">SymbolTable</span> <span class="p">{</span>
<span class="k">public</span><span class="o">:</span>
<span class="n">SymbolTable</span><span class="p">(</span><span class="n">SymbolTable</span><span class="o">*</span> <span class="n">parent</span><span class="p">)</span> <span class="o">:</span> <span class="n">parent_</span><span class="p">(</span><span class="n">parent</span><span class="p">)</span> <span class="p">{}</span>
<span class="n">OpDesc</span><span class="o">*</span> <span class="n">NewOp</span><span class="p">(</span><span class="k">const</span> <span class="n">string</span><span class="o">&amp;</span> <span class="n">name</span><span class="o">=</span><span class="s">&quot;&quot;</span><span class="p">);</span>
<span class="c1">// TODO determine whether name is generated by python or C++</span>
<span class="c1">// currently assume that a unique name will be generated by C++ if the</span>
<span class="c1">// argument name left default.</span>
<span class="n">VarDesc</span><span class="o">*</span> <span class="nf">NewVar</span><span class="p">(</span><span class="k">const</span> <span class="n">string</span><span class="o">&amp;</span> <span class="n">name</span><span class="o">=</span><span class="s">&quot;&quot;</span><span class="p">);</span>
<span class="c1">// find a VarDesc by name, if recursive true, find parent&#39;s SymbolTable</span>
<span class="c1">// recursively.</span>
<span class="c1">// this interface is introduced to support InferShape, find protobuf messages</span>
<span class="c1">// of variables and operators, pass pointers into InferShape.</span>
<span class="c1">// operator</span>
<span class="c1">//</span>
<span class="c1">// NOTE maybe some C++ classes such as VarDescBuilder and OpDescBuilder should</span>
<span class="c1">// be proposed and embedded into pybind to enable python operate on C++ pointers.</span>
<span class="n">VarDesc</span><span class="o">*</span> <span class="nf">FindVar</span><span class="p">(</span><span class="k">const</span> <span class="n">string</span><span class="o">&amp;</span> <span class="n">name</span><span class="p">,</span> <span class="kt">bool</span> <span class="n">recursive</span><span class="o">=</span><span class="nb">true</span><span class="p">);</span>
<span class="n">OpDesc</span><span class="o">*</span> <span class="nf">FindOp</span><span class="p">(</span><span class="k">const</span> <span class="n">string</span><span class="o">&amp;</span> <span class="n">name</span><span class="p">);</span>
<span class="n">BlockDesc</span> <span class="nf">Compile</span><span class="p">()</span> <span class="k">const</span><span class="p">;</span>
<span class="k">private</span><span class="o">:</span>
<span class="n">SymbolTable</span><span class="o">*</span> <span class="n">parent_</span><span class="p">;</span>
<span class="n">map</span><span class="o">&lt;</span><span class="n">string</span><span class="p">,</span> <span class="n">OpDesc</span><span class="o">&gt;</span> <span class="n">ops_</span><span class="p">;</span>
<span class="n">map</span><span class="o">&lt;</span><span class="n">string</span><span class="p">,</span> <span class="n">VarDesc</span><span class="o">&gt;</span> <span class="n">vars_</span><span class="p">;</span>
<span class="p">};</span>
</pre></div>
</div>
<p>After all the description of variables and operators is added into SymbolTable,
the block has enough information to run.</p>
<p>The <code class="docutils literal"><span class="pre">Block</span></code> class takes a <code class="docutils literal"><span class="pre">BlockDesc</span></code> as input, and provide <code class="docutils literal"><span class="pre">Run</span></code> and <code class="docutils literal"><span class="pre">InferShape</span></code> functions.</p>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="k">namespace</span> <span class="p">{</span>
<span class="k">class</span> <span class="nc">Block</span> <span class="o">:</span> <span class="n">OperatorBase</span> <span class="p">{</span>
<span class="k">public</span><span class="o">:</span>
<span class="n">Block</span><span class="p">(</span><span class="k">const</span> <span class="n">BlockDesc</span><span class="o">&amp;</span> <span class="n">desc</span><span class="p">)</span> <span class="n">desc_</span><span class="p">(</span><span class="n">desc</span><span class="p">)</span> <span class="p">{}</span>
<span class="kt">void</span> <span class="n">InferShape</span><span class="p">(</span><span class="k">const</span> <span class="n">framework</span><span class="o">::</span><span class="n">Scope</span><span class="o">&amp;</span> <span class="n">scope</span><span class="p">)</span> <span class="k">const</span> <span class="k">override</span> <span class="p">{</span>
<span class="k">if</span> <span class="p">(</span><span class="o">!</span><span class="n">symbols_ready_</span><span class="p">)</span> <span class="p">{</span>
<span class="n">CreateVariables</span><span class="p">(</span><span class="n">scope</span><span class="p">);</span>
<span class="n">CreateOperators</span><span class="p">();</span>
<span class="p">}</span>
<span class="c1">// should run InferShape first.</span>
<span class="k">for</span> <span class="p">(</span><span class="k">auto</span><span class="o">&amp;</span> <span class="nl">op</span> <span class="p">:</span> <span class="n">runtime_table_</span><span class="p">.</span><span class="n">ops</span><span class="p">())</span> <span class="p">{</span>
<span class="n">op</span><span class="o">-&gt;</span><span class="n">InferShape</span><span class="p">(</span><span class="n">scope</span><span class="p">);</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="kt">void</span> <span class="n">Run</span><span class="p">(</span><span class="k">const</span> <span class="n">framework</span><span class="o">::</span><span class="n">Scope</span><span class="o">&amp;</span> <span class="n">scope</span><span class="p">,</span>
<span class="k">const</span> <span class="n">platform</span><span class="o">::</span><span class="n">DeviceContext</span><span class="o">&amp;</span> <span class="n">dev_ctx</span><span class="p">)</span> <span class="k">const</span> <span class="k">override</span> <span class="p">{</span>
<span class="n">PADDLE_ENFORCE</span><span class="p">(</span><span class="n">symbols_ready_</span><span class="p">,</span> <span class="s">&quot;operators and variables should be created first.&quot;</span><span class="p">);</span>
<span class="k">for</span> <span class="p">(</span><span class="k">auto</span><span class="o">&amp;</span> <span class="nl">op</span> <span class="p">:</span> <span class="n">runtime_table_</span><span class="p">.</span><span class="n">ops</span><span class="p">())</span> <span class="p">{</span>
<span class="n">op</span><span class="o">-&gt;</span><span class="n">Run</span><span class="p">(</span><span class="n">scope</span><span class="p">,</span> <span class="n">dev_ctx</span><span class="p">);</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="kt">void</span> <span class="n">CreateVariables</span><span class="p">(</span><span class="k">const</span> <span class="n">framework</span><span class="o">::</span><span class="n">Scope</span><span class="o">&amp;</span> <span class="n">scope</span><span class="p">);</span>
<span class="kt">void</span> <span class="nf">CreateOperators</span><span class="p">();</span>
<span class="c1">// some other necessary interfaces of NetOp are list below</span>
<span class="c1">// ...</span>
<span class="k">private</span><span class="o">:</span>
<span class="n">BlockDesc</span> <span class="n">desc_</span><span class="p">;</span>
<span class="kt">bool</span> <span class="n">symbols_ready_</span><span class="p">{</span><span class="nb">false</span><span class="p">};</span>
<span class="p">};</span>
</pre></div>
</div>
</div>
<div class="section" id="the-execution-of-blocks">
<span id="the-execution-of-blocks"></span><h2>The Execution of Blocks<a class="headerlink" href="#the-execution-of-blocks" title="永久链接至标题"></a></h2>
<p>Block inherits from OperatorBase, which has a Run method.
Block&#8217;s Run method will run its operators sequentially.</p>
<p>There is another important interface called <code class="docutils literal"><span class="pre">Eval</span></code>, which take some arguments called targets, and generate a minimal graph which takes targets as the end points and creates a new Block,
after <code class="docutils literal"><span class="pre">Run</span></code>, <code class="docutils literal"><span class="pre">Eval</span></code> will get the latest value and return the targets.</p>
<p>The definition of Eval is as follows:</p>
<div class="highlight-c++"><div class="highlight"><pre><span></span><span class="c1">// clean a block description by targets using the corresponding dependency graph.</span>
<span class="c1">// return a new BlockDesc with minimal number of operators.</span>
<span class="c1">// NOTE not return a Block but the block&#39;s description so that this can be distributed</span>
<span class="c1">// to a cluster.</span>
<span class="n">BlockDesc</span> <span class="nf">Prune</span><span class="p">(</span><span class="k">const</span> <span class="n">BlockDesc</span><span class="o">&amp;</span> <span class="n">desc</span><span class="p">,</span> <span class="n">vector</span><span class="o">&lt;</span><span class="n">string</span><span class="o">&gt;</span> <span class="n">targets</span><span class="p">);</span>
<span class="kt">void</span> <span class="n">Block</span><span class="o">::</span><span class="n">Eval</span><span class="p">(</span><span class="k">const</span> <span class="n">vector</span><span class="o">&lt;</span><span class="n">string</span><span class="o">&gt;&amp;</span> <span class="n">targets</span><span class="p">,</span>
<span class="k">const</span> <span class="n">framework</span><span class="o">::</span><span class="n">Scope</span><span class="o">&amp;</span> <span class="n">scope</span><span class="p">,</span>
<span class="k">const</span> <span class="n">platform</span><span class="o">::</span><span class="n">DeviceContext</span><span class="o">&amp;</span> <span class="n">dev_ctx</span><span class="p">)</span> <span class="p">{</span>
<span class="n">BlockDesc</span> <span class="n">min_desc</span> <span class="o">=</span> <span class="n">Prune</span><span class="p">(</span><span class="n">desc_</span><span class="p">,</span> <span class="n">targets</span><span class="p">);</span>
<span class="n">Block</span> <span class="nf">min_block</span><span class="p">(</span><span class="n">min_desc</span><span class="p">);</span>
<span class="n">min_block</span><span class="p">.</span><span class="n">Run</span><span class="p">(</span><span class="n">scope</span><span class="p">,</span> <span class="n">dev_ctx</span><span class="p">);</span>
<span class="p">}</span>
</pre></div>
</div>
</div>
</div>
</div>
</div>
<footer>
<hr/>
<div role="contentinfo">
<p>
&copy; Copyright 2016, PaddlePaddle developers.
</p>
</div>
Built with <a href="http://sphinx-doc.org/">Sphinx</a> using a <a href="https://github.com/snide/sphinx_rtd_theme">theme</a> provided by <a href="https://readthedocs.org">Read the Docs</a>.
</footer>
</div>
</div>
</section>
</div>
<script type="text/javascript">
var DOCUMENTATION_OPTIONS = {
URL_ROOT:'../',
VERSION:'',
COLLAPSE_INDEX:false,
FILE_SUFFIX:'.html',
HAS_SOURCE: true,
SOURCELINK_SUFFIX: ".txt",
};
</script>
<script type="text/javascript" src="../_static/jquery.js"></script>
<script type="text/javascript" src="../_static/underscore.js"></script>
<script type="text/javascript" src="../_static/doctools.js"></script>
<script type="text/javascript" src="../_static/translations.js"></script>
<script type="text/javascript" src="https://cdn.bootcss.com/mathjax/2.7.0/MathJax.js"></script>
<script type="text/javascript" src="../_static/js/theme.js"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js" integrity="sha384-Tc5IQib027qvyjSMfHjOMaLkfuWVxZxUPnCJA7l2mCWNIpG9mGCD8wGNIcPD7Txa" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/perfect-scrollbar/0.6.14/js/perfect-scrollbar.jquery.min.js"></script>
<script src="../_static/js/paddle_doc_init.js"></script>
</body>
</html>
\ No newline at end of file
因为 它太大了无法显示 source diff 。你可以改为 查看blob
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册