The basic structure of a PaddlePaddle program is some nested blocks, as a C++ or Java program.
As described in [graph.md](./graph.md), the first five lines of the following PaddlePaddle program
```python
x=layer.data("images")
l=layer.data("label")
y=layer.fc(x)
cost=layer.mse(y,l)
optimize(cost)
train(cost,reader=mnist.train())
```
generates, or compiles, a PaddelPaddle program, which is represented by the following protobuf message:
```protobuf
messageProgramDesc{
repeatedBlockDescblocks=1;
}
messageBlockDesc{
repeatedVarDescvars=1;
repeatedOpDescops=2;
}
messageOpDesc{
AttrDescattrs=1;
...
}
messageAttrDesc{
requiredAttrTypetype=1;
// index into ProgramDesc::blocks when type==BLOCK
optionalint32block=2;
...
}
```
When each of the first five lines runs, related Python function, e.g., `layer.fc`, calls C++ InferShape functions. This InferShape function needs to access the properties of VarDesc's accessed by the current OpDesc. These VarDesc's might not be defined in the current block, but in some ancestor blocks. This requires that we can trace the parent of a block.
A nested block is often an attribute of an operator, most likely, an IfElseOp or a WhileOp. In above solution, all blocks are in `ProgramDesc::blocks`, this implicitly assigns a zero-based ID to each block -- the index of the block in `ProgramDesc::blocks`. So that `AttrDesc::block` could be an integer block ID.
With this design, the InferShape function should take the following parameters:
```c++
voidInferShape(constProgramDesc*program,
intcurrent_block,
intcurrent_operator){
...
}
```
where
-`current_block` indices into `ProgramDesc::blocks`,
-`current_operator` indices into `BlockDesc::ops`.