The layer will return a Variable, which is also the output of an operator. However, outputs of a layer function have more attributes than an operator. There are parameter variables, and their gradient variables need to return. To return them is useful. For example,
1. Users can debug the network by printing parameter gradients.
2. Users can append attributes to a parameter, such as, `param.stop_gradient=True` will make a parameter stop generate the gradient. We can fix the parameter value during training by using this attribute.
However, it is good to return a Variable for layers, since all layers and operators use Variables as their parameters. We can just append a `param` field and a `grad` field for layer function since the Python is dynamic typing.
<spanid="return-value-of-layer-functions"></span><h3>Return value of layer functions<aclass="headerlink"href="#return-value-of-layer-functions"title="Permalink to this headline">¶</a></h3>
<p>The layer will return a Variable, which is also the output of an operator. However, outputs of a layer function have more attributes than an operator. There are parameter variables, and their gradient variables need to return. To return them is useful. For example,</p>
<olclass="simple">
<li>Users can debug the network by printing parameter gradients.</li>
<li>Users can append attributes to a parameter, such as, <codeclass="docutils literal"><spanclass="pre">param.stop_gradient=True</span></code> will make a parameter stop generate the gradient. We can fix the parameter value during training by using this attribute.</li>
</ol>
<p>However, it is good to return a Variable for layers, since all layers and operators use Variables as their parameters. We can just append a <codeclass="docutils literal"><spanclass="pre">param</span></code> field and a <codeclass="docutils literal"><spanclass="pre">grad</span></code> field for layer function since the Python is dynamic typing.</p>
The layer will return a Variable, which is also the output of an operator. However, outputs of a layer function have more attributes than an operator. There are parameter variables, and their gradient variables need to return. To return them is useful. For example,
1. Users can debug the network by printing parameter gradients.
2. Users can append attributes to a parameter, such as, `param.stop_gradient=True` will make a parameter stop generate the gradient. We can fix the parameter value during training by using this attribute.
However, it is good to return a Variable for layers, since all layers and operators use Variables as their parameters. We can just append a `param` field and a `grad` field for layer function since the Python is dynamic typing.
<spanid="return-value-of-layer-functions"></span><h3>Return value of layer functions<aclass="headerlink"href="#return-value-of-layer-functions"title="永久链接至标题">¶</a></h3>
<p>The layer will return a Variable, which is also the output of an operator. However, outputs of a layer function have more attributes than an operator. There are parameter variables, and their gradient variables need to return. To return them is useful. For example,</p>
<olclass="simple">
<li>Users can debug the network by printing parameter gradients.</li>
<li>Users can append attributes to a parameter, such as, <codeclass="docutils literal"><spanclass="pre">param.stop_gradient=True</span></code> will make a parameter stop generate the gradient. We can fix the parameter value during training by using this attribute.</li>
</ol>
<p>However, it is good to return a Variable for layers, since all layers and operators use Variables as their parameters. We can just append a <codeclass="docutils literal"><spanclass="pre">param</span></code> field and a <codeclass="docutils literal"><spanclass="pre">grad</span></code> field for layer function since the Python is dynamic typing.</p>