提交 f997109b 编写于 作者: X Xin Pan

polish

上级 c1fdacd4
...@@ -21,8 +21,7 @@ class Layer(object): ...@@ -21,8 +21,7 @@ class Layer(object):
# ... # ...
return self.apply(inputs): return self.apply(inputs):
def forward(inputs):
def apply(inputs):
# forward logic with paddle operators. backward auto-generated. # forward logic with paddle operators. backward auto-generated.
...@@ -35,7 +34,8 @@ class PyLayer(core.PyLayer): ...@@ -35,7 +34,8 @@ class PyLayer(core.PyLayer):
def forward(inputs): def forward(inputs):
# any forward logic implemented with numpy io. # any forward logic implemented with numpy io.
@static method @staticmethod
def backward(inputs):
# any backward logic implemented with numpy io. # any backward logic implemented with numpy io.
``` ```
...@@ -67,7 +67,6 @@ class Tracer { ...@@ -67,7 +67,6 @@ class Tracer {
Lots of research already. Lots of research already.
https://autodiff-workshop.github.io/ https://autodiff-workshop.github.io/
## Tests ## Tests
* All op tests run once in static graph, once in imperative mode. * All op tests run once in static graph, once in imperative mode.
...@@ -131,6 +130,7 @@ class MLP(fluid.imperative.Layer): ...@@ -131,6 +130,7 @@ class MLP(fluid.imperative.Layer):
out._backward() out._backward()
``` ```
# Plan # Plan
2.1,3 fulltime, Can run a few simple models. (Currently, 2 20% engs) 2.1,3 fulltime, Can run a few simple models. (Currently, 2 20% engs)
...@@ -143,6 +143,7 @@ class MLP(fluid.imperative.Layer): ...@@ -143,6 +143,7 @@ class MLP(fluid.imperative.Layer):
12.1, 5 fulltime, Can compile to static graph, support more optimizations. 12.1, 5 fulltime, Can compile to static graph, support more optimizations.
# Discussion # Discussion
TODO. TODO.
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册