提交 6915cb29 编写于 作者: Y Yi Wang

Add metaplotlib for iPython

上级 92ab6148
...@@ -176,9 +176,11 @@ opt.train(topology, parameters, reader=read, ...) ...@@ -176,9 +176,11 @@ opt.train(topology, parameters, reader=read, ...)
### Updater ### Updater
Please be aware that a trainer requires an updater as its data Please be aware that a trainer can accept an updater as its data
member. This is to make it easier to customize trainers, as member, where an updater is a class derived from
discussed [here](https://github.com/PaddlePaddle/Paddle/issues/1319). `paddle.trainer.Updater`. This is to make it easier to customize
trainers, as discussed
[here](https://github.com/PaddlePaddle/Paddle/issues/1319).
### Event Handler ### Event Handler
...@@ -188,8 +190,8 @@ that handle some events: ...@@ -188,8 +190,8 @@ that handle some events:
1. BeginTraining 1. BeginTraining
1. EndTraining 1. EndTraining
1. BeginMinibatch 1. BeginIteration
1. EndMinibatch 1. EndIteration
1. BeginPass 1. BeginPass
1. EndPass 1. EndPass
...@@ -200,12 +202,17 @@ An example as follows: ...@@ -200,12 +202,17 @@ An example as follows:
```python ```python
def event_handler(event): def event_handler(event):
if ininstance(event, paddle.event.EndMinibatch): if ininstance(event, paddle.event.EndIteration):
print paddle.test(...) print paddle.test(...)
paddle.train(topology, parameters, reader, event_handler) paddle.train(topology, parameters, reader, event_handler)
``` ```
If we are writing a PaddlePaddle program in and for iPython/Jypyter,
we can use metaplotlib in the event handler to plot a curve of
cost/error versus iterations, as shown
[here](https://blog.dominodatalab.com/interactive-dashboards-in-jupyter/).
### Distributed Training ### Distributed Training
If users want to do distributed training on a cluster, s/he should If users want to do distributed training on a cluster, s/he should
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册