请问fluid中怎么使用sgd+adam这种优化方式
Created by: Kayven
在v2中这样的trainer.SGD设置update_equation,但fluid不支持
adam_optimizer = paddle.optimizer.Adam(
learning_rate=1e-4,
regularization=paddle.optimizer.L2Regularization(rate=1e-3),
model_average=paddle.optimizer.ModelAverage(average_window=0.5))
trainer = paddle.trainer.SGD(
cost=cost,
extra_layers=paddle.evaluator.auc(input=prob, label=label),
parameters=parameters,
update_equation=adam_optimizer, is_local=True)