huber_cost is running incorrectly, connect to issue #10
已关闭
huber_cost is running incorrectly, connect to issue #10
Created by: chrisxu2016
@lcy-seso huber_cost is used error ,connect to issue #10
##example
import paddle.v2 as paddle import paddle.v2.dataset.uci_housing as uci_housing
def main(): # init paddle.init(use_gpu=False, trainer_count=1)
# network config
x = paddle.layer.data(
name='x',
type=paddle.data_type.dense_vector(13))
y_predict = paddle.layer.fc(
input=x,
size=1,
act=paddle.activation.Linear())
y = paddle.layer.data(
name='y',
type=paddle.data_type.dense_vector(1))
cost = paddle.layer.huber_cost(input=y_predict, label=y)
# create parameters
parameters = paddle.parameters.create(cost)
# create optimizer
optimizer = paddle.optimizer.Momentum(momentum=0)
trainer = paddle.trainer.SGD(
cost=cost, parameters=parameters, update_equation=optimizer)
feeding = {'x': 0, 'y': 1}
# event_handler to print training and testing info
def event_handler(event):
if isinstance(event, paddle.event.EndIteration):
if event.batch_id % 100 == 0:
print "Pass %d, Batch %d, Cost %f" % (
event.pass_id, event.batch_id, event.cost)
if isinstance(event, paddle.event.EndPass):
result = trainer.test(
reader=paddle.batch(uci_housing.test(), batch_size=2),
feeding=feeding)
print "Test %d, Cost %f" % (event.pass_id, result.cost)
# training
trainer.train(
reader=paddle.batch(
paddle.reader.shuffle(uci_housing.train(), buf_size=500),
batch_size=2),
feeding=feeding,
event_handler=event_handler,
num_passes=30)
if name == 'main': main()
Failuer infomation
Created by: shanyi15
您好,此issue在近一个月内暂无更新,我们将于今天内关闭。若在关闭后您仍需跟进提问,可重新开启此问题,我们将在24小时内回复您。因关闭带来的不便我们深表歉意,请您谅解~感谢您对PaddlePaddle的支持! Hello, this issue has not been updated in the past month. We will close it today for the sake of other user‘s experience. If you still need to follow up on this question after closing, please feel free to reopen it. In that case, we will get back to you within 24 hours. We apologize for the inconvenience caused by the closure and thank you so much for your support of PaddlePaddle Group!