模型使用有些疑惑
Created by: jinbiaomao
当我进行定义线性回归训练时,发现了一个问题
读取一个文件
# define training dataset reader def train_reader(): train_x = []; train_y = []; for line in open("test2.txt"): line = line.split('|') x = [int(line[2])] y = [int(line[3])] train_x.append(x) train_y.append(y) train_y = np.array(train_y) train_x = np.array(train_x) def reader(): for i in xrange(train_y.shape[0]): yield train_x[i], train_y[i] return reader
文件内容为下方展示 2017/3/3|地图|1|1000| 2017/3/5|地图|2|900| 2017/3/23|地图|3|800| 2017/3/25|地图|4|700| 2017/3/28|地图|5|600| 2017/4/1|地图|6|500| 2017/4/2|地图|7|400| 2017/4/3|地图|8|300| 2017/4/4|地图|9|200| 2017/4/4|地图|10|100| 当我进行完成模型处理时,得到的结果为下[ 164.00054932] [ 190.17622375] [ 216.35189819] [ 242.52755737] [ 268.70324707]
此处测试为下方#test test_data = [[1],[2],[3],[4],[5]];
inference probs = paddle.infer( output_layer=y_predict, parameters=parameters, input=test_data)
for data in probs: print data`` -----------------此处模型配置如下------------------------- `# network config x = paddle.layer.data(name='x', type=paddle.data_type.dense_vector(1)) y_predict = paddle.layer.fc(input=x, size=1, act=paddle.activation.Linear()) y = paddle.layer.data(name='y', type=paddle.data_type.dense_vector(1)) cost = paddle.layer.mse_cost(input=y_predict, label=y)
create parameters
parameters = paddle.parameters.create(cost)
create optimizer
optimizer = paddle.optimizer.Momentum(momentum=0)
create trainer
trainer = paddle.trainer.SGD(cost=cost, parameters=parameters, update_equation=optimizer)` 关于这个问题尚且不是十分清楚,希望得到解答