fluid版本的很多参数没有做类型兼容处理
Created by: jshower
我在使用paddle的fluid版本的例子时,发现存在的一个问题就是没有做类型的兼容性处理,比如说我定义一个全连接层,里面使用了NormalInitializer,scale设为1,如下所示。 fc1 = fluid.layers.fc(input=drop, size=4096, act=None, bias_attr=fluid.ParamAttr(initializer=NormalInitializer(loc=0.0, scale=1, seed=0))) 以修改Paddle/python/paddle/fluid/tests/book/test_image_classification.py 的89行为例,结果会出现:
Traceback (most recent call last):
File "test_image_classification.py", line 252, in test_vgg_cpu
main('vgg', use_cuda=False)
File "test_image_classification.py", line 237, in main
train(net_type, use_cuda, save_dirname, is_local)
File "test_image_classification.py", line 105, in train
net = vgg16_bn_drop(images)
File "test_image_classification.py", line 89, in vgg16_bn_drop
fc1 = fluid.layers.fc(input=drop, size=4096, act=None, bias_attr=fluid.ParamAttr(initializer=NormalInitializer(loc=0.0, scale=1, seed=0)))
File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/layers/nn.py", line 193, in fc
pre_activation = helper.append_bias_op(pre_bias, dim_start=num_flatten_dims)
File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/layer_helper.py", line 385, in append_bias_op
attr=bias_attr, shape=size, dtype=input_var.dtype, is_bias=True)
File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/layer_helper.py", line 312, in create_parameter
dtype=dtype, shape=shape, **attr.to_kwargs(with_initializer=True))
File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/framework.py", line 810, in create_parameter
kwargs['initializer'](param, self)
File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/initializer.py", line 239, in __call__
"seed": self._seed
File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/framework.py", line 834, in prepend_op
op = Operator(self, op_desc, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/framework.py", line 484, in __init__
self.desc.check_attrs()
EnforceNotMet: Cannot get attribute std by type f, its type is i at [/paddle/Paddle/paddle/fluid/framework/attribute.h:131]
而我如果将scale=1改成scale=1.0的话,这个问题就解决了。 类似的还有 fluid.optimizer.Momentum(momentum=0, learning_rate=1e-3) 会报类似的错误,将momentum改成0.0就不会报错。 这是一个比较明显的没有做类型兼容处理的问题。