大于等于5维的reduce操作竟然会算错?
Created by: OleNet
import paddle.fluid as fluid
import paddle.fluid.layers as L
import numpy
import numpy as np
mp, sp = fluid.Program(), fluid.Program()
with fluid.program_guard(mp, sp):
a = fluid.layers.data("a", shape=(1,1,2,1,1,2), dtype='float32', append_batch_size=False)
b = fluid.layers.data("b", shape=(1,1,2,1,3,2), dtype='float32', append_batch_size=False)
c = a * b
fluid.layers.Print(c)
d = L.reduce_sum(c, dim=-1, keep_dim=True)
a = np.array([0.426218,0.385291,0.299569,0.456716]).reshape([1,1,2,1,1,2,])
b = np.array([0.417022,0.720325,0.146756,0.0923386,0.000114375,0.302333,0.417022,0.720325,0.000114375,0.302333,0.146756,0.0923386,]).reshape([1,1,2,1,3,2,])
print(np.sum(a * b, axis=-1).squeeze())
place = fluid.CPUPlace()
exe = fluid.Executor(place)
exe.run(sp)
d = exe.run(mp,
feed={
"a": a,
"b": b,
},
fetch_list=[d])
print(d[0].squeeze())
输出是 numpy 计算的广播之后的reduce是:
[[0.45527702 0.09812728 0.11653493]
[0.45391082 0.13811458 0.08613606]]
paddle 计算的广播并且reduce_sum 之后是:
[[1.77742283e-01 2.77534740e-01 6.25500488e-02]
[3.55772315e-02 4.87486838e-05 1.16486184e-01]]
简单测试了下, paddle 的高维reduce_sum 和 reduce_mean 都有问题
1.8.0 和 1.8.2 都能复现