静态图中batch_norm提示“got an unexpected keyword argument 'fuse_with_relu'”
Created by: dbsxdbsx
win10_64, paddlepaddle_1.8, cuda7.5 GPU版本。 我在魔改PARL中example的一个网络时发现batch_norm不能用:
import paddle.fluid as fluid
import parl
from parl import layers
LOG_SIG_MAX = 2.0
LOG_SIG_MIN = -20.0
class ActorModel(parl.Model):
def __init__(self, act_dim):
hid1_size = 400
hid2_size = 300
self.fc1 = layers.fc(size=hid1_size)
self.bn1 = layers.batch_norm(act='relu')
self.fc2 = layers.fc(size=hid2_size)
self.bn2 = layers.batch_norm(act='relu')
self.mean_linear = layers.fc(size=act_dim)
self.log_std_linear = layers.fc(size=act_dim)
# origin version
# self.fc1 = layers.fc(size=hid1_size, act='relu')
# self.fc2 = layers.fc(size=hid2_size, act='relu')
# self.mean_linear = layers.fc(size=act_dim)
# self.log_std_linear = layers.fc(size=act_dim)
def policy(self, obs):
hid1 = self.fc1(obs)
bn1 = self.bn1(hid1)
hid2 = self.fc2(bn1)
bn2 = self.bn2(hid2)
means = self.mean_linear(bn2)
log_std = self.log_std_linear(bn2)
log_std = layers.clip(log_std, min=LOG_SIG_MIN, max=LOG_SIG_MAX)
# origin version
# hid1 = self.fc1(obs)
# hid2 = self.fc2(hid1)
# means = self.mean_linear(hid2)
# log_std = self.log_std_linear(hid2)
# log_std = layers.clip(log_std, min=LOG_SIG_MIN, max=LOG_SIG_MAX)
return means, log_std
# origin version
是原版,可运行,未注释的是我改的版本,提示:
TypeError: batch_norm() got an unexpected keyword argument 'fuse_with_relu'
另外,当在batch_norm()函数中输入input
参数时,其也提示找不到input
参数。不知是否新版batch_norm格式变化了?我参考了这里