Skip to content

  • 体验新版
    • 正在加载...
  • 登录
  • PaddlePaddle
  • Paddle
  • Issue
  • #2071

P
Paddle
  • 项目概览

PaddlePaddle / Paddle
大约 2 年 前同步成功

通知 2325
Star 20933
Fork 5424
  • 代码
    • 文件
    • 提交
    • 分支
    • Tags
    • 贡献者
    • 分支图
    • Diff
  • Issue 1423
    • 列表
    • 看板
    • 标记
    • 里程碑
  • 合并请求 543
  • Wiki 0
    • Wiki
  • 分析
    • 仓库
    • DevOps
  • 项目成员
  • Pages
P
Paddle
  • 项目概览
    • 项目概览
    • 详情
    • 发布
  • 仓库
    • 仓库
    • 文件
    • 提交
    • 分支
    • 标签
    • 贡献者
    • 分支图
    • 比较
  • Issue 1,423
    • Issue 1,423
    • 列表
    • 看板
    • 标记
    • 里程碑
  • 合并请求 543
    • 合并请求 543
  • Pages
  • 分析
    • 分析
    • 仓库分析
    • DevOps
  • Wiki 0
    • Wiki
  • 成员
    • 成员
  • 收起侧边栏
  • 动态
  • 分支图
  • 创建新Issue
  • 提交
  • Issue看板
已关闭
开放中
Opened 5月 09, 2017 by saxon_zh@saxon_zhGuest

Meet a problem when using lstm_step in recurrent group

Created by: kuke

I am implementing grid lstm demo in v2 api. When trying to pass a memory object to the param state of the function lstm_step(), I got a complaint about unknown input layer for lstm_step. I also changed the input from memory object to other types of inputs, but the situation doesn’t get better.

The lstm_step function is called in a step function of recurrent group, whose context resembles:

def grid_step():
	recurrent_group1(...)
        ...
	lstm_step()
        …
	recurrent_group2()
        ...
recurrent_group(step=grid_step)

lstm_step shouldn’t go wrong in simple test. So I wonder what results in this error, the context or something else.

The usage of lstm_step can be found in grid lstm source code, from line 138 to line 151.

And the error information:

/home/work/.jumbo/lib/python2.7/site-packages/sklearn/externals/joblib/_multiprocessing_helpers.py:28: UserWarning: This platform lacks a functioning sem_open implementation, therefore, the required synchronization primitives needed will not function, see issue 3770..  joblib will operate in serial mode
  warnings.warn('%s.  joblib will operate in serial mode' % (e,))
I0509 17:00:01.274682 15010 Util.cpp:166] commandline:  --use_gpu=False --trainer_count=4
[CRITICAL 2017-05-09 17:00:01,554 layers.py:3023] Unknown input layer 'decoder_lstm1_state@anotation_lstm1_lstm_decoder_group@grid_decoder_group' for layer decoder_lstm1@anotation_lstm1_lstm_decoder_group@grid_decoder_group
Traceback (most recent call last):
  File "grid_lstm_v2.py", line 350, in <module>
    main()
  File "grid_lstm_v2.py", line 346, in main
    train()
  File "grid_lstm_v2.py", line 269, in train
    parameters = paddle.parameters.create(cost)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/parameters.py", line 19, in create
    topology = Topology(layers)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/topology.py", line 69, in __init__
    layers, extra_layers=extra_layers)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/layer.py", line 96, in parse_network
    return __parse__(__real_func__)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer_config_helpers/config_parser_utils.py", line 32, in parse_network_config
    config = config_parser.parse_config(network_conf, config_arg_str)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer/config_parser.py", line 3598, in parse_config
    trainer_config()
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/layer.py", line 89, in __real_func__
    real_output = [each.to_proto(context=context) for each in output_layers]
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 109, in to_proto
    context=context)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 112, in to_proto
    self.__parent_layers__[layer_name])
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 111, in <lambda>
    v1_layer = map(lambda x: x.to_proto(context=context),
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 112, in to_proto
    self.__parent_layers__[layer_name])
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 111, in <lambda>
    v1_layer = map(lambda x: x.to_proto(context=context),
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 109, in to_proto
    context=context)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 109, in to_proto
    context=context)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 112, in to_proto
    self.__parent_layers__[layer_name])
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 111, in <lambda>
    v1_layer = map(lambda x: x.to_proto(context=context),
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 100, in to_proto

  ?
    p.to_proto(context=context)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 100, in to_proto
    p.to_proto(context=context)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 112, in to_proto
    self.__parent_layers__[layer_name])
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 111, in <lambda>
    v1_layer = map(lambda x: x.to_proto(context=context),
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 112, in to_proto
    self.__parent_layers__[layer_name])
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 111, in <lambda>
    v1_layer = map(lambda x: x.to_proto(context=context),
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 109, in to_proto
    context=context)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 116, in to_proto
    ret_val = self.to_proto_impl(**kwargs)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/v2/config_base.py", line 212, in to_proto_impl
    return getattr(conf_helps, method_name)(**args)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer_config_helpers/default_decorators.py", line 53, in __wrapper__
    return func(*args, **kwargs)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer_config_helpers/default_decorators.py", line 53, in __wrapper__
    return func(*args, **kwargs)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer_config_helpers/default_decorators.py", line 53, in __wrapper__
    return func(*args, **kwargs)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer_config_helpers/default_decorators.py", line 53, in __wrapper__
    return func(*args, **kwargs)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer_config_helpers/layers.py", line 331, in wrapper
    return method(*args, **kwargs)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer_config_helpers/layers.py", line 3023, in lstm_step_layer
    **ExtraLayerAttribute.to_kwargs(layer_attr))
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer/config_parser.py", line 3181, in Layer
    return layer_func(name, **xargs)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer/config_parser.py", line 2993, in __init__
    **xargs)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer/config_parser.py", line 1428, in __init__
    (input_layer_name, name))
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer/config_parser.py", line 163, in config_assert
    logger.fatal(msg)
  File "/home/work/.jumbo/lib/python2.7/site-packages/paddle/trainer/config_parser.py", line 3518, in my_fatal
    raise Exception()
Exception
指派人
分配到
无
里程碑
无
分配里程碑
工时统计
无
截止日期
无
标识: paddlepaddle/Paddle#2071
渝ICP备2023009037号

京公网安备11010502055752号

网络110报警服务 Powered by GitLab CE v13.7
开源知识
Git 入门 Pro Git 电子书 在线学 Git
Markdown 基础入门 IT 技术知识开源图谱
帮助
使用手册 反馈建议 博客
《GitCode 隐私声明》 《GitCode 服务条款》 关于GitCode
Powered by GitLab CE v13.7