Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
wux_labs
Tensorflow
提交
2e66ef0e
T
Tensorflow
项目概览
wux_labs
/
Tensorflow
通知
1
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
DevOps
流水线
流水线任务
计划
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
T
Tensorflow
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
DevOps
DevOps
流水线
流水线任务
计划
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
流水线任务
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
2e66ef0e
编写于
6月 04, 2019
作者:
A
Ayush Dubey
提交者:
TensorFlower Gardener
6月 04, 2019
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Consume stderr from all processes in tests that fork multiple processes.
PiperOrigin-RevId: 251580137
上级
51f8f449
变更
1
隐藏空白更改
内联
并排
Showing
1 changed file
with
36 addition
and
13 deletion
+36
-13
tensorflow/python/distribute/multi_worker_test_base.py
tensorflow/python/distribute/multi_worker_test_base.py
+36
-13
未找到文件。
tensorflow/python/distribute/multi_worker_test_base.py
浏览文件 @
2e66ef0e
...
...
@@ -23,6 +23,7 @@ import contextlib
import
copy
import
json
import
os
import
six
import
subprocess
import
sys
import
threading
...
...
@@ -524,19 +525,41 @@ class MultiWorkerMultiProcessTest(test.TestCase):
for
return_code
in
return_codes
:
self
.
assertEqual
(
return_code
,
0
)
def
stream_stderr
(
self
,
process
):
# TODO(yuefengz): calling stream_stderr on a single process will probably
# make all processes hang if they have too much output e.g. adding
# --vmodule=execute=2 to cmd_args. But this method is useful for debugging
# purposes. We should figure out the hanging problem, probably by consuming
# outputs of all processes at the same time.
while
True
:
output
=
process
.
stderr
.
readline
()
if
not
output
and
process
.
poll
()
is
not
None
:
break
if
output
:
print
(
output
.
strip
())
sys
.
stdout
.
flush
()
def
stream_stderr
(
self
,
processes
,
print_only_first
=
False
):
"""Consume stderr of all processes and print to stdout.
To reduce the amount of logging, caller can set print_only_first to True.
In that case, this function only prints stderr from the first process of
each type.
Arguments:
processes: A dictionary from process type string -> list of processes.
print_only_first: If true, only print output from first process of each
type.
"""
def
_stream_stderr_single_process
(
process
,
type_string
,
index
,
print_to_stdout
):
"""Consume a single process's stderr and optionally print to stdout."""
while
True
:
output
=
process
.
stderr
.
readline
()
if
not
output
and
process
.
poll
()
is
not
None
:
break
if
output
and
print_to_stdout
:
print
(
'{}{} {}'
.
format
(
type_string
,
index
,
output
.
strip
()))
sys
.
stdout
.
flush
()
stream_threads
=
[]
for
process_type
,
process_list
in
six
.
iteritems
(
processes
):
for
i
in
range
(
len
(
process_list
)):
print_to_stdout
=
(
not
print_only_first
)
or
(
i
==
0
)
thread
=
threading
.
Thread
(
target
=
_stream_stderr_single_process
,
args
=
(
process_list
[
i
],
process_type
,
i
,
print_to_stdout
))
thread
.
start
()
stream_threads
.
append
(
thread
)
for
thread
in
stream_threads
:
thread
.
join
()
def
get_tf_config_task
():
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录