Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
机器未来
Paddle
提交
ae78940a
P
Paddle
项目概览
机器未来
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1
Issue
1
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
未验证
提交
ae78940a
编写于
9月 24, 2021
作者:
W
Wilber
提交者:
GitHub
9月 24, 2021
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
[cherry-pick] inference fix trt problem (#35939)
* update xpu version
上级
0e19aeb9
变更
4
隐藏空白更改
内联
并排
Showing
4 changed file
with
28 addition
and
5 deletion
+28
-5
cmake/external/lite.cmake
cmake/external/lite.cmake
+1
-1
cmake/external/xpu.cmake
cmake/external/xpu.cmake
+1
-1
paddle/fluid/inference/api/analysis_predictor.cc
paddle/fluid/inference/api/analysis_predictor.cc
+18
-3
paddle/fluid/inference/tensorrt/engine.h
paddle/fluid/inference/tensorrt/engine.h
+8
-0
未找到文件。
cmake/external/lite.cmake
浏览文件 @
ae78940a
...
...
@@ -50,7 +50,7 @@ if (NOT LITE_SOURCE_DIR OR NOT LITE_BINARY_DIR)
set
(
LITE_INSTALL_DIR
${
THIRD_PARTY_PATH
}
/install/lite
)
if
(
NOT LITE_GIT_TAG
)
set
(
LITE_GIT_TAG
1c4698c6efd9a5f57a4f8369bd5b6374166f5ba4
)
set
(
LITE_GIT_TAG
4ab64daecc11fbf74fffdc6a4733f388472e7d5d
)
endif
()
if
(
NOT CUDA_ARCH_NAME
)
...
...
cmake/external/xpu.cmake
浏览文件 @
ae78940a
...
...
@@ -35,7 +35,7 @@ ELSE ()
ENDIF
()
SET
(
XPU_BASE_URL_WITHOUT_DATE
"https://baidu-kunlun-product.cdn.bcebos.com/KL-SDK/klsdk-dev"
)
SET
(
XPU_BASE_URL
"
${
XPU_BASE_URL_WITHOUT_DATE
}
/202109
09
"
)
SET
(
XPU_BASE_URL
"
${
XPU_BASE_URL_WITHOUT_DATE
}
/202109
21
"
)
SET
(
XPU_XRE_URL
"
${
XPU_BASE_URL
}
/
${
XPU_XRE_DIR_NAME
}
.tar.gz"
CACHE STRING
""
FORCE
)
SET
(
XPU_XDNN_URL
"
${
XPU_BASE_URL
}
/
${
XPU_XDNN_DIR_NAME
}
.tar.gz"
CACHE STRING
""
FORCE
)
SET
(
XPU_XCCL_URL
"
${
XPU_BASE_URL_WITHOUT_DATE
}
/20210623/
${
XPU_XCCL_DIR_NAME
}
.tar.gz"
CACHE STRING
""
FORCE
)
...
...
paddle/fluid/inference/api/analysis_predictor.cc
浏览文件 @
ae78940a
...
...
@@ -686,9 +686,24 @@ void AnalysisPredictor::OptimizeInferenceProgram() {
// Note, please do NOT use any member variables, because member variables may
// have been destructed in multiple threads.
#if PADDLE_WITH_TENSORRT
paddle
::
inference
::
Singleton
<
inference
::
tensorrt
::
TRTEngineManager
>::
Global
()
.
DeleteAll
();
auto
&
block
=
prog
->
Block
(
0
);
for
(
auto
&
op_desc
:
block
.
AllOps
())
{
if
(
op_desc
->
Type
()
==
"tensorrt_engine"
)
{
std
::
string
engine_key
=
BOOST_GET_CONST
(
std
::
string
,
op_desc
->
GetAttr
(
"engine_key"
));
int
engine_predictor_id
=
BOOST_GET_CONST
(
int
,
op_desc
->
GetAttr
(
"predictor_id"
));
std
::
string
engine_name
=
engine_key
+
std
::
to_string
(
engine_predictor_id
);
if
(
paddle
::
inference
::
Singleton
<
inference
::
tensorrt
::
TRTEngineManager
>::
Global
()
.
Has
(
engine_name
))
{
paddle
::
inference
::
Singleton
<
inference
::
tensorrt
::
TRTEngineManager
>::
Global
()
.
DeleteKey
(
engine_name
);
}
}
}
#endif
delete
prog
;
});
...
...
paddle/fluid/inference/tensorrt/engine.h
浏览文件 @
ae78940a
...
...
@@ -631,6 +631,14 @@ class TRTEngineManager {
}
}
void
DeleteKey
(
const
std
::
string
&
key
)
{
auto
iter
=
engines_
.
find
(
key
);
if
(
iter
!=
engines_
.
end
())
{
iter
->
second
.
reset
(
nullptr
);
engines_
.
erase
(
iter
);
}
}
private:
std
::
unordered_map
<
std
::
string
,
std
::
unique_ptr
<
TensorRTEngine
>>
engines_
;
};
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录