A probable memory leak bug on ProgramDesc
Created by: gmcather
As mentioned in the title, there might be a memory leak bug in the Destructor of ProgramDesc
.
I can list an example. If you try to load model (load function: https://github.com/PaddlePaddle/Paddle/blob/develop/paddle/fluid/inference/io.cc#L139 ) repeatedly, the memory continues increasing during the running time. Because memory allocated by creating the ProgramDesc didn't free.
Here's our temporary solution by freeing the leaked memory manually. More specifically, we clear memory of proto p->Proto()->Clear();
when destructing the ProgramDesc.
auto t_pdesc = paddle::inference::Load(*_exec, *_root_scope[i],
program_desc_path, model_path);
ProgramDesc* pdesc_ptr = t_pdesc.release();
_program_desc = std::unique_ptr<ProgramDesc, std::function<void(ProgramDesc*)>>(
pdesc_ptr,
[](ProgramDesc* p) {
p->Proto()->Clear();
delete p;
});