Cache inference package on CI job
Created by: Yancey1989
CI job: http://ci.paddlepaddle.org/viewLog.html?buildId=13633&buildTypeId=GuochaorongPaddleTest_PrCi&tab=buildLog Logs:
[19:38:13][Step 2/2] -- generating grpc send_recv.proto
[19:38:15][Step 2/2] -- Download inference test stuff ditu_rnn_fluid%2Fmodel.tar.gz from http://paddle-inference-dist.bj.bcebos.com/ditu_rnn_fluid%2Fmodel.tar.gz
[19:38:16][Step 2/2] -- finish downloading ditu_rnn_fluid%2Fmodel.tar.gz
[19:38:16][Step 2/2] -- Download inference test stuff ditu_rnn_fluid%2Fdata.txt.tar.gz from http://paddle-inference-dist.bj.bcebos.com/ditu_rnn_fluid%2Fdata.txt.tar.gz
[19:38:18][Step 2/2] -- finish downloading ditu_rnn_fluid%2Fdata.txt.tar.gz
[19:38:18][Step 2/2] -- Download inference test stuff chinese_ner_model.tar.gz from http://paddle-inference-dist.bj.bcebos.com/chinese_ner_model.tar.gz
Should cache the inference package on CI job so that we can speed up the CI job building.