test_analyzer_transformer random fails on compare
Created by: luotao1
[23:57:29][ RUN ] Analyzer_Transformer.compare
[23:57:29]I0319 23:55:10.920542 17062 analyzer_transformer_tester.cc:176] The number of samples to be test: 8
[23:57:29]I0319 23:55:10.922397 17062 tester_helper.h:65] AnalysisConfig {
[23:57:29] NativeConfig {
[23:57:29] PaddlePredictor::Config {
[23:57:29] model_dir:
[23:57:29] }
[23:57:29] use_gpu: 0
[23:57:29] device: 0
[23:57:29] fraction_of_gpu_memory: 0.00219227
[23:57:29] specify_input_name: 1
[23:57:29] cpu_num_threads: 1
[23:57:29] }
[23:57:29] prog_file: /root/.cache/inference_demo/transformer/model/model
[23:57:29] param_file: /root/.cache/inference_demo/transformer/model/params
[23:57:29] enable_ir_optim: 1
[23:57:29] enable_ir_optim: 1
[23:57:29] use_feed_fetch_ops: 1
[23:57:29] use_tensorrt: 0
[23:57:29] use_mkldnn: 0
[23:57:29]}
[23:57:29]W0319 23:55:10.925616 17062 init.cc:153] AVX is available, Please re-compile on local machine
[23:57:29]I0319 23:55:11.322595 17062 tester_helper.h:278] Running thread 0, warm up run...
[23:57:29]I0319 23:55:44.447240 17062 helper.h:270] ====== batch_size: 8, repeat: 1, threads: 1, thread id: 0, latency: 33124.5ms, fps: 0.0301891 ======
[23:57:29]I0319 23:55:44.447353 17062 tester_helper.h:301] Thread 0 run 1 times...
[23:57:29]I0319 23:56:17.681612 17062 helper.h:270] ====== batch_size: 8, repeat: 1, threads: 1, thread id: 0, latency: 33234.2ms, fps: 0.0300895 ======
[23:57:29]W0319 23:56:17.688809 17062 init.cc:153] AVX is available, Please re-compile on local machine
[23:57:29]--- Running analysis [ir_graph_build_pass]
[23:57:29]--- Running analysis [ir_analysis_pass]
[23:57:29]--- Running IR pass [infer_clean_graph_pass]
[23:57:29]--- Running IR pass [attention_lstm_fuse_pass]
[23:57:29]--- Running IR pass [seqpool_concat_fuse_pass]
[23:57:29]--- Running IR pass [seqconv_eltadd_relu_fuse_pass]
[23:57:29]--- Running IR pass [fc_lstm_fuse_pass]
[23:57:29]--- Running IR pass [mul_lstm_fuse_pass]
[23:57:29]--- Running IR pass [fc_gru_fuse_pass]
[23:57:29]--- Running IR pass [mul_gru_fuse_pass]
[23:57:29]--- Running IR pass [seq_concat_fc_fuse_pass]
[23:57:29]--- Running IR pass [fc_fuse_pass]
[23:57:29]--- detected 12 subgraphs
[23:57:29]--- Running IR pass [repeated_fc_relu_fuse_pass]
[23:57:29]--- Running IR pass [squared_mat_sub_fuse_pass]
[23:57:29]--- Running IR pass [conv_bn_fuse_pass]
[23:57:29]--- Running IR pass [conv_eltwiseadd_bn_fuse_pass]
[23:57:29]--- Running IR pass [is_test_pass]
[23:57:29]--- Running IR pass [identity_scale_op_clean_pass]
[23:57:29]--- Running analysis [ir_params_sync_among_devices_pass]
[23:57:29]--- Running analysis [ir_graph_to_program_pass]
[23:57:29]I0319 23:56:24.455240 17062 analysis_predictor.cc:396] == optimize end ==
[23:57:29]I0319 23:56:24.456599 17062 tester_helper.h:278] Running thread 0, warm up run...
[23:57:29]W0319 23:56:24.456904 17062 naive_executor.cc:43] The NaiveExecutor can not work properly if the cmake flag ON_INFER is not set.
[23:57:29]W0319 23:56:24.456938 17062 naive_executor.cc:45] Unlike the training phase, all the scopes and variables will be reused to save the allocation overhead.
[23:57:29]W0319 23:56:24.456956 17062 naive_executor.cc:48] Please re-compile the inference library by setting the cmake flag ON_INFER=ON if you are running Paddle Inference
[23:57:29]I0319 23:56:56.635799 17062 helper.h:270] ====== batch_size: 8, repeat: 1, threads: 1, thread id: 0, latency: 32179.1ms, fps: 0.031076 ======
[23:57:29]I0319 23:56:56.635892 17062 tester_helper.h:301] Thread 0 run 1 times...
[23:57:29]W0319 23:56:56.636173 17062 naive_executor.cc:43] The NaiveExecutor can not work properly if the cmake flag ON_INFER is not set.
[23:57:29]W0319 23:56:56.636209 17062 naive_executor.cc:45] Unlike the training phase, all the scopes and variables will be reused to save the allocation overhead.
[23:57:29]W0319 23:56:56.636229 17062 naive_executor.cc:48] Please re-compile the inference library by setting the cmake flag ON_INFER=ON if you are running Paddle Inference
[23:57:29]I0319 23:57:28.508708 17062 helper.h:270] ====== batch_size: 8, repeat: 1, threads: 1, thread id: 0, latency: 31872.8ms, fps: 0.0313748 ======
[23:57:29]/paddle/paddle/fluid/inference/tests/api/tester_helper.h:82: Failure
[23:57:29] Expected: size
[23:57:29] Which is: 645
[23:57:29]To be equal to: ref_size
[23:57:29] Which is: 666
[23:57:29]/paddle/paddle/fluid/inference/tests/api/tester_helper.h:89: Failure
[23:57:29] Expected: pdata_ref[j]
[23:57:29] Which is: 6444
[23:57:29]To be equal to: pdata[j]
[23:57:29] Which is: 23118
[23:57:29]/paddle/paddle/fluid/inference/tests/api/tester_helper.h:89: Failure
[23:57:29] Expected: pdata_ref[j]
[23:57:29] Which is: 10
[23:57:29]To be equal to: pdata[j]
[23:57:29] Which is: 855