Inference result have big different when using same model but different GPU configuration.
Created by: helinwang
User in Paddle Hi Group say that he gets big difference in inference result when using configuration "use_gpu=True, trainer_count=2, gpu_id=1" and "use_gpu=True, trainer_count=1, gpu_id=1".
