Incorrect Inference.infer results when running with multiple GPUs.
Created by: xinghai-sun
Inference.infer will have incorrect results when running with multiple GPUs.
Below is the output data of the first convolution layer for 4 example instances, when trainer_count=1 (upper figure) and trainer_count=2 (lower figure).
The output results are wrong if trainer_count > 1 (use_gpu=True).
If we just print the input data layer, no difference is found between the two cases, indicating that the problem might exist in models instead of data allocation across GPUs.