...#define the program, cost, and create sgd optimizer
optimize_ops,params_grads=sgd_optimizer.minimize(avg_cost)#get optimize OPs and gradient parameters
t=fluid.DistributeTranspiler()# create the transpiler instance
# slice the program into 2 pieces with optimizer_ops and gradient parameters list, as well as pserver_endpoints, which is a comma separated list of [IP:PORT] and number of trainers