1. 04 5月, 2018 2 次提交
    • H
      Fluid new API: dist train without modifying code · 8ee23da8
      Helin Wang 提交于
      Works with 1 trainer 1 pserver. 2 trainer 1 pserver will stuck at the
      end of first step, still investigating.
      
      The user only need to set envrionment variables to enable distributed
      training.
      
      run pserver:
      
      PADDLE_TRAINING_ROLE=PSERVER PADDLE_PSERVER_IPS=127.0.0.1 PADDLE_TRAINERS=2 PADDLE_CURRENT_IP=127.0.0.1 python no_test_word2vec_new_api.py
      
      run trainer:
      
      PADDLE_TRAINING_ROLE=TRAINER PADDLE_PSERVER_IPS=127.0.0.1 PADDLE_TRAINERS=2 PADDLE_TRAINER_ID=0 python no_test_word2vec_new_api.py
      8ee23da8
    • A
  2. 03 5月, 2018 29 次提交
  3. 02 5月, 2018 9 次提交