1. 15 11月, 2016 1 次提交
    • X
      Add ScalingProjection · bf6f690f
      xuwei06 提交于
      out = w * input
      where w is a parameter of size 1
      
      Change-Id: Ife682d62323ceb1a20cbbf6269421b20a862d888
      bf6f690f
  2. 12 11月, 2016 1 次提交
  3. 11 11月, 2016 3 次提交
  4. 10 11月, 2016 7 次提交
  5. 08 11月, 2016 4 次提交
  6. 07 11月, 2016 2 次提交
  7. 05 11月, 2016 1 次提交
    • E
      Add elementwise math operations (#343) · 6c3a678c
      emailweixu 提交于
      * Add elementwise math operations
      This allows use to use expressions like: y=log(1+exp(x))
      Also added unittests for ActivationFunction
      * Enforce keyword arguments for non-positional arguments
      * Add LogActivation to doc
      6c3a678c
  8. 02 11月, 2016 1 次提交
    • Q
      Add job=time in trainer, refine cudnn_conv to reduce gpu memory and speed up training. (#218) · 45c81a41
      qingqing01 提交于
      * Add benchmark for PaddlePaddle, tensorflow and caffe
      
      * ConvProjection to reduce memory for goolenet
      
      * Add unit test for ConvProjection.
      1. unit test in test_LayerGrad.
      2. compare the ConvPorjection and CudnnConvLayer, also compare the concat_layer+img_conv_layer and concat_layer_conv_projection.
      
      * Reduce cudnn_conv memory and add benchmark document.
      1. Use TmpMatrix as the workspace in cudnn_conv to reduce gpu memory. It reduce lots of memory.
      2. Add benchmark document.
      3. fix smallnet_mnist_cifar.py in paddle.
      
      * Add job=time and refine cudnn_conv to reduce gpu memroy and speed up
      
      * Refine cudnn_conv and shared biases operation in concat_layer and mixed_layer.
      
      * follow comments
      
      * follow comments
      
      * Use unique_ptr to prevent memory leaks in CudnnConvLayer.
      45c81a41
  9. 30 10月, 2016 1 次提交
  10. 28 10月, 2016 1 次提交
  11. 24 10月, 2016 3 次提交
  12. 17 10月, 2016 1 次提交
  13. 13 10月, 2016 1 次提交
  14. 10 10月, 2016 1 次提交
  15. 09 10月, 2016 1 次提交
  16. 08 10月, 2016 1 次提交
  17. 29 9月, 2016 1 次提交
  18. 28 9月, 2016 2 次提交
  19. 27 9月, 2016 2 次提交
  20. 22 9月, 2016 2 次提交
  21. 21 9月, 2016 1 次提交
  22. 20 9月, 2016 1 次提交
  23. 17 9月, 2016 1 次提交