- 30 6月, 2022 1 次提交
-
-
由 chentianyu03 提交于
* add relu6 kernel and yaml * format files * format code and fix bug * fix build failed
-
- 20 5月, 2022 1 次提交
-
-
由 YuanRisheng 提交于
-
- 30 3月, 2022 1 次提交
-
- 29 3月, 2022 1 次提交
-
-
由 tianshuo78520a 提交于
This reverts commit e77a947e.
-
- 28 3月, 2022 1 次提交
-
-
由 hong 提交于
* update * add forward case * update * update; test=develop * add some grad kernel; test=develop * move gpu kernel; test=develop * update * update; * update test; * fix selected rows bug; * add mix vector include ; * add mixed vector depen; test=develop * add logit grad signature; * polish code * fix bug; * add namespace for abs * revert code * not move softsign * revmove duplate register; * fix softsign bug * polish code * format * format * fix bug * remove cmake dep * add square sqrt selected rows support * update * remove clip norm * add standalone executor sqrt dep * standalone exec denp sqrt * remove sqrt op in cmkaelist * open some case
-
- 25 3月, 2022 1 次提交
-
-
由 YuanRisheng 提交于
-
- 23 3月, 2022 1 次提交
-
-
由 YuanRisheng 提交于
* move activation * fix bugs when run ce
-
- 17 3月, 2022 1 次提交
-
-
由 YuanRisheng 提交于
-
- 16 3月, 2022 2 次提交
-
-
由 Lijunhui 提交于
* init commit * correct namespace
-
由 YuanRisheng 提交于
-
- 15 3月, 2022 1 次提交
-
-
由 YuanRisheng 提交于
* move activation op * adjust code format * fix compile bugs * fix ci bugs * code format adjust * code format adjust2 * activate ci status * modify according to comment * move activation kernel * revert relu6 * reduce add code * perfect use_phi_functor * completing func name * fix bugs when run ci * fix bugs when run infr * modifpy infrt get kernel signature
-
- 10 3月, 2022 1 次提交
-
-
由 Lijunhui 提交于
-
- 08 3月, 2022 1 次提交
-
-
由 YuanRisheng 提交于
[Phi]Move Relu/Cos/Sin/Tan/Acos/Asin/Atan/Sinh/Cosh/Asinh/Acosh/Atanh kernels in Activation to Phi (#40175) * move activation op * adjust code format * fix compile bugs * fix ci bugs * code format adjust * code format adjust2 * activate ci status * modify according to comment
-
- 07 3月, 2022 1 次提交
-
-
由 zhangbo9674 提交于
* add activ * refine unittest * refine unittest * refine unittest * refine unittest * refine code
-
- 02 3月, 2022 1 次提交
-
-
由 Lijunhui 提交于
-
- 11 2月, 2022 1 次提交
-
-
由 Zhang Ting 提交于
* improve backward performance * support different dtypes for elementwise ops
-
- 27 1月, 2022 1 次提交
-
-
由 Feiyu Chan 提交于
-
- 18 1月, 2022 1 次提交
-
-
由 Zhanlue Yang 提交于
* Merged LoDTensor with Tensor,test=allcases * Patched python level LoDTensor * Patched python level LoDTensor * Merge Tensor into DenseTensor * Fixed namespace issues,test=allcases * Fixed merge issues * Fixed inference issues * Fixed NPU test issues * Fixed merge issues
-
- 12 1月, 2022 1 次提交
-
-
由 Zhang Ting 提交于
-
- 07 1月, 2022 1 次提交
-
-
由 wangxinxin08 提交于
* add mish operator and api * remove redundant code and modify grad_atol of mish unittest * modify mish code to be consistent with other activation implementation
-
- 16 12月, 2021 1 次提交
-
-
由 xiaoting 提交于
* add activation * update activation_op * add unitest for activation * fix acosh for init, test=develop
-
- 13 12月, 2021 1 次提交
-
-
由 wangzhen38 提交于
* add Logit API * add unittest * conflict * pull conflit * pull conflit logit * fix unititest * fix code style * update docs style of * update en doc * fix docs en style * fix docs en style1 * fix docs en style2 * fix docs en style3 * fix docs en style4 * fix docs en style5 * fix docs en style6 * fix docs en style7 * fix docs en style8 * update by review * fix nan bug
-
- 03 12月, 2021 1 次提交
-
-
由 ronnywang 提交于
* refine structure for cuda and rocm * update * update * update * update
-
- 22 11月, 2021 1 次提交
-
-
由 zhupengyang 提交于
-
- 15 10月, 2021 1 次提交
-
-
由 Jiabin Yang 提交于
* native commit for triple grad of sigmod * Updated unittests files * init functional jacobian api * Updated trible_test func * Updated gradient_checker & test_script * finish test with dtype float32 * add float64 test case * polish code * use atol=1e-5 with dtype float64 * fix for ci * set timeout for test_jacobian * fix dygraph grad to support high differential * polish API docstring * Updated gradient checker and some related files * fix double grad strip error for high differential * fix double grad strip error for high differential * Add Sigmoid triple grad tests * fix dygraph double grad dtype error when calling for high differential senario * Updated triple grad teses func * Use np.random to initialize ddx * Updated triple_grad_check func * add todo for gradient checker and refine some comments * remove additional code * add test for warnging in backward.py * add tanh triple grad * format python code * refine code Co-authored-by: Nveyron95 <veyron_wu@163.com> Co-authored-by: Nlevi131 <limaolin01@baidu.com>
-
- 13 10月, 2021 2 次提交
-
-
由 yujun 提交于
* update * update * update * try make CI pass * doc typo * update doc string
-
由 Jiabin Yang 提交于
* native commit for triple grad of sigmod * Updated unittests files * init functional jacobian api * Updated trible_test func * Updated gradient_checker & test_script * finish test with dtype float32 * add float64 test case * polish code * use atol=1e-5 with dtype float64 * fix for ci * set timeout for test_jacobian * fix dygraph grad to support high differential * polish API docstring * Updated gradient checker and some related files * fix double grad strip error for high differential * fix double grad strip error for high differential * Add Sigmoid triple grad tests * fix dygraph double grad dtype error when calling for high differential senario * Updated triple grad teses func * Use np.random to initialize ddx * Updated triple_grad_check func * add todo for gradient checker and refine some comments * remove additional code * add test for warnging in backward.py * format python code Co-authored-by: Nveyron95 <veyron_wu@163.com> Co-authored-by: Nlevi131 <limaolin01@baidu.com>
-
- 14 9月, 2021 1 次提交
-
-
由 Yiqun Liu 提交于
Implement FunctionTraits to support two kinds of elementwise functor and remove some old codes for broadcast. (#35688)
-
- 13 9月, 2021 2 次提交
- 11 6月, 2021 1 次提交
-
-
由 ronnywang 提交于
-
- 26 5月, 2021 1 次提交
-
-
由 Zhanlue Yang 提交于
Sigmoid: Out = Sigmoid(X) SigmoidGrad: DX = DOut*(1-Out)*Out [This Patch] Out DOut -> SigmoidGradGrad -> DOutNew DDX DDOut DDOut = (1-Out)*Out*DDX DOutNew = (1-2*Out)*DOut*DDX
-
- 20 5月, 2021 1 次提交
-
-
由 limingshu 提交于
-
- 18 5月, 2021 1 次提交
-
-
由 wuhuanzhou 提交于
-
- 10 5月, 2021 1 次提交
-
-
由 Zhang Zheng 提交于
-
- 07 5月, 2021 1 次提交
-
-
由 Zhang Zheng 提交于
-
- 27 4月, 2021 1 次提交
-
-
由 Zhang Zheng 提交于
-
- 15 4月, 2021 1 次提交
-
-
由 Jiabin Yang 提交于
* add IsInitialized * rm additional log and add tanh double grad * rename is_initialized
-
- 02 4月, 2021 1 次提交
-
-
由 niuliling123 提交于
* add leaky_relu forward and backward in activation_op.cu
-
- 29 3月, 2021 1 次提交
-
-
由 niuliling123 提交于
-