- 16 12月, 2021 1 次提交
-
-
由 xiaoting 提交于
* add activation * update activation_op * add unitest for activation * fix acosh for init, test=develop
-
- 13 12月, 2021 1 次提交
-
-
由 wangzhen38 提交于
* add Logit API * add unittest * conflict * pull conflit * pull conflit logit * fix unititest * fix code style * update docs style of * update en doc * fix docs en style * fix docs en style1 * fix docs en style2 * fix docs en style3 * fix docs en style4 * fix docs en style5 * fix docs en style6 * fix docs en style7 * fix docs en style8 * update by review * fix nan bug
-
- 22 11月, 2021 1 次提交
-
-
由 zhupengyang 提交于
-
- 16 11月, 2021 1 次提交
-
-
由 jakpiase 提交于
-
- 11 11月, 2021 1 次提交
-
-
由 jakpiase 提交于
* added softplus + activation fuse plass * minor change * implemented reviewer suggestion * minor fix * minor fix * added scale_out parameter * minor fix * fix for iScan CI * conditionally disabled logs * refactored pass builder
-
- 15 10月, 2021 1 次提交
-
-
由 Jiabin Yang 提交于
* native commit for triple grad of sigmod * Updated unittests files * init functional jacobian api * Updated trible_test func * Updated gradient_checker & test_script * finish test with dtype float32 * add float64 test case * polish code * use atol=1e-5 with dtype float64 * fix for ci * set timeout for test_jacobian * fix dygraph grad to support high differential * polish API docstring * Updated gradient checker and some related files * fix double grad strip error for high differential * fix double grad strip error for high differential * Add Sigmoid triple grad tests * fix dygraph double grad dtype error when calling for high differential senario * Updated triple grad teses func * Use np.random to initialize ddx * Updated triple_grad_check func * add todo for gradient checker and refine some comments * remove additional code * add test for warnging in backward.py * add tanh triple grad * format python code * refine code Co-authored-by: Nveyron95 <veyron_wu@163.com> Co-authored-by: Nlevi131 <limaolin01@baidu.com>
-
- 13 10月, 2021 2 次提交
-
-
由 yujun 提交于
* update * update * update * try make CI pass * doc typo * update doc string
-
由 Jiabin Yang 提交于
* native commit for triple grad of sigmod * Updated unittests files * init functional jacobian api * Updated trible_test func * Updated gradient_checker & test_script * finish test with dtype float32 * add float64 test case * polish code * use atol=1e-5 with dtype float64 * fix for ci * set timeout for test_jacobian * fix dygraph grad to support high differential * polish API docstring * Updated gradient checker and some related files * fix double grad strip error for high differential * fix double grad strip error for high differential * Add Sigmoid triple grad tests * fix dygraph double grad dtype error when calling for high differential senario * Updated triple grad teses func * Use np.random to initialize ddx * Updated triple_grad_check func * add todo for gradient checker and refine some comments * remove additional code * add test for warnging in backward.py * format python code Co-authored-by: Nveyron95 <veyron_wu@163.com> Co-authored-by: Nlevi131 <limaolin01@baidu.com>
-
- 10 9月, 2021 1 次提交
-
-
由 Shang Zhizhou 提交于
* add opdef extra * add reduce mean * update style
-
- 08 9月, 2021 1 次提交
-
-
由 hong19860320 提交于
-
- 07 9月, 2021 1 次提交
-
-
由 Pei Yang 提交于
-
- 27 8月, 2021 1 次提交
-
-
由 zhupengyang 提交于
-
- 11 6月, 2021 1 次提交
-
-
由 ronnywang 提交于
-
- 26 5月, 2021 1 次提交
-
-
由 Zhanlue Yang 提交于
Sigmoid: Out = Sigmoid(X) SigmoidGrad: DX = DOut*(1-Out)*Out [This Patch] Out DOut -> SigmoidGradGrad -> DOutNew DDX DDOut DDOut = (1-Out)*Out*DDX DOutNew = (1-2*Out)*DOut*DDX
-
- 25 4月, 2021 1 次提交
-
-
由 minghaoBD 提交于
-
- 15 4月, 2021 1 次提交
-
-
由 Jiabin Yang 提交于
* add IsInitialized * rm additional log and add tanh double grad * rename is_initialized
-
- 03 3月, 2021 1 次提交
-
-
由 Qi Li 提交于
-
- 19 2月, 2021 1 次提交
-
-
由 Wojciech Uss 提交于
* Modify relu native implementation * fix GPU performance
-
- 25 1月, 2021 1 次提交
-
-
由 arlesniak 提交于
* More precise mkldnn kernel choice in GetExpectedKernelType * Fixes after review * Refresh develop for CI * CI experiment * get back from CI exper
-
- 20 1月, 2021 1 次提交
-
-
由 chentianyu03 提交于
* rewrite abs op * rewrite abs op and remove abs in activation * remove abs register in old codes * fix abs_grad type error * fix abs double_grad output name error * modify abs_grad, abs_grad_grad functor for windows building * format code style * fix the bug of result is nan when the divisor is zero * add missing abs attr and add abs for float16
-
- 22 12月, 2020 1 次提交
-
-
由 whs 提交于
-
- 09 12月, 2020 1 次提交
-
-
由 joejiong 提交于
As the title
-
- 27 11月, 2020 1 次提交
-
-
由 arlesniak 提交于
-
- 26 11月, 2020 1 次提交
-
-
由 Noel 提交于
Fix ops doc for some ops
-
- 19 11月, 2020 1 次提交
-
-
由 joejiong 提交于
Add new operator log10
-
- 12 11月, 2020 1 次提交
-
-
由 joejiong 提交于
As the title
-
- 27 9月, 2020 1 次提交
-
-
由 Jack Zhou 提交于
register log double grad kernel for cpu and cuda
-
- 25 9月, 2020 1 次提交
-
-
由 Zhong Hui 提交于
add abs support double grad for the api 2.0
-
- 31 8月, 2020 3 次提交
-
-
由 hong19860320 提交于
-
由 wawltor 提交于
update the doc for the some ops, ceil asin, atan
-
由 zhupengyang 提交于
-
- 22 8月, 2020 1 次提交
-
-
由 zhupengyang 提交于
-
- 20 8月, 2020 1 次提交
-
-
由 hong19860320 提交于
-
- 14 8月, 2020 5 次提交
-
-
由 Yang Zhang 提交于
test=document_fix remove activation wording
-
由 Yang Zhang 提交于
test=develop,test=document_fix remove activation wording
-
由 Yang Zhang 提交于
test=develop,test=document_fix remove activation wording
-
由 Yang Zhang 提交于
test=develop,test=document_fix remove activation wording tanh -> tan
-
由 Yang Zhang 提交于
test=develop,test=document_fix explain input/out put range and out of boundary behavior
-
- 13 8月, 2020 1 次提交
-
-
由 Leo Chen 提交于
* add unchaged infershape function * add broadcast infershape function * fix bug * rename infershape functions * add UnaryOpUnchangedInferShapeCheckAxis * add error message * add test for common infer shape functions * dont update existed ops * dont update op_desc.h * add more test * add error check, refine error message
-
- 10 8月, 2020 1 次提交
-
-
由 Adam 提交于
* Add oneDNN relu6 op * Lint fixes
-