Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
Crayon鑫
Paddle
提交
39f69727
P
Paddle
项目概览
Crayon鑫
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1
Issue
1
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
39f69727
编写于
1月 23, 2017
作者:
H
hedaoyuan
提交者:
GitHub
1月 23, 2017
浏览文件
操作
浏览文件
下载
差异文件
Merge pull request
#1202
from hedaoyuan/cmrnorm
Add some comment of CrossMapNormalFunc
上级
f4678331
5b9450ae
变更
1
显示空白变更内容
内联
并排
Showing
1 changed file
with
73 addition
and
10 deletion
+73
-10
paddle/function/CrossMapNormalOp.cpp
paddle/function/CrossMapNormalOp.cpp
+73
-10
未找到文件。
paddle/function/CrossMapNormalOp.cpp
浏览文件 @
39f69727
...
...
@@ -112,11 +112,51 @@ void CrossMapNormalGrad<DEVICE_TYPE_CPU>(real* inputsGrad,
}
/**
* \brief
{o_0, o_1} = calc(i_0)
* \brief
Normalization with across maps.
*
* \param inputs[0] input value.
* \param outputs[0] output value.
* \param outputs[1] denoms.
* This Function comes from the paper
* "ImageNet Classification with Deep Convolutional Neural Networks".
*
* The original formula is:
*
* Input(i, x, y)
* Output(i, x, y) = ----------------------------------------------
* -- upper
* (k + alpha * > (Input(j, x, y))^2) ^ (beta)
* -- j = lower
*
* upper is `min(C, c + N/2)`
* lower if `max(0, c - N/2)`
*
* Function implementation:
*
* inputs and outpus is NCHW format, while input.shape.ndims() is equal 4.
* And the meaning of each dimension(0-3) is respectively batch size,
* feature maps, rows and columns.
*
* Input and Output in the above formula is for each map(i) of one image, and
* Input(i, x, y), Output(i, x, y) represents an element in an image.
*
* C is the number of feature maps of one image, and N is a hyper-parameters
* is configured when Function is initialized. The sum in the denominator
* is the sum of the same position in the neighboring maps.
*
* In the implementation of Function, k is equal to 1,
* so Function has no argument for k.
*
* Function Arguments:
*
* \param size_ represent N
* \param scale_ represent alpha
* \param pow_ represent beta
* \param inputs[0] represent Input
* \param outputs[0] represent Output
* \param outputs[1] represent The denominator in the formula(except beta)
*
* Note:
* Save output[1] is to simplify the backward calculation.
* TODO, if only consider the forward calculation, we can optimize to
* remove the output[1].
*/
template
<
DeviceType
Device
>
class
CrossMapNormalFunc
:
public
FunctionBase
{
...
...
@@ -161,13 +201,36 @@ private:
};
/**
* \brief {o_0} = calc(i_0, i_1, i_2, i_3)
* \brief Backward calculation for normalization with across maps.
*
* Function implementation:
*
* The implementation of this Function is derived from the
* CrossMapNormalFunc implementation.
*
* InputGrad = OutputGrad * denoms ^ (-beta)
* -- upper
* + > (OutputGrad * OutputValue * (-2 * alpha * beta) / denoms) * InputValue
* -- lower
*
* The data of inputs/outputs format is the same as the forward interface
* and is NCHW.
*
* The upper and lower is the same as forward. The logic of the sum
* is also the same as forward.
*
* Function Arguments:
*
* \param inputs[0] input value.
* \param inputs[1] output value.
* \param inputs[2] output grad.
* \param inputs[3] denoms.
* \param outputs[0] input grad.
* \param size_ represent N
* \param scale_ represent alpha
* \param pow_ represent beta
* \param inputs[0] represent InputValue, inputs[0] of CrossMapNormalFunc
* \param inputs[1] represent OutputValue, outputs[0] of CrossMapNormalFunc
* \param inputs[2] represent OutputGrad
* \param inputs[3] represent denoms, outputs[1] of CrossMapNormalFunc
* This is the intermediate result that is
* preserved in the forward calculation.
* \param outputs[0] represent InputGrad
*/
template
<
DeviceType
Device
>
class
CrossMapNormalGradFunc
:
public
FunctionBase
{
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录