Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
Greenplum
Annotated Deep Learning Paper Implementations
提交
f4b2d469
A
Annotated Deep Learning Paper Implementations
项目概览
Greenplum
/
Annotated Deep Learning Paper Implementations
10 个月 前同步成功
通知
6
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
DevOps
流水线
流水线任务
计划
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
A
Annotated Deep Learning Paper Implementations
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
DevOps
DevOps
流水线
流水线任务
计划
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
流水线任务
提交
Issue看板
前往新版Gitcode,体验更适合开发者的 AI 搜索 >>
提交
f4b2d469
编写于
9月 24, 2022
作者:
V
Varuna Jayasiri
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
fix
上级
eb92824e
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
2 addition
and
2 deletion
+2
-2
docs/diffusion/stable_diffusion/model/unet_attention.html
docs/diffusion/stable_diffusion/model/unet_attention.html
+1
-1
labml_nn/diffusion/stable_diffusion/model/unet_attention.py
labml_nn/diffusion/stable_diffusion/model/unet_attention.py
+1
-1
未找到文件。
docs/diffusion/stable_diffusion/model/unet_attention.html
浏览文件 @
f4b2d469
...
...
@@ -602,7 +602,7 @@
<span
class=
"lineno"
>
173
</span>
<span
class=
"n"
>
k
</span>
<span
class=
"o"
>
=
</span>
<span
class=
"bp"
>
self
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
to_k
</span><span
class=
"p"
>
(
</span><span
class=
"n"
>
cond
</span><span
class=
"p"
>
)
</span>
<span
class=
"lineno"
>
174
</span>
<span
class=
"n"
>
v
</span>
<span
class=
"o"
>
=
</span>
<span
class=
"bp"
>
self
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
to_v
</span><span
class=
"p"
>
(
</span><span
class=
"n"
>
cond
</span><span
class=
"p"
>
)
</span>
<span
class=
"lineno"
>
175
</span>
<span
class=
"lineno"
>
176
</span>
<span
class=
"nb"
>
print
</span><span
class=
"p"
>
(
</span><span
class=
"s1"
>
'
use flash
'
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
CrossAttention
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
use_flash_attention
</span><span
class=
"p"
>
)
</span>
<span
class=
"lineno"
>
176
</span>
<span
class=
"nb"
>
print
</span><span
class=
"p"
>
(
</span><span
class=
"s1"
>
'
use flash
'
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
CrossAttention
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
use_flash_attention
</span><span
class=
"p"
>
,
</span>
<span
class=
"bp"
>
self
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
flash
</span><span
class=
"p"
>
)
</span>
<span
class=
"lineno"
>
177
</span>
<span
class=
"lineno"
>
178
</span>
<span
class=
"k"
>
if
</span>
<span
class=
"n"
>
CrossAttention
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
use_flash_attention
</span>
<span
class=
"ow"
>
and
</span>
<span
class=
"bp"
>
self
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
flash
</span>
<span
class=
"ow"
>
is
</span>
<span
class=
"ow"
>
not
</span>
<span
class=
"kc"
>
None
</span>
<span
class=
"ow"
>
and
</span>
<span
class=
"n"
>
cond
</span>
<span
class=
"ow"
>
is
</span>
<span
class=
"kc"
>
None
</span>
<span
class=
"ow"
>
and
</span>
<span
class=
"bp"
>
self
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
d_head
</span>
<span
class=
"o"
>
<
=
</span>
<span
class=
"mi"
>
128
</span><span
class=
"p"
>
:
</span>
<span
class=
"lineno"
>
179
</span>
<span
class=
"k"
>
return
</span>
<span
class=
"bp"
>
self
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
flash_attention
</span><span
class=
"p"
>
(
</span><span
class=
"n"
>
q
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
k
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
v
</span><span
class=
"p"
>
)
</span>
...
...
labml_nn/diffusion/stable_diffusion/model/unet_attention.py
浏览文件 @
f4b2d469
...
...
@@ -173,7 +173,7 @@ class CrossAttention(nn.Module):
k
=
self
.
to_k
(
cond
)
v
=
self
.
to_v
(
cond
)
print
(
'use flash'
,
CrossAttention
.
use_flash_attention
)
print
(
'use flash'
,
CrossAttention
.
use_flash_attention
,
self
.
flash
)
if
CrossAttention
.
use_flash_attention
and
self
.
flash
is
not
None
and
cond
is
None
and
self
.
d_head
<=
128
:
return
self
.
flash_attention
(
q
,
k
,
v
)
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录