Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
Ranting8323
stable-diffusion-webui-composable-lora
提交
db3552a3
S
stable-diffusion-webui-composable-lora
项目概览
Ranting8323
/
stable-diffusion-webui-composable-lora
10 个月 前同步成功
通知
14
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
分析
仓库
DevOps
项目成员
Pages
S
stable-diffusion-webui-composable-lora
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Pages
分析
分析
仓库分析
DevOps
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
提交
前往新版Gitcode,体验更适合开发者的 AI 搜索 >>
提交
db3552a3
编写于
6月 28, 2023
作者:
A
a2569875
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
trying fix issues: #8 and #11 Not sure if it works
上级
dd29e057
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
28 addition
and
1 deletion
+28
-1
composable_lora_function_handler.py
composable_lora_function_handler.py
+5
-1
scripts/composable_lora_script.py
scripts/composable_lora_script.py
+23
-0
未找到文件。
composable_lora_function_handler.py
浏览文件 @
db3552a3
...
...
@@ -24,7 +24,11 @@ def on_enable():
composable_lycoris
.
backup_MultiheadAttention_forward_before_lyco
=
torch
.
nn
.
MultiheadAttention_forward_before_lyco
if
hasattr
(
torch
.
nn
,
'MultiheadAttention_load_state_dict_before_lyco'
):
composable_lycoris
.
backup_MultiheadAttention_load_state_dict_before_lyco
=
torch
.
nn
.
MultiheadAttention_load_state_dict_before_lyco
if
hasattr
(
composable_lora
,
'lyco_notfound'
):
if
composable_lora
.
lyco_notfound
:
torch
.
nn
.
Linear_forward_before_lyco
=
composable_lora
.
Linear_forward_before_clora
torch
.
nn
.
Conv2d_forward_before_lyco
=
composable_lora
.
Conv2d_forward_before_clora
torch
.
nn
.
MultiheadAttention_forward_before_lyco
=
composable_lora
.
MultiheadAttention_forward_before_clora
torch
.
nn
.
Linear
.
forward
=
composable_lora
.
lora_Linear_forward
torch
.
nn
.
Conv2d
.
forward
=
composable_lora
.
lora_Conv2d_forward
torch
.
nn
.
MultiheadAttention
.
forward
=
lycoris
.
lyco_MultiheadAttention_forward
...
...
scripts/composable_lora_script.py
浏览文件 @
db3552a3
...
...
@@ -15,6 +15,24 @@ def unload():
torch
.
nn
.
Conv2d
.
forward
=
torch
.
nn
.
Conv2d_forward_before_lora
torch
.
nn
.
MultiheadAttention
.
forward
=
torch
.
nn
.
MultiheadAttention_forward_before_lora
if
not
hasattr
(
composable_lora
,
'Linear_forward_before_clora'
):
if
hasattr
(
torch
.
nn
,
'Linear_forward_before_lyco'
):
composable_lora
.
Linear_forward_before_clora
=
torch
.
nn
.
Linear_forward_before_lyco
else
:
composable_lora
.
Linear_forward_before_clora
=
torch
.
nn
.
Linear
.
forward
if
not
hasattr
(
composable_lora
,
'Conv2d_forward_before_clora'
):
if
hasattr
(
torch
.
nn
,
'Conv2d_forward_before_lyco'
):
composable_lora
.
Conv2d_forward_before_clora
=
torch
.
nn
.
Conv2d_forward_before_lyco
else
:
composable_lora
.
Conv2d_forward_before_clora
=
torch
.
nn
.
Conv2d
.
forward
if
not
hasattr
(
composable_lora
,
'MultiheadAttention_forward_before_clora'
):
if
hasattr
(
torch
.
nn
,
'MultiheadAttention_forward_before_lyco'
):
composable_lora
.
MultiheadAttention_forward_before_clora
=
torch
.
nn
.
MultiheadAttention_forward_before_lyco
else
:
composable_lora
.
MultiheadAttention_forward_before_clora
=
torch
.
nn
.
MultiheadAttention
.
forward
if
not
hasattr
(
torch
.
nn
,
'Linear_forward_before_lora'
):
if
hasattr
(
torch
.
nn
,
'Linear_forward_before_lyco'
):
torch
.
nn
.
Linear_forward_before_lora
=
torch
.
nn
.
Linear_forward_before_lyco
...
...
@@ -33,6 +51,11 @@ if not hasattr(torch.nn, 'MultiheadAttention_forward_before_lora'):
else
:
torch
.
nn
.
MultiheadAttention_forward_before_lora
=
torch
.
nn
.
MultiheadAttention
.
forward
if
hasattr
(
torch
.
nn
,
'Linear_forward_before_lyco'
):
composable_lora
.
lyco_notfound
=
False
else
:
composable_lora
.
lyco_notfound
=
True
torch
.
nn
.
Linear
.
forward
=
composable_lora
.
lora_Linear_forward
torch
.
nn
.
Conv2d
.
forward
=
composable_lora
.
lora_Conv2d_forward
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录