未验证 提交 496a9a3a 编写于 作者: M Molly Smith 提交者: GitHub

Diffusers 0.15.0 bug fix (#3345)

* diffusers 0.15.0 cross attention class check

* revert diffusers_attention.py
上级 6e1cbebe
......@@ -233,7 +233,10 @@ def generic_injection(module, fp16=False, enable_cuda_graph=True):
try:
import diffusers
cross_attention = diffusers.models.attention.CrossAttention
if hasattr(diffusers.models.attention, 'CrossAttention'):
cross_attention = diffusers.models.attention.CrossAttention
else:
cross_attention = diffusers.models.attention_processor.Attention
attention_block = diffusers.models.attention.BasicTransformerBlock
new_policies = {
cross_attention: replace_attn,
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册