1. 11 7月, 2023 2 次提交
  2. 10 7月, 2023 1 次提交
  3. 27 6月, 2023 7 次提交
  4. 25 6月, 2023 1 次提交
  5. 18 6月, 2023 1 次提交
  6. 14 6月, 2023 2 次提交
  7. 13 6月, 2023 3 次提交
  8. 10 6月, 2023 9 次提交
  9. 09 6月, 2023 3 次提交
  10. 08 6月, 2023 2 次提交
  11. 07 6月, 2023 3 次提交
    • A
      Merge pull request #11058 from AUTOMATIC1111/api-wiki · cf28aed1
      AUTOMATIC1111 提交于
      link footer API to Wiki when API is not active
      cf28aed1
    • A
      Merge pull request #11066 from aljungberg/patch-1 · 806ea639
      AUTOMATIC1111 提交于
      Fix upcast attention dtype error.
      806ea639
    • A
      Fix upcast attention dtype error. · d9cc0910
      Alexander Ljungberg 提交于
      Without this fix, enabling the "Upcast cross attention layer to float32" option while also using `--opt-sdp-attention` breaks generation with an error:
      
      ```
        File "/ext3/automatic1111/stable-diffusion-webui/modules/sd_hijack_optimizations.py", line 612, in sdp_attnblock_forward
          out = torch.nn.functional.scaled_dot_product_attention(q, k, v, dropout_p=0.0, is_causal=False)
      RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead.
      ```
      
      The fix is to make sure to upcast the value tensor too.
      d9cc0910
  12. 06 6月, 2023 6 次提交