mirror of
https://github.com/huggingface/diffusers.git
synced 2025-12-08 13:34:27 +08:00
* feat: add lora attention processor for pt 2.0. * explicit context manager for SDPA. * switch to flash attention * make shapes compatible to work optimally with SDPA. * fix: circular import problem. * explicitly specify the flash attention kernel in sdpa * fall back to efficient attention context manager. * remove explicit dispatch. * fix: removed processor. * fix: remove optional from type annotation. * feat: make changes regarding LoRAAttnProcessor2_0. * remove confusing warning. * formatting. * relax tolerance for PT 2.0 * fix: loading message. * remove unnecessary logging. * add: entry to the docs. * add: network_alpha argument. * relax tolerance.
42 lines
1.2 KiB
Plaintext
42 lines
1.2 KiB
Plaintext
# Attention Processor
|
|
|
|
An attention processor is a class for applying different types of attention mechanisms.
|
|
|
|
## AttnProcessor
|
|
[[autodoc]] models.attention_processor.AttnProcessor
|
|
|
|
## AttnProcessor2_0
|
|
[[autodoc]] models.attention_processor.AttnProcessor2_0
|
|
|
|
## LoRAAttnProcessor
|
|
[[autodoc]] models.attention_processor.LoRAAttnProcessor
|
|
|
|
## LoRAAttnProcessor2_0
|
|
[[autodoc]] models.attention_processor.LoRAAttnProcessor2_0
|
|
|
|
## CustomDiffusionAttnProcessor
|
|
[[autodoc]] models.attention_processor.CustomDiffusionAttnProcessor
|
|
|
|
## AttnAddedKVProcessor
|
|
[[autodoc]] models.attention_processor.AttnAddedKVProcessor
|
|
|
|
## AttnAddedKVProcessor2_0
|
|
[[autodoc]] models.attention_processor.AttnAddedKVProcessor2_0
|
|
|
|
## LoRAAttnAddedKVProcessor
|
|
[[autodoc]] models.attention_processor.LoRAAttnAddedKVProcessor
|
|
|
|
## XFormersAttnProcessor
|
|
[[autodoc]] models.attention_processor.XFormersAttnProcessor
|
|
|
|
## LoRAXFormersAttnProcessor
|
|
[[autodoc]] models.attention_processor.LoRAXFormersAttnProcessor
|
|
|
|
## CustomDiffusionXFormersAttnProcessor
|
|
[[autodoc]] models.attention_processor.CustomDiffusionXFormersAttnProcessor
|
|
|
|
## SlicedAttnProcessor
|
|
[[autodoc]] models.attention_processor.SlicedAttnProcessor
|
|
|
|
## SlicedAttnAddedKVProcessor
|
|
[[autodoc]] models.attention_processor.SlicedAttnAddedKVProcessor |