* debug * from step * print * turn sigma a list * make str * init_noise_sigma * comment * remove prints * feat: introduce fused projections * change to a better name * no grad * device. * device * dtype * okay * print * more print * fix: unbind -> split * fix: qkv >-> k * enable disable * apply attention processor within the method * attn processors * _enable_fused_qkv_projections * remove print * add fused projection to vae * add todos. * add: documentation and cleanups. * add: test for qkv projection fusion. * relax assertions. * relax further * fix: docs * fix-copies * correct error message. * Empty-Commit * better conditioning on disable_fused_qkv_projections * check * check processor * bfloat16 computation. * check latent dtype * style * remove copy temporarily * cast latent to bfloat16 * fix: vae -> self.vae * remove print. * add _change_to_group_norm_32 * comment out stuff that didn't work * Apply suggestions from code review Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com> * reflect patrick's suggestions. * fix imports * fix: disable call. * fix more * fix device and dtype * fix conditions. * fix more * Apply suggestions from code review Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com> --------- Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
2.0 KiB
Attention Processor
An attention processor is a class for applying different types of attention mechanisms.
AttnProcessor
autodoc models.attention_processor.AttnProcessor
AttnProcessor2_0
autodoc models.attention_processor.AttnProcessor2_0
FusedAttnProcessor2_0
autodoc models.attention_processor.FusedAttnProcessor2_0
LoRAAttnProcessor
autodoc models.attention_processor.LoRAAttnProcessor
LoRAAttnProcessor2_0
autodoc models.attention_processor.LoRAAttnProcessor2_0
CustomDiffusionAttnProcessor
autodoc models.attention_processor.CustomDiffusionAttnProcessor
CustomDiffusionAttnProcessor2_0
autodoc models.attention_processor.CustomDiffusionAttnProcessor2_0
AttnAddedKVProcessor
autodoc models.attention_processor.AttnAddedKVProcessor
AttnAddedKVProcessor2_0
autodoc models.attention_processor.AttnAddedKVProcessor2_0
LoRAAttnAddedKVProcessor
autodoc models.attention_processor.LoRAAttnAddedKVProcessor
XFormersAttnProcessor
autodoc models.attention_processor.XFormersAttnProcessor
LoRAXFormersAttnProcessor
autodoc models.attention_processor.LoRAXFormersAttnProcessor
CustomDiffusionXFormersAttnProcessor
autodoc models.attention_processor.CustomDiffusionXFormersAttnProcessor
SlicedAttnProcessor
autodoc models.attention_processor.SlicedAttnProcessor
SlicedAttnAddedKVProcessor
autodoc models.attention_processor.SlicedAttnAddedKVProcessor