comfyanonymous
|
6908f9c949
|
This makes pytorch2.0 attention perform a bit faster.
|
2 years ago |
comfyanonymous
|
3696d1699a
|
Add support for GLIGEN textbox model.
|
2 years ago |
comfyanonymous
|
73c3e11e83
|
Fix model_management import so it doesn't get executed twice.
|
2 years ago |
EllangoK
|
e5e587b1c0
|
seperates out arg parser and imports args
|
2 years ago |
comfyanonymous
|
18a6c1db33
|
Add a TomePatchModel node to the _for_testing section.
Tome increases sampling speed at the expense of quality.
|
2 years ago |
comfyanonymous
|
61ec3c9d5d
|
Add a way to pass options to the transformers blocks.
|
2 years ago |
comfyanonymous
|
3ed4a4e4e6
|
Try again with vae tiled decoding if regular fails because of OOM.
|
2 years ago |
comfyanonymous
|
83f23f82b8
|
Add pytorch attention support to VAE.
|
2 years ago |
comfyanonymous
|
a256a2abde
|
--disable-xformers should not even try to import xformers.
|
2 years ago |
comfyanonymous
|
0f3ba7482f
|
Xformers is now properly disabled when --cpu used.
Added --windows-standalone-build option, currently it only opens
makes the code open up comfyui in the browser.
|
2 years ago |
comfyanonymous
|
798c90e1c0
|
Fix pytorch 2.0 cross attention not working.
|
2 years ago |
comfyanonymous
|
c1f5855ac1
|
Make some cross attention functions work on the CPU.
|
2 years ago |
comfyanonymous
|
1a612e1c74
|
Add some pytorch scaled_dot_product_attention code for testing.
--use-pytorch-cross-attention to use it.
|
2 years ago |
comfyanonymous
|
9502ee45c3
|
Hopefully fix a strange issue with xformers + lowvram.
|
2 years ago |
comfyanonymous
|
c9daec4c89
|
Remove prints that are useless when xformers is enabled.
|
2 years ago |
comfyanonymous
|
773cdabfce
|
Same thing but for the other places where it's used.
|
2 years ago |
comfyanonymous
|
50db297cf6
|
Try to fix OOM issues with cards that have less vram than mine.
|
2 years ago |
comfyanonymous
|
051f472e8f
|
Fix sub quadratic attention for SD2 and make it the default optimization.
|
2 years ago |
comfyanonymous
|
220afe3310
|
Initial commit.
|
2 years ago |