comfyanonymous
|
83f23f82b8
|
Add pytorch attention support to VAE.
|
2 years ago |
comfyanonymous
|
a256a2abde
|
--disable-xformers should not even try to import xformers.
|
2 years ago |
comfyanonymous
|
0f3ba7482f
|
Xformers is now properly disabled when --cpu used.
Added --windows-standalone-build option, currently it only opens
makes the code open up comfyui in the browser.
|
2 years ago |
comfyanonymous
|
798c90e1c0
|
Fix pytorch 2.0 cross attention not working.
|
2 years ago |
comfyanonymous
|
c1f5855ac1
|
Make some cross attention functions work on the CPU.
|
2 years ago |
comfyanonymous
|
1a612e1c74
|
Add some pytorch scaled_dot_product_attention code for testing.
--use-pytorch-cross-attention to use it.
|
2 years ago |
comfyanonymous
|
9502ee45c3
|
Hopefully fix a strange issue with xformers + lowvram.
|
2 years ago |
comfyanonymous
|
c9daec4c89
|
Remove prints that are useless when xformers is enabled.
|
2 years ago |
comfyanonymous
|
773cdabfce
|
Same thing but for the other places where it's used.
|
2 years ago |
comfyanonymous
|
50db297cf6
|
Try to fix OOM issues with cards that have less vram than mine.
|
2 years ago |
comfyanonymous
|
051f472e8f
|
Fix sub quadratic attention for SD2 and make it the default optimization.
|
2 years ago |
comfyanonymous
|
220afe3310
|
Initial commit.
|
2 years ago |