comfyanonymous
|
95d796fc85
|
Faster VAE loading.
|
1 year ago |
comfyanonymous
|
fa28d7334b
|
Remove useless code.
|
1 year ago |
comfyanonymous
|
b8636a44aa
|
Make scaled_dot_product switch to sliced attention on OOM.
|
2 years ago |
comfyanonymous
|
797c4e8d3b
|
Simplify and improve some vae attention code.
|
2 years ago |
comfyanonymous
|
bae4fb4a9d
|
Fix imports.
|
2 years ago |
comfyanonymous
|
73c3e11e83
|
Fix model_management import so it doesn't get executed twice.
|
2 years ago |
comfyanonymous
|
e46b1c3034
|
Disable xformers in VAE when xformers == 0.0.18
|
2 years ago |
comfyanonymous
|
3ed4a4e4e6
|
Try again with vae tiled decoding if regular fails because of OOM.
|
2 years ago |
comfyanonymous
|
c692509c2b
|
Try to improve VAEEncode memory usage a bit.
|
2 years ago |
comfyanonymous
|
83f23f82b8
|
Add pytorch attention support to VAE.
|
2 years ago |
comfyanonymous
|
a256a2abde
|
--disable-xformers should not even try to import xformers.
|
2 years ago |
comfyanonymous
|
0f3ba7482f
|
Xformers is now properly disabled when --cpu used.
Added --windows-standalone-build option, currently it only opens
makes the code open up comfyui in the browser.
|
2 years ago |
comfyanonymous
|
1de86851b1
|
Try to fix memory issue.
|
2 years ago |
comfyanonymous
|
cc8baf1080
|
Make VAE use common function to get free memory.
|
2 years ago |
comfyanonymous
|
509c7dfc6d
|
Use real softmax in split op to fix issue with some images.
|
2 years ago |
comfyanonymous
|
773cdabfce
|
Same thing but for the other places where it's used.
|
2 years ago |
comfyanonymous
|
e8c499ddd4
|
Split optimization for VAE attention block.
|
2 years ago |
comfyanonymous
|
5b4e312749
|
Use inplace operations for less OOM issues.
|
2 years ago |
comfyanonymous
|
220afe3310
|
Initial commit.
|
2 years ago |