comfyanonymous
1de86851b1
Try to fix memory issue.
2 years ago
edikius
165be5828a
Fixed import ( #44 )
...
* fixed import error
I had an
ImportError: cannot import name 'Protocol' from 'typing'
while trying to update so I fixed it to start an app
* Update main.py
* deleted example files
2 years ago
comfyanonymous
cc8baf1080
Make VAE use common function to get free memory.
2 years ago
comfyanonymous
798c90e1c0
Fix pytorch 2.0 cross attention not working.
2 years ago
comfyanonymous
4215206281
Add a node to set CLIP skip.
...
Use a more simple way to detect if the model is -v prediction.
2 years ago
comfyanonymous
94bb0375b0
New CheckpointLoaderSimple to load checkpoints without a config.
2 years ago
comfyanonymous
c1f5855ac1
Make some cross attention functions work on the CPU.
2 years ago
comfyanonymous
1a612e1c74
Add some pytorch scaled_dot_product_attention code for testing.
...
--use-pytorch-cross-attention to use it.
2 years ago
comfyanonymous
9502ee45c3
Hopefully fix a strange issue with xformers + lowvram.
2 years ago
comfyanonymous
fcb25d37db
Prepare for t2i adapter.
2 years ago
comfyanonymous
f04dc2c2f4
Implement DDIM sampler.
2 years ago
comfyanonymous
c9daec4c89
Remove prints that are useless when xformers is enabled.
2 years ago
comfyanonymous
09f1d76ed8
Fix an OOM issue.
2 years ago
comfyanonymous
4efa67fa12
Add ControlNet support.
2 years ago
comfyanonymous
1a4edd19cd
Fix overflow issue with inplace softmax.
2 years ago
comfyanonymous
509c7dfc6d
Use real softmax in split op to fix issue with some images.
2 years ago
comfyanonymous
1f6a467e92
Update ldm dir with latest upstream stable diffusion changes.
2 years ago
comfyanonymous
773cdabfce
Same thing but for the other places where it's used.
2 years ago
comfyanonymous
df40d4f3bf
torch.cuda.OutOfMemoryError is not present on older pytorch versions.
2 years ago
comfyanonymous
e8c499ddd4
Split optimization for VAE attention block.
2 years ago
comfyanonymous
5b4e312749
Use inplace operations for less OOM issues.
2 years ago
comfyanonymous
047775615b
Lower the chances of an OOM.
2 years ago
comfyanonymous
1daccf3678
Run softmax in place if it OOMs.
2 years ago
comfyanonymous
50db297cf6
Try to fix OOM issues with cards that have less vram than mine.
2 years ago
comfyanonymous
051f472e8f
Fix sub quadratic attention for SD2 and make it the default optimization.
2 years ago
comfyanonymous
220afe3310
Initial commit.
2 years ago