67 Commits (392878a2621d131ac9e856fb2d428d9c6e2a022e)

Author SHA1 Message Date
comfyanonymous 809bcc8ceb Add support for unCLIP SD2.x models. 2 years ago
comfyanonymous 61ec3c9d5d Add a way to pass options to the transformers blocks. 2 years ago
comfyanonymous 3ed4a4e4e6 Try again with vae tiled decoding if regular fails because of OOM. 2 years ago
comfyanonymous c692509c2b Try to improve VAEEncode memory usage a bit. 2 years ago
comfyanonymous 54dbfaf2ec Remove omegaconf dependency and some ci changes. 2 years ago
comfyanonymous 83f23f82b8 Add pytorch attention support to VAE. 2 years ago
comfyanonymous a256a2abde --disable-xformers should not even try to import xformers. 2 years ago
comfyanonymous 0f3ba7482f Xformers is now properly disabled when --cpu used. 2 years ago
comfyanonymous 1de86851b1 Try to fix memory issue. 2 years ago
comfyanonymous cc8baf1080 Make VAE use common function to get free memory. 2 years ago
comfyanonymous fcb25d37db Prepare for t2i adapter. 2 years ago
comfyanonymous 09f1d76ed8 Fix an OOM issue. 2 years ago
comfyanonymous 4efa67fa12 Add ControlNet support. 2 years ago
comfyanonymous 509c7dfc6d Use real softmax in split op to fix issue with some images. 2 years ago
comfyanonymous 1f6a467e92 Update ldm dir with latest upstream stable diffusion changes. 2 years ago
comfyanonymous 773cdabfce Same thing but for the other places where it's used. 2 years ago
comfyanonymous e8c499ddd4 Split optimization for VAE attention block. 2 years ago
comfyanonymous 5b4e312749 Use inplace operations for less OOM issues. 2 years ago
comfyanonymous 220afe3310 Initial commit. 2 years ago