769 Commits (ea9ac9d30beb23119e590610d3ec5dcd146a12f2)

Author SHA1 Message Date
comfyanonymous 94cc718e9c Add a way to add patches to the input block. 1 year ago
comfyanonymous 7339479b10 Disable xformers when it can't load properly. 1 year ago
comfyanonymous 4781819a85 Make memory estimation aware of model dtype. 1 year ago
comfyanonymous dd4ba68b6e Allow different models to estimate memory usage differently. 1 year ago
comfyanonymous 2c9dba8dc0 sampling_function now has the model object as the argument. 1 year ago
comfyanonymous 8d80584f6a Remove useless argument from uni_pc sampler. 1 year ago
comfyanonymous 248aa3e563 Fix bug. 1 year ago
comfyanonymous 4a8a839b40 Add option to use in place weight updating in ModelPatcher. 1 year ago
comfyanonymous 412d3ff57d Refactor. 1 year ago
comfyanonymous 58d5d71a93 Working RescaleCFG node. 1 year ago
comfyanonymous 3e0033ef30 Fix model merge bug. 1 year ago
comfyanonymous 002aefa382 Support lcm models. 1 year ago
comfyanonymous ec12000136 Add support for full diff lora keys. 1 year ago
comfyanonymous 064d7583eb Add a CONDConstant for passing non tensor conds to unet. 1 year ago
comfyanonymous 794dd2064d Fix typo. 1 year ago
comfyanonymous 0a6fd49a3e Print leftover keys when using the UNETLoader. 1 year ago
comfyanonymous fe40109b57 Fix issue with object patches not being copied with patcher. 1 year ago
comfyanonymous a527d0c795 Code refactor. 1 year ago
comfyanonymous 2a23ba0b8c Fix unet ops not entirely on GPU. 1 year ago
comfyanonymous 844dbf97a7 Add: advanced->model->ModelSamplingDiscrete node. 1 year ago
comfyanonymous 656c0b5d90 CLIP code refactor and improvements. 1 year ago
comfyanonymous b3fcd64c6c Make SDTokenizer class work with more types of tokenizers. 1 year ago
gameltb 7e455adc07 fix unet_wrapper_function name in ModelPatcher 1 year ago
comfyanonymous 1ffa8858e7 Move model sampling code to comfy/model_sampling.py 1 year ago
comfyanonymous ae2acfc21b Don't convert Nan to zero. 1 year ago
comfyanonymous d2e27b48f1 sampler_cfg_function now gets the noisy output as argument again. 1 year ago
comfyanonymous 2455aaed8a Allow model or clip to be None in load_lora_for_models. 1 year ago
comfyanonymous ecb80abb58 Allow ModelSamplingDiscrete to be instantiated without a model config. 1 year ago
comfyanonymous e73ec8c4da Not used anymore. 1 year ago
comfyanonymous 111f1b5255 Fix some issues with sampling precision. 1 year ago
comfyanonymous 7c0f255de1 Clean up percent start/end and make controlnets work with sigmas. 1 year ago
comfyanonymous a268a574fa Remove a bunch of useless code. 1 year ago
comfyanonymous 1777b54d02 Sampling code changes. 1 year ago
comfyanonymous c837a173fa Fix some memory issues in sub quad attention. 1 year ago
comfyanonymous 125b03eead Fix some OOM issues with split attention. 1 year ago
comfyanonymous a12cc05323 Add --max-upload-size argument, the default is 100MB. 1 year ago
comfyanonymous 2a134bfab9 Fix checkpoint loader with config. 1 year ago
comfyanonymous e60ca6929a SD1 and SD2 clip and tokenizer code is now more similar to the SDXL one. 1 year ago
comfyanonymous 6ec3f12c6e Support SSD1B model and make it easier to support asymmetric unets. 1 year ago
comfyanonymous 434ce25ec0 Restrict loading embeddings from embedding folders. 1 year ago
comfyanonymous 723847f6b3 Faster clip image processing. 1 year ago
comfyanonymous a373367b0c Fix some OOM issues with split and sub quad attention. 1 year ago
comfyanonymous 7fbb217d3a Fix uni_pc returning noisy image when steps <= 3 1 year ago
Jedrzej Kosinski 3783cb8bfd change 'c_adm' to 'y' in ControlNet.get_control 1 year ago
comfyanonymous d1d2fea806 Pass extra conds directly to unet. 1 year ago
comfyanonymous 036f88c621 Refactor to make it easier to add custom conds to models. 1 year ago
comfyanonymous 3fce8881ca Sampling code refactor to make it easier to add more conds. 1 year ago
comfyanonymous 8594c8be4d Empty the cache when torch cache is more than 25% free mem. 1 year ago
comfyanonymous 8b65f5de54 attention_basic now works with hypertile. 1 year ago
comfyanonymous e6bc42df46 Make sub_quad and split work with hypertile. 1 year ago