comfyanonymous
65397ce601
Replace prints with logging and add --verbose argument.
8 months ago
comfyanonymous
03c47fc0f2
Add a min_length property to tokenizer class.
9 months ago
comfyanonymous
8ac69f62e5
Make return_projected_pooled setable from the __init__
9 months ago
comfyanonymous
c2cb8e889b
Always return unprojected pooled output for gligen.
9 months ago
comfyanonymous
1cb3f6a83b
Move text projection into the CLIP model code.
...
Fix issue with not loading the SSD1B clip correctly.
9 months ago
comfyanonymous
97d03ae04a
StableCascade CLIP model support.
9 months ago
comfyanonymous
4871a36458
Cleanup some unused imports.
10 months ago
comfyanonymous
57926635e8
Switch text encoder to manual cast.
...
Use fp16 text encoder weights for CPU inference to lower memory usage.
11 months ago
comfyanonymous
9ac0b487ac
Make --gpu-only put intermediate values in GPU memory instead of cpu.
12 months ago
comfyanonymous
fbdb14d4c4
Cleaner CLIP text encoder implementation.
...
Use a simple CLIP model implementation instead of the one from
transformers.
This will allow some interesting things that would too hackish to implement
using the transformers implementation.
12 months ago
comfyanonymous
be3468ddd5
Less useless downcasting.
12 months ago
comfyanonymous
728613bb3e
Fix last pr.
1 year ago
Jianqi Pan
f2e49b1d57
fix: adaptation to older versions of pytroch
1 year ago
comfyanonymous
656c0b5d90
CLIP code refactor and improvements.
...
More generic clip model class that can be used on more types of text
encoders.
Don't apply weighting algorithm when weight is 1.0
Don't compute an empty token output when it's not needed.
1 year ago
comfyanonymous
b3fcd64c6c
Make SDTokenizer class work with more types of tokenizers.
1 year ago
comfyanonymous
2a134bfab9
Fix checkpoint loader with config.
1 year ago
comfyanonymous
e60ca6929a
SD1 and SD2 clip and tokenizer code is now more similar to the SDXL one.
1 year ago
comfyanonymous
434ce25ec0
Restrict loading embeddings from embedding folders.
1 year ago
comfyanonymous
44361f6344
Support for text encoder models that need attention_mask.
1 year ago
comfyanonymous
fb3b728203
Fix issue where autocast fp32 CLIP gave different results from regular.
1 year ago
comfyanonymous
ec96f6d03a
Move text_projection to base clip model.
1 year ago
comfyanonymous
e3d0a9a490
Fix potential issue with text projection matrix multiplication.
1 year ago
comfyanonymous
00c0b2c507
Initialize text encoder to target dtype.
1 year ago
comfyanonymous
f081017c1a
Save memory by storing text encoder weights in fp16 in most situations.
...
Do inference in fp32 to make sure quality stays the exact same.
1 year ago
comfyanonymous
c99d8002f8
Make sure the pooled output stays at the EOS token with added embeddings.
1 year ago
comfyanonymous
50b1180dde
Fix CLIPSetLastLayer not reverting when removed.
1 year ago
comfyanonymous
46dc050c9f
Fix potential tensors being on different devices issues.
1 year ago
comfyanonymous
606a537090
Support SDXL embedding format with 2 CLIP.
1 year ago
comfyanonymous
608fcc2591
Fix bug with weights when prompt is long.
1 year ago
comfyanonymous
ce35d8c659
Lower latency by batching some text encoder inputs.
1 year ago
comfyanonymous
b6a60fa696
Try to keep text encoders loaded and patched to increase speed.
...
load_model_gpu() is now used with the text encoder models instead of just
the unet.
1 year ago
comfyanonymous
97ee230682
Make highvram and normalvram shift the text encoders to vram and back.
...
This is faster on big text encoder models than running it on the CPU.
1 year ago
comfyanonymous
9920367d3c
Fix embeddings not working with --gpu-only
1 year ago
comfyanonymous
20f579d91d
Add DualClipLoader to load clip models for SDXL.
...
Update LoadClip to load clip models for SDXL refiner.
1 year ago
comfyanonymous
f87ec10a97
Support base SDXL and SDXL refiner models.
...
Large refactor of the model detection and loading code.
1 year ago
comfyanonymous
f7edcfd927
Add a --gpu-only argument to keep and run everything on the GPU.
...
Make the CLIP model work on the GPU.
1 year ago
comfyanonymous
bb1f45d6e8
Properly disable weight initialization in clip models.
1 year ago
comfyanonymous
0c7cad404c
Don't initialize clip weights to default values.
1 year ago
comfyanonymous
23cf8ca7c5
Fix bug when embedding gets ignored because of mismatched size.
1 year ago
comfyanonymous
af9cc1fb6a
Search recursively in subfolders for embeddings.
2 years ago
comfyanonymous
81d1f00df3
Some refactoring: from_tokens -> encode_from_tokens
2 years ago
BlenderNeko
d0b1b6c6bf
fixed improper padding
2 years ago
comfyanonymous
04d9bc13af
Safely load pickled embeds that don't load with weights_only=True.
2 years ago
BlenderNeko
da115bd78d
ensure backwards compat with optional args
2 years ago
BlenderNeko
752f7a162b
align behavior with old tokenize function
2 years ago
comfyanonymous
334aab05e5
Don't stop workflow if loading embedding fails.
2 years ago
BlenderNeko
8489cba140
add unique ID per word/embedding for tokenizer
2 years ago
comfyanonymous
1718730e80
Ignore embeddings when sizes don't match and print a WARNING.
2 years ago
comfyanonymous
50099bcd96
Support multiple paths for embeddings.
2 years ago
comfyanonymous
00a9189e30
Support old pytorch.
2 years ago