19 Commits (6253ec4aef664327880aab1f95d200a7488173ae)

Author SHA1 Message Date
comfyanonymous bb1f45d6e8 Properly disable weight initialization in clip models. 1 year ago
comfyanonymous 0c7cad404c Don't initialize clip weights to default values. 1 year ago
comfyanonymous 23cf8ca7c5 Fix bug when embedding gets ignored because of mismatched size. 1 year ago
comfyanonymous af9cc1fb6a Search recursively in subfolders for embeddings. 2 years ago
comfyanonymous 81d1f00df3 Some refactoring: from_tokens -> encode_from_tokens 2 years ago
BlenderNeko d0b1b6c6bf fixed improper padding 2 years ago
comfyanonymous 04d9bc13af Safely load pickled embeds that don't load with weights_only=True. 2 years ago
BlenderNeko da115bd78d ensure backwards compat with optional args 2 years ago
BlenderNeko 752f7a162b align behavior with old tokenize function 2 years ago
comfyanonymous 334aab05e5 Don't stop workflow if loading embedding fails. 2 years ago
BlenderNeko 8489cba140 add unique ID per word/embedding for tokenizer 2 years ago
comfyanonymous 1718730e80 Ignore embeddings when sizes don't match and print a WARNING. 2 years ago
comfyanonymous 50099bcd96 Support multiple paths for embeddings. 2 years ago
comfyanonymous 00a9189e30 Support old pytorch. 2 years ago
comfyanonymous 137ae2606c Support people putting commas after the embedding name in the prompt. 2 years ago
comfyanonymous 324273fff2 Fix embedding not working when on new line. 2 years ago
comfyanonymous f73e57d881 Add support for textual inversion embedding for SD1.x CLIP. 2 years ago
comfyanonymous 220afe3310 Initial commit. 2 years ago