pharmapsychotic
f4429b4c9d
Update install instructions
...
Recommending the stable 0.5.4 version for now.
0.6.0 supports BLIP2 but requires newer transformers version and could use more love still.
2 years ago
pharmapsychotic
3385e538ee
Update README to 0.6.0
2 years ago
pharmapsychotic
ac74904908
Expose LabelTable and load_list and give example in README how they can be used to rank your own list of terms.
2 years ago
pharmapsychotic
571ba9844c
Link to CLIP Interrogator on Lambda
2 years ago
pharmapsychotic
80d97f1f96
Link to new extension
2 years ago
pharmapsychotic
08546eae22
Properly sync up all the version numbers
2 years ago
pharmapsychotic
384e234ba2
Minor fix to BLIP offloading
2 years ago
pharmapsychotic
c4e16359a7
More safetensor, download, and VRAM improvements
2 years ago
pharmapsychotic
6a62ce73e8
0.5.1
2 years ago
pharmapsychotic
ae88b07a65
safetensors!
...
- store cached embeddings in safetensor format
- updated huggingface ci-preprocess repo
- bumped version to 0.5.0
2 years ago
pharmapsychotic
bcf1833ae0
0.4.4 use blip-ci instead of blip-vit
2 years ago
pharmapsychotic
93db86fa70
.
2 years ago
pharmapsychotic
78287e17e1
0.4.2:
...
- upgrade chain to take a min_count parameter so it won't early out until it has considered at least min_count flavors
- interrogate method ("best" mode) also checks against classic and fast to use their output if it's better
- fix bug of config.download_cache option not being used!
- add notes on Config object to readme
2 years ago
pharmapsychotic
290a63b51e
Update README.md
2 years ago
pharmapsychotic
42b3cf4d9e
Bunch of updates! ( #40 )
...
- auto download the cache files from huggingface
- experimental negative prompt mode
- slight quality and performance improvement to best mode
- analyze tab in Colab and run_gradio to get table of ranked terms
2 years ago
pharmapsychotic
65c560ffac
Reuse existing blip-vit package on pypi
2 years ago
pharmapsychotic
55fe80c74c
Simplify install further
...
Big ups to @justindujardin for proper syntax to get git dependency into requirements in way that pip will accept it! :D
2 years ago
pharmapsychotic
a8ecf52a38
.
2 years ago
pharmapsychotic
02576df0ce
.
2 years ago
pharmapsychotic
1ec6cd9d45
Update to nicer BLIP packaging
2 years ago
pharmapsychotic
6f17fb09af
Fix for running on CPU
2 years ago
pharmapsychotic
152d5f551f
0.3.1 fix for running on cpu, update readme usage instructions
2 years ago
pharmapsychotic
faa56c8ef9
Update replicate link
2 years ago
pharmapsychotic
e3c1a4df84
Update more stuff for open_clip switch
2 years ago
pharmapsychotic
55b1770386
Update Replicate cog to use clip_interrogator library
2 years ago
pharmapsychotic
8f5ddce2b3
.
2 years ago
pharmapsychotic
f4abcdfd0c
Update readme
2 years ago
pharmapsychotic
7a2ac9aa57
Add to pip!
2 years ago
pharmapsychotic
1b5f9437bd
Make into a reusable library
2 years ago
Chenxi
11a0087004
replicate demo
2 years ago
pharmapsychotic
6a0f6d457d
Version 2! Flavors upgraded from 400 to 100,000! Tests for ordering of common pieces. Chaining of flavors to progressively improve prompt. ipywidgets ui for easy uploading and interrogation. Other stuff I forget by now.
2 years ago
amrrs
b09fcc8b35
added open in spaces badge
2 years ago
pharmapsychotic
1080560c48
Ported my CLIP Interrogator over to colab to share
...
The CLIP Interrogator uses the OpenAI CLIP models to test a given image against a variety of artists, mediums, and styles to study how the different models see the content of the image. It also combines the results with BLIP caption to suggest a text prompt to create more images similar to what was given.
2 years ago
pharmapsychotic
65888181d7
Initial commit
2 years ago