From b09fcc8b35f8de76ac37dd5385b2a9cd3ec664a0 Mon Sep 17 00:00:00 2001 From: amrrs <1littlecoder@gmail.com> Date: Fri, 21 Oct 2022 22:24:55 +0530 Subject: [PATCH] added open in spaces badge --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 0212c9d..11a2b7c 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # clip-interrogator -[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/pharmapsychotic/clip-interrogator/blob/main/clip_interrogator.ipynb) +[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/pharmapsychotic/clip-interrogator/blob/main/clip_interrogator.ipynb) [![Generic badge](https://img.shields.io/badge/🤗-Open%20in%20Spaces-blue.svg)](https://huggingface.co/spaces/pharma/CLIP-Interrogator) The CLIP Interrogator uses the OpenAI CLIP models to test a given image against a variety of artists, mediums, and styles to study how the different models see the content of the image. It also combines the results with BLIP caption to suggest a text prompt to create more images similar to what was given.