{ "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "provenance": [] }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "language_info": { "name": "python" }, "accelerator": "GPU", "gpuClass": "standard" }, "cells": [ { "cell_type": "markdown", "source": [ "Git clone the repo and install the requirements. (ignore the pip errors about protobuf)" ], "metadata": { "id": "aaaaaaaaaa" } }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "bbbbbbbbbb" }, "outputs": [], "source": [ "!git clone https://github.com/comfyanonymous/ComfyUI\n", "%cd ComfyUI\n", "!pip install -r requirements.txt\n", "!pip install xformers" ] }, { "cell_type": "markdown", "source": [ "Download some models/checkpoints/vae:" ], "metadata": { "id": "cccccccccc" } }, { "cell_type": "code", "source": [ "!wget https://huggingface.co/runwayml/stable-diffusion-v1-5/resolve/main/v1-5-pruned-emaonly.ckpt -P ./models/checkpoints/\n", "!wget https://huggingface.co/stabilityai/sd-vae-ft-mse-original/resolve/main/vae-ft-mse-840000-ema-pruned.safetensors -P ./models/vae/\n", "!wget https://huggingface.co/WarriorMama777/OrangeMixs/resolve/main/Models/AbyssOrangeMix2/AbyssOrangeMix2_hard.safetensors -P ./models/checkpoints/\n" ], "metadata": { "id": "dddddddddd" }, "execution_count": null, "outputs": [] }, { "cell_type": "markdown", "source": [ "Optional: Launch a http server to see the output pics (you can also download them by browsing to the output directory with colab):" ], "metadata": { "id": "eeeeeeeeee" } }, { "cell_type": "code", "source": [ "from google.colab import output\n", "output.serve_kernel_port_as_window(8000)\n", "get_ipython().system_raw('cd output && python3 -m http.server 8000 &') " ], "metadata": { "id": "ffffffffff" }, "execution_count": null, "outputs": [] }, { "cell_type": "markdown", "source": [ "### Run ComfyUI \n", "use the **fp16** model configs for more speed\n", "\n", "You should see the ui appear in an iframe. If you get a 403 error, it's your firefox settings or an extension that's messing things up.\n", "\n", "If you want to open it in another window use the second link not the first one.\n" ], "metadata": { "id": "gggggggggg" } }, { "cell_type": "code", "source": [ "import threading\n", "import time\n", "import socket\n", "def iframe_thread(port):\n", " while True:\n", " time.sleep(0.5)\n", " sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n", " result = sock.connect_ex(('127.0.0.1', port))\n", " if result == 0:\n", " break\n", " sock.close()\n", " from google.colab import output\n", " output.serve_kernel_port_as_iframe(port, height=1024)\n", " print(\"to open it in a window you can open this link here:\")\n", " output.serve_kernel_port_as_window(port)\n", "\n", "threading.Thread(target=iframe_thread, daemon=True, args=(8188,)).start()\n", "\n", "!python main.py --highvram" ], "metadata": { "id": "hhhhhhhhhh" }, "execution_count": null, "outputs": [] }, { "cell_type": "markdown", "source": [ "### Run ComfyUI with localtunnel\n", "\n", "If you have issues with the previous way, you can try this way. It will also work on non colab.\n", "\n", "use the **fp16** model configs for more speed\n", "\n" ], "metadata": { "id": "kkkkkkkkkkkkkk" } }, { "cell_type": "code", "source": [ "!npm install -g localtunnel\n", "\n", "import subprocess\n", "import threading\n", "import time\n", "import socket\n", "def iframe_thread(port):\n", " while True:\n", " time.sleep(0.5)\n", " sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n", " result = sock.connect_ex(('127.0.0.1', port))\n", " if result == 0:\n", " break\n", " sock.close()\n", " p = subprocess.Popen([\"lt\", \"--port\", \"{}\".format(port)], stdout=subprocess.PIPE)\n", " for line in p.stdout:\n", " print(line.decode(), end='')\n", "\n", "\n", "threading.Thread(target=iframe_thread, daemon=True, args=(8188,)).start()\n", "\n", "!python main.py --highvram" ], "metadata": { "id": "jjjjjjjjjjjjj" }, "execution_count": null, "outputs": [] } ] }