From 02af0da87e6486be6382a4a27634d87e9e1e8634 Mon Sep 17 00:00:00 2001 From: Daniel Miessler Date: Sun, 4 Feb 2024 16:06:04 -0800 Subject: [PATCH] Added TOC --- README.md | 40 ++++++++++++++++++++++++++++++++++++++-- client/utils.py | 4 ++-- 2 files changed, 40 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 541757f..5dc8d97 100644 --- a/README.md +++ b/README.md @@ -23,6 +23,24 @@ +- [fabric](#fabric) + - [What and why](#what-and-why) + - [Philosophy](#philosophy) + - [Breaking problems into components](#breaking-problems-into-components) + - [Too many prompts](#too-many-prompts) + - [Our approach to prompting](#our-approach-to-prompting) + - [Quickstart](#quickstart) + - [1. Just use the Patterns](#1-just-use-the-patterns) + - [2. Create your own Fabric Mill (Server)](#2-create-your-own-fabric-mill-server) + - [3. The standalone client](#3-the-standalone-client) + - [Structure](#structure) + - [Components](#components) + - [CLI-native](#cli-native) + - [Directly calling Patterns](#directly-calling-patterns) + - [Examples](#examples) + - [Meta](#meta) + - [Primary contributors](#primary-contributors) + ## What and why Since the start of 2023 and GenAI we've seen a massive number of AI applications for accomplishing tasks. It's powerful, but **it's not easy to integrate this functionality into our lives.** @@ -91,46 +109,57 @@ The most feature-rich way to use Fabric is to use the `fabric` client, which can Follow these steps to get the client installed and configured. 1. Navigate to where you want the Fabric project to live on your systemClone the directory to a semi-permanent place on your computer. + ```bash # Find a home for Fabric cd /where/you/keep/code ``` + 2. Clone the project to your computer. + ```bash # Clone Fabric to your computer git clone git@github.com:danielmiessler/fabric.git ``` + 3. Enter Fabric's /client directory + ```bash # Enter the project and its /client folder cd fabric/client ``` + 4. Install the dependencies + ```bash # Install the pre-requisites pip3 install -r requirements.txt ``` + 5. Add the path to the `fabric` client to your shell + ```bash # Tell your shell how to find the `fabric` client echo 'alias fabric="/the/path/to/fabric/client" >> .bashrc' # Example of ~/.zshrc or ~/.bashrc alias fabric="~/Development/fabric/client/fabric" ``` + 6. Restart your shell + ```bash # Make sure you can echo 'alias fabric="/the/path/to/fabric/client" >> .bashrc' # Example echo 'alias fabric="~/Development/fabric/client/fabric" >> .zshrc' ``` + ### Using the `fabric` client Once you have it all set up, here's how to use it. 1. Check out the options -`fabric -h` - + `fabric -h` ```bash fabric [-h] [--text TEXT] [--copy] [--output [OUTPUT]] [--stream] [--list] @@ -153,10 +182,13 @@ options: The pattern (prompt) to use --setup Set up your fabric instance ``` + 2. Set up the client + ```bash fabric --setup ``` + You'll be asked to enter your OpenAI API key, which will be written to `~/.config/fabric/.env`. Patterns will then be downloaded from Github, which will take a few moments. ### Example commands @@ -164,13 +196,17 @@ You'll be asked to enter your OpenAI API key, which will be written to `~/.confi The client, by default, runs Fabric patterns without needing a server (the Patterns were downloaded during setup). This means the client connects directly to OpenAI using the input given and the Fabric pattern used. 1. Run the `summarize` Pattern based on input from `stdin`. In this case, the body of an article. + ```bash pbpaste | fabric -pattern summarize ``` + 2. Run the `analyze_claims` Pattern with the `--stream` option to get immediate and streaming results. + ```bash pbpaste | fabric --stream --pattern analyze_claims ``` + > [!NOTE] > More examples coming in the next few days, including a demo video! diff --git a/client/utils.py b/client/utils.py index f008a18..c2d3db5 100644 --- a/client/utils.py +++ b/client/utils.py @@ -51,7 +51,7 @@ class Standalone: messages = [user_message] try: stream = self.client.chat.completions.create( - model="gpt-4-turbo-preview", + model="gpt-4", messages=messages, temperature=0.0, top_p=1, @@ -98,7 +98,7 @@ class Standalone: messages = [user_message] try: response = self.client.chat.completions.create( - model="gpt-4-turbo-preview", + model="gpt-4", messages=messages, temperature=0.0, top_p=1,