Browse Source

Merge pull request #214 from jimsnns/community-contributions-branch

Add week 1 exercise notebook for OpenAI API and Ollama integration, the AI Technician.
pull/239/head
Ed Donner 2 months ago committed by GitHub
parent
commit
0b4b53f4cb
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
  1. 202
      week1/community-contributions/week1 EXERCISE_AI_techician.ipynb
  2. 448
      week2/community-contributions/day4_booking_flight_tool.ipynb
  3. 1108
      week2/community-contributions/day5_book_flight_sightseeing_tools.ipynb

202
week1/community-contributions/week1 EXERCISE_AI_techician.ipynb

@ -0,0 +1,202 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "fe12c203-e6a6-452c-a655-afb8a03a4ff5",
"metadata": {},
"source": [
"# End of week 1 exercise\n",
"\n",
"To demonstrate your familiarity with OpenAI API, and also Ollama, build a tool that takes a technical question, \n",
"and responds with an explanation. This is a tool that you will be able to use yourself during the course!"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "c1070317-3ed9-4659-abe3-828943230e03",
"metadata": {},
"outputs": [],
"source": [
"# imports\n",
"from IPython.display import Markdown, display, update_display\n",
"import openai\n",
"from openai import OpenAI\n"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "4a456906-915a-4bfd-bb9d-57e505c5093f",
"metadata": {},
"outputs": [],
"source": [
"# constants\n",
"models = {\n",
" 'MODEL_GPT': 'gpt-4o-mini',\n",
" 'MODEL_LLAMA': 'llama3.2'\n",
"}\n",
"\n",
"# To use ollama using openai API (ensure that ollama is running on localhost)\n",
"ollama_via_openai = OpenAI(base_url='http://localhost:11434/v1', api_key='ollama')\n",
"\n",
"def model_choices(model):\n",
" if model in models:\n",
" return models[model]\n",
" else:\n",
" raise ValueError(f\"Model {model} not found in models dictionary\")\n",
"\n",
"def get_model_api(model='MODEL_GPT'):\n",
" if model == 'MODEL_GPT':\n",
" return openai, model_choices(model)\n",
" elif model == 'MODEL_LLAMA':\n",
" return ollama_via_openai, model_choices(model)\n",
" else:\n",
" raise ValueError(f\"Model {model} not found in models dictionary\")\n"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "a8d7923c-5f28-4c30-8556-342d7c8497c1",
"metadata": {},
"outputs": [],
"source": [
"# set up environment\n",
"\n",
"system_prompt = \"\"\" You are an AI assistant helping a user find information about a product. \n",
"The user asks you a technical question about code, and you provide a response with code snippets and explanations.\"\"\"\n",
"\n",
"def stream_brochure(question, model):\n",
" api, model_name = get_model_api(model)\n",
" stream = api.chat.completions.create(\n",
" model=model_name,\n",
" messages=[\n",
" {\"role\": \"system\", \"content\": system_prompt},\n",
" {\"role\": \"user\", \"content\": question}\n",
" ],\n",
" stream=True\n",
" )\n",
" \n",
" response = \"\"\n",
" display_handle = display(Markdown(\"\"), display_id=True)\n",
" for chunk in stream:\n",
" response += chunk.choices[0].delta.content or ''\n",
" response = response.replace(\"```\",\"\").replace(\"markdown\", \"\")\n",
" update_display(Markdown(response), display_id=display_handle.display_id)\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "3f0d0137-52b0-47a8-81a8-11a90a010798",
"metadata": {},
"outputs": [],
"source": [
"# Here is the question; type over this to ask something new\n",
"\n",
"question = \"\"\"\n",
"Please explain what this code does and why:\n",
"yield from {book.get(\"author\") for book in books if book.get(\"author\")}\n",
"\"\"\""
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "60ce7000-a4a5-4cce-a261-e75ef45063b4",
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"**Understanding the Code Snippet**\n",
"\n",
"This Python code snippet uses a combination of built-in functions, dictionary iteration, and generator expressions to extract and yield author names from a list of `Book` objects.\n",
"\n",
"Here's a breakdown:\n",
"\n",
"1. **Dictionary Iteration**: The expression `for book in books if book.get(\"author\")`\n",
" - Iterates over each element (`book`) in the container `books`.\n",
" - Filters out elements whose `'author'` key does not have a value (i.e., `None`, `False`, or an empty string). This leaves only dictionaries with author information.\n",
"\n",
"2. **Dictionary Access**: The expression `{book.get(\"author\") for book in books if book.get(\"author\")}`\n",
" - Uses dictionary membership testing to access only the values associated with the `'author'` key.\n",
" - If the value is not found or is considered false, it's skipped in this particular case.\n",
"\n",
"3. **Generator Expression**: This generates an iterator that iterates over the filtered author names.\n",
" - Yields each author name (i.e., a single `'name'` from the book dictionary) on demand.\n",
" - Since these are generator expressions, they use memory less than equivalent Python lists and also create results on-demand.\n",
"\n",
"4. **`yield from`**: This statement takes the generator expression as an argument and uses it to generate a nested iterator structure.\n",
" - It essentially \"decompresses\" the single level of nested iterator created by `list(iter(x))`, allowing for simpler use cases and potentially significant efficiency improvements for more complex structures where every value must be iterated, while in the latter case just the first item per iterable in the outer expression's sequence needs to actually be yielded into result stream.\n",
" - By \"yielding\" a nested iterator (the generator expression), we can simplify code by avoiding repetitive structure like `for book, book_author in zip(iterating over), ...` or list creation.\n",
"\n",
"**Example Use Case**\n",
"\n",
"In this hypothetical example:\n",
"\n",
"# Example Book objects\n",
"class Book:\n",
" def __init__(self, author, title):\n",
" self.author = author # str\n",
" self.title = title\n",
"\n",
"books = [\n",
" {\"author\": \"John Doe\", \"title\": f\"Book 1 by John Doe\"},\n",
" {\"author\": None, \"title\": f\"Book 2 without Author\"},\n",
" {\"author\": \"Jane Smith\", \"title\": f\"Book 3 by Jane Smith\"}\n",
"]\n",
"\n",
"# The given expression to extract and yield author names\n",
"for author in yield from {book.get(\"author\") for book in books if book.get(\"author\")}:\n",
"\n",
" print(author) \n",
"\n",
"In this code snippet, printing the extracted authors would output `John Doe`, `Jane Smith` (since only dictionaries with author information pass the filtering test).\n",
"\n",
"Please modify it like as you wish and use `yield from` along with dictionary iteration, list comprehension or generator expression if needed, and explain what purpose your version has."
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Get the model of your choice (choices appeared below) to answer, with streaming \n",
"\n",
"\"\"\"models = {\n",
" 'MODEL_GPT': 'gpt-4o-mini',\n",
" 'MODEL_LLAMA': 'llama3.2'\n",
"}\"\"\"\n",
"\n",
"stream_brochure(question,'MODEL_LLAMA')"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "llms",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.11"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

448
week2/community-contributions/day4_booking_flight_tool.ipynb

@ -0,0 +1,448 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "ddfa9ae6-69fe-444a-b994-8c4c5970a7ec",
"metadata": {},
"source": [
"# Project - Airline AI Assistant\n",
"\n",
"We'll now bring together what we've learned to make an AI Customer Support assistant for an Airline"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "8b50bbe2-c0b1-49c3-9a5c-1ba7efa2bcb4",
"metadata": {},
"outputs": [],
"source": [
"# imports\n",
"\n",
"import os\n",
"import json\n",
"from dotenv import load_dotenv\n",
"from openai import OpenAI\n",
"import gradio as gr"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "747e8786-9da8-4342-b6c9-f5f69c2e22ae",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"OpenAI API Key exists and begins sk-proj-\n"
]
}
],
"source": [
"# Initialization\n",
"\n",
"load_dotenv(override=True)\n",
"\n",
"openai_api_key = os.getenv('OPENAI_API_KEY')\n",
"if openai_api_key:\n",
" print(f\"OpenAI API Key exists and begins {openai_api_key[:8]}\")\n",
"else:\n",
" print(\"OpenAI API Key not set\")\n",
" \n",
"MODEL = \"gpt-4o-mini\"\n",
"openai = OpenAI()\n",
"\n",
"# As an alternative, if you'd like to use Ollama instead of OpenAI\n",
"# Check that Ollama is running for you locally (see week1/day2 exercise) then uncomment these next 2 lines\n",
"# MODEL = \"llama3.2\"\n",
"# openai = OpenAI(base_url='http://localhost:11434/v1', api_key='ollama')\n"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "0a521d84-d07c-49ab-a0df-d6451499ed97",
"metadata": {},
"outputs": [],
"source": [
"system_message = \"You are a helpful assistant for an Airline called FlightAI. \"\n",
"system_message += \"Give short, courteous answers, no more than 1 sentence. \"\n",
"system_message += \"Always be accurate. If you don't know the answer, say so.\""
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "61a2a15d-b559-4844-b377-6bd5cb4949f6",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"* Running on local URL: http://127.0.0.1:7877\n",
"\n",
"To create a public link, set `share=True` in `launch()`.\n"
]
},
{
"data": {
"text/html": [
"<div><iframe src=\"http://127.0.0.1:7877/\" width=\"100%\" height=\"500\" allow=\"autoplay; camera; microphone; clipboard-read; clipboard-write;\" frameborder=\"0\" allowfullscreen></iframe></div>"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/plain": []
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# This function looks rather simpler than the one from my video, because we're taking advantage of the latest Gradio updates\n",
"\n",
"def chat(message, history):\n",
" messages = [{\"role\": \"system\", \"content\": system_message}] + history + [{\"role\": \"user\", \"content\": message}]\n",
" response = openai.chat.completions.create(model=MODEL, messages=messages)\n",
" return response.choices[0].message.content\n",
"\n",
"gr.ChatInterface(fn=chat, type=\"messages\").launch()"
]
},
{
"cell_type": "markdown",
"id": "36bedabf-a0a7-4985-ad8e-07ed6a55a3a4",
"metadata": {},
"source": [
"## Tools\n",
"\n",
"Tools are an incredibly powerful feature provided by the frontier LLMs.\n",
"\n",
"With tools, you can write a function, and have the LLM call that function as part of its response.\n",
"\n",
"Sounds almost spooky.. we're giving it the power to run code on our machine?\n",
"\n",
"Well, kinda."
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "0696acb1-0b05-4dc2-80d5-771be04f1fb2",
"metadata": {},
"outputs": [],
"source": [
"# Let's start by making a useful function\n",
"\n",
"ticket_prices = {\"london\": \"$799\", \"paris\": \"$899\", \"tokyo\": \"$1400\", \"berlin\": \"$499\"}\n",
"\n",
"def get_ticket_price(destination_city):\n",
" print(f\"Tool get_ticket_price called for {destination_city}\")\n",
" city = destination_city.lower()\n",
" return ticket_prices.get(city, \"Unknown\")"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "80ca4e09-6287-4d3f-997d-fa6afbcf6c85",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Tool get_ticket_price called for Berlin\n"
]
},
{
"data": {
"text/plain": [
"'$499'"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"get_ticket_price(\"Berlin\")"
]
},
{
"cell_type": "code",
"execution_count": 29,
"id": "0757cba1",
"metadata": {},
"outputs": [],
"source": [
"import random\n",
"\n",
"# Create a function for the booking system\n",
"def get_booking(destination_city):\n",
" print(f\"Tool get_booking called for {destination_city}\")\n",
" city = destination_city.lower()\n",
" \n",
" # Example data for different cities\n",
" flight_info = {\n",
" \"london\": {\"flight_number\": \"BA123\", \"departure_time\": \"10:00 AM\", \"gate\": \"A12\"},\n",
" \"paris\": {\"flight_number\": \"AF456\", \"departure_time\": \"12:00 PM\", \"gate\": \"B34\"},\n",
" \"tokyo\": {\"flight_number\": \"JL789\", \"departure_time\": \"02:00 PM\", \"gate\": \"C56\"},\n",
" \"berlin\": {\"flight_number\": \"LH101\", \"departure_time\": \"04:00 PM\", \"gate\": \"D78\"}\n",
" }\n",
" \n",
" if city in flight_info:\n",
" info = flight_info[city]\n",
" status = random.choice([\"available\", \"not available\"])\n",
" return f\"Flight {info['flight_number']} to {destination_city.lower()} is {status}. Departure time: {info['departure_time']}, Gate: {info['gate']}.\"\n",
" else:\n",
" return \"Unknown destination city.\""
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "d5413a96",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Tool get_booking called for Berlin\n"
]
},
{
"data": {
"text/plain": [
"'Flight LH101 to berlin is cancelled. Departure time: 04:00 PM, Gate: D78.'"
]
},
"execution_count": 13,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"get_booking(\"Berlin\")"
]
},
{
"cell_type": "code",
"execution_count": 30,
"id": "4afceded-7178-4c05-8fa6-9f2085e6a344",
"metadata": {},
"outputs": [],
"source": [
"# There's a particular dictionary structure that's required to describe our function:\n",
"\n",
"price_function = {\n",
" \"name\": \"get_ticket_price\",\n",
" \"description\": \"Get the price of a return ticket to the destination city. Call this whenever you need to know the ticket price, for example when a customer asks 'How much is a ticket to this city'\",\n",
" \"parameters\": {\n",
" \"type\": \"object\",\n",
" \"properties\": {\n",
" \"destination_city\": {\n",
" \"type\": \"string\",\n",
" \"description\": \"The city that the customer wants to travel to\",\n",
" },\n",
" },\n",
" \"required\": [\"destination_city\"],\n",
" \"additionalProperties\": False\n",
" }\n",
"}\n",
"\n",
"# Book flight function description and properties\n",
"\n",
"book_flight_function = {\n",
" \"name\": \"book_flight\",\n",
" \"description\": \"Book a flight to the destination city. Call this whenever a customer wants to book a flight.\",\n",
" \"parameters\": {\n",
" \"type\": \"object\",\n",
" \"properties\": {\n",
" \"destination_city\": {\n",
" \"type\": \"string\",\n",
" \"description\": \"The city that the customer wants to travel to\",\n",
" },\n",
" \"departure_date\": {\n",
" \"type\": \"string\",\n",
" \"description\": \"The date of departure (YYYY-MM-DD)\",\n",
" },\n",
" \"return_date\": {\n",
" \"type\": \"string\",\n",
" \"description\": \"The date of return (YYYY-MM-DD)\",\n",
" },\n",
" \"passenger_name\": {\n",
" \"type\": \"string\",\n",
" \"description\": \"The name of the passenger\",\n",
" },\n",
" },\n",
" \"required\": [\"destination_city\", \"departure_date\", \"return_date\", \"passenger_name\"],\n",
" \"additionalProperties\": False\n",
" }\n",
"}"
]
},
{
"cell_type": "code",
"execution_count": 31,
"id": "bdca8679-935f-4e7f-97e6-e71a4d4f228c",
"metadata": {},
"outputs": [],
"source": [
"# And this is included in a list of tools:\n",
"\n",
"tools = [{\"type\": \"function\", \"function\": price_function}, {\"type\": \"function\", \"function\": book_flight_function}]"
]
},
{
"cell_type": "markdown",
"id": "c3d3554f-b4e3-4ce7-af6f-68faa6dd2340",
"metadata": {},
"source": [
"## Getting OpenAI to use our Tool\n",
"\n",
"There's some fiddly stuff to allow OpenAI \"to call our tool\"\n",
"\n",
"What we actually do is give the LLM the opportunity to inform us that it wants us to run the tool.\n",
"\n",
"Here's how the new chat function looks:"
]
},
{
"cell_type": "code",
"execution_count": 33,
"id": "ce9b0744-9c78-408d-b9df-9f6fd9ed78cf",
"metadata": {},
"outputs": [],
"source": [
"def chat(message, history):\n",
" messages = [{\"role\": \"system\", \"content\": system_message}] + history + [{\"role\": \"user\", \"content\": message}]\n",
" response = openai.chat.completions.create(model=MODEL, messages=messages, tools=tools)\n",
"\n",
" if response.choices[0].finish_reason==\"tool_calls\":\n",
" message = response.choices[0].message\n",
" response, city = handle_tool_call(message)\n",
" messages.append(message)\n",
" messages.append(response)\n",
" response = openai.chat.completions.create(model=MODEL, messages=messages)\n",
" \n",
" return response.choices[0].message.content"
]
},
{
"cell_type": "code",
"execution_count": 32,
"id": "b0992986-ea09-4912-a076-8e5603ee631f",
"metadata": {},
"outputs": [],
"source": [
"# We have to write that function handle_tool_call:\n",
"\n",
"def handle_tool_call(message):\n",
" print(f\"Message type: {type(message)}\")\n",
" tool_call = message.tool_calls[0]\n",
" print(f\"Tool call: {tool_call}\")\n",
" arguments = json.loads(tool_call.function.arguments)\n",
" city = arguments.get('destination_city')\n",
" price = get_ticket_price(city)\n",
" book = get_booking(city)\n",
" print (book)\n",
" response = {\n",
" \"role\": \"tool\",\n",
" \"content\": json.dumps({\"destination_city\": city,\"price\": price, \"booking\": book}),\n",
" \"tool_call_id\": tool_call.id\n",
" }\n",
" return response, city"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f4be8a71-b19e-4c2f-80df-f59ff2661f14",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"* Running on local URL: http://127.0.0.1:7864\n",
"\n",
"To create a public link, set `share=True` in `launch()`.\n"
]
},
{
"data": {
"text/html": [
"<div><iframe src=\"http://127.0.0.1:7864/\" width=\"100%\" height=\"500\" allow=\"autoplay; camera; microphone; clipboard-read; clipboard-write;\" frameborder=\"0\" allowfullscreen></iframe></div>"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/plain": []
},
"execution_count": 34,
"metadata": {},
"output_type": "execute_result"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Message type: <class 'openai.types.chat.chat_completion_message.ChatCompletionMessage'>\n",
"Tool call: ChatCompletionMessageToolCall(id='call_TGFmeFmQN689caTlqfLuhycv', function=Function(arguments='{\"destination_city\":\"London\",\"departure_date\":\"2023-10-31\",\"return_date\":\"2025-03-30\",\"passenger_name\":\"dimitris\"}', name='book_flight'), type='function')\n",
"Tool get_ticket_price called for London\n",
"Tool get_booking called for London\n",
"Flight BA123 to london is available. Departure time: 10:00 AM, Gate: A12.\n",
"Message type: <class 'openai.types.chat.chat_completion_message.ChatCompletionMessage'>\n",
"Tool call: ChatCompletionMessageToolCall(id='call_FRzs5w09rkpVumZ61SArRlND', function=Function(arguments='{\"destination_city\":\"Paris\",\"departure_date\":\"2023-03-23\",\"return_date\":\"2025-03-30\",\"passenger_name\":\"Dimitris\"}', name='book_flight'), type='function')\n",
"Tool get_ticket_price called for Paris\n",
"Tool get_booking called for Paris\n",
"Flight AF456 to paris is available. Departure time: 12:00 PM, Gate: B34.\n"
]
}
],
"source": [
"gr.ChatInterface(fn=chat, type=\"messages\").launch()"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "llms",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.11"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

1108
week2/community-contributions/day5_book_flight_sightseeing_tools.ipynb

File diff suppressed because one or more lines are too long
Loading…
Cancel
Save