3 changed files with 905 additions and 522 deletions
@ -0,0 +1,579 @@ |
|||||||
|
{ |
||||||
|
"cells": [ |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "d15d8294-3328-4e07-ad16-8a03e9bbfdb9", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"# Instant Gratification\n", |
||||||
|
"\n", |
||||||
|
"## Your first Frontier LLM Project!\n", |
||||||
|
"\n", |
||||||
|
"Let's build a useful LLM solution - in a matter of minutes.\n", |
||||||
|
"\n", |
||||||
|
"By the end of this course, you will have built an autonomous Agentic AI solution with 7 agents that collaborate to solve a business problem. All in good time! We will start with something smaller...\n", |
||||||
|
"\n", |
||||||
|
"Our goal is to code a new kind of Web Browser. Give it a URL, and it will respond with a summary. The Reader's Digest of the internet!!\n", |
||||||
|
"\n", |
||||||
|
"Before starting, you should have completed the setup for [PC](../SETUP-PC.md) or [Mac](../SETUP-mac.md) and you hopefully launched this jupyter lab from within the project root directory, with your environment activated.\n", |
||||||
|
"\n", |
||||||
|
"## If you're new to Jupyter Lab\n", |
||||||
|
"\n", |
||||||
|
"Welcome to the wonderful world of Data Science experimentation! Once you've used Jupyter Lab, you'll wonder how you ever lived without it. Simply click in each \"cell\" with code in it, such as the cell immediately below this text, and hit Shift+Return to execute that cell. As you wish, you can add a cell with the + button in the toolbar, and print values of variables, or try out variations. \n", |
||||||
|
"\n", |
||||||
|
"I've written a notebook called [Guide to Jupyter](Guide%20to%20Jupyter.ipynb) to help you get more familiar with Jupyter Labs, including adding Markdown comments, using `!` to run shell commands, and `tqdm` to show progress.\n", |
||||||
|
"\n", |
||||||
|
"## If you'd prefer to work in IDEs\n", |
||||||
|
"\n", |
||||||
|
"If you're more comfortable in IDEs like VSCode or Pycharm, they both work great with these lab notebooks too. \n", |
||||||
|
"If you'd prefer to work in VSCode, [here](https://chatgpt.com/share/676f2e19-c228-8012-9911-6ca42f8ed766) are instructions from an AI friend on how to configure it for the course.\n", |
||||||
|
"\n", |
||||||
|
"## If you'd like to brush up your Python\n", |
||||||
|
"\n", |
||||||
|
"I've added a notebook called [Intermediate Python](Intermediate%20Python.ipynb) to get you up to speed. But you should give it a miss if you already have a good idea what this code does: \n", |
||||||
|
"`yield from {book.get(\"author\") for book in books if book.get(\"author\")}`\n", |
||||||
|
"\n", |
||||||
|
"## I am here to help\n", |
||||||
|
"\n", |
||||||
|
"If you have any problems at all, please do reach out. \n", |
||||||
|
"I'm available through the platform, or at ed@edwarddonner.com, or at https://www.linkedin.com/in/eddonner/ if you'd like to connect (and I love connecting!)\n", |
||||||
|
"\n", |
||||||
|
"## More troubleshooting\n", |
||||||
|
"\n", |
||||||
|
"Please see the [troubleshooting](troubleshooting.ipynb) notebook in this folder to diagnose and fix common problems. At the very end of it is a diagnostics script with some useful debug info.\n", |
||||||
|
"\n", |
||||||
|
"## If this is old hat!\n", |
||||||
|
"\n", |
||||||
|
"If you're already comfortable with today's material, please hang in there; you can move swiftly through the first few labs - we will get much more in depth as the weeks progress.\n", |
||||||
|
"\n", |
||||||
|
"<table style=\"margin: 0; text-align: left;\">\n", |
||||||
|
" <tr>\n", |
||||||
|
" <td style=\"width: 150px; height: 150px; vertical-align: middle;\">\n", |
||||||
|
" <img src=\"../important.jpg\" width=\"150\" height=\"150\" style=\"display: block;\" />\n", |
||||||
|
" </td>\n", |
||||||
|
" <td>\n", |
||||||
|
" <h2 style=\"color:#900;\">Please read - important note</h2>\n", |
||||||
|
" <span style=\"color:#900;\">The way I collaborate with you may be different to other courses you've taken. I prefer not to type code while you watch. Rather, I execute Jupyter Labs, like this, and give you an intuition for what's going on. My suggestion is that you do this with me, either at the same time, or (perhaps better) right afterwards. Add print statements to understand what's going on, and then come up with your own variations. If you have a Github account, use this to showcase your variations. Not only is this essential practice, but it demonstrates your skills to others, including perhaps future clients or employers...</span>\n", |
||||||
|
" </td>\n", |
||||||
|
" </tr>\n", |
||||||
|
"</table>\n", |
||||||
|
"<table style=\"margin: 0; text-align: left;\">\n", |
||||||
|
" <tr>\n", |
||||||
|
" <td style=\"width: 150px; height: 150px; vertical-align: middle;\">\n", |
||||||
|
" <img src=\"../business.jpg\" width=\"150\" height=\"150\" style=\"display: block;\" />\n", |
||||||
|
" </td>\n", |
||||||
|
" <td>\n", |
||||||
|
" <h2 style=\"color:#181;\">Business value of these exercises</h2>\n", |
||||||
|
" <span style=\"color:#181;\">A final thought. While I've designed these notebooks to be educational, I've also tried to make them enjoyable. We'll do fun things like have LLMs tell jokes and argue with each other. But fundamentally, my goal is to teach skills you can apply in business. I'll explain business implications as we go, and it's worth keeping this in mind: as you build experience with models and techniques, think of ways you could put this into action at work today. Please do contact me if you'd like to discuss more or if you have ideas to bounce off me.</span>\n", |
||||||
|
" </td>\n", |
||||||
|
" </tr>\n", |
||||||
|
"</table>" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "4e2a9393-7767-488e-a8bf-27c12dca35bd", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# imports\n", |
||||||
|
"\n", |
||||||
|
"import os\n", |
||||||
|
"import requests\n", |
||||||
|
"from dotenv import load_dotenv\n", |
||||||
|
"from bs4 import BeautifulSoup\n", |
||||||
|
"from IPython.display import Markdown, display\n", |
||||||
|
"from openai import OpenAI\n", |
||||||
|
"\n", |
||||||
|
"# If you get an error running this cell, then please head over to the troubleshooting notebook!" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "6900b2a8-6384-4316-8aaa-5e519fca4254", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"# Connecting to OpenAI\n", |
||||||
|
"\n", |
||||||
|
"The next cell is where we load in the environment variables in your `.env` file and connect to OpenAI.\n", |
||||||
|
"\n", |
||||||
|
"## Troubleshooting if you have problems:\n", |
||||||
|
"\n", |
||||||
|
"Head over to the [troubleshooting](troubleshooting.ipynb) notebook in this folder for step by step code to identify the root cause and fix it!\n", |
||||||
|
"\n", |
||||||
|
"If you make a change, try restarting the \"Kernel\" (the python process sitting behind this notebook) by Kernel menu >> Restart Kernel and Clear Outputs of All Cells. Then try this notebook again, starting at the top.\n", |
||||||
|
"\n", |
||||||
|
"Or, contact me! Message me or email ed@edwarddonner.com and we will get this to work.\n", |
||||||
|
"\n", |
||||||
|
"Any concerns about API costs? See my notes in the README - costs should be minimal, and you can control it at every point. You can also use Ollama as a free alternative, which we discuss during Day 2." |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "7b87cadb-d513-4303-baee-a37b6f938e4d", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# Load environment variables in a file called .env\n", |
||||||
|
"\n", |
||||||
|
"load_dotenv(override=True)\n", |
||||||
|
"api_key = os.getenv('OPENAI_API_KEY')\n", |
||||||
|
"\n", |
||||||
|
"# Check the key\n", |
||||||
|
"\n", |
||||||
|
"if not api_key:\n", |
||||||
|
" print(\"No API key was found - please head over to the troubleshooting notebook in this folder to identify & fix!\")\n", |
||||||
|
"elif not api_key.startswith(\"sk-proj-\"):\n", |
||||||
|
" print(\"An API key was found, but it doesn't start sk-proj-; please check you're using the right key - see troubleshooting notebook\")\n", |
||||||
|
"elif api_key.strip() != api_key:\n", |
||||||
|
" print(\"An API key was found, but it looks like it might have space or tab characters at the start or end - please remove them - see troubleshooting notebook\")\n", |
||||||
|
"else:\n", |
||||||
|
" print(\"API key found and looks good so far!\")\n" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "019974d9-f3ad-4a8a-b5f9-0a3719aea2d3", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"!ollama pull llama3.2\n", |
||||||
|
"MODEL = \"llama3.2\"\n", |
||||||
|
"openai = OpenAI(base_url=\"http://localhost:11434/v1\", api_key=\"ollama\")\n", |
||||||
|
"# openai = OpenAI()\n", |
||||||
|
"\n", |
||||||
|
"# If this doesn't work, try Kernel menu >> Restart Kernel and Clear Outputs Of All Cells, then run the cells from the top of this notebook down.\n", |
||||||
|
"# If it STILL doesn't work (horrors!) then please see the Troubleshooting notebook in this folder for full instructions" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "442fc84b-0815-4f40-99ab-d9a5da6bda91", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"# Let's make a quick call to a Frontier model to get started, as a preview!" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "a58394bf-1e45-46af-9bfd-01e24da6f49a", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# To give you a preview -- calling OpenAI with these messages is this easy. Any problems, head over to the Troubleshooting notebook.\n", |
||||||
|
"\n", |
||||||
|
"message = \"Hello, GPT! This is my first ever message to you! Hi!\"\n", |
||||||
|
"response = openai.chat.completions.create(model=MODEL, messages=[{\"role\":\"user\", \"content\":message}])\n", |
||||||
|
"print(response.choices[0].message.content)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "2aa190e5-cb31-456a-96cc-db109919cd78", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"## OK onwards with our first project" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "c5e793b2-6775-426a-a139-4848291d0463", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# A class to represent a Webpage\n", |
||||||
|
"# If you're not familiar with Classes, check out the \"Intermediate Python\" notebook\n", |
||||||
|
"\n", |
||||||
|
"# Some websites need you to use proper headers when fetching them:\n", |
||||||
|
"headers = {\n", |
||||||
|
" \"User-Agent\": \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/117.0.0.0 Safari/537.36\"\n", |
||||||
|
"}\n", |
||||||
|
"\n", |
||||||
|
"class Website:\n", |
||||||
|
"\n", |
||||||
|
" def __init__(self, url):\n", |
||||||
|
" \"\"\"\n", |
||||||
|
" Create this Website object from the given url using the BeautifulSoup library\n", |
||||||
|
" \"\"\"\n", |
||||||
|
" self.url = url\n", |
||||||
|
" response = requests.get(url, headers=headers)\n", |
||||||
|
" soup = BeautifulSoup(response.content, 'html.parser')\n", |
||||||
|
" self.title = soup.title.string if soup.title else \"No title found\"\n", |
||||||
|
" for irrelevant in soup.body([\"script\", \"style\", \"img\", \"input\"]):\n", |
||||||
|
" irrelevant.decompose()\n", |
||||||
|
" self.text = soup.body.get_text(separator=\"\\n\", strip=True)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "2ef960cf-6dc2-4cda-afb3-b38be12f4c97", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# Let's try one out. Change the website and add print statements to follow along.\n", |
||||||
|
"\n", |
||||||
|
"ed = Website(\"https://edwarddonner.com\")\n", |
||||||
|
"print(ed.title)\n", |
||||||
|
"print(ed.text)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "6a478a0c-2c53-48ff-869c-4d08199931e1", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"## Types of prompts\n", |
||||||
|
"\n", |
||||||
|
"You may know this already - but if not, you will get very familiar with it!\n", |
||||||
|
"\n", |
||||||
|
"Models like GPT4o have been trained to receive instructions in a particular way.\n", |
||||||
|
"\n", |
||||||
|
"They expect to receive:\n", |
||||||
|
"\n", |
||||||
|
"**A system prompt** that tells them what task they are performing and what tone they should use\n", |
||||||
|
"\n", |
||||||
|
"**A user prompt** -- the conversation starter that they should reply to" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "abdb8417-c5dc-44bc-9bee-2e059d162699", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# Define our system prompt - you can experiment with this later, changing the last sentence to 'Respond in markdown in Spanish.\"\n", |
||||||
|
"\n", |
||||||
|
"system_prompt = \"You are an assistant that analyzes the contents of a website \\\n", |
||||||
|
"and provides a short summary, ignoring text that might be navigation related. \\\n", |
||||||
|
"Respond in markdown.\"" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "f0275b1b-7cfe-4f9d-abfa-7650d378da0c", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# A function that writes a User Prompt that asks for summaries of websites:\n", |
||||||
|
"\n", |
||||||
|
"def user_prompt_for(website):\n", |
||||||
|
" user_prompt = f\"You are looking at a website titled {website.title}\"\n", |
||||||
|
" user_prompt += \"\\nThe contents of this website is as follows; \\\n", |
||||||
|
"please provide a short summary of this website in markdown. \\\n", |
||||||
|
"If it includes news or announcements, then summarize these too.\\n\\n\"\n", |
||||||
|
" user_prompt += website.text\n", |
||||||
|
" return user_prompt" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "26448ec4-5c00-4204-baec-7df91d11ff2e", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"print(user_prompt_for(ed))" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "ea211b5f-28e1-4a86-8e52-c0b7677cadcc", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"## Messages\n", |
||||||
|
"\n", |
||||||
|
"The API from OpenAI expects to receive messages in a particular structure.\n", |
||||||
|
"Many of the other APIs share this structure:\n", |
||||||
|
"\n", |
||||||
|
"```\n", |
||||||
|
"[\n", |
||||||
|
" {\"role\": \"system\", \"content\": \"system message goes here\"},\n", |
||||||
|
" {\"role\": \"user\", \"content\": \"user message goes here\"}\n", |
||||||
|
"]\n", |
||||||
|
"\n", |
||||||
|
"To give you a preview, the next 2 cells make a rather simple call - we won't stretch the might GPT (yet!)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "f25dcd35-0cd0-4235-9f64-ac37ed9eaaa5", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"messages = [\n", |
||||||
|
" {\"role\": \"system\", \"content\": \"You are a snarky assistant\"},\n", |
||||||
|
" {\"role\": \"user\", \"content\": \"What is 2 + 2?\"}\n", |
||||||
|
"]" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "21ed95c5-7001-47de-a36d-1d6673b403ce", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# To give you a preview -- calling OpenAI with system and user messages:\n", |
||||||
|
"\n", |
||||||
|
"response = openai.chat.completions.create(model=MODEL, messages=messages)\n", |
||||||
|
"print(response.choices[0].message.content)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "d06e8d78-ce4c-4b05-aa8e-17050c82bb47", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"## And now let's build useful messages for GPT-4o-mini, using a function" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "0134dfa4-8299-48b5-b444-f2a8c3403c88", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# See how this function creates exactly the format above\n", |
||||||
|
"\n", |
||||||
|
"def messages_for(website):\n", |
||||||
|
" return [\n", |
||||||
|
" {\"role\": \"system\", \"content\": system_prompt},\n", |
||||||
|
" {\"role\": \"user\", \"content\": user_prompt_for(website)}\n", |
||||||
|
" ]" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "36478464-39ee-485c-9f3f-6a4e458dbc9c", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# Try this out, and then try for a few more websites\n", |
||||||
|
"\n", |
||||||
|
"messages_for(ed)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "16f49d46-bf55-4c3e-928f-68fc0bf715b0", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"## Time to bring it together - the API for OpenAI is very simple!" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "905b9919-aba7-45b5-ae65-81b3d1d78e34", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# And now: call the OpenAI API. You will get very familiar with this!\n", |
||||||
|
"\n", |
||||||
|
"def summarize(url):\n", |
||||||
|
" website = Website(url)\n", |
||||||
|
" response = openai.chat.completions.create(\n", |
||||||
|
" model = MODEL,\n", |
||||||
|
" messages = messages_for(website)\n", |
||||||
|
" )\n", |
||||||
|
" return response.choices[0].message.content" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "05e38d41-dfa4-4b20-9c96-c46ea75d9fb5", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"summarize(\"https://edwarddonner.com\")" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "3d926d59-450e-4609-92ba-2d6f244f1342", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# A function to display this nicely in the Jupyter output, using markdown\n", |
||||||
|
"\n", |
||||||
|
"def display_summary(url):\n", |
||||||
|
" summary = summarize(url)\n", |
||||||
|
" display(Markdown(summary))" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "3018853a-445f-41ff-9560-d925d1774b2f", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"display_summary(\"https://edwarddonner.com\")" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "b3bcf6f4-adce-45e9-97ad-d9a5d7a3a624", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"# Let's try more websites\n", |
||||||
|
"\n", |
||||||
|
"Note that this will only work on websites that can be scraped using this simplistic approach.\n", |
||||||
|
"\n", |
||||||
|
"Websites that are rendered with Javascript, like React apps, won't show up. See the community-contributions folder for a Selenium implementation that gets around this. You'll need to read up on installing Selenium (ask ChatGPT!)\n", |
||||||
|
"\n", |
||||||
|
"Also Websites protected with CloudFront (and similar) may give 403 errors - many thanks Andy J for pointing this out.\n", |
||||||
|
"\n", |
||||||
|
"But many websites will work just fine!" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "45d83403-a24c-44b5-84ac-961449b4008f", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"display_summary(\"https://cnn.com\")" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "75e9fd40-b354-4341-991e-863ef2e59db7", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"display_summary(\"https://anthropic.com\")" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "c951be1a-7f1b-448f-af1f-845978e47e2c", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"<table style=\"margin: 0; text-align: left;\">\n", |
||||||
|
" <tr>\n", |
||||||
|
" <td style=\"width: 150px; height: 150px; vertical-align: middle;\">\n", |
||||||
|
" <img src=\"../business.jpg\" width=\"150\" height=\"150\" style=\"display: block;\" />\n", |
||||||
|
" </td>\n", |
||||||
|
" <td>\n", |
||||||
|
" <h2 style=\"color:#181;\">Business applications</h2>\n", |
||||||
|
" <span style=\"color:#181;\">In this exercise, you experienced calling the Cloud API of a Frontier Model (a leading model at the frontier of AI) for the first time. We will be using APIs like OpenAI at many stages in the course, in addition to building our own LLMs.\n", |
||||||
|
"\n", |
||||||
|
"More specifically, we've applied this to Summarization - a classic Gen AI use case to make a summary. This can be applied to any business vertical - summarizing the news, summarizing financial performance, summarizing a resume in a cover letter - the applications are limitless. Consider how you could apply Summarization in your business, and try prototyping a solution.</span>\n", |
||||||
|
" </td>\n", |
||||||
|
" </tr>\n", |
||||||
|
"</table>\n", |
||||||
|
"\n", |
||||||
|
"<table style=\"margin: 0; text-align: left;\">\n", |
||||||
|
" <tr>\n", |
||||||
|
" <td style=\"width: 150px; height: 150px; vertical-align: middle;\">\n", |
||||||
|
" <img src=\"../important.jpg\" width=\"150\" height=\"150\" style=\"display: block;\" />\n", |
||||||
|
" </td>\n", |
||||||
|
" <td>\n", |
||||||
|
" <h2 style=\"color:#900;\">Before you continue - now try yourself</h2>\n", |
||||||
|
" <span style=\"color:#900;\">Use the cell below to make your own simple commercial example. Stick with the summarization use case for now. Here's an idea: write something that will take the contents of an email, and will suggest an appropriate short subject line for the email. That's the kind of feature that might be built into a commercial email tool.</span>\n", |
||||||
|
" </td>\n", |
||||||
|
" </tr>\n", |
||||||
|
"</table>" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "00743dac-0e70-45b7-879a-d7293a6f68a6", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# Step 1: Create your prompts\n", |
||||||
|
"\n", |
||||||
|
"system_prompt = \"You are an office assistant helping summarize email body into an executive email subject.\"\n", |
||||||
|
"user_prompt = \"\"\"\n", |
||||||
|
" Hello world!\n", |
||||||
|
" As we build new AI programs that help us with various tasks it behooves us to think about how us humans will earn a living.\n", |
||||||
|
" What kind of value do we provide that will help us earn money? Or do we need to change the way that the society works without money and our value is determined by something else?\n", |
||||||
|
"\n", |
||||||
|
" Something to think about!!\n", |
||||||
|
"\"\"\"\n", |
||||||
|
"\n", |
||||||
|
"# Step 2: Make the messages list\n", |
||||||
|
"\n", |
||||||
|
"messages = [\n", |
||||||
|
" {\"role\": \"system\", \"content\": system_prompt},\n", |
||||||
|
" {\"role\": \"user\", \"content\": user_prompt}\n", |
||||||
|
"] # fill this ind\n", |
||||||
|
"\n", |
||||||
|
"# Step 3: Call OpenAI\n", |
||||||
|
"\n", |
||||||
|
"response = openai.chat.completions.create(\n", |
||||||
|
" model = MODEL,\n", |
||||||
|
" messages = messages\n", |
||||||
|
" )\n", |
||||||
|
"\n", |
||||||
|
"# Step 4: print the result\n", |
||||||
|
"\n", |
||||||
|
"print(response.choices[0].message.content)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "36ed9f14-b349-40e9-a42c-b367e77f8bda", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"## An extra exercise for those who enjoy web scraping\n", |
||||||
|
"\n", |
||||||
|
"You may notice that if you try `display_summary(\"https://openai.com\")` - it doesn't work! That's because OpenAI has a fancy website that uses Javascript. There are many ways around this that some of you might be familiar with. For example, Selenium is a hugely popular framework that runs a browser behind the scenes, renders the page, and allows you to query it. If you have experience with Selenium, Playwright or similar, then feel free to improve the Website class to use them. In the community-contributions folder, you'll find an example Selenium solution from a student (thank you!)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "eeab24dc-5f90-4570-b542-b0585aca3eb6", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"# Sharing your code\n", |
||||||
|
"\n", |
||||||
|
"I'd love it if you share your code afterwards so I can share it with others! You'll notice that some students have already made changes (including a Selenium implementation) which you will find in the community-contributions folder. If you'd like add your changes to that folder, submit a Pull Request with your new versions in that folder and I'll merge your changes.\n", |
||||||
|
"\n", |
||||||
|
"If you're not an expert with git (and I am not!) then GPT has given some nice instructions on how to submit a Pull Request. It's a bit of an involved process, but once you've done it once it's pretty clear. As a pro-tip: it's best if you clear the outputs of your Jupyter notebooks (Edit >> Clean outputs of all cells, and then Save) for clean notebooks.\n", |
||||||
|
"\n", |
||||||
|
"Here are good instructions courtesy of an AI friend: \n", |
||||||
|
"https://chatgpt.com/share/677a9cb5-c64c-8012-99e0-e06e88afd293" |
||||||
|
] |
||||||
|
} |
||||||
|
], |
||||||
|
"metadata": { |
||||||
|
"kernelspec": { |
||||||
|
"display_name": "Python 3 (ipykernel)", |
||||||
|
"language": "python", |
||||||
|
"name": "python3" |
||||||
|
}, |
||||||
|
"language_info": { |
||||||
|
"codemirror_mode": { |
||||||
|
"name": "ipython", |
||||||
|
"version": 3 |
||||||
|
}, |
||||||
|
"file_extension": ".py", |
||||||
|
"mimetype": "text/x-python", |
||||||
|
"name": "python", |
||||||
|
"nbconvert_exporter": "python", |
||||||
|
"pygments_lexer": "ipython3", |
||||||
|
"version": "3.11.11" |
||||||
|
} |
||||||
|
}, |
||||||
|
"nbformat": 4, |
||||||
|
"nbformat_minor": 5 |
||||||
|
} |
@ -1,522 +0,0 @@ |
|||||||
{ |
|
||||||
"cells": [ |
|
||||||
{ |
|
||||||
"cell_type": "markdown", |
|
||||||
"id": "d15d8294-3328-4e07-ad16-8a03e9bbfdb9", |
|
||||||
"metadata": {}, |
|
||||||
"source": [ |
|
||||||
"# Welcome to your first assignment!\n", |
|
||||||
"\n", |
|
||||||
"Instructions are below. Please give this a try, and look in the solutions folder if you get stuck (or feel free to ask me!)" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "markdown", |
|
||||||
"id": "ada885d9-4d42-4d9b-97f0-74fbbbfe93a9", |
|
||||||
"metadata": {}, |
|
||||||
"source": [ |
|
||||||
"<table style=\"margin: 0; text-align: left;\">\n", |
|
||||||
" <tr>\n", |
|
||||||
" <td style=\"width: 150px; height: 150px; vertical-align: middle;\">\n", |
|
||||||
" <img src=\"../resources.jpg\" width=\"150\" height=\"150\" style=\"display: block;\" />\n", |
|
||||||
" </td>\n", |
|
||||||
" <td>\n", |
|
||||||
" <h2 style=\"color:#f71;\">Just before we get to the assignment --</h2>\n", |
|
||||||
" <span style=\"color:#f71;\">I thought I'd take a second to point you at this page of useful resources for the course. This includes links to all the slides.<br/>\n", |
|
||||||
" <a href=\"https://edwarddonner.com/2024/11/13/llm-engineering-resources/\">https://edwarddonner.com/2024/11/13/llm-engineering-resources/</a><br/>\n", |
|
||||||
" Please keep this bookmarked, and I'll continue to add more useful links there over time.\n", |
|
||||||
" </span>\n", |
|
||||||
" </td>\n", |
|
||||||
" </tr>\n", |
|
||||||
"</table>" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "markdown", |
|
||||||
"id": "6e9fa1fc-eac5-4d1d-9be4-541b3f2b3458", |
|
||||||
"metadata": {}, |
|
||||||
"source": [ |
|
||||||
"# HOMEWORK EXERCISE ASSIGNMENT\n", |
|
||||||
"\n", |
|
||||||
"Upgrade the day 1 project to summarize a webpage to use an Open Source model running locally via Ollama rather than OpenAI\n", |
|
||||||
"\n", |
|
||||||
"You'll be able to use this technique for all subsequent projects if you'd prefer not to use paid APIs.\n", |
|
||||||
"\n", |
|
||||||
"**Benefits:**\n", |
|
||||||
"1. No API charges - open-source\n", |
|
||||||
"2. Data doesn't leave your box\n", |
|
||||||
"\n", |
|
||||||
"**Disadvantages:**\n", |
|
||||||
"1. Significantly less power than Frontier Model\n", |
|
||||||
"\n", |
|
||||||
"## Recap on installation of Ollama\n", |
|
||||||
"\n", |
|
||||||
"Simply visit [ollama.com](https://ollama.com) and install!\n", |
|
||||||
"\n", |
|
||||||
"Once complete, the ollama server should already be running locally. \n", |
|
||||||
"If you visit: \n", |
|
||||||
"[http://localhost:11434/](http://localhost:11434/)\n", |
|
||||||
"\n", |
|
||||||
"You should see the message `Ollama is running`. \n", |
|
||||||
"\n", |
|
||||||
"If not, bring up a new Terminal (Mac) or Powershell (Windows) and enter `ollama serve` \n", |
|
||||||
"And in another Terminal (Mac) or Powershell (Windows), enter `ollama pull llama3.2` \n", |
|
||||||
"Then try [http://localhost:11434/](http://localhost:11434/) again.\n", |
|
||||||
"\n", |
|
||||||
"If Ollama is slow on your machine, try using `llama3.2:1b` as an alternative. Run `ollama pull llama3.2:1b` from a Terminal or Powershell, and change the code below from `MODEL = \"llama3.2\"` to `MODEL = \"llama3.2:1b\"`" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 2, |
|
||||||
"id": "4e2a9393-7767-488e-a8bf-27c12dca35bd", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [], |
|
||||||
"source": [ |
|
||||||
"# imports\n", |
|
||||||
"\n", |
|
||||||
"import requests\n", |
|
||||||
"from bs4 import BeautifulSoup\n", |
|
||||||
"from IPython.display import Markdown, display" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "raw", |
|
||||||
"id": "07e106bd-10c5-4365-b85b-397b5f059656", |
|
||||||
"metadata": {}, |
|
||||||
"source": [ |
|
||||||
"# Constants\n", |
|
||||||
"\n", |
|
||||||
"OLLAMA_API = \"http://localhost:11434/api/chat\"\n", |
|
||||||
"HEADERS = {\"Content-Type\": \"application/json\"}\n", |
|
||||||
"MODEL = \"llama3.2\"" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 5, |
|
||||||
"id": "dac0a679-599c-441f-9bf2-ddc73d35b940", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [], |
|
||||||
"source": [ |
|
||||||
"# Create a messages list using the same format that we used for OpenAI\n", |
|
||||||
"\n", |
|
||||||
"messages = [\n", |
|
||||||
" {\"role\": \"user\", \"content\": \"Describe some of the business applications of Generative AI\"}\n", |
|
||||||
"]" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 6, |
|
||||||
"id": "7bb9c624-14f0-4945-a719-8ddb64f66f47", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [], |
|
||||||
"source": [ |
|
||||||
"payload = {\n", |
|
||||||
" \"model\": MODEL,\n", |
|
||||||
" \"messages\": messages,\n", |
|
||||||
" \"stream\": False\n", |
|
||||||
" }" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 7, |
|
||||||
"id": "42b9f644-522d-4e05-a691-56e7658c0ea9", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [ |
|
||||||
{ |
|
||||||
"name": "stdout", |
|
||||||
"output_type": "stream", |
|
||||||
"text": [ |
|
||||||
"Generative AI (Artificial Intelligence) has numerous business applications across various industries. Here are some examples:\n", |
|
||||||
"\n", |
|
||||||
"1. **Content Generation**: Generative AI can create high-quality content such as articles, social media posts, product descriptions, and more. This can help businesses save time and resources on content creation.\n", |
|
||||||
"2. **Product Design**: Generative AI can be used to design new products, such as fashion items, jewelry, or electronics. It can also generate 3D models and prototypes, reducing the need for manual design and prototyping.\n", |
|
||||||
"3. **Image and Video Generation**: Generative AI can create realistic images and videos that can be used in marketing campaigns, advertising, and social media. This can help businesses create engaging visual content without requiring extensive photography or videography skills.\n", |
|
||||||
"4. **Chatbots and Virtual Assistants**: Generative AI can power chatbots and virtual assistants that provide customer support, answer frequently asked questions, and even engage in basic conversations.\n", |
|
||||||
"5. **Predictive Maintenance**: Generative AI can analyze sensor data from machines and predict when maintenance is needed, reducing downtime and increasing efficiency.\n", |
|
||||||
"6. **Personalized Recommendations**: Generative AI can analyze customer behavior and preferences to generate personalized product recommendations, improving the overall shopping experience.\n", |
|
||||||
"7. **Customer Segmentation**: Generative AI can help businesses segment their customers based on their behavior, demographics, and preferences, enabling targeted marketing campaigns.\n", |
|
||||||
"8. **Automated Writing Assistance**: Generative AI can assist writers with ideas, suggestions, and even full-text writing, helping to boost productivity and creativity.\n", |
|
||||||
"9. **Data Analysis and Visualization**: Generative AI can analyze large datasets and generate insights, visualizations, and predictions that can inform business decisions.\n", |
|
||||||
"10. **Creative Collaboration**: Generative AI can collaborate with human creatives, such as artists, designers, and writers, to generate new ideas, concepts, and content.\n", |
|
||||||
"\n", |
|
||||||
"Some specific industries where Generative AI is being applied include:\n", |
|
||||||
"\n", |
|
||||||
"1. **Marketing and Advertising**: generating personalized ads, content, and messaging.\n", |
|
||||||
"2. **Finance and Banking**: automating financial analysis, risk assessment, and customer service.\n", |
|
||||||
"3. **Healthcare**: generating medical images, analyzing patient data, and predicting disease outcomes.\n", |
|
||||||
"4. **Manufacturing and Supply Chain**: optimizing production workflows, predicting demand, and identifying potential bottlenecks.\n", |
|
||||||
"5. **Education**: creating personalized learning experiences, grading assignments, and developing educational content.\n", |
|
||||||
"\n", |
|
||||||
"These are just a few examples of the many business applications of Generative AI. As the technology continues to evolve, we can expect to see even more innovative uses across various industries.\n" |
|
||||||
] |
|
||||||
} |
|
||||||
], |
|
||||||
"source": [ |
|
||||||
"response = requests.post(OLLAMA_API, json=payload, headers=HEADERS)\n", |
|
||||||
"print(response.json()['message']['content'])" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "markdown", |
|
||||||
"id": "6a021f13-d6a1-4b96-8e18-4eae49d876fe", |
|
||||||
"metadata": {}, |
|
||||||
"source": [ |
|
||||||
"# Introducing the ollama package\n", |
|
||||||
"\n", |
|
||||||
"And now we'll do the same thing, but using the elegant ollama python package instead of a direct HTTP call.\n", |
|
||||||
"\n", |
|
||||||
"Under the hood, it's making the same call as above to the ollama server running at localhost:11434" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 8, |
|
||||||
"id": "7745b9c4-57dc-4867-9180-61fa5db55eb8", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [ |
|
||||||
{ |
|
||||||
"name": "stdout", |
|
||||||
"output_type": "stream", |
|
||||||
"text": [ |
|
||||||
"Generative AI has numerous business applications across various industries. Here are some examples:\n", |
|
||||||
"\n", |
|
||||||
"1. **Content Generation**: Generative AI can be used to generate high-quality content such as articles, social media posts, product descriptions, and more. This can save time and resources for businesses that need to produce a large volume of content.\n", |
|
||||||
"2. **Product Design**: Generative AI can be used to design new products, such as furniture, electronics, and other consumer goods. It can also help optimize product designs by generating multiple versions and selecting the most suitable one based on various criteria.\n", |
|
||||||
"3. **Marketing Automation**: Generative AI can be used to create personalized marketing campaigns, such as email marketing automation, social media ads, and more. This can help businesses tailor their marketing efforts to specific customer segments and improve engagement rates.\n", |
|
||||||
"4. **Image and Video Editing**: Generative AI can be used to edit images and videos, such as removing background noise, correcting color casts, and enhancing video quality. This can save time and resources for businesses that need to create high-quality visual content.\n", |
|
||||||
"5. **Chatbots and Virtual Assistants**: Generative AI can be used to create chatbots and virtual assistants that can understand natural language and respond accordingly. This can help businesses provide better customer service and improve user experience.\n", |
|
||||||
"6. **Predictive Analytics**: Generative AI can be used to analyze large datasets and generate predictive models that can forecast future trends and behaviors. This can help businesses make data-driven decisions and stay ahead of the competition.\n", |
|
||||||
"7. **Customer Segmentation**: Generative AI can be used to segment customers based on their behavior, demographics, and preferences. This can help businesses tailor their marketing efforts and improve customer engagement.\n", |
|
||||||
"8. **Language Translation**: Generative AI can be used to translate languages in real-time, which can help businesses communicate with international clients and customers more effectively.\n", |
|
||||||
"9. **Music Composition**: Generative AI can be used to compose music for various applications such as advertising, film scoring, and video game soundtracks.\n", |
|
||||||
"10. **Financial Modeling**: Generative AI can be used to create financial models that can predict future revenue streams, costs, and other financial metrics. This can help businesses make more accurate predictions and inform better investment decisions.\n", |
|
||||||
"\n", |
|
||||||
"Some of the industries that are already leveraging generative AI include:\n", |
|
||||||
"\n", |
|
||||||
"* E-commerce\n", |
|
||||||
"* Healthcare\n", |
|
||||||
"* Finance\n", |
|
||||||
"* Marketing\n", |
|
||||||
"* Education\n", |
|
||||||
"* Entertainment\n", |
|
||||||
"* Manufacturing\n", |
|
||||||
"\n", |
|
||||||
"These applications have the potential to transform various business processes, improve customer experiences, and drive innovation in various sectors.\n" |
|
||||||
] |
|
||||||
} |
|
||||||
], |
|
||||||
"source": [ |
|
||||||
"import ollama\n", |
|
||||||
"\n", |
|
||||||
"response = ollama.chat(model=MODEL, messages=messages)\n", |
|
||||||
"print(response['message']['content'])" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "markdown", |
|
||||||
"id": "a4704e10-f5fb-4c15-a935-f046c06fb13d", |
|
||||||
"metadata": {}, |
|
||||||
"source": [ |
|
||||||
"## Alternative approach - using OpenAI python library to connect to Ollama" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 9, |
|
||||||
"id": "23057e00-b6fc-4678-93a9-6b31cb704bff", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [ |
|
||||||
{ |
|
||||||
"name": "stdout", |
|
||||||
"output_type": "stream", |
|
||||||
"text": [ |
|
||||||
"Generative AI has numerous business applications across various industries, transforming the way companies operate, create products, and interact with customers. Some key applications include:\n", |
|
||||||
"\n", |
|
||||||
"1. **Content Generation**: Automate content creation for marketing materials, such as blog posts, product descriptions, social media posts, and more, using Generative AI-powered tools.\n", |
|
||||||
"2. **Product Design and Prototyping**: Use Generative AI to design new products, furniture, or other innovative solutions, reducing design time and costs while increasing creativity.\n", |
|
||||||
"3. **Customer Experience (CX) Tools**: Leverage Generative AI to create personalized customer experiences, such as chatbots that can respond to customer queries and provide tailored recommendations.\n", |
|
||||||
"4. **Predictive Maintenance**: Use Generative AI to analyze sensor data, identify potential issues, and predict maintenance needs for equipment, reducing downtime and increasing overall efficiency.\n", |
|
||||||
"5. **Personalized Marketing**: Use Generative AI to create targeted marketing campaigns based on individual customer preferences, behaviors, and demographics.\n", |
|
||||||
"6. **Content Optimization**: Utilize Generative AI to optimize content for better performance in search engine results pages (SERPs), ensuring improved visibility and traffic.\n", |
|
||||||
"7. **Brand Storytelling**: Automate the creation of brand stories, taglines, and overall brand narrative using Generative AI-powered tools.\n", |
|
||||||
"8. **Financial Modeling and Forecasting**: Use Generative AI to create financial models, forecasts, and predictions for businesses, helping them make data-driven decisions.\n", |
|
||||||
"9. **Supply Chain Optimization**: Leverage Generative AI to optimize supply chain operations, predicting demand, reducing inventory levels, and streamlining logistics.\n", |
|
||||||
"10. **Automated Transcription and Translation**: Use Generative AI to automate the transcription of audio and video files into written text, as well as translate materials across languages.\n", |
|
||||||
"11. **Digital Asset Management**: Utilize Generative AI to manage digital assets, such as images, videos, and documents, and automatically generate metadata for easy search and retrieval.\n", |
|
||||||
"12. **Chatbots and Virtual Assistants**: Create more advanced chatbots using Generative AI that can understand context, emotions, and intent, providing better customer service experiences.\n", |
|
||||||
"\n", |
|
||||||
"In healthcare, Generative AI is being applied to:\n", |
|
||||||
"\n", |
|
||||||
"1. Medical Imaging Analysis\n", |
|
||||||
"2. Personalized Medicine\n", |
|
||||||
"3. Patient Data Analysis\n", |
|
||||||
"\n", |
|
||||||
"In education, Generative AI is used in:\n", |
|
||||||
"\n", |
|
||||||
"1. Adaptive Learning Systems\n", |
|
||||||
"2. Automated Grading and Feedback\n", |
|
||||||
"\n", |
|
||||||
"Generative AI has numerous applications across various industries, from creative content generation to predictive maintenance and supply chain optimization.\n", |
|
||||||
"\n", |
|
||||||
"Keep in mind that these are just a few examples of the many business applications of Generative AI as this technology continues to evolve at a rapid pace.\n" |
|
||||||
] |
|
||||||
} |
|
||||||
], |
|
||||||
"source": [ |
|
||||||
"# There's actually an alternative approach that some people might prefer\n", |
|
||||||
"# You can use the OpenAI client python library to call Ollama:\n", |
|
||||||
"\n", |
|
||||||
"from openai import OpenAI\n", |
|
||||||
"ollama_via_openai = OpenAI(base_url='http://localhost:11434/v1', api_key='ollama')\n", |
|
||||||
"\n", |
|
||||||
"response = ollama_via_openai.chat.completions.create(\n", |
|
||||||
" model=MODEL,\n", |
|
||||||
" messages=messages\n", |
|
||||||
")\n", |
|
||||||
"\n", |
|
||||||
"print(response.choices[0].message.content)" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "markdown", |
|
||||||
"id": "1622d9bb-5c68-4d4e-9ca4-b492c751f898", |
|
||||||
"metadata": {}, |
|
||||||
"source": [ |
|
||||||
"# NOW the exercise for you\n", |
|
||||||
"\n", |
|
||||||
"Take the code from day1 and incorporate it here, to build a website summarizer that uses Llama 3.2 running locally instead of OpenAI; use either of the above approaches." |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 28, |
|
||||||
"id": "de923314-a427-4199-b1f9-0e60f85114c3", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [], |
|
||||||
"source": [ |
|
||||||
"# imports\n", |
|
||||||
"\n", |
|
||||||
"import requests\n", |
|
||||||
"from bs4 import BeautifulSoup\n", |
|
||||||
"from IPython.display import Markdown, display\n", |
|
||||||
"\n", |
|
||||||
"# A class to represent a Webpage\n", |
|
||||||
"# If you're not familiar with Classes, check out the \"Intermediate Python\" notebook\n", |
|
||||||
"\n", |
|
||||||
"# Some websites need you to use proper headers when fetching them:\n", |
|
||||||
"headers = {\n", |
|
||||||
" \"User-Agent\": \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/117.0.0.0 Safari/537.36\"\n", |
|
||||||
"}\n", |
|
||||||
"\n", |
|
||||||
"class Website:\n", |
|
||||||
"\n", |
|
||||||
" def __init__(self, url):\n", |
|
||||||
" \"\"\"\n", |
|
||||||
" Create this Website object from the given url using the BeautifulSoup library\n", |
|
||||||
" \"\"\"\n", |
|
||||||
" self.url = url\n", |
|
||||||
" response = requests.get(url, headers=headers)\n", |
|
||||||
" soup = BeautifulSoup(response.content, 'html.parser')\n", |
|
||||||
" self.title = soup.title.string if soup.title else \"No title found\"\n", |
|
||||||
" for irrelevant in soup.body([\"script\", \"style\", \"img\", \"input\"]):\n", |
|
||||||
" irrelevant.decompose()\n", |
|
||||||
" self.text = soup.body.get_text(separator=\"\\n\", strip=True)" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 31, |
|
||||||
"id": "0cedada6-adc6-40dc-bdf3-bc8a3b6b3826", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [ |
|
||||||
{ |
|
||||||
"name": "stdout", |
|
||||||
"output_type": "stream", |
|
||||||
"text": [ |
|
||||||
"Home\n", |
|
||||||
"Outsmart\n", |
|
||||||
"An arena that pits LLMs against each other in a battle of diplomacy and deviousness\n", |
|
||||||
"About\n", |
|
||||||
"Posts\n", |
|
||||||
"Well, hi there.\n", |
|
||||||
"I’m Ed. I like writing code and experimenting with LLMs, and hopefully you’re here because you do too. I also enjoy DJing (but I’m badly out of practice), amateur electronic music production (\n", |
|
||||||
"very\n", |
|
||||||
"amateur) and losing myself in\n", |
|
||||||
"Hacker News\n", |
|
||||||
", nodding my head sagely to things I only half understand.\n", |
|
||||||
"I’m the co-founder and CTO of\n", |
|
||||||
"Nebula.io\n", |
|
||||||
". We’re applying AI to a field where it can make a massive, positive impact: helping people discover their potential and pursue their reason for being. Recruiters use our product today to source, understand, engage and manage talent. I’m previously the founder and CEO of AI startup untapt,\n", |
|
||||||
"acquired in 2021\n", |
|
||||||
".\n", |
|
||||||
"We work with groundbreaking, proprietary LLMs verticalized for talent, we’ve\n", |
|
||||||
"patented\n", |
|
||||||
"our matching model, and our award-winning platform has happy customers and tons of press coverage.\n", |
|
||||||
"Connect\n", |
|
||||||
"with me for more!\n", |
|
||||||
"November 13, 2024\n", |
|
||||||
"Mastering AI and LLM Engineering – Resources\n", |
|
||||||
"October 16, 2024\n", |
|
||||||
"From Software Engineer to AI Data Scientist – resources\n", |
|
||||||
"August 6, 2024\n", |
|
||||||
"Outsmart LLM Arena – a battle of diplomacy and deviousness\n", |
|
||||||
"June 26, 2024\n", |
|
||||||
"Choosing the Right LLM: Toolkit and Resources\n", |
|
||||||
"Navigation\n", |
|
||||||
"Home\n", |
|
||||||
"Outsmart\n", |
|
||||||
"An arena that pits LLMs against each other in a battle of diplomacy and deviousness\n", |
|
||||||
"About\n", |
|
||||||
"Posts\n", |
|
||||||
"Get in touch\n", |
|
||||||
"ed [at] edwarddonner [dot] com\n", |
|
||||||
"www.edwarddonner.com\n", |
|
||||||
"Follow me\n", |
|
||||||
"LinkedIn\n", |
|
||||||
"Twitter\n", |
|
||||||
"Facebook\n", |
|
||||||
"Subscribe to newsletter\n", |
|
||||||
"Type your email…\n", |
|
||||||
"Subscribe\n" |
|
||||||
] |
|
||||||
} |
|
||||||
], |
|
||||||
"source": [ |
|
||||||
"# Let's try one out. Change the website and add print statements to follow along.\n", |
|
||||||
"\n", |
|
||||||
"web_res = Website(\"https://edwarddonner.com\")\n", |
|
||||||
"print(web_res.text)" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 11, |
|
||||||
"id": "64d26055-756b-4095-a1d1-298fdf4fd8f1", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [], |
|
||||||
"source": [ |
|
||||||
"\n", |
|
||||||
"# Constants\n", |
|
||||||
"\n", |
|
||||||
"OLLAMA_API = \"http://localhost:11434/api/chat\"\n", |
|
||||||
"HEADERS = {\"Content-Type\": \"application/json\"}\n", |
|
||||||
"MODEL = \"llama3.2\"\n" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 52, |
|
||||||
"id": "65b08550-7506-415f-8612-e2395d6e145d", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [], |
|
||||||
"source": [ |
|
||||||
"\n", |
|
||||||
"# Define our system prompt - you can experiment with this later, changing the last sentence to 'Respond in markdown in Spanish.\"\n", |
|
||||||
"\n", |
|
||||||
"system_prompt = \"You are an helper that assist user to provide crisp summary\\\n", |
|
||||||
"of the website they pass in, respond with key points\"\n", |
|
||||||
"\n", |
|
||||||
"# A function that writes a User Prompt that asks for summaries of websites:\n", |
|
||||||
"\n", |
|
||||||
"def user_prompt_for(website):\n", |
|
||||||
" user_prompt = f\"You are looking at a website titled {website.title}\"\n", |
|
||||||
" user_prompt += \"\\nThe contents of this website is as follows; \\\n", |
|
||||||
"please provide a short summary of this website in markdown. \\\n", |
|
||||||
"If it includes news or announcements, then summarize these too with start bulletin.\\n\\n\"\n", |
|
||||||
" user_prompt += website.text\n", |
|
||||||
" return user_prompt\n" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 33, |
|
||||||
"id": "36a0a2d0-f07a-40ac-a065-b713cdd5c028", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [], |
|
||||||
"source": [ |
|
||||||
"# See how this function creates exactly the format above\n", |
|
||||||
"\n", |
|
||||||
"def messages_for(website):\n", |
|
||||||
" return [\n", |
|
||||||
" {\"role\": \"system\", \"content\": system_prompt},\n", |
|
||||||
" {\"role\": \"user\", \"content\": user_prompt_for(website)}\n", |
|
||||||
" ]\n" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 50, |
|
||||||
"id": "8c2b20ea-6a8e-41c9-be3b-f24a5b29e8de", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [], |
|
||||||
"source": [ |
|
||||||
"#website search\n", |
|
||||||
"\n", |
|
||||||
"web_msg=Website(\"https://www.cricbuzz.com/cricket-match-squads/91796/aus-vs-ind-3rd-test-india-tour-of-australia-2024-25\")\n", |
|
||||||
"messages=messages_for(web_msg)\n", |
|
||||||
"\n", |
|
||||||
"payload = {\n", |
|
||||||
" \"model\": MODEL,\n", |
|
||||||
" \"messages\": messages,\n", |
|
||||||
" \"stream\": False\n", |
|
||||||
" }" |
|
||||||
] |
|
||||||
}, |
|
||||||
{ |
|
||||||
"cell_type": "code", |
|
||||||
"execution_count": 54, |
|
||||||
"id": "e5636b3b-7763-4f9c-ab18-88aa25b50de6", |
|
||||||
"metadata": {}, |
|
||||||
"outputs": [ |
|
||||||
{ |
|
||||||
"name": "stdout", |
|
||||||
"output_type": "stream", |
|
||||||
"text": [ |
|
||||||
"**Summary of the Website**\n", |
|
||||||
"=========================\n", |
|
||||||
"\n", |
|
||||||
"* The website provides live updates and information about the 3rd Test match between Australia and India as part of India's tour of Australia in the 2024-25 season.\n", |
|
||||||
"* It includes news, scores, stats, and analysis from the match.\n", |
|
||||||
"* The website is affiliated with Cricbuzz.com, a popular online cricket platform.\n", |
|
||||||
"\n", |
|
||||||
"**News and Announcements**\n", |
|
||||||
"==========================\n", |
|
||||||
"\n", |
|
||||||
"* **Rashid Khan to miss the rest of the series**: Australian all-rounder Mitchell Marsh's teammate Rashid Khan has been ruled out of the remaining Tests due to a knee injury.\n", |
|
||||||
"* **Bumrah to feature in the third Test**: Indian fast bowler Jasprit Bumrah is expected to return for the third Test, which starts on January 5 at the Sydney Cricket Ground.\n" |
|
||||||
] |
|
||||||
} |
|
||||||
], |
|
||||||
"source": [ |
|
||||||
"#Using Ollama to run it in the local\n", |
|
||||||
"response = requests.post(OLLAMA_API, json=payload, headers=HEADERS)\n", |
|
||||||
"print(response.json()['message']['content'])" |
|
||||||
] |
|
||||||
} |
|
||||||
], |
|
||||||
"metadata": { |
|
||||||
"kernelspec": { |
|
||||||
"display_name": "Python 3 (ipykernel)", |
|
||||||
"language": "python", |
|
||||||
"name": "python3" |
|
||||||
}, |
|
||||||
"language_info": { |
|
||||||
"codemirror_mode": { |
|
||||||
"name": "ipython", |
|
||||||
"version": 3 |
|
||||||
}, |
|
||||||
"file_extension": ".py", |
|
||||||
"mimetype": "text/x-python", |
|
||||||
"name": "python", |
|
||||||
"nbconvert_exporter": "python", |
|
||||||
"pygments_lexer": "ipython3", |
|
||||||
"version": "3.11.11" |
|
||||||
} |
|
||||||
}, |
|
||||||
"nbformat": 4, |
|
||||||
"nbformat_minor": 5 |
|
||||||
} |
|
@ -0,0 +1,326 @@ |
|||||||
|
{ |
||||||
|
"cells": [ |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "d15d8294-3328-4e07-ad16-8a03e9bbfdb9", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"# Welcome to your first assignment!\n", |
||||||
|
"\n", |
||||||
|
"Instructions are below. Please give this a try, and look in the solutions folder if you get stuck (or feel free to ask me!)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "ada885d9-4d42-4d9b-97f0-74fbbbfe93a9", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"<table style=\"margin: 0; text-align: left;\">\n", |
||||||
|
" <tr>\n", |
||||||
|
" <td style=\"width: 150px; height: 150px; vertical-align: middle;\">\n", |
||||||
|
" <img src=\"../resources.jpg\" width=\"150\" height=\"150\" style=\"display: block;\" />\n", |
||||||
|
" </td>\n", |
||||||
|
" <td>\n", |
||||||
|
" <h2 style=\"color:#f71;\">Just before we get to the assignment --</h2>\n", |
||||||
|
" <span style=\"color:#f71;\">I thought I'd take a second to point you at this page of useful resources for the course. This includes links to all the slides.<br/>\n", |
||||||
|
" <a href=\"https://edwarddonner.com/2024/11/13/llm-engineering-resources/\">https://edwarddonner.com/2024/11/13/llm-engineering-resources/</a><br/>\n", |
||||||
|
" Please keep this bookmarked, and I'll continue to add more useful links there over time.\n", |
||||||
|
" </span>\n", |
||||||
|
" </td>\n", |
||||||
|
" </tr>\n", |
||||||
|
"</table>" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "6e9fa1fc-eac5-4d1d-9be4-541b3f2b3458", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"# HOMEWORK EXERCISE ASSIGNMENT\n", |
||||||
|
"\n", |
||||||
|
"Upgrade the day 1 project to summarize a webpage to use an Open Source model running locally via Ollama rather than OpenAI\n", |
||||||
|
"\n", |
||||||
|
"You'll be able to use this technique for all subsequent projects if you'd prefer not to use paid APIs.\n", |
||||||
|
"\n", |
||||||
|
"**Benefits:**\n", |
||||||
|
"1. No API charges - open-source\n", |
||||||
|
"2. Data doesn't leave your box\n", |
||||||
|
"\n", |
||||||
|
"**Disadvantages:**\n", |
||||||
|
"1. Significantly less power than Frontier Model\n", |
||||||
|
"\n", |
||||||
|
"## Recap on installation of Ollama\n", |
||||||
|
"\n", |
||||||
|
"Simply visit [ollama.com](https://ollama.com) and install!\n", |
||||||
|
"\n", |
||||||
|
"Once complete, the ollama server should already be running locally. \n", |
||||||
|
"If you visit: \n", |
||||||
|
"[http://localhost:11434/](http://localhost:11434/)\n", |
||||||
|
"\n", |
||||||
|
"You should see the message `Ollama is running`. \n", |
||||||
|
"\n", |
||||||
|
"If not, bring up a new Terminal (Mac) or Powershell (Windows) and enter `ollama serve` \n", |
||||||
|
"And in another Terminal (Mac) or Powershell (Windows), enter `ollama pull llama3.2` \n", |
||||||
|
"Then try [http://localhost:11434/](http://localhost:11434/) again.\n", |
||||||
|
"\n", |
||||||
|
"If Ollama is slow on your machine, try using `llama3.2:1b` as an alternative. Run `ollama pull llama3.2:1b` from a Terminal or Powershell, and change the code below from `MODEL = \"llama3.2\"` to `MODEL = \"llama3.2:1b\"`" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "4e2a9393-7767-488e-a8bf-27c12dca35bd", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# imports\n", |
||||||
|
"\n", |
||||||
|
"import requests\n", |
||||||
|
"from bs4 import BeautifulSoup\n", |
||||||
|
"from IPython.display import Markdown, display" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "29ddd15d-a3c5-4f4e-a678-873f56162724", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# Constants\n", |
||||||
|
"\n", |
||||||
|
"OLLAMA_API = \"http://localhost:11434/api/chat\"\n", |
||||||
|
"HEADERS = {\"Content-Type\": \"application/json\"}\n", |
||||||
|
"MODEL = \"llama3.2\"" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "dac0a679-599c-441f-9bf2-ddc73d35b940", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# Create a messages list using the same format that we used for OpenAI\n", |
||||||
|
"\n", |
||||||
|
"messages = [\n", |
||||||
|
" {\"role\": \"user\", \"content\": \"Describe some of the business applications of Generative AI\"}\n", |
||||||
|
"]" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "7bb9c624-14f0-4945-a719-8ddb64f66f47", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"payload = {\n", |
||||||
|
" \"model\": MODEL,\n", |
||||||
|
" \"messages\": messages,\n", |
||||||
|
" \"stream\": False\n", |
||||||
|
" }" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "479ff514-e8bd-4985-a572-2ea28bb4fa40", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# Let's just make sure the model is loaded\n", |
||||||
|
"\n", |
||||||
|
"!ollama pull llama3.2" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "42b9f644-522d-4e05-a691-56e7658c0ea9", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# If this doesn't work for any reason, try the 2 versions in the following cells\n", |
||||||
|
"# And double check the instructions in the 'Recap on installation of Ollama' at the top of this lab\n", |
||||||
|
"# And if none of that works - contact me!\n", |
||||||
|
"\n", |
||||||
|
"response = requests.post(OLLAMA_API, json=payload, headers=HEADERS)\n", |
||||||
|
"print(response.json()['message']['content'])" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "6a021f13-d6a1-4b96-8e18-4eae49d876fe", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"# Introducing the ollama package\n", |
||||||
|
"\n", |
||||||
|
"And now we'll do the same thing, but using the elegant ollama python package instead of a direct HTTP call.\n", |
||||||
|
"\n", |
||||||
|
"Under the hood, it's making the same call as above to the ollama server running at localhost:11434" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "7745b9c4-57dc-4867-9180-61fa5db55eb8", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"import ollama\n", |
||||||
|
"\n", |
||||||
|
"response = ollama.chat(model=MODEL, messages=messages)\n", |
||||||
|
"print(response['message']['content'])" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "a4704e10-f5fb-4c15-a935-f046c06fb13d", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"## Alternative approach - using OpenAI python library to connect to Ollama" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "23057e00-b6fc-4678-93a9-6b31cb704bff", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# There's actually an alternative approach that some people might prefer\n", |
||||||
|
"# You can use the OpenAI client python library to call Ollama:\n", |
||||||
|
"\n", |
||||||
|
"from openai import OpenAI\n", |
||||||
|
"ollama_via_openai = OpenAI(base_url='http://localhost:11434/v1', api_key='ollama')\n", |
||||||
|
"\n", |
||||||
|
"response = ollama_via_openai.chat.completions.create(\n", |
||||||
|
" model=MODEL,\n", |
||||||
|
" messages=messages\n", |
||||||
|
")\n", |
||||||
|
"\n", |
||||||
|
"print(response.choices[0].message.content)" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "markdown", |
||||||
|
"id": "1622d9bb-5c68-4d4e-9ca4-b492c751f898", |
||||||
|
"metadata": {}, |
||||||
|
"source": [ |
||||||
|
"# NOW the exercise for you\n", |
||||||
|
"\n", |
||||||
|
"Take the code from day1 and incorporate it here, to build a website summarizer that uses Llama 3.2 running locally instead of OpenAI; use either of the above approaches." |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "402d5686-4e76-4110-b65a-b3906c35c0a4", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"# imports\n", |
||||||
|
"\n", |
||||||
|
"import ollama\n", |
||||||
|
"import requests\n", |
||||||
|
"from bs4 import BeautifulSoup\n", |
||||||
|
"from IPython.display import Markdown, display\n", |
||||||
|
"from selenium import webdriver\n", |
||||||
|
"from selenium.webdriver.chrome.service import Service\n", |
||||||
|
"from selenium.webdriver.common.by import By\n", |
||||||
|
"from selenium.webdriver.chrome.options import Options\n", |
||||||
|
"from openai import OpenAI\n", |
||||||
|
"\n", |
||||||
|
"#!ollama pull llama3.2\n", |
||||||
|
"MODEL = \"llama3.2\"\n", |
||||||
|
"openai = OpenAI(base_url=\"http://localhost:11434/v1\", api_key=\"ollama\")\n" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "cca8ae91-ad1e-4239-951f-e1376a5ec934", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [ |
||||||
|
"headers = {\n", |
||||||
|
" \"User-Agent\": \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/117.0.0.0 Safari/537.36\"\n", |
||||||
|
"}\n", |
||||||
|
"\n", |
||||||
|
"PATH_TO_CHROME_DRIVER = \"/path/to/chromedriver\"\n", |
||||||
|
"\n", |
||||||
|
"class Website:\n", |
||||||
|
"\n", |
||||||
|
" def __init__(self, url):\n", |
||||||
|
" self.url = url\n", |
||||||
|
"\n", |
||||||
|
" options = Options()\n", |
||||||
|
"\n", |
||||||
|
" options.add_argument(\"--no-sandbox\")\n", |
||||||
|
" options.add_argument(\"--disable-dev-shm-usage\")\n", |
||||||
|
"\n", |
||||||
|
" service = Service(PATH_TO_CHROME_DRIVER)\n", |
||||||
|
" driver = webdriver.Chrome(service=service, options=options)\n", |
||||||
|
" driver.get(url)\n", |
||||||
|
"\n", |
||||||
|
" input(\"Please complete the verification in the browser and press Enter to continue...\")\n", |
||||||
|
" page_source = driver.page_source\n", |
||||||
|
" driver.quit()\n", |
||||||
|
" \n", |
||||||
|
" soup = BeautifulSoup(page_source, 'html.parser')\n", |
||||||
|
" self.title = soup.title.string if soup.title else \"No title found\"\n", |
||||||
|
" for irrelevant in soup.body([\"script\", \"style\", \"img\", \"input\"]):\n", |
||||||
|
" irrelevant.decompose()\n", |
||||||
|
" self.text = soup.get_text(separator=\"\\n\", strip=True)\n", |
||||||
|
"\n", |
||||||
|
"def messages_for(website):\n", |
||||||
|
" return [{\"role\":\"system\", \"content\": \"You are a technology trainer, please read the content provided and highlight the key points in less than 200 words.\"},\n", |
||||||
|
" {\"role\":\"user\", \"content\":website.text}]\n", |
||||||
|
"\n", |
||||||
|
"def summarize(url):\n", |
||||||
|
" website = Website(url)\n", |
||||||
|
" response = openai.chat.completions.create(\n", |
||||||
|
" model = MODEL,\n", |
||||||
|
" messages = messages_for(website)\n", |
||||||
|
" )\n", |
||||||
|
" return response.choices[0].message.content\n", |
||||||
|
"\n", |
||||||
|
"def display_summary(url):\n", |
||||||
|
" summary = summarize(url)\n", |
||||||
|
" display(Markdown(summary))\n", |
||||||
|
" \n", |
||||||
|
"display_summary(\"https://martinfowler.com/articles/serverless.html\")" |
||||||
|
] |
||||||
|
}, |
||||||
|
{ |
||||||
|
"cell_type": "code", |
||||||
|
"execution_count": null, |
||||||
|
"id": "09f59679-22ff-46c4-a736-7309a6ca4365", |
||||||
|
"metadata": {}, |
||||||
|
"outputs": [], |
||||||
|
"source": [] |
||||||
|
} |
||||||
|
], |
||||||
|
"metadata": { |
||||||
|
"kernelspec": { |
||||||
|
"display_name": "Python 3 (ipykernel)", |
||||||
|
"language": "python", |
||||||
|
"name": "python3" |
||||||
|
}, |
||||||
|
"language_info": { |
||||||
|
"codemirror_mode": { |
||||||
|
"name": "ipython", |
||||||
|
"version": 3 |
||||||
|
}, |
||||||
|
"file_extension": ".py", |
||||||
|
"mimetype": "text/x-python", |
||||||
|
"name": "python", |
||||||
|
"nbconvert_exporter": "python", |
||||||
|
"pygments_lexer": "ipython3", |
||||||
|
"version": "3.11.11" |
||||||
|
} |
||||||
|
}, |
||||||
|
"nbformat": 4, |
||||||
|
"nbformat_minor": 5 |
||||||
|
} |
Loading…
Reference in new issue