From the uDemy course on LLM engineering.
https://www.udemy.com/course/llm-engineering-master-ai-and-large-language-models
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
450 lines
20 KiB
450 lines
20 KiB
{ |
|
"cells": [ |
|
{ |
|
"cell_type": "markdown", |
|
"id": "d15d8294-3328-4e07-ad16-8a03e9bbfdb9", |
|
"metadata": {}, |
|
"source": [ |
|
"# Welcome to your first assignment!\n", |
|
"\n", |
|
"Instructions are below. Please give this a try, and look in the solutions folder if you get stuck (or feel free to ask me!)" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "ada885d9-4d42-4d9b-97f0-74fbbbfe93a9", |
|
"metadata": {}, |
|
"source": [ |
|
"<table style=\"margin: 0; text-align: left;\">\n", |
|
" <tr>\n", |
|
" <td style=\"width: 150px; height: 150px; vertical-align: middle;\">\n", |
|
" <img src=\"../resources.jpg\" width=\"150\" height=\"150\" style=\"display: block;\" />\n", |
|
" </td>\n", |
|
" <td>\n", |
|
" <h2 style=\"color:#f71;\">Just before we get to the assignment --</h2>\n", |
|
" <span style=\"color:#f71;\">I thought I'd take a second to point you at this page of useful resources for the course. This includes links to all the slides.<br/>\n", |
|
" <a href=\"https://edwarddonner.com/2024/11/13/llm-engineering-resources/\">https://edwarddonner.com/2024/11/13/llm-engineering-resources/</a><br/>\n", |
|
" Please keep this bookmarked, and I'll continue to add more useful links there over time.\n", |
|
" </span>\n", |
|
" </td>\n", |
|
" </tr>\n", |
|
"</table>" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "6e9fa1fc-eac5-4d1d-9be4-541b3f2b3458", |
|
"metadata": {}, |
|
"source": [ |
|
"# HOMEWORK EXERCISE ASSIGNMENT\n", |
|
"\n", |
|
"Upgrade the day 1 project to summarize a webpage to use an Open Source model running locally via Ollama rather than OpenAI\n", |
|
"\n", |
|
"You'll be able to use this technique for all subsequent projects if you'd prefer not to use paid APIs.\n", |
|
"\n", |
|
"**Benefits:**\n", |
|
"1. No API charges - open-source\n", |
|
"2. Data doesn't leave your box\n", |
|
"\n", |
|
"**Disadvantages:**\n", |
|
"1. Significantly less power than Frontier Model\n", |
|
"\n", |
|
"## Recap on installation of Ollama\n", |
|
"\n", |
|
"Simply visit [ollama.com](https://ollama.com) and install!\n", |
|
"\n", |
|
"Once complete, the ollama server should already be running locally. \n", |
|
"If you visit: \n", |
|
"[http://localhost:11434/](http://localhost:11434/)\n", |
|
"\n", |
|
"You should see the message `Ollama is running`. \n", |
|
"\n", |
|
"If not, bring up a new Terminal (Mac) or Powershell (Windows) and enter `ollama serve` \n", |
|
"And in another Terminal (Mac) or Powershell (Windows), enter `ollama pull llama3.2` \n", |
|
"Then try [http://localhost:11434/](http://localhost:11434/) again.\n", |
|
"\n", |
|
"If Ollama is slow on your machine, try using `llama3.2:1b` as an alternative. Run `ollama pull llama3.2:1b` from a Terminal or Powershell, and change the code below from `MODEL = \"llama3.2\"` to `MODEL = \"llama3.2:1b\"`" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 1, |
|
"id": "4e2a9393-7767-488e-a8bf-27c12dca35bd", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"# imports\n", |
|
"\n", |
|
"import requests\n", |
|
"from bs4 import BeautifulSoup\n", |
|
"from IPython.display import Markdown, display" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 2, |
|
"id": "29ddd15d-a3c5-4f4e-a678-873f56162724", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"# Constants\n", |
|
"\n", |
|
"OLLAMA_API = \"http://localhost:11434/api/chat\"\n", |
|
"HEADERS = {\"Content-Type\": \"application/json\"}\n", |
|
"MODEL = \"llama3.2\"" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 3, |
|
"id": "dac0a679-599c-441f-9bf2-ddc73d35b940", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"# Create a messages list using the same format that we used for OpenAI\n", |
|
"\n", |
|
"messages = [\n", |
|
" {\"role\": \"user\", \"content\": \"Describe some of the business applications of Generative AI\"}\n", |
|
"]" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 4, |
|
"id": "7bb9c624-14f0-4945-a719-8ddb64f66f47", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"payload = {\n", |
|
" \"model\": MODEL,\n", |
|
" \"messages\": messages,\n", |
|
" \"stream\": False\n", |
|
" }" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 5, |
|
"id": "479ff514-e8bd-4985-a572-2ea28bb4fa40", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"name": "stdout", |
|
"output_type": "stream", |
|
"text": [ |
|
"\u001b[?25lpulling manifest ⠋ \u001b[?25h\u001b[?25l\u001b[2K\u001b[1Gpulling manifest ⠙ \u001b[?25h\u001b[?25l\u001b[2K\u001b[1Gpulling manifest ⠹ \u001b[?25h\u001b[?25l\u001b[2K\u001b[1Gpulling manifest ⠸ \u001b[?25h\u001b[?25l\u001b[2K\u001b[1Gpulling manifest ⠼ \u001b[?25h\u001b[?25l\u001b[2K\u001b[1Gpulling manifest ⠴ \u001b[?25h\u001b[?25l\u001b[2K\u001b[1Gpulling manifest ⠦ \u001b[?25h\u001b[?25l\u001b[2K\u001b[1Gpulling manifest \n", |
|
"pulling dde5aa3fc5ff... 100% ▕████████████████▏ 2.0 GB \n", |
|
"pulling 966de95ca8a6... 100% ▕████████████████▏ 1.4 KB \n", |
|
"pulling fcc5a6bec9da... 100% ▕████████████████▏ 7.7 KB \n", |
|
"pulling a70ff7e570d9... 100% ▕████████████████▏ 6.0 KB \n", |
|
"pulling 56bb8bd477a5... 100% ▕████████████████▏ 96 B \n", |
|
"pulling 34bb5ab01051... 100% ▕████████████████▏ 561 B \n", |
|
"verifying sha256 digest \n", |
|
"writing manifest \n", |
|
"success \u001b[?25h\n" |
|
] |
|
} |
|
], |
|
"source": [ |
|
"# Let's just make sure the model is loaded\n", |
|
"\n", |
|
"!ollama pull llama3.2" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 6, |
|
"id": "42b9f644-522d-4e05-a691-56e7658c0ea9", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"name": "stdout", |
|
"output_type": "stream", |
|
"text": [ |
|
"Generative AI has numerous business applications across various industries, including:\n", |
|
"\n", |
|
"1. **Content Generation**: Generate high-quality content such as articles, social media posts, product descriptions, and more, saving time and resources for content creation teams.\n", |
|
"2. **Image and Video Creation**: Use generative models to create visually appealing images, videos, and 3D models for marketing campaigns, advertising, and entertainment.\n", |
|
"3. **Chatbots and Virtual Assistants**: Develop conversational interfaces that can understand and respond to customer inquiries, providing 24/7 support and improving customer satisfaction.\n", |
|
"4. **Product Design and Development**: Utilize generative design tools to create innovative product designs, reducing the need for physical prototypes and accelerating the product development cycle.\n", |
|
"5. **Marketing Automation**: Leverage generative AI to personalize marketing campaigns, predict customer behavior, and optimize sales funnels.\n", |
|
"6. **Financial Modeling and Analysis**: Use generative models to build complex financial models, forecast market trends, and identify potential investment opportunities.\n", |
|
"7. **Customer Segmentation and Profiling**: Develop advanced customer segmentation models that can identify high-value customers and provide personalized recommendations.\n", |
|
"8. **Language Translation**: Generate human-like translations for text, speech, and audio content, breaking language barriers and facilitating global communication.\n", |
|
"9. **Music and Audio Generation**: Create unique music tracks, sound effects, and voiceovers for film, TV, and advertising productions.\n", |
|
"10. **Data Augmentation**: Use generative models to augment existing datasets, increasing the size and diversity of training data, improving model performance, and reducing data bias.\n", |
|
"\n", |
|
"Some specific business use cases include:\n", |
|
"\n", |
|
"* **Product Recommendation Engine**: Develop a platform that recommends products based on customer behavior, preferences, and purchase history.\n", |
|
"* **Social Media Content Calendar**: Generate social media content in advance using generative models, ensuring consistent branding and messaging across platforms.\n", |
|
"* **Personalized Email Marketing**: Create customized email campaigns that address individual customers' interests, preferences, and behavior.\n", |
|
"* **Automated Customer Service**: Develop chatbots that can understand and respond to customer inquiries, providing 24/7 support and improving customer satisfaction.\n", |
|
"\n", |
|
"These are just a few examples of the many business applications of Generative AI. As the technology continues to evolve, we can expect to see even more innovative use cases emerge across various industries.\n" |
|
] |
|
} |
|
], |
|
"source": [ |
|
"# If this doesn't work for any reason, try the 2 versions in the following cells\n", |
|
"# And double check the instructions in the 'Recap on installation of Ollama' at the top of this lab\n", |
|
"# And if none of that works - contact me!\n", |
|
"\n", |
|
"response = requests.post(OLLAMA_API, json=payload, headers=HEADERS)\n", |
|
"print(response.json()['message']['content'])" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "6a021f13-d6a1-4b96-8e18-4eae49d876fe", |
|
"metadata": {}, |
|
"source": [ |
|
"# Introducing the ollama package\n", |
|
"\n", |
|
"And now we'll do the same thing, but using the elegant ollama python package instead of a direct HTTP call.\n", |
|
"\n", |
|
"Under the hood, it's making the same call as above to the ollama server running at localhost:11434" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 7, |
|
"id": "7745b9c4-57dc-4867-9180-61fa5db55eb8", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"name": "stdout", |
|
"output_type": "stream", |
|
"text": [ |
|
"Generative AI has numerous business applications across various industries, including:\n", |
|
"\n", |
|
"1. **Content Generation**: Generative AI can create high-quality content such as blog posts, social media posts, product descriptions, and more. This can help businesses save time and resources while maintaining a consistent tone and style.\n", |
|
"2. **Image and Video Creation**: Generative AI can generate images and videos for various purposes, including marketing materials, product demonstrations, and entertainment. For example, AI-generated videos can be used to create explainer content or social media clips.\n", |
|
"3. **Chatbots and Virtual Assistants**: Generative AI can power chatbots and virtual assistants that provide customer support, answer frequently asked questions, and help with tasks such as booking appointments or making reservations.\n", |
|
"4. **Product Design and Development**: Generative AI can assist in the design and development of new products by generating 3D models, prototypes, and even entire product lines. This can help businesses reduce costs and speed up product development.\n", |
|
"5. **Marketing Automation**: Generative AI can automate marketing tasks such as email campaigns, social media advertising, and lead generation. This can help businesses personalize their marketing efforts and reach a wider audience.\n", |
|
"6. **Predictive Maintenance**: Generative AI can analyze data from sensors and machines to predict when maintenance is required, reducing downtime and increasing overall efficiency.\n", |
|
"7. **Customer Segmentation**: Generative AI can analyze customer data and behavior patterns to identify new segments and tailor marketing efforts to specific groups.\n", |
|
"8. **Data Visualization**: Generative AI can create interactive visualizations of complex data sets, helping businesses communicate insights and trends more effectively.\n", |
|
"9. **Sales Forecasting**: Generative AI can analyze historical sales data and market trends to predict future sales performance, enabling businesses to make more informed decisions.\n", |
|
"10. **Language Translation**: Generative AI can translate text and speech in real-time, breaking down language barriers and enabling global communication.\n", |
|
"\n", |
|
"Some specific business applications of Generative AI include:\n", |
|
"\n", |
|
"* **Amazon's Product Recommendations**: Amazon uses Generative AI to create personalized product recommendations based on customer behavior and preferences.\n", |
|
"* **Google's Search Results**: Google's search results are influenced by Generative AI, which helps rank relevant content and provide more accurate answers to user queries.\n", |
|
"* **Salesforce's Chatbots**: Salesforce uses Generative AI-powered chatbots to provide customer support and answer frequently asked questions.\n", |
|
"* **Netflix's Content Recommendations**: Netflix uses Generative AI to recommend personalized content to its users based on their viewing history and preferences.\n", |
|
"\n", |
|
"These are just a few examples of the many business applications of Generative AI. As the technology continues to evolve, we can expect to see even more innovative uses across various industries.\n" |
|
] |
|
} |
|
], |
|
"source": [ |
|
"import ollama\n", |
|
"\n", |
|
"response = ollama.chat(model=MODEL, messages=messages)\n", |
|
"print(response['message']['content'])" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "a4704e10-f5fb-4c15-a935-f046c06fb13d", |
|
"metadata": {}, |
|
"source": [ |
|
"## Alternative approach - using OpenAI python library to connect to Ollama" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": null, |
|
"id": "23057e00-b6fc-4678-93a9-6b31cb704bff", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"# There's actually an alternative approach that some people might prefer\n", |
|
"# You can use the OpenAI client python library to call Ollama:\n", |
|
"\n", |
|
"from openai import OpenAI\n", |
|
"ollama_via_openai = OpenAI(base_url='http://localhost:11434/v1', api_key='ollama')\n", |
|
"\n", |
|
"response = ollama_via_openai.chat.completions.create(\n", |
|
" model=MODEL,\n", |
|
" messages=messages\n", |
|
")\n", |
|
"\n", |
|
"print(response.choices[0].message.content)" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "bc7d1de3-e2ac-46ff-a302-3b4ba38c4c90", |
|
"metadata": {}, |
|
"source": [ |
|
"## Also trying the amazing reasoning model DeepSeek\n", |
|
"\n", |
|
"Here we use the version of DeepSeek-reasoner that's been distilled to 1.5B. \n", |
|
"This is actually a 1.5B variant of Qwen that has been fine-tuned using synethic data generated by Deepseek R1.\n", |
|
"\n", |
|
"Other sizes of DeepSeek are [here](https://ollama.com/library/deepseek-r1) all the way up to the full 671B parameter version, which would use up 404GB of your drive and is far too large for most!" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": null, |
|
"id": "cf9eb44e-fe5b-47aa-b719-0bb63669ab3d", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"!ollama pull deepseek-r1:1.5b" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": null, |
|
"id": "1d3d554b-e00d-4c08-9300-45e073950a76", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"# This may take a few minutes to run! You should then see a fascinating \"thinking\" trace inside <think> tags, followed by some decent definitions\n", |
|
"\n", |
|
"response = ollama_via_openai.chat.completions.create(\n", |
|
" model=\"deepseek-r1:1.5b\",\n", |
|
" messages=[{\"role\": \"user\", \"content\": \"Please give definitions of some core concepts behind LLMs: a neural network, attention and the transformer\"}]\n", |
|
")\n", |
|
"\n", |
|
"print(response.choices[0].message.content)" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "1622d9bb-5c68-4d4e-9ca4-b492c751f898", |
|
"metadata": {}, |
|
"source": [ |
|
"# NOW the exercise for you\n", |
|
"\n", |
|
"Take the code from day1 and incorporate it here, to build a website summarizer that uses Llama 3.2 running locally instead of OpenAI; use either of the above approaches." |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 14, |
|
"id": "6de38216-6d1c-48c4-877b-86d403f4e0f8", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"data": { |
|
"text/markdown": [ |
|
"# Website Summary\n", |
|
"The website is hosted by Edward Donner, a co-founder and CTO of Nebula.io, an AI startup that applies AI to help people discover their potential. The website appears to be a personal blog or profile page, showcasing his interests in LLMs (Large Language Models) and AI.\n", |
|
"\n", |
|
"### News and Announcements\n", |
|
"\n", |
|
"* **Upcoming Event**: January 23, 2025 - LLM Workshop – Hands-on with Agents – resources\n", |
|
"* **Past Events**:\n", |
|
"\t+ December 21, 2024 - Welcome, SuperDataScientists!\n", |
|
"\t+ November 13, 2024 - Mastering AI and LLM Engineering – Resources\n", |
|
"\t+ October 16, 2024 - From Software Engineer to AI Data Scientist – resources" |
|
], |
|
"text/plain": [ |
|
"<IPython.core.display.Markdown object>" |
|
] |
|
}, |
|
"metadata": {}, |
|
"output_type": "display_data" |
|
} |
|
], |
|
"source": [ |
|
"import os\n", |
|
"import requests\n", |
|
"from dotenv import load_dotenv\n", |
|
"from bs4 import BeautifulSoup\n", |
|
"from IPython.display import Markdown, display\n", |
|
"from openai import OpenAI\n", |
|
"\n", |
|
"headers = {\n", |
|
" \"User-Agent\": \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/117.0.0.0 Safari/537.36\"\n", |
|
"}\n", |
|
"\n", |
|
"system_prompt = \"You are an assistant that analyzes the contents of a website \\\n", |
|
"and provides a short summary, ignoring text that might be navigation related. \\\n", |
|
"Respond in markdown.\"\n", |
|
"\n", |
|
"def user_prompt_for(website):\n", |
|
" user_prompt = f\"You are looking at a website titled {website.title}\"\n", |
|
" user_prompt += \"\\nThe contents of this website is as follows; \\\n", |
|
"please provide a short summary of this website in markdown. \\\n", |
|
"If it includes news or announcements, then summarize these too.\\n\\n\"\n", |
|
" user_prompt += website.text\n", |
|
" return user_prompt\n", |
|
"\n", |
|
"class Website:\n", |
|
"\n", |
|
" def __init__(self, url):\n", |
|
" \"\"\"\n", |
|
" Create this Website object from the given url using the BeautifulSoup library\n", |
|
" \"\"\"\n", |
|
" self.url = url\n", |
|
" response = requests.get(url, headers=headers)\n", |
|
" soup = BeautifulSoup(response.content, 'html.parser')\n", |
|
" self.title = soup.title.string if soup.title else \"No title found\"\n", |
|
" for irrelevant in soup.body([\"script\", \"style\", \"img\", \"input\"]):\n", |
|
" irrelevant.decompose()\n", |
|
" self.text = soup.body.get_text(separator=\"\\n\", strip=True)\n", |
|
"\n", |
|
"def messages_for(website):\n", |
|
" return [\n", |
|
" {\"role\": \"system\", \"content\": system_prompt},\n", |
|
" {\"role\": \"user\", \"content\": user_prompt_for(website)}\n", |
|
" ]\n", |
|
"\n", |
|
"def generate_payload(website):\n", |
|
" return {\n", |
|
" \"model\": MODEL,\n", |
|
" \"messages\": messages_for(website),\n", |
|
" \"stream\": False\n", |
|
" }\n", |
|
"\n", |
|
"def summarize(url):\n", |
|
" website = Website(url)\n", |
|
" response = requests.post(OLLAMA_API, json=generate_payload(website), headers=HEADERS)\n", |
|
" return response.json()['message']['content']\n", |
|
"\n", |
|
"def display_summary(url):\n", |
|
" summary = summarize(url)\n", |
|
" display(Markdown(summary))\n", |
|
"\n", |
|
"display_summary(\"https://edwarddonner.com\")" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": null, |
|
"id": "fc350dae-4666-4627-bd12-668c2399e4aa", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [] |
|
} |
|
], |
|
"metadata": { |
|
"kernelspec": { |
|
"display_name": "Python 3 (ipykernel)", |
|
"language": "python", |
|
"name": "python3" |
|
}, |
|
"language_info": { |
|
"codemirror_mode": { |
|
"name": "ipython", |
|
"version": 3 |
|
}, |
|
"file_extension": ".py", |
|
"mimetype": "text/x-python", |
|
"name": "python", |
|
"nbconvert_exporter": "python", |
|
"pygments_lexer": "ipython3", |
|
"version": "3.11.11" |
|
} |
|
}, |
|
"nbformat": 4, |
|
"nbformat_minor": 5 |
|
}
|
|
|