From the uDemy course on LLM engineering.
https://www.udemy.com/course/llm-engineering-master-ai-and-large-language-models
You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
501 lines
24 KiB
501 lines
24 KiB
{ |
|
"cells": [ |
|
{ |
|
"cell_type": "markdown", |
|
"id": "d15d8294-3328-4e07-ad16-8a03e9bbfdb9", |
|
"metadata": {}, |
|
"source": [ |
|
"# Welcome to your first assignment!\n", |
|
"\n", |
|
"Instructions are below. Please give this a try, and look in the solutions folder if you get stuck (or feel free to ask me!)" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "ada885d9-4d42-4d9b-97f0-74fbbbfe93a9", |
|
"metadata": {}, |
|
"source": [ |
|
"<table style=\"margin: 0; text-align: left;\">\n", |
|
" <tr>\n", |
|
" <td style=\"width: 150px; height: 150px; vertical-align: middle;\">\n", |
|
" <img src=\"../resources.jpg\" width=\"150\" height=\"150\" style=\"display: block;\" />\n", |
|
" </td>\n", |
|
" <td>\n", |
|
" <h2 style=\"color:#f71;\">Just before we get to the assignment --</h2>\n", |
|
" <span style=\"color:#f71;\">I thought I'd take a second to point you at this page of useful resources for the course. This includes links to all the slides.<br/>\n", |
|
" <a href=\"https://edwarddonner.com/2024/11/13/llm-engineering-resources/\">https://edwarddonner.com/2024/11/13/llm-engineering-resources/</a><br/>\n", |
|
" Please keep this bookmarked, and I'll continue to add more useful links there over time.\n", |
|
" </span>\n", |
|
" </td>\n", |
|
" </tr>\n", |
|
"</table>" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "6e9fa1fc-eac5-4d1d-9be4-541b3f2b3458", |
|
"metadata": {}, |
|
"source": [ |
|
"# HOMEWORK EXERCISE ASSIGNMENT\n", |
|
"\n", |
|
"Upgrade the day 1 project to summarize a webpage to use an Open Source model running locally via Ollama rather than OpenAI\n", |
|
"\n", |
|
"You'll be able to use this technique for all subsequent projects if you'd prefer not to use paid APIs.\n", |
|
"\n", |
|
"**Benefits:**\n", |
|
"1. No API charges - open-source\n", |
|
"2. Data doesn't leave your box\n", |
|
"\n", |
|
"**Disadvantages:**\n", |
|
"1. Significantly less power than Frontier Model\n", |
|
"\n", |
|
"## Recap on installation of Ollama\n", |
|
"\n", |
|
"Simply visit [ollama.com](https://ollama.com) and install!\n", |
|
"\n", |
|
"Once complete, the ollama server should already be running locally. \n", |
|
"If you visit: \n", |
|
"[http://localhost:11434/](http://localhost:11434/)\n", |
|
"\n", |
|
"You should see the message `Ollama is running`. \n", |
|
"\n", |
|
"If not, bring up a new Terminal (Mac) or Powershell (Windows) and enter `ollama serve` \n", |
|
"And in another Terminal (Mac) or Powershell (Windows), enter `ollama pull llama3.2` \n", |
|
"Then try [http://localhost:11434/](http://localhost:11434/) again.\n", |
|
"\n", |
|
"If Ollama is slow on your machine, try using `llama3.2:1b` as an alternative. Run `ollama pull llama3.2:1b` from a Terminal or Powershell, and change the code below from `MODEL = \"llama3.2\"` to `MODEL = \"llama3.2:1b\"`" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 1, |
|
"id": "4e2a9393-7767-488e-a8bf-27c12dca35bd", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"# imports\n", |
|
"\n", |
|
"import requests\n", |
|
"from bs4 import BeautifulSoup\n", |
|
"from IPython.display import Markdown, display" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 2, |
|
"id": "29ddd15d-a3c5-4f4e-a678-873f56162724", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"# Constants\n", |
|
"\n", |
|
"OLLAMA_API = \"http://localhost:11434/api/chat\"\n", |
|
"HEADERS = {\"Content-Type\": \"application/json\"}\n", |
|
"MODEL = \"llama3.2\"" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 3, |
|
"id": "dac0a679-599c-441f-9bf2-ddc73d35b940", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"# Create a messages list using the same format that we used for OpenAI\n", |
|
"\n", |
|
"messages = [\n", |
|
" {\"role\": \"user\", \"content\": \"Describe some of the business applications of Generative AI\"}\n", |
|
"]" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 4, |
|
"id": "7bb9c624-14f0-4945-a719-8ddb64f66f47", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"payload = {\n", |
|
" \"model\": MODEL,\n", |
|
" \"messages\": messages,\n", |
|
" \"stream\": False\n", |
|
" }" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 15, |
|
"id": "4f2ae8e2", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"name": "stdout", |
|
"output_type": "stream", |
|
"text": [ |
|
"NAME ID SIZE MODIFIED \n", |
|
"llama3.2:latest a80c4f17acd5 2.0 GB 3 minutes ago \n" |
|
] |
|
} |
|
], |
|
"source": [ |
|
"!ollama list\n" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 9, |
|
"id": "479ff514-e8bd-4985-a572-2ea28bb4fa40", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"name": "stderr", |
|
"output_type": "stream", |
|
"text": [ |
|
"\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠋ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠙ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠹ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠸ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠼ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠴ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠦ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠧ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠇ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠏ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠋ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠙ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠹ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠸ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠼ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠴ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠦ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠧ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠇ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠏ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest ⠋ \u001b[K\u001b[?25h\u001b[?2026l\u001b[?2026h\u001b[?25l\u001b[1Gpulling manifest \u001b[K\n", |
|
"pulling dde5aa3fc5ff... 100% ▕████████████████▏ 2.0 GB \u001b[K\n", |
|
"pulling 966de95ca8a6... 100% ▕████████████████▏ 1.4 KB \u001b[K\n", |
|
"pulling fcc5a6bec9da... 100% ▕████████████████▏ 7.7 KB \u001b[K\n", |
|
"pulling a70ff7e570d9... 100% ▕████████████████▏ 6.0 KB \u001b[K\n", |
|
"pulling 56bb8bd477a5... 100% ▕████████████████▏ 96 B \u001b[K\n", |
|
"pulling 34bb5ab01051... 100% ▕████████████████▏ 561 B \u001b[K\n", |
|
"verifying sha256 digest \u001b[K\n", |
|
"writing manifest \u001b[K\n", |
|
"success \u001b[K\u001b[?25h\u001b[?2026l\n" |
|
] |
|
} |
|
], |
|
"source": [ |
|
"# Let's just make sure the model is loaded\n", |
|
"\n", |
|
"!ollama pull llama3.2" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 19, |
|
"id": "42b9f644-522d-4e05-a691-56e7658c0ea9", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"name": "stdout", |
|
"output_type": "stream", |
|
"text": [ |
|
"Generative AI has numerous business applications across various industries. Here are some examples:\n", |
|
"\n", |
|
"1. **Content Creation**: Generative AI can generate high-quality content such as articles, blog posts, social media posts, and even entire books. This technology is particularly useful for businesses that need to create a large amount of content quickly, but may not have the resources or expertise in-house.\n", |
|
"2. **Visual Content Generation**: Generative AI can create visual content like images, videos, and 3D models. This technology is commonly used in advertising, marketing, and e-commerce to create visually appealing products, product demos, and social media campaigns.\n", |
|
"3. **Chatbots and Virtual Assistants**: Generative AI can power chatbots and virtual assistants that provide customer support, answer frequently asked questions, and engage with customers in a more human-like way.\n", |
|
"4. **Data Analysis and Insights**: Generative AI can analyze large datasets and generate insights, predictions, and recommendations. This technology is commonly used in data science, marketing analytics, and business intelligence to help organizations make data-driven decisions.\n", |
|
"5. **Product Design and Development**: Generative AI can assist designers and engineers in product development by generating ideas, designs, and prototypes quickly and efficiently.\n", |
|
"6. **Marketing Automation**: Generative AI can automate marketing campaigns by generating personalized content, emails, and social media posts based on customer behavior and preferences.\n", |
|
"7. **Personalization**: Generative AI can help businesses personalize their products and services to individual customers based on their behavior, preferences, and demographics.\n", |
|
"8. **Cybersecurity**: Generative AI can be used to detect and respond to cyber threats by analyzing network traffic, identifying patterns, and predicting potential attacks.\n", |
|
"9. **Predictive Maintenance**: Generative AI can analyze equipment data and predict when maintenance is required, reducing downtime and increasing productivity.\n", |
|
"10. **Autonomous Systems**: Generative AI can enable the development of autonomous systems that can operate independently, making decisions based on real-time data and feedback.\n", |
|
"\n", |
|
"Some specific industries that are already using generative AI include:\n", |
|
"\n", |
|
"1. **Retail**: Companies like Walmart and Amazon use generative AI to personalize customer experiences, optimize supply chains, and automate marketing campaigns.\n", |
|
"2. **Finance**: Financial institutions like JPMorgan Chase and Goldman Sachs use generative AI to analyze financial data, predict market trends, and identify investment opportunities.\n", |
|
"3. **Healthcare**: Medical organizations like the Mayo Clinic and Johnson & Johnson use generative AI to analyze medical images, diagnose diseases, and develop personalized treatment plans.\n", |
|
"4. **Manufacturing**: Companies like Siemens and GE Aviation use generative AI to optimize production processes, design new products, and predict maintenance needs.\n", |
|
"\n", |
|
"These are just a few examples of the many business applications of Generative AI. As the technology continues to evolve, we can expect to see even more innovative uses across various industries.\n" |
|
] |
|
} |
|
], |
|
"source": [ |
|
"# If this doesn't work for any reason, try the 2 versions in the following cells\n", |
|
"# And double check the instructions in the 'Recap on installation of Ollama' at the top of this lab\n", |
|
"# And if none of that works - contact me!\n", |
|
"\n", |
|
"response = requests.post(OLLAMA_API, json=payload, headers=HEADERS)\n", |
|
"print(response.json()['message']['content'])" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "6a021f13-d6a1-4b96-8e18-4eae49d876fe", |
|
"metadata": {}, |
|
"source": [ |
|
"# Introducing the ollama package\n", |
|
"\n", |
|
"And now we'll do the same thing, but using the elegant ollama python package instead of a direct HTTP call.\n", |
|
"\n", |
|
"Under the hood, it's making the same call as above to the ollama server running at localhost:11434" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 20, |
|
"id": "7745b9c4-57dc-4867-9180-61fa5db55eb8", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"ename": "ModuleNotFoundError", |
|
"evalue": "No module named 'ollama'", |
|
"output_type": "error", |
|
"traceback": [ |
|
"\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", |
|
"\u001b[1;31mModuleNotFoundError\u001b[0m Traceback (most recent call last)", |
|
"Cell \u001b[1;32mIn[20], line 1\u001b[0m\n\u001b[1;32m----> 1\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[38;5;21;01mollama\u001b[39;00m\n\u001b[0;32m 3\u001b[0m response \u001b[38;5;241m=\u001b[39m ollama\u001b[38;5;241m.\u001b[39mchat(model\u001b[38;5;241m=\u001b[39mMODEL, messages\u001b[38;5;241m=\u001b[39mmessages)\n\u001b[0;32m 4\u001b[0m \u001b[38;5;28mprint\u001b[39m(response[\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mmessage\u001b[39m\u001b[38;5;124m'\u001b[39m][\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mcontent\u001b[39m\u001b[38;5;124m'\u001b[39m])\n", |
|
"\u001b[1;31mModuleNotFoundError\u001b[0m: No module named 'ollama'" |
|
] |
|
} |
|
], |
|
"source": [ |
|
"import ollama\n", |
|
"\n", |
|
"response = ollama.chat(model=MODEL, messages=messages)\n", |
|
"print(response['message']['content'])" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "a4704e10-f5fb-4c15-a935-f046c06fb13d", |
|
"metadata": {}, |
|
"source": [ |
|
"## Alternative approach - using OpenAI python library to connect to Ollama" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 17, |
|
"id": "23057e00-b6fc-4678-93a9-6b31cb704bff", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"name": "stdout", |
|
"output_type": "stream", |
|
"text": [ |
|
"Generative AI (GeAI) has numerous business applications across various industries. Here are some examples:\n", |
|
"\n", |
|
"1. **Content Generation**: GeAI can create high-quality content such as articles, social media posts, product descriptions, and even entire books. This technology is being used by companies to automate the creation of content, reducing the need for human writers and improving efficiency.\n", |
|
"2. **Virtual Assistants**: GeAI-powered virtual assistants can be used to build conversational interfaces for customer service, sales, and other applications. These AI-powered chatbots can provide personalized support, recommend products, and help customers with queries.\n", |
|
"3. **Marketing Automation**: GeAI can generate targeted marketing campaigns using real-time data analytics, social media insights, and behavioral patterns. This technology helps businesses personalize their marketing messages, improving engagement rates and conversion rates.\n", |
|
"4. **Product Design and Development**: GeAI-powered design tools can create product prototypes, suggest design variations, and even automate product prototyping using AI-generated 3D models.\n", |
|
"5. **Image Recognition and Classification**: GeAI can analyze large volumes of images and identify patterns, objects, and people. This technology is being used in applications such as facial recognition, object detection, and medical image analysis.\n", |
|
"6. **Predictive Maintenance**: GeAI-powered predictive maintenance systems can predict equipment failures, detect anomalies in network traffic, and even recommend proactive maintenance schedules.\n", |
|
"7. **Customer Service Chatbots**: GeAI-powered chatbots can provide 24/7 customer support, answering common queries, resolving basic issues, and routing complex problems to human support agents.\n", |
|
"8. **Creative Writing Assistance**: GeAI-powered writing tools can assist writers with content ideas, suggestions for rewording or paraphrasing text, and optimizing the style of their work.\n", |
|
"9. **Business Strategy Development**: GeAI can analyze large datasets, identify industry trends, and provide recommendations for business strategy development.\n", |
|
"10. **Speech Synthesis and Voice Assistants**: GeAI-powered speech synthesis systems can generate natural-sounding voices, automating tasks such as text-to-speech and voice-based customer support.\n", |
|
"\n", |
|
"These are just a few examples of the many potential applications of Generative AI in various industries. As the technology advances, we can expect to see even more innovative uses of GeAI across businesses and organizations.\n", |
|
"\n", |
|
"In addition to these specific business applications, Generative AI also poses potential risks and challenges, including:\n", |
|
"\n", |
|
"* Job displacement\n", |
|
"* Data quality issues\n", |
|
"* Bias in decision-making systems\n", |
|
"* Dependence on AI for critical tasks\n", |
|
"\n", |
|
"As with any emerging technology, it is essential for businesses to carefully consider the benefits and drawbacks of using GeAI and to develop responsible guidelines and regulations for its use.\n" |
|
] |
|
} |
|
], |
|
"source": [ |
|
"# There's actually an alternative approach that some people might prefer\n", |
|
"# You can use the OpenAI client python library to call Ollama:\n", |
|
"\n", |
|
"from openai import OpenAI\n", |
|
"ollama_via_openai = OpenAI(base_url='http://localhost:11434/v1', api_key='ollama')\n", |
|
"\n", |
|
"response = ollama_via_openai.chat.completions.create(\n", |
|
" model=MODEL,\n", |
|
" messages=messages\n", |
|
")\n", |
|
"\n", |
|
"print(response.choices[0].message.content)" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "9f9e22da-b891-41f6-9ac9-bd0c0a5f4f44", |
|
"metadata": {}, |
|
"source": [ |
|
"## Are you confused about why that works?\n", |
|
"\n", |
|
"It seems strange, right? We just used OpenAI code to call Ollama?? What's going on?!\n", |
|
"\n", |
|
"Here's the scoop:\n", |
|
"\n", |
|
"The python class `OpenAI` is simply code written by OpenAI engineers that makes calls over the internet to an endpoint. \n", |
|
"\n", |
|
"When you call `openai.chat.completions.create()`, this python code just makes a web request to the following url: \"https://api.openai.com/v1/chat/completions\"\n", |
|
"\n", |
|
"Code like this is known as a \"client library\" - it's just wrapper code that runs on your machine to make web requests. The actual power of GPT is running on OpenAI's cloud behind this API, not on your computer!\n", |
|
"\n", |
|
"OpenAI was so popular, that lots of other AI providers provided identical web endpoints, so you could use the same approach.\n", |
|
"\n", |
|
"So Ollama has an endpoint running on your local box at http://localhost:11434/v1/chat/completions \n", |
|
"And in week 2 we'll discover that lots of other providers do this too, including Gemini and DeepSeek.\n", |
|
"\n", |
|
"And then the team at OpenAI had a great idea: they can extend their client library so you can specify a different 'base url', and use their library to call any compatible API.\n", |
|
"\n", |
|
"That's it!\n", |
|
"\n", |
|
"So when you say: `ollama_via_openai = OpenAI(base_url='http://localhost:11434/v1', api_key='ollama')` \n", |
|
"Then this will make the same endpoint calls, but to Ollama instead of OpenAI." |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "bc7d1de3-e2ac-46ff-a302-3b4ba38c4c90", |
|
"metadata": {}, |
|
"source": [ |
|
"## Also trying the amazing reasoning model DeepSeek\n", |
|
"\n", |
|
"Here we use the version of DeepSeek-reasoner that's been distilled to 1.5B. \n", |
|
"This is actually a 1.5B variant of Qwen that has been fine-tuned using synethic data generated by Deepseek R1.\n", |
|
"\n", |
|
"Other sizes of DeepSeek are [here](https://ollama.com/library/deepseek-r1) all the way up to the full 671B parameter version, which would use up 404GB of your drive and is far too large for most!" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": null, |
|
"id": "cf9eb44e-fe5b-47aa-b719-0bb63669ab3d", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"!ollama pull deepseek-r1:1.5b" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": null, |
|
"id": "1d3d554b-e00d-4c08-9300-45e073950a76", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [ |
|
"# This may take a few minutes to run! You should then see a fascinating \"thinking\" trace inside <think> tags, followed by some decent definitions\n", |
|
"\n", |
|
"response = ollama_via_openai.chat.completions.create(\n", |
|
" model=\"deepseek-r1:1.5b\",\n", |
|
" messages=[{\"role\": \"user\", \"content\": \"Please give definitions of some core concepts behind LLMs: a neural network, attention and the transformer\"}]\n", |
|
")\n", |
|
"\n", |
|
"print(response.choices[0].message.content)" |
|
] |
|
}, |
|
{ |
|
"cell_type": "markdown", |
|
"id": "1622d9bb-5c68-4d4e-9ca4-b492c751f898", |
|
"metadata": {}, |
|
"source": [ |
|
"# NOW the exercise for you\n", |
|
"\n", |
|
"Take the code from day1 and incorporate it here, to build a website summarizer that uses Llama 3.2 running locally instead of OpenAI; use either of the above approaches." |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": 23, |
|
"id": "6de38216-6d1c-48c4-877b-86d403f4e0f8", |
|
"metadata": {}, |
|
"outputs": [ |
|
{ |
|
"data": { |
|
"text/markdown": [ |
|
"This is a basic HTML template for a website. Here's a summary of the key elements:\n", |
|
"\n", |
|
"1. **Head Section**:\n", |
|
"\t* Specifies the character encoding (UTF-8) and viewport settings.\n", |
|
"\t* Links to external resources, including:\n", |
|
"\t\t+ Font Awesome CDN for icons\n", |
|
"\t\t+ Google Fonts for typography (Inter)\n", |
|
"\t\t+ A favicon image\n", |
|
"\t* References an internal stylesheet (`style.css`) for CSS layout.\n", |
|
"\n", |
|
"2. **Title**:\n", |
|
"\t* Sets the title of the webpage (\"Haq Nawaz - Bridging AI & Islamic Scholarship\").\n", |
|
"\n", |
|
"3. **JavaScript Code**:\n", |
|
"\t* Handles menu toggling on mobile devices.\n", |
|
"\t+ Adds a click event listener to toggle the `.nav-links` class and update the hamburger icon's text.\n", |
|
"\t+ Closes the menu when clicking outside the `nav` element or a dropdown.\n", |
|
"\n", |
|
"4. **Layout**:\n", |
|
"\t* The code snippets above suggest that the webpage uses a basic layout with a hamburger menu, which is toggled using JavaScript.\n", |
|
"\n", |
|
"However, without more information about the complete HTML structure and code, it's difficult to provide a detailed explanation of how this template interacts with other pages or external resources. \n", |
|
"\n", |
|
"There are several potential security issues:\n", |
|
"\n", |
|
"* The link integrity is specified incorrectly with `...your-integrity-key...\",` should be replaced by a valid hash for Font Awesome.\n", |
|
"* The `crossorigin=\"anonymous\"`, setting prevents proper DNS lookup, and may cause issues.\n", |
|
"\n", |
|
"The given template seems to be missing a few sections typically found in an HTML page, such as:\n", |
|
"\n", |
|
"- `<body>` tag,\n", |
|
"- Content,\n", |
|
"- footer" |
|
], |
|
"text/plain": [ |
|
"<IPython.core.display.Markdown object>" |
|
] |
|
}, |
|
"metadata": {}, |
|
"output_type": "display_data" |
|
} |
|
], |
|
"source": [ |
|
"from openai import OpenAI\n", |
|
"from IPython.display import display, Markdown\n", |
|
"\n", |
|
"openai = OpenAI(\n", |
|
" base_url=\"http://localhost:11434/v1\",\n", |
|
" api_key=\"ollama\"\n", |
|
")\n", |
|
"\n", |
|
"def messages_for(content):\n", |
|
" return [{\"role\": \"user\", \"content\": f\"Summarize this website:\\n{content}\"}]\n", |
|
"\n", |
|
"def summarize(url):\n", |
|
" import requests\n", |
|
" website = requests.get(url).text[:2000] # Keep it short for local model\n", |
|
" response = openai.chat.completions.create(\n", |
|
" model=\"llama3.2\",\n", |
|
" messages=messages_for(website)\n", |
|
" )\n", |
|
" return response.choices[0].message.content\n", |
|
"\n", |
|
"def display_summary(url):\n", |
|
" summary = summarize(url)\n", |
|
" display(Markdown(summary))\n", |
|
"\n", |
|
"display_summary(\"https://haqnawaz.org\")" |
|
] |
|
}, |
|
{ |
|
"cell_type": "code", |
|
"execution_count": null, |
|
"id": "29563f16", |
|
"metadata": {}, |
|
"outputs": [], |
|
"source": [] |
|
} |
|
], |
|
"metadata": { |
|
"kernelspec": { |
|
"display_name": "venv", |
|
"language": "python", |
|
"name": "python3" |
|
}, |
|
"language_info": { |
|
"codemirror_mode": { |
|
"name": "ipython", |
|
"version": 3 |
|
}, |
|
"file_extension": ".py", |
|
"mimetype": "text/x-python", |
|
"name": "python", |
|
"nbconvert_exporter": "python", |
|
"pygments_lexer": "ipython3", |
|
"version": "3.10.0rc2" |
|
} |
|
}, |
|
"nbformat": 4, |
|
"nbformat_minor": 5 |
|
}
|
|
|