"Cell \u001b[0;32mIn[9], line 3\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[38;5;66;03m# This code will fail with an error because one of our books doesn't have an author\u001b[39;00m\n\u001b[0;32m----> 3\u001b[0m \u001b[43m[\u001b[49m\u001b[43mbook\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[38;5;124;43mauthor\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mfor\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mbook\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;129;43;01min\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mbooks\u001b[49m\u001b[43m]\u001b[49m\n",
"Cell \u001b[0;32mIn[9], line 3\u001b[0m, in \u001b[0;36m<listcomp>\u001b[0;34m(.0)\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[38;5;66;03m# This code will fail with an error because one of our books doesn't have an author\u001b[39;00m\n\u001b[0;32m----> 3\u001b[0m [\u001b[43mbook\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[38;5;124;43mauthor\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[43m]\u001b[49m \u001b[38;5;28;01mfor\u001b[39;00m book \u001b[38;5;129;01min\u001b[39;00m books]\n",
"\u001b[0;31mKeyError\u001b[0m: 'author'"
]
}
],
"source": [
"source": [
"# This code will fail with an error because one of our books doesn't have an author\n",
"# This code will fail with an error because one of our books doesn't have an author\n",
"pulling 56bb8bd477a5... 100% ▕████████████████▏ 96 B \n",
"pulling 34bb5ab01051... 100% ▕████████████████▏ 561 B \n",
"verifying sha256 digest \n",
"writing manifest \n",
"success \u001b[?25h\n"
]
}
],
"source": [
"source": [
"# Let's just make sure the model is loaded\n",
"# Let's just make sure the model is loaded\n",
"\n",
"\n",
@ -136,10 +153,46 @@
},
},
{
{
"cell_type": "code",
"cell_type": "code",
"execution_count": null,
"execution_count": 7,
"id": "42b9f644-522d-4e05-a691-56e7658c0ea9",
"id": "42b9f644-522d-4e05-a691-56e7658c0ea9",
"metadata": {},
"metadata": {},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Generative AI has numerous business applications across various industries, including:\n",
"\n",
"1. **Content Generation**: Automate content creation for social media, blogs, and websites using text-to-image models like DALL-E or Midjourney.\n",
"2. **Image Editing**: Use generative AI to edit photos, create custom illustrations, or generate realistic images for marketing materials, product designs, or advertising campaigns.\n",
"3. **Personalized Recommendations**: Develop AI-powered recommendation systems that suggest products, services, or content based on individual user behavior and preferences using generative models like Generative Adversarial Networks (GANs) or Variational Autoencoders (VAEs).\n",
"4. **Chatbots and Virtual Assistants**: Create conversational interfaces that use natural language processing (NLP) and machine learning to understand user queries, generate responses, and provide personalized support.\n",
"5. **Design and Prototyping**: Utilize generative AI to design new products, furniture, or buildings by generating 3D models, architectural designs, or product prototypes quickly and efficiently.\n",
"6. **Marketing and Advertising**: Leverage generative AI to create engaging ad copy, generate social media posts, or design eye-catching ads using text-to-image models like Deep Dream Generator.\n",
"7. **Product Development**: Use generative AI to design and optimize products for various industries, such as fashion, electronics, or automotive.\n",
"8. **Financial Modeling**: Develop generative models that predict market trends, forecast revenue, and identify opportunities for investment using techniques like GANs or VAEs.\n",
"9. **Customer Service**: Implement AI-powered chatbots and virtual assistants to provide 24/7 customer support, respond to inquiries, and resolve issues efficiently.\n",
"10. **Innovation and Research**: Utilize generative AI as a tool for idea generation, prototyping, and experimentation in various fields like science, technology, engineering, and mathematics (STEM).\n",
"11. **Translation and Localization**: Develop generative models that translate text, speech, or images from one language to another, making it easier to expand global reach.\n",
"12. **Music and Audio Generation**: Create new music compositions, generate beats, or even create entire soundtracks using AI-powered music generation tools.\n",
"\n",
"These applications demonstrate the vast potential of Generative AI in transforming various industries and revolutionizing how businesses operate, innovate, and interact with customers.\n",
"\n",
"**Key Industries Affected:**\n",
"\n",
"1. E-commerce\n",
"2. Advertising\n",
"3. Finance\n",
"4. Healthcare\n",
"5. Education\n",
"6. Entertainment\n",
"7. Media\n",
"8. Retail\n",
"\n",
"These are just a few examples of the many business applications of Generative AI. As this technology continues to evolve, we can expect even more innovative and creative use cases across various industries.\n"
]
}
],
"source": [
"source": [
"# If this doesn't work for any reason, try the 2 versions in the following cells\n",
"# If this doesn't work for any reason, try the 2 versions in the following cells\n",
"# And double check the instructions in the 'Recap on installation of Ollama' at the top of this lab\n",
"# And double check the instructions in the 'Recap on installation of Ollama' at the top of this lab\n",
@ -163,10 +216,50 @@
},
},
{
{
"cell_type": "code",
"cell_type": "code",
"execution_count": null,
"execution_count": 8,
"id": "7745b9c4-57dc-4867-9180-61fa5db55eb8",
"id": "7745b9c4-57dc-4867-9180-61fa5db55eb8",
"metadata": {},
"metadata": {},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Generative AI has numerous business applications across various industries. Some of the most promising uses include:\n",
"\n",
"1. **Content Creation**: Generative AI can be used to generate high-quality content such as articles, social media posts, product descriptions, and more. This can help businesses save time and resources while maintaining consistency in their content.\n",
"\n",
"2. **Product Design and Engineering**: Generative AI can aid in designing new products by creating 3D models, simulations, and prototypes. This can be particularly useful for companies looking to innovate or improve existing products.\n",
"\n",
"3. **Marketing and Advertising**: Generative AI can help generate ad copy, product descriptions, and even entire marketing campaigns. It can also assist in personalizing ads based on user behavior and preferences.\n",
"\n",
"4. **Customer Service Chatbots**: Generative AI-powered chatbots can be used to provide 24/7 customer support, answering frequently asked questions, and routing complex issues to human representatives.\n",
"\n",
"5. **Music and Audio Generation**: Generative AI can create original music tracks, sound effects, and audio loops for various applications such as film scores, video games, and advertising.\n",
"\n",
"6. **Image and Video Editing**: Generative AI-powered tools can edit images and videos with unprecedented speed and accuracy. This can help businesses streamline their visual content creation processes.\n",
"\n",
"7. **Recommendation Systems**: Generative AI can be used to create personalized product recommendations based on user behavior, preferences, and demographics.\n",
"\n",
"8. **Text Summarization and Translation**: Generative AI-powered tools can summarize long documents into concise summaries, making it easier for businesses to communicate complex information quickly.\n",
"\n",
"9. **Financial Modeling and Forecasting**: Generative AI can be used to analyze large datasets, identify patterns, and make predictions about future financial trends and market performance.\n",
"\n",
"10. **Risk Analysis and Compliance**: Generative AI-powered tools can analyze vast amounts of data to identify potential risks and compliance issues, helping businesses stay ahead of regulatory requirements.\n",
"\n",
"11. **Supply Chain Optimization**: Generative AI can help optimize supply chain operations by predicting demand, identifying bottlenecks, and suggesting efficient logistics routes.\n",
"\n",
"12. **Healthcare Data Analysis**: Generative AI-powered tools can analyze large healthcare datasets to identify trends, diagnose diseases, and develop personalized treatment plans.\n",
"\n",
"13. **Real Estate Property Valuation**: Generative AI can estimate property values based on historical data, market trends, and location information, helping real estate agents and investors make informed decisions.\n",
"\n",
"14. **Cybersecurity Threat Analysis**: Generative AI-powered tools can analyze network traffic, identify potential security threats, and predict the likelihood of successful attacks.\n",
"\n",
"15. **Education and Training Development**: Generative AI can help create personalized learning experiences by generating customized educational materials, quizzes, and assessments.\n",
"\n",
"These are just a few examples of the many business applications of Generative AI. As the technology continues to evolve, we can expect even more innovative uses in various industries.\n"
]
}
],
"source": [
"source": [
"import ollama\n",
"import ollama\n",
"\n",
"\n",
@ -184,10 +277,38 @@
},
},
{
{
"cell_type": "code",
"cell_type": "code",
"execution_count": null,
"execution_count": 9,
"id": "23057e00-b6fc-4678-93a9-6b31cb704bff",
"id": "23057e00-b6fc-4678-93a9-6b31cb704bff",
"metadata": {},
"metadata": {},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Generative AI has numerous business applications across various industries. Here are some examples:\n",
"\n",
"1. **Content Creation**: Generative AI can be used to create high-quality, personalized content such as articles, social media posts, and product descriptions, saving time and resources for content creation teams.\n",
"2. **Product Design and Development**: Generative AI can help design new products, logos, and packaging by generating 3D models, textures, and images based on input parameters such as color, shape, and style.\n",
"3. **Marketing Automation**: Generative AI can be used to create personalized marketing campaigns, automating email marketing, social media ads, and lead generation.\n",
"4. **Customer Service Chatbots**: Generative AI-powered chatbots can analyze customer queries, provide personalized responses, and help resolve issues faster.\n",
"5. **Virtual Assistants**: Generative AI can be integrated with virtual assistants like Siri, Alexa, or Google Assistant to provide customers with personalized information and assistance.\n",
"6. **Data Analysis and Insights**: Generative AI can help analyze large datasets, identify patterns, and generate insights for business decision-making.\n",
"7. **Predictive Maintenance**: Generative AI can be used to predict equipment failures, schedule maintenance, and optimize resource allocation in industries like manufacturing and healthcare.\n",
"8. **Financial Modeling**: Generative AI can help create complex financial models, predict market trends, and simulate different scenarios.\n",
"9. **Human Resources**: Generative AI can be used for tasks such as resume screening, interview suggestions, and employee onboarding.\n",
"10. **Creative Visualization**: Generative AI can generate 2D and 3D visualizations of products, buildings, or landscapes, helping architects, designers, and real estate developers visualize their ideas.\n",
"\n",
"Some specific business applications include:\n",
"\n",
"* **E-commerce**: Using generative AI to create product images, design packaging, and optimize listings.\n",
"* **Finance**: Using generative AI for forecasting, risk analysis, and portfolio optimization.\n",
"* **Healthcare**: Using generative AI to analyze medical data, predict patient outcomes, and develop personalized treatment plans.\n",
"* **Manufacturing**: Using generative AI to design new products, simulate production processes, and optimize supply chains.\n",
"\n",
"These are just a few examples of the many business applications of Generative AI. As the technology continues to evolve, we can expect to see even more innovative uses across various industries.\n"
]
}
],
"source": [
"source": [
"# There's actually an alternative approach that some people might prefer\n",
"# There's actually an alternative approach that some people might prefer\n",
"# You can use the OpenAI client python library to call Ollama:\n",
"# You can use the OpenAI client python library to call Ollama:\n",
"pulling f4d24e9138dd... 100% ▕████████████████▏ 148 B \n",
"pulling a85fe2a2e58e... 100% ▕████████████████▏ 487 B \n",
"verifying sha256 digest \n",
"writing manifest \n",
"success \u001b[?25h\n"
]
}
],
"source": [
"source": [
"!ollama pull deepseek-r1:1.5b"
"!ollama pull deepseek-r1:1.5b"
]
]
},
},
{
{
"cell_type": "code",
"cell_type": "code",
"execution_count": null,
"execution_count": 11,
"id": "1d3d554b-e00d-4c08-9300-45e073950a76",
"id": "1d3d554b-e00d-4c08-9300-45e073950a76",
"metadata": {},
"metadata": {},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"<think>\n",
"Okay, so I need to figure out the definitions of some key concepts behind large language models (LLMs), focusing on neural networks, attention, and transformers. Hmm, where should I start? Well, first off, I remember that LLMs are these AI models designed to understand and generate human languages, like speech or texts. They probably need strong processing capabilities to handle this.\n",
"\n",
"Starting with neural networks. From what I recall, neural networks are a type of machine learning model inspired by the structure of the human brain's nervous system. So they consist of layers of nodes called neurons that process information through connections representing weighted edges between them. But wait, how does a typical LLM use these? They probably process inputs layer by layer, transforming and combining features iteratively. I think each layer might correspond to different levels of understanding or processing in the language.\n",
"\n",
"Moving on to attention, which seems important for LLMs since they're such good at capturing context. I remember hearing about self-attention being a key mechanism here. Unlike traditional neural networks that treat all input tokens as linearly ordered without relationships, self-attention allows models to consider all possible pairs of words, not just their sequential order. This means the model can attend to different parts of its output when making predictions at each step. It's like considering distant dependencies beyond just next words.\n",
"\n",
"Now transformers—this part might be more complex or specific. The term \"transformers\" was used by纸片 authors like Vaswani in 2017 with a paper called \"Attention is All You Need.\" They introduced something called the \"transformer architecture,\" which consists of multiple attention layers. Unlike recurrent neural networks (RNNs), which have memory cells, transformers don't need them because each encoding layer processes all token pairs using self-attention. This makes processing in parallel, similar to how convolutional networks work on image data.\n",
"\n",
"I'm a bit fuzzy on the term \"permutation\" mentioned earlier when talking about token order. Oh right, without attention, tokens would just be reordered by permutation. But with attention, models can attend to different input positions globally and focus on specific parts based on context. This helps build more refined representations that capture semantically relevant information.\n",
"\n",
"Putting it all together: LLMs are neural networks using self-attention mechanisms to handle the order of inputs, which is then processed through multiple transformer layers that allow parallel attention across all tokens. So each layer in a transformer processes the output, gradually building up a deep and contextual representation that captures both local dependencies from earlier layers and global context from higher-order attention.\n",
"\n",
"Wait, I'm trying to remember if there are any other concepts or maybe some mistakes I might have made. For example, is there more than one self-attention layer? Or does each transformer layer handle multiple aspects through different weights? Also, how exactly do these attention weights work? Do they score alignments between tokens and adjust the output based on those scores?\n",
"\n",
"I think I've covered the main points by considering the structure of transformers as composed of self-attention layers, which process all pairs of tokens while paying attention to their position in input. This allows models to create more sophisticated representations than simpler architectures like RNNs or static word embeddings.\n",
"\n",
"Hmm, maybe a summary would help: LLMs are neural networks that use attention mechanisms to handle sequence data by considering all possible relationships between inputs, through layers of self-attention and permutation-free processing. Transformers provide an efficient way to implement these with their layer-wise attention mechanism and parallel processing capabilities compared to recurrent approaches.\n",
"</think>\n",
"\n",
"Large Language Models (LLMs) are advanced AI systems designed to understand and generate human languages. They rely on several key concepts from neural networks, attention mechanisms, and transformer architectures to achieve this capability.\n",
"\n",
"1. **Neural Networks**: Neural networks are inspired by biological neural systems, composed of interconnected nodes called neurons. These nodes process information through layers of connections that represent weighted edges between them. In the context of LLMs, each layer processes input data iteratively, transforming features, and combining them to generate outputs.\n",
"\n",
"2. **Attention Mechanisms**: Attention is a core mechanism in LLMs, enabling models to consider all pairs of tokens rather than just their sequential order. This is achieved through self-attention, which allows the model to attend to different parts of its output when making predictions at each step. Unlike traditional neural networks, attention enables the model to capture complex contextual dependencies.\n",
"\n",
"3. **Transformer Architecture**: The term \"transformer\" was introduced by Vaswani et al. in 2017 with a paper titled \"Attention is All You Need.\" Transformers consist of multiple self-attention layers where each layer processes all token pairs using self-attention. This architecture eliminates the need for sequential memory, allowing each processing step to be done in parallel. The absence of sequence dependencies improves feature representation and allows models to build refined representations that capture semantically relevant information.\n",
"\n",
"In summary, LLMs utilize neural networks with self-attention mechanisms to process input sequences and transformer architectures, which provide a permutation-free approach by considering all possible token pairs through multiple layers. These components enable the creation of sophisticated representations that enhance both local and global dependencies in language processing.\n"
]
}
],
"source": [
"source": [
"# This may take a few minutes to run! You should then see a fascinating \"thinking\" trace inside <think> tags, followed by some decent definitions\n",
"# This may take a few minutes to run! You should then see a fascinating \"thinking\" trace inside <think> tags, followed by some decent definitions\n",
"\n",
"\n",
@ -255,9 +795,173 @@
},
},
{
{
"cell_type": "code",
"cell_type": "code",
"execution_count": null,
"execution_count": 12,
"id": "6de38216-6d1c-48c4-877b-86d403f4e0f8",
"id": "6de38216-6d1c-48c4-877b-86d403f4e0f8",
"metadata": {},
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"API key found and looks good so far!\n"
]
}
],
"source": [
"import os\n",
"from dotenv import load_dotenv\n",
"from openai import OpenAI\n",
"\n",
"load_dotenv(override=True)\n",
"api_key = os.getenv('OPENAI_API_KEY')\n",
"\n",
"# Check the key\n",
"\n",
"if not api_key:\n",
" print(\"No API key was found - please head over to the troubleshooting notebook in this folder to identify & fix!\")\n",
"elif not api_key.startswith(\"sk-proj-\"):\n",
" print(\"An API key was found, but it doesn't start sk-proj-; please check you're using the right key - see troubleshooting notebook\")\n",
"elif api_key.strip() != api_key:\n",
" print(\"An API key was found, but it looks like it might have space or tab characters at the start or end - please remove them - see troubleshooting notebook\")\n",
"else:\n",
" print(\"API key found and looks good so far!\")\n"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "8f54551d-9a98-4824-9604-cac56b315ae3",
"metadata": {},
"outputs": [],
"source": [
"headers = {\n",
" \"User-Agent\": \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/117.0.0.0 Safari/537.36\"\n",
"}\n",
"\n",
"class Website:\n",
"\n",
" def __init__(self, url):\n",
" \"\"\"\n",
" Create this Website object from the given url using the BeautifulSoup library\n",
"SUMMARY <$model='llama3.2' created_at='2025-03-04T00:12:00.263857Z' done=True done_reason='stop' total_duration=2697124208 load_duration=40565500 prompt_eval_count=311 prompt_eval_duration=165000000 eval_count=202 eval_duration=2488000000 message=Message(role='assistant', content='### Website Summary\\n\\n#### Overview\\nSpotlight Monitor is a company that provides solutions for improving Salesforce security. Their website offers various tools and services to help businesses reduce risk and identify bad actors in their Salesforce ecosystem.\\n\\n#### Key Features\\n\\n* **SpotMon**: A solution to assess your Salesforce risk, ensure compliance, and mitigate risk.\\n* **Salesforce Security Assessment**: A way to monitor user behavior to identify suspicious activity.\\n* **Shield Quickstarts**: Pre-built solutions to speed up Salesforce Shield implementation.\\n\\n#### News/Announcements\\nThe website mentions that:\\n* 65% of data breaches involve internal actors.\\n* Salesforce stores sensitive mission-critical data, making it a prime target for security threats.\\n* Most Salesforce customers have a limited understanding of what their users actually do, leaving them vulnerable to security risks.\\n* The company offers a recent webinar clip available on their website.\\n\\n#### Contact Information\\nSpotlight Monitor provides contact information, including a link to talk with an expert about improving your Salesforce security posture.', images=None, tool_calls=None)>\n"
"ChatResponse(model='llama3.2', created_at='2025-03-04T00:10:25.224805Z', done=True, done_reason='stop', total_duration=1946978625, load_duration=36116708, prompt_eval_count=311, prompt_eval_duration=166000000, eval_count=142, eval_duration=1741000000, message=Message(role='assistant', content='### Website Summary\\n\\n**Overview**\\nSpotlight Monitor is a cybersecurity firm specializing in Salesforce security solutions. The website provides various services to help businesses reduce risk and identify bad actors within their Salesforce ecosystems.\\n\\n**Key Features**\\n\\n* **SpotMon**: A solution to assess Salesforce risk and ensure compliance.\\n* **Shield Quickstarts**: Accelerate Salesforce Shield implementation for faster security.\\n* **Salesforce Security Assessment**: Monitor user behavior to detect suspicious activity.\\n\\n### News/Announcements\\n\\n* **Data Breach Statistics**: 65% of data breaches involve internal actors, highlighting the importance of effective security measures.\\n* **Webinar Clips**: The website mentions recent webinar clips available for more insights into Salesforce security best practices.', images=None, tool_calls=None))"