{ "cells": [ { "cell_type": "markdown", "id": "5c291475-8c7c-461c-9b12-545a887b2432", "metadata": {}, "source": [ "# Intermediate Level Python\n", "\n", "## Getting you up to speed\n", "\n", "This course assumes that you're at an intermediate level of python. For example, you should have a decent idea what something like this might do:\n", "\n", "`yield from {book.get(\"author\") for book in books if book.get(\"author\")}`\n", "\n", "If not - then you've come to the right place! Welcome to the crash course in intermediate level python. The best way to learn is by doing!\n" ] }, { "cell_type": "markdown", "id": "542f0577-a826-4613-a5d7-4170e9666d04", "metadata": {}, "source": [ "## First: if you need a refresher on the foundations\n", "\n", "I'm going to defer to an AI friend for this, because these explanations are so well written with great examples. Copy and paste the code examples into a new cell to give them a try. Pick whichever section(s) you'd like to brush up on.\n", "\n", "**Python imports:** \n", "https://chatgpt.com/share/672f9f31-8114-8012-be09-29ef0d0140fb\n", "\n", "**Python functions** including default arguments: \n", "https://chatgpt.com/share/672f9f99-7060-8012-bfec-46d4cf77d672\n", "\n", "**Python strings**, including slicing, split/join, replace and literals: \n", "https://chatgpt.com/share/672fb526-0aa0-8012-9e00-ad1687c04518\n", "\n", "**Python f-strings** including number and date formatting: \n", "https://chatgpt.com/share/672fa125-0de0-8012-8e35-27918cbb481c\n", "\n", "**Python lists, dicts and sets**, including the `get()` method: \n", "https://chatgpt.com/share/672fa225-3f04-8012-91af-f9c95287da8d\n", "\n", "**Python files** including modes, encoding, context managers, Path, glob.glob: \n", "https://chatgpt.com/share/673b53b2-6d5c-8012-a344-221056c2f960\n", "\n", "**Python classes:** \n", "https://chatgpt.com/share/672fa07a-1014-8012-b2ea-6dc679552715\n", "\n", "**Pickling Python objects and converting to JSON:** \n", "https://chatgpt.com/share/673b553e-9d0c-8012-9919-f3bb5aa23e31" ] }, { "cell_type": "markdown", "id": "f9e0f8e1-09b3-478b-ada7-c8c35003929b", "metadata": {}, "source": [ "## With this in mind - understanding NameErrors in Python\n", "\n", "It's quite common to hit a NameError in python. With foundational knowledge, you should always feel equipped to debug a NameError and get to the bottom of it.\n", "\n", "If you're unsure how to fix a NameError, please see this [initial guide](https://chatgpt.com/share/67958312-ada0-8012-a1d3-62b3a5fcbbfc) and this [second guide with exercises](https://chatgpt.com/share/67a57e0b-0194-8012-bb50-8ea76c5995b8), and work through them both until you have high confidence.\n", "\n", "There's some repetition here, so feel free to skip it if you're already confident.\n", "\n", "## And now, on to the code!" ] }, { "cell_type": "code", "execution_count": 7, "id": "5802e2f0-0ea0-4237-bbb7-f375a34260f0", "metadata": {}, "outputs": [], "source": [ "# First let's create some things:\n", "\n", "fruits = [\"Apples\", \"Bananas\", \"Pears\"]\n", "\n", "book1 = {\"title\": \"Great Expectations\", \"author\": \"Charles Dickens\"}\n", "book2 = {\"title\": \"Bleak House\", \"author\": \"Charles Dickens\"}\n", "book3 = {\"title\": \"An Book By No Author\"}\n", "book4 = {\"title\": \"Moby Dick\", \"author\": \"Herman Melville\"}\n", "\n", "books = [book1, book2, book3, book4]" ] }, { "cell_type": "markdown", "id": "9b941e6a-3658-4144-a8d4-72f5e72f3707", "metadata": {}, "source": [ "# Part 1: List and dict comprehensions" ] }, { "cell_type": "code", "execution_count": 8, "id": "61992bb8-735d-4dad-8747-8c10b63aec82", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Apples\n", "Bananas\n", "Pears\n" ] } ], "source": [ "# Simple enough to start\n", "\n", "for fruit in fruits:\n", " print(fruit)" ] }, { "cell_type": "code", "execution_count": 9, "id": "c89c3842-9b74-47fa-8424-0fcb08e4177c", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['APPLES', 'BANANAS', 'PEARS']" ] }, "execution_count": 9, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Let's make a new version of fruits\n", "\n", "fruits_shouted = []\n", "for fruit in fruits:\n", " fruits_shouted.append(fruit.upper())\n", "\n", "fruits_shouted" ] }, { "cell_type": "code", "execution_count": 10, "id": "4ec13b3a-9545-44f1-874a-2910a0663560", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['APPLES', 'BANANAS', 'PEARS']" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# You probably already know this\n", "# There's a nice Python construct called \"list comprehension\" that does this:\n", "\n", "fruits_shouted2 = [fruit.upper() for fruit in fruits]\n", "fruits_shouted2" ] }, { "cell_type": "code", "execution_count": 11, "id": "ecc08c3c-181d-4b64-a3e1-b0ccffc6c0cd", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'Apples': 'APPLES', 'Bananas': 'BANANAS', 'Pears': 'PEARS'}" ] }, "execution_count": 11, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# But you may not know that you can do this to create dictionaries, too:\n", "\n", "fruit_mapping = {fruit: fruit.upper() for fruit in fruits}\n", "fruit_mapping" ] }, { "cell_type": "code", "execution_count": 12, "id": "500c2406-00d2-4793-b57b-f49b612760c8", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['APPLES', 'BANANAS']" ] }, "execution_count": 12, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# you can also use the if statement to filter the results\n", "\n", "fruits_with_longer_names_shouted = [fruit.upper() for fruit in fruits if len(fruit)>5]\n", "fruits_with_longer_names_shouted" ] }, { "cell_type": "code", "execution_count": 13, "id": "38c11c34-d71e-45ba-945b-a3d37dc29793", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'Bananas': 'BANANAS', 'Pears': 'PEARS'}" ] }, "execution_count": 13, "metadata": {}, "output_type": "execute_result" } ], "source": [ "fruit_mapping_unless_starts_with_a = {fruit: fruit.upper() for fruit in fruits if not fruit.startswith('A')}\n", "fruit_mapping_unless_starts_with_a" ] }, { "cell_type": "code", "execution_count": 14, "id": "5c97d8e8-31de-4afa-973e-28d8e5cab749", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['Great Expectations', 'Bleak House', 'An Book By No Author', 'Moby Dick']" ] }, "execution_count": 14, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Another comprehension\n", "\n", "[book['title'] for book in books]" ] }, { "cell_type": "code", "execution_count": 17, "id": "50be0edc-a4cd-493f-a680-06080bb497b4", "metadata": {}, "outputs": [ { "ename": "KeyError", "evalue": "'author'", "output_type": "error", "traceback": [ "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", "\u001b[1;31mKeyError\u001b[0m Traceback (most recent call last)", "Cell \u001b[1;32mIn[17], line 3\u001b[0m\n\u001b[0;32m 1\u001b[0m \u001b[38;5;66;03m# This code will fail with an error because one of our books doesn't have an author\u001b[39;00m\n\u001b[1;32m----> 3\u001b[0m \u001b[43m[\u001b[49m\u001b[43mbook\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[38;5;124;43mauthor\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mfor\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mbook\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;129;43;01min\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mbooks\u001b[49m\u001b[43m]\u001b[49m\n", "Cell \u001b[1;32mIn[17], line 3\u001b[0m, in \u001b[0;36m\u001b[1;34m(.0)\u001b[0m\n\u001b[0;32m 1\u001b[0m \u001b[38;5;66;03m# This code will fail with an error because one of our books doesn't have an author\u001b[39;00m\n\u001b[1;32m----> 3\u001b[0m [\u001b[43mbook\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[38;5;124;43mauthor\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[43m]\u001b[49m \u001b[38;5;28;01mfor\u001b[39;00m book \u001b[38;5;129;01min\u001b[39;00m books]\n", "\u001b[1;31mKeyError\u001b[0m: 'author'" ] } ], "source": [ "# This code will fail with an error because one of our books doesn't have an author\n", "\n", "[book['author'] for book in books]" ] }, { "cell_type": "code", "execution_count": 18, "id": "53794083-cc09-4edb-b448-2ffb7e8495c2", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['Charles Dickens', 'Charles Dickens', None, 'Herman Melville']" ] }, "execution_count": 18, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# But this will work, because get() returns None\n", "\n", "[book.get('author') for book in books]" ] }, { "cell_type": "code", "execution_count": 19, "id": "b8e4b859-24f8-4016-8d74-c2cef226d049", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['Charles Dickens', 'Charles Dickens', 'Herman Melville']" ] }, "execution_count": 19, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# And this variation will filter out the None\n", "\n", "[book.get('author') for book in books if book.get('author')]" ] }, { "cell_type": "code", "execution_count": 20, "id": "c44bb999-52b4-4dee-810b-8a400db8f25f", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'Charles Dickens', 'Herman Melville'}" ] }, "execution_count": 20, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# And this version will convert it into a set, removing duplicates\n", "\n", "set([book.get('author') for book in books if book.get('author')])" ] }, { "cell_type": "code", "execution_count": 21, "id": "80a65156-6192-4bb4-b4e6-df3fdc933891", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'Charles Dickens', 'Herman Melville'}" ] }, "execution_count": 21, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# And finally, this version is even nicer\n", "# curly braces creates a set, so this is a set comprehension\n", "\n", "{book.get('author') for book in books if book.get('author')}" ] }, { "cell_type": "markdown", "id": "c100e5db-5438-4715-921c-3f7152f83f4a", "metadata": {}, "source": [ "# Part 2: Generators\n", "\n", "We use Generators in the course because AI models can stream back results.\n", "\n", "If you've not used Generators before, please start with this excellent intro from ChatGPT:\n", "\n", "https://chatgpt.com/share/672faa6e-7dd0-8012-aae5-44fc0d0ec218\n", "\n", "Try pasting some of its examples into a cell." ] }, { "cell_type": "code", "execution_count": 22, "id": "1efc26fa-9144-4352-9a17-dfec1d246aad", "metadata": {}, "outputs": [], "source": [ "# First define a generator; it looks like a function, but it has yield instead of return\n", "\n", "import time\n", "\n", "def come_up_with_fruit_names():\n", " for fruit in fruits:\n", " time.sleep(1) # thinking of a fruit\n", " yield fruit" ] }, { "cell_type": "code", "execution_count": 23, "id": "eac338bb-285c-45c8-8a3e-dbfc41409ca3", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Apples\n", "Bananas\n", "Pears\n" ] } ], "source": [ "# Then use it\n", "\n", "for fruit in come_up_with_fruit_names():\n", " print(fruit)" ] }, { "cell_type": "code", "execution_count": 24, "id": "f6880578-a3de-4502-952a-4572b95eb9ff", "metadata": {}, "outputs": [], "source": [ "# Here's another one\n", "\n", "def authors_generator():\n", " for book in books:\n", " if book.get(\"author\"):\n", " yield book.get(\"author\")" ] }, { "cell_type": "code", "execution_count": 25, "id": "9e316f02-f87f-441d-a01f-024ade949607", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Charles Dickens\n", "Charles Dickens\n", "Herman Melville\n" ] } ], "source": [ "# Use it\n", "\n", "for author in authors_generator():\n", " print(author)" ] }, { "cell_type": "code", "execution_count": 26, "id": "7535c9d0-410e-4e56-a86c-ae6c0e16053f", "metadata": {}, "outputs": [], "source": [ "# Here's the same thing written with list comprehension\n", "\n", "def authors_generator():\n", " for author in [book.get(\"author\") for book in books if book.get(\"author\")]:\n", " yield author" ] }, { "cell_type": "code", "execution_count": 27, "id": "dad34494-0f6c-4edb-b03f-b8d49ee186f2", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Charles Dickens\n", "Charles Dickens\n", "Herman Melville\n" ] } ], "source": [ "# Use it\n", "\n", "for author in authors_generator():\n", " print(author)" ] }, { "cell_type": "code", "execution_count": 28, "id": "abeb7e61-d8aa-4af0-b05a-ae17323e678c", "metadata": {}, "outputs": [], "source": [ "# Here's a nice shortcut\n", "# You can use \"yield from\" to yield each item of an iterable\n", "\n", "def authors_generator():\n", " yield from [book.get(\"author\") for book in books if book.get(\"author\")]" ] }, { "cell_type": "code", "execution_count": 29, "id": "05b0cb43-aa83-4762-a797-d3beb0f22c44", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Charles Dickens\n", "Charles Dickens\n", "Herman Melville\n" ] } ], "source": [ "# Use it\n", "\n", "for author in authors_generator():\n", " print(author)" ] }, { "cell_type": "code", "execution_count": 30, "id": "fdfea58e-d809-4dd4-b7b0-c26427f8be55", "metadata": {}, "outputs": [], "source": [ "# And finally - we can replace the list comprehension with a set comprehension\n", "\n", "def unique_authors_generator():\n", " yield from {book.get(\"author\") for book in books if book.get(\"author\")}" ] }, { "cell_type": "code", "execution_count": 31, "id": "3e821d08-97be-4db9-9a5b-ce5dced3eff8", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Charles Dickens\n", "Herman Melville\n" ] } ], "source": [ "# Use it\n", "\n", "for author in unique_authors_generator():\n", " print(author)" ] }, { "cell_type": "code", "execution_count": null, "id": "905ba603-15d8-4d01-9a79-60ec293d7ca1", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "You conspire with wobbly walruses. You eat wobbly monocles. We dramatically renounce suspicious eels. They impersonate disheveled turnips. They eat melodramatic rodents. You secretly collect overripe mustaches. I philosophize about wobbly walruses. I resent pleasing eels. We pontificate about arrogant rodents. You philosophize about pleasing bagpipes. You pontificate about pompous jellyfish. They tap dance on pleasing jellyfish. We secretly collect melodramatic rodents. You impersonate pompous spreadsheets. You misplace arrogant walruses. We dramatically renounce suspicious thermostats. We philosophize about wobbly rodents. I resent whimsical walruses. They secretly collect festering spreadsheets. We resent fluorescent jellyfish. I conspire with pretentious mustaches. They worship wobbly rodents. I worship festering kumquats. You juggle bewildered wombats. You misplace suspicious jellyfish. You deny the existence of pleasing walruses. You secretly collect suspicious eels. You conspire with turqoise rodents. We deny the existence of arrogant eels. We dramatically renounce wobbly wombats. I bathe in pleasing turnips. I tap dance on disheveled thermostats. You eat whimsical mustaches. You dramatically renounce turqoise wombats. They secretly collect arrogant accordions. They philosophize about turqoise turnips. You philosophize about wobbly thermostats. They bathe in melodramatic monocles. They deny the existence of wobbly walruses. I philosophize about fluorescent walruses. I impersonate festering walruses. I dramatically renounce pleasing walruses. You secretly collect disheveled spreadsheets. They detest pretentious rodents. They worship smelly monocles. We eat suspicious thermostats. You secretly collect turqoise calculators. I philosophize about suspicious jellyfish. You impersonate pleasing rodents. We misplace wobbly bagpipes. We philosophize about melodramatic walruses. I pontificate about arrogant spreadsheets. You pontificate about disheveled turnips. You dramatically renounce pretentious eels. I conspire with disheveled calculators. You philosophize about festering wombats. They resent pleasing rodents. We detest wobbly wombats. I tap dance on pleasing thermostats. I misplace bewildered wombats. I conspire with arrogant walruses. You bathe in bewildered rodents. I juggle turqoise wombats. You misplace melodramatic jellyfish. We bathe in whimsical eels. I impersonate bewildered calculators. They bathe in wobbly thermostats. I pontificate about melodramatic calculators. They worship pompous thermostats. We detest bewildered turnips. We juggle pretentious accordions. They tap dance on arrogant wombats. You secretly collect pretentious thermostats. They impersonate pleasing thermostats. You detest overripe monocles. You eat festering eels. We resent wobbly accordions. They juggle disheveled accordions. They tap dance on suspicious thermostats. I worship wobbly thermostats. You juggle melodramatic spreadsheets. I secretly collect smelly spreadsheets. I bathe in pleasing wombats. We dramatically renounce turqoise walruses. I resent wobbly thermostats. I tap dance on overripe jellyfish. We dramatically renounce turqoise accordions. They impersonate melodramatic rodents. We impersonate pretentious monocles. I impersonate arrogant spreadsheets. You bathe in pleasing spreadsheets. We deny the existence of arrogant jellyfish. You detest whimsical rodents. I resent fluorescent monocles. We worship pompous thermostats. I bathe in melodramatic calculators. You tap dance on pompous eels. They eat bewildered calculators. We secretly collect pompous bagpipes. I deny the existence of wobbly monocles. They resent disheveled spreadsheets. You dramatically renounce pleasing mustaches. I juggle smelly mustaches. We bathe in suspicious calculators. We secretly collect festering turnips. They impersonate pompous calculators. They impersonate overripe eels. They detest smelly spreadsheets. You pontificate about bewildered bagpipes. I resent turqoise monocles. You detest turqoise mustaches. We juggle arrogant thermostats. We juggle whimsical turnips. I misplace smelly calculators. They deny the existence of pleasing calculators. You secretly collect melodramatic monocles. They detest turqoise bagpipes. I impersonate pretentious calculators. We misplace fluorescent monocles. I philosophize about disheveled bagpipes. I detest disheveled thermostats. They misplace pretentious monocles. I philosophize about bewildered turnips. I juggle fluorescent thermostats. We impersonate overripe calculators. They impersonate overripe monocles. You secretly collect overripe walruses. They juggle arrogant mustaches. They conspire with suspicious spreadsheets. You misplace turqoise spreadsheets. You detest arrogant monocles. They secretly collect bewildered monocles. They detest melodramatic kumquats. They pontificate about suspicious calculators. You pontificate about pretentious wombats. We detest suspicious accordions. They detest festering kumquats. We bathe in disheveled thermostats. I juggle overripe wombats. They secretly collect pleasing accordions. I juggle overripe jellyfish. You tap dance on pleasing turnips. I tap dance on turqoise kumquats. We secretly collect suspicious spreadsheets. We secretly collect fluorescent turnips. They tap dance on disheveled wombats. They eat overripe wombats. They tap dance o" ] } ], "source": [ "# And for some fun - press the stop button in the toolbar when bored!\n", "# It's like we've made our own Large Language Model... although not particularly large..\n", "# See if you understand why it prints a letter at a time, instead of a word at a time. If you're unsure, try removing the keyword \"from\" everywhere in the code.\n", "\n", "import random\n", "import time\n", "\n", "pronouns = [\"I\", \"You\", \"We\", \"They\"]\n", "verbs = [\"eat\", \"detest\", \"bathe in\", \"deny the existence of\", \"resent\", \"pontificate about\", \"juggle\", \"impersonate\", \"worship\", \"misplace\", \"conspire with\", \"philosophize about\", \"tap dance on\", \"dramatically renounce\", \"secretly collect\"]\n", "adjectives = [\"turqoise\", \"smelly\", \"arrogant\", \"festering\", \"pleasing\", \"whimsical\", \"disheveled\", \"pretentious\", \"wobbly\", \"melodramatic\", \"pompous\", \"fluorescent\", \"bewildered\", \"suspicious\", \"overripe\"]\n", "nouns = [\"turnips\", \"rodents\", \"eels\", \"walruses\", \"kumquats\", \"monocles\", \"spreadsheets\", \"bagpipes\", \"wombats\", \"accordions\", \"mustaches\", \"calculators\", \"jellyfish\", \"thermostats\"]\n", "\n", "def infinite_random_sentences():\n", " while True:\n", " yield from random.choice(pronouns)\n", " yield \" \"\n", " yield from random.choice(verbs)\n", " yield \" \"\n", " yield from random.choice(adjectives)\n", " yield \" \"\n", " yield from random.choice(nouns)\n", " yield \". \"\n", "\n", "for letter in infinite_random_sentences():\n", " print(letter, end=\"\", flush=True)\n", " time.sleep(0.02)" ] }, { "cell_type": "markdown", "id": "04832ea2-2447-4473-a449-104f80e24d85", "metadata": {}, "source": [ "# Exercise\n", "\n", "Write some python classes for the books example.\n", "\n", "Write a Book class with a title and author. Include a method has_author()\n", "\n", "Write a BookShelf class with a list of books. Include a generator method unique_authors()" ] }, { "cell_type": "markdown", "id": "35760406-fe6c-41f9-b0c0-3e8cf73aafd0", "metadata": {}, "source": [ "# Finally\n", "\n", "Here are some intermediate level details of Classes from our AI friend, including use of type hints, inheritance and class methods. This includes a Book example.\n", "\n", "https://chatgpt.com/share/67348aca-65fc-8012-a4a9-fd1b8f04ba59" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.11" } }, "nbformat": 4, "nbformat_minor": 5 }